The duration of load effect in lumber as stochastic degradation
aa r X i v : . [ s t a t . A P ] A ug The duration of load effect in lumber as stochasticdegradation
Samuel W K Wong and James V Zidek Department of Statistics, University of Florida, Gainesville, FL Department of Statistics, University of British Columbia, Vancouver, BCSeptember 28, 2018
Abstract
This paper proposes a gamma process for modelling the damage that accumulatesover time in the lumber used in structural engineering applications when stress isapplied. The model separates the stochastic processes representing features internalto the piece of lumber on the one hand, from those representing external forces due toapplied dead and live loads. The model applies those external forces through a time-varying population level function designed for time-varying loads. The application ofthis type of model, which is standard in reliability analysis, is novel in this context,which has been dominated by accumulated damage models (ADMs) over more thanhalf a century. The proposed model is compared with one of the traditional ADMs.Our statistical results based on a Bayesian analysis of experimental data highlight thelimitations of using accelerated testing data to assess long-term reliability, as seen inthe wide posterior intervals. This suggests the need for more comprehensive testing infuture applications, or to encode appropriate expert knowledge in the priors used forBayesian analysis.
Keywords: gamma process; degradation; duration of load; wood products; accu-mulated damage models Introduction
Wood placed under a sustained load over a period of time in an engineering applicationwill sustain damage due to the “duration of load” (DOL) effect (Karacabeyli and Soltis,1991). The importance of this effect led to the development of models for predicting itso that it could be incorporated into the establishment of design values and safety factorsin applications. However shortcomings in these models that we now describe led to thedevelopment by the authors of the alternative approach described in this paper. The twoapproaches are then compared in an illustrative application.Long term damage in a piece of lumber, resulting in a reduction of its strength-bearingcapacity, depends on the load level and how it is applied (e.g., via bending, compression,tension). The speed at which a piece weakens over time may also depend on a combination offactors such as the visco–elasticity of the wood, temperature and moisture. Even a relativelysmall constant load applied over a sufficiently long period of time may lead to failure (knownas creep rupture). According to the review of Rosowsky and Bulleit (2002), the effect wasfirst recognized by Haupt (1867). But it does not seem to have been formally incorporatedinto design standards until Wood et al. (1960) produced the so–called Madison Curve fordoing so. The curve is still in use today for estimating the DOL effect on the strength ofwood.However the purely empirical approach of Wood led only to a fitted curve. So an al-ternative dynamic model was developed to describe how damage accumulated over time asa function of the stress load profile (Barrett and Foschi, 1978a,b; Gerhards, 1979). Theseaccumulated damage models (ADMs) differ in detail, but the idea is the same, to focuson the rate at which damage accumulates rather than the damage itself, using an ordinarydifferential equation (ODE) dα ( t ) dt = F [ α ( t ) , σ ( t ) , φ ] , (1)which represents the rate of damage accumulation for a randomly selected piece of lumber.The vector φ contains (random) parameters associated with the piece itself, with their jointprobability distribution depending on population parameters that must also be fitted toimplement the model. Once a piece is selected, the model (1) deterministically describes therate at which damage accumulates in that piece.While the accumulated damage at time t , α ( t ), is unobservable, the ODE provides aframework onto which the other elements of the model can be attached. It is calibrated sothat α ( t ) = 0 when t = 0 and no damage has occurred, and α ( T l ) = 1 at time t = T l whenthe piece fails. Here σ ( t ) = τ ( t ) /τ s , where τ ( t ) (psi) is the applied stress at time t and τ s (psi) is the ‘short-term breaking strength’ of the piece (commonly defined to be the stressat which the piece would fail were it to be subjected to a ramp load test of duration ∼ F in Equation (1) as a function of α ( t ),namely ˙ α ( t ) = a [ τ ( t ) − σ τ s ] b + + c [ τ ( t ) − σ τ s ] n + α ( t ) (2)2here a , b , c , n , σ are log–normally distributed random effects for the piece. Here σ isknown as the stress ratio threshold, and x + = max( x, τ ( t ) < σ τ s .However Ellingwood and Rosowsky (1991) point out that the Canadian model cannotbe nondimensionalized. That is a serious issue, since a model that represents a naturalprocess cannot ultimately depend on how the quantities involved in the model are mea-sured. The concern was seen to be of sufficient importance that Ellingwood and Rosowsky(1991) exclude the model from their comparative analysis of ADMs. The review of ADMs inHoffmeyer and Sørensen (2007) instead modifies the Canadian model to correct the dimen-sions; this difficulty of the Canadian model, as well as in other ADMs, was also describedin Zhai (2011) and Zhai et al. (2012). Wong and Zidek (2016) also address the problem byinvoking the Buckingham π theorem to build reparametrized models that are dimensionallyconsistent while retaining their functional form.Another difficulty associated with the ADM approach is its computational burden dueto the need to solve ODEs such as Equation (2) numerically for each individual piece oflumber, one that restricts the use of standard likelihood-based methods for their analysis.As a result, uncertainties in both the parameter estimates and subsequent reliability calcu-lations are difficult to quantify. Yang et al. (2017) address this latter difficulty by proposingapproximate Bayesian computation techniques to perform the analysis on a solid statisticalplatform; however, a large cluster of CPUs is needed to carry out the subsequent reliabilitycalculations with high–accuracy ODE numerical solvers.As a final limitation of the ADM approach, randomness in the process of damage accumu-lation within a given piece is ignored, which may not be realistic. In consequence, estimatesof ADM parameters are difficult to interpret as population level and piece–specific modelingare inherently intertwined. So this paper presents a new approach for modelling the DOLeffect that overcomes the difficulties described above. It is based on the gamma process, astandard approach to modelling degradation that has a long history (Lawless and Crowder,2004). But it has not previously been used for wood products as far as the authors are aware.To successfully apply this approach to lumber, we formulate a model for the time-varyingshape parameter that accounts for the time-varying loads which pieces must sustain.In a major point of departure from the ADM approach above, the degradation of a pieceof lumber under the gamma process remains random (even conditional on it having beingselected): it is represented by a stochastic process Y t , t ≥
0, that describes the damageaccumulated up to time t . That process is internal to the piece, and can be thought ofas representing its random progress of damage. The future combination of dead and liveloads, which may be a random process, are external to that piece. Given a realized loadprofile, these two ingredients are fused through the deterministic time–varying populationlevel shape parameter. This separation of internal and external sources of variability hasadvantages in terms of the interpretability of the results and facilitates the use of principledstatistical methods of analysis.The paper presents a number of notable findings that we now summarize. • Weak evidence is found of a threshold effect below which no degradation in the pop-ulation occurs, that being an estimated threshold level of 413psi for the populationfrom which the test data in our illustrative application are drawn. But the posterior3redibility bands are wide, making the possibility of no threshold quite plausible. • The experimental data suggest that the degradation as a function of time for a lumberpopulation cannot be explained by the simple power law commonly used in gammaprocess applications. • Our reliability analysis suggests that under a simulated future dynamic occupancyload, the chance of failure of a piece of lumber before the end of fifty years is 9%. Thiscontrasts with a more optimistic estimate 1.5% obtained by application of an ADMunder the same simulated future. • Finally, the analysis of future residual life of a piece of lumber that has survived at leastfour years under a constant load of 3000psi has a long right-tailed distribution witha median survival time in the range of 22 to 333 years if that same load is sustainedindefinitely. That substantial amount of uncertainty revealed by our analysis, pointsto the need for much more testing to precisely estimate reliability under low sustainedloads.The paper is organized as follows. Section 2 introduces the gamma process as a way ofdescribing the damage due to the stress applied to wood when placed in service. Section3 describes how the gamma process may be used to model degradation. In particular itis shown how the random degradation process for a randomly chosen piece of lumber getscoupled to a model for its population and how the applied stress profile operates to causedamage to accumulate. A major contribution follows next in Section 4: a Bayesian approachfor applying the gamma process model is developed and applied to data obtained in anaccelerated testing experiment designed to explore the duration of load effect. A discussionabout the lessons learned in Section 4 about the reliability of lumber comes next in Section5. Another application follows in Section 6; there it is shown how the residual life of a pieceof lumber in service can be predicted. Further discussion and concluding remarks follow inSection 7.
In this section we briefly review the basics of the gamma process as it relates to modelinglumber degradation.Let Y t ≥ t . Assume Y = 0 andthat Y t ≥ t = T when the damage exceeds a pre-specifiedthreshold level indicating failure. Without loss of generality, we may scale the degradationprocess so that failure occurs at Y T = 1. Virtually, the degradation process can be thoughtto continue for t > T even though by that time the specimen will have failed.Conditional on the parameters for a randomly selected lumber specimen, assume Y t , t ≥ t < · · · < t n , the4ncrements Y t i − Y t i − , i = 1 , . . . n are stochastically independent. The distribution of theseincrements may depend on factors internal to the specimen as well as the external effects ofthe applied stress, resulting in damage that accumulates as a series of successive jumps ofrandom size. The particularly simple family of models we adopt assumes Y t is a compoundPoisson process with intensity function λ t , t >
0, i.e. Y t i − Y t i − = N ( ti − ,ti ) X i =1 X i where conditional on the model parameters P [ N ( t i − ,t i ) = n ] = Λ n ( t i − ,t i ) exp {− Λ ( t i − ,t i ) } n ! , withΛ ( t i − ,t i ) = Z t i t i − λ t dt while the random jumps X i , which are independent of the Poisson count process, have agamma distribution with shape parameter η and scale ξ . Standard theory then implies thatconditional on the model parameters E [ Y t i − Y t i − ] = ξη Λ ( t i − ,t i ) , while V ar [ Y t i − Y t i − ] = ξη Λ ( t i − ,t i ) [ η + ξ ] . As the intensity parameter increases and the gamma shape parameter decreases, we ap-proach in the limit, the so–called gamma process that has an infinite number of infinitesimallysmall jumps. That model has been used extensively to model degradation. More formally Y t i − Y t i − ∼ Ga [ ξ, η t i − η t i − ]where η t ≥ Ga [ ξ, η t ] denotes the gamma distribution withscale parameter ξ and shape parameter η t . The scale ξ = ξ ( x ) is a scalar-valued quantitythat could also depend on fixed covariates x associated with a specimen, such as the modulusof elasticity. From standard theory we then obtain E [ Y t | ξ, η t ] = ξ ( x ) η t V ar [ Y t | ξ, η t ] = ξ ( x ) η t , and CV [ Y t | ξ, η t ] = η − / . Provided that multiple gamma processes have the same scale ξ , which in effect meanseach has a scale that is a known multiple of ξ , their sum is also a gamma process. Moreprecisely, assume that conditional on η it and ξ , the { Y it } , i = 1 , . . . r are independent gammaprocesses with shapes η it and scales ξ . Then the sum Y t = Y t + · · · + . . . Y rt (3)5s also a gamma process with shape η t = P i η it and scale ξ .This is a useful property since it provides a convenient framework for combining differentprocesses that contribute to degradation. The process corresponding to damage due to theapplied load profile is that of primary interest in this paper. As potential extensions, otherexternal factors that contribute to damage such as the time-varying moisture content andtemperature of the environment could be incorporated as separate components in Equation(3). However, we do not at present have the data to illustrate these refinements to the model. The gamma process induces a probability distribution of failure times T , which we brieflyreview as follows. Detailed proofs of these results can be found in Paroissin and Salami(2014). The survival function for T is P [ T > t | ξ, η t ] = P [ Y t ≤ | ξ, η t ] = Z u η t − e − u/ξ ξ η t Γ[ η t ] du = 1 − Γ( η t , /ξ )Γ( η t ) , (4)where Γ( · , · ) denotes the upper incomplete gamma function. When η t is differentiable, itfollows that the probability density of T needed for the construction of the likelihood functionis f T [ t | ξ, η t ] = − dP [ T > t | ξ, η t ] dt = ˙ η t (Ψ( η t ) − log( c/ξ )) (cid:18) − Γ( η t , /ξ )Γ( η t ) (cid:19) (5)+ ˙ η t η t Γ( η t ) ( c/ξ ) η t F ( η t , η t ; η t + 1 , η t + 1; − /ξ ) , where Ψ is the digamma function and p F q is the generalized hypergeometric function of order p, q . Of primary interest is characterizing the gamma process representing damage due to thestress applied. Suppose the load profile over time with which the population is stressed, τ ( t ); t ≥ τ ( t ) has a fundamental role in determining the correspondingvalue of η t . In particular, η t must account for the degradation effects of the entire loadhistory profile until time t . For lumber degradation, we assume two basic properties for η t :(i) If τ ( t ) ≤ τ ∗ for δ ≤ t ≤ δ , then ˙ η t = 0 ∀ t ∈ ( δ , δ ), where τ ∗ is a threshold stresslevel below which the population does not undergo degradation.(ii) If τ ( t ) is held at a constant level larger than τ ∗ for δ ≤ t ≤ δ , then ˙ η t is decreasingover the interval δ ≤ t ≤ δ . 6he first property implies that degradation does not progress during periods when the stressis too low to cause damage. The threshold τ ∗ is a population analogue of the damagethreshold commonly seen in ADMs (see Introduction). The second property captures theDOL effect: if the load is held constant at a stress level high enough to cause failures in thepopulation, degradation continues as that constant load is maintained but the rate at whichit occurs is expected to slow over time. These properties will guide the specific choice of η t . We now develop a specific functional form for the shape parameter η t along with parametersto be estimated from data in the illustrative example. The “power law” and its variantshave been commonly used to model degradation and serves as a useful starting point fordeveloping specific model implementations.Suppose τ is a given load level held constant over time. Then we can conceive a simpleform for η t to characterize the degradation in a population of pieces subject to that loadfrom time 0 to t as η t = g ( t ) × ( uτ − v ) + , (6)where g ( · ) is an increasing function that captures the DOL effect, u and v are positiveconstants, and x + = max( x, uτ − v ) + is constant over time, dependingonly on the size of the load. It is zero when that stress level is sufficiently low in accord withproperty (i), that is, when uτ i − v < τ ∗ = v/u below which no degradation occurs. The function g ( t ) governs the rate of degradation inthe population over time under that fixed load. The simple form g ( t ) = t a with a > g ( t ) = t a + bt c where a, b, c are all positive parameters with a < c , whichhas the feature of mixing two different power law growth rates. In particular by setting theconstraint a < c we expect that t a will capture the shorter-term effect well, while the role of bt c becomes more important over longer time durations.In practice, the load may vary over time. Let 0 = τ < τ < τ < · · · < τ m denote asequence of load levels spanning the range of loads under which the population may be sub-jected. Then for each load level τ i , i = 1 , . . . , m , we can consider the amount of incrementaldegradation due to load τ i beyond that which was sustained from load τ i − . Then a naturalanalogue to Equation (6) for this load increment, for time 0 to t , is g (˜ t i ) [( uτ i − v ) + − ( uτ i − − v ) + ] , where ˜ t i = R t I ( τ ( t ′ ) ≥ τ i ) dt ′ is the total time duration for which the load exceeded τ i .Thus the constant term [( uτ i − v ) + − ( uτ i − − v ) + ] captures the incremental ‘jump’ in η t that occurs due to load level τ i being reached. Similarly g (˜ t i ), as a function of the totallength of time for which the load level τ i is sustained, now models its corresponding DOLeffect. 7e can then combine the contributions of all the load levels to construct η t for anyarbitrary given load profile. Using our chosen form for g ( t ), we thus obtain η t = m X i =1 (˜ t ia + b ˜ t ic ) [( uτ i − v ) + − ( uτ i − − v ) + ] , (7)which reduces to Equation (6) in the special case that the load is held constant at τ fromtime 0 to t . It can be seen that η t is differentiable, since if τ j ≤ τ ( t ) < τ j +1 , we have˙ η t = j X i =1 ( a ˜ t ia − + bc ˜ t ic − ) [( uτ i − v ) + − ( uτ i − − v ) + ] . (8)Thus when the exponents a and c are each less than 1, ˙ η t is decreasing over any period witha fixed load level, in accord with property (ii).A specific sequence of load levels τ , . . . , τ m needs to be chosen for computation. Theseserve as the incremental thresholds over which additional degradation contributions areadded into the model. For example, if τ j = 3000psi and τ j +1 = 3020psi, then any loadsin the interval [3000 , η t in this model as a loadof exactly 3000psi. Naturally, the range of loads may be discretized as finely as desired tofaithfully reproduce the stress history, at the cost of additional computation time. In ourdemonstration we use an equally-spaced sequence for τ , . . . , τ m with intervals of 20psi. Anartifact of the discrete load levels in the model is that if the load profile τ ( t ) has periods ofcontinuous increase, the resulting η t becomes jagged as the load passes the different thresh-olds rather than smoothly increasing with the load. In this case a line segment can be usedto smooth η t between the time points when successive load thresholds are reached, to serveas an acceptable approximation. This section presents a Bayesian analysis of data from an accelerated testing experimentdesigned to explore the duration of load effect.
The real data we subsequently analyze come from the DOL experiment reported in Foschi and Barrett(1982). It consists of a total of 637 pieces of visually graded 2x6 Western Hemlock, divided fortesting under three different load profiles (all time units in hours unless otherwise indicated):1. 198 pieces were assigned the load profile τ ( t ) = ( t, for t ≤ / , for 3000 / < t ≤ , i.e., the load was increased linearly until reaching 3000psi, and held at that constantlevel for 4 years. Hence pieces that do not fail by the end of the 4-year period whenthe test is truncated have their failure time censored.8. 300 pieces were assigned the load profile τ ( t ) = ( t, for t ≤ / , for 4500 / < t ≤ , which is similar to the above, now with a constant load level of 4500psi for 1 year.Pieces that do not fail by the end of the 1-year period when the test is truncated havetheir failure time censored.3. 139 pieces were assigned the load profile τ ( t ) = 388440 t until failure.In the DOL literature, profiles 1 and 2 are known as ‘constant load’ tests, while profile 3is known as a ‘ramp load’ test. These are so-called ‘accelerated’ testing schemes that wereoriginally designed to help elucidate the long-term DOL effect using tests of relatively shorterduration (Barrett and Foschi, 1978b).Each piece that failed during the test had its failure time recorded. Pieces that did notfail during the test duration had their censoring times recorded (i.e., 4 years for group 1 and1 year for group 2). No covariates for individual specimens were recorded in the data. We now perform an illustrative analysis of these accelerated testing data based on the modeldeveloped, using the techniques of Bayesian inference. Let θ denote the vector of parametersto be inferred, which consists of the five parameters associated with the model for η t alongwith the gamma process scale parameter ξ , namely θ = ( a, b, c, u, v, ξ ). Let π ( θ ) denotethe joint prior distribution on θ . Then using the likelihood in Equation (5), the posteriordistribution of θ based on an independent sample of test specimens with recorded failuretimes t , t , . . . , t n is given by π ( θ | t , t , . . . , t n ) ∝ π ( θ ) n Y i =1 f T ( t i | ξ, η t i ) , (9)where η t i denotes evaluating Equation (7) for η t at time t i according to the load profile τ ( t )associated with specimen i . For some specimens the actual failure times are not observed, asthe test has ended after a specified duration without the specimen failing. Then the likeli-hood contribution f T ( t i | ξ, η t i ) for those specimens is replaced by the corresponding survivorfunction, namely P ( T i > t c | ξ, η t i ) computed by Equation (4) where t c is the truncation time.Equation (9) thus can accommodate all the test data to be analyzed under the differentloading profiles employed in the experiment. Importantly, we emphasize it is assumed thatthe same set of parameters can model the degradation of the population under any loadingscenario. That assumption, which implies that the parameters of a fitted model can then beused with any load profile τ ( t ) of interest, has been fundamental to much of the previouswork with ADMs that involve the probabilistic assessment of long-term lumber reliability.An example of such follows in Section 5.To proceed with the analysis, we use vague independent Normal( µ = 0, σ = 1000) priorsfor each of the parameters in θ , along with the restriction a < c . As the form of the pos-terior is intractable for direct sampling, we employ Markov Chain Monte Carlo (MCMC)9echniques to obtain sample draws from it. To obtain reasonable starting values for theMCMC, we first used Nelder–Mead iterations to optimize the posterior. Then, to improveconvergence and the efficiency of posterior exploration via MCMC, we used parallel temper-ing (Swendsen and Wang, 1986) distributed over 120 compute cores, with each core runninga MCMC chain using simple Metropolis-Hastings iterations and temperatures geometricallyspaced from 1 to 20. Swaps between chains were performed every five iterations. The first5,000 iterations were discarded as burn-in, and the following 15,000 iterations from the chainrepresenting the target posterior distribution constitute our final samples.Summaries of the posterior samples of the parameters are shown in Table 1. A fewobservations can be noted. First, there is a clear distinction between the powers a and c , with posterior means of 0.019 and 0.40 respectively, indicating that a single power lawdoes not adequately explain the observed degradation over time. Second, there is only weakevidence for a stress threshold τ ∗ below which no population degradation occurs; the MCMCsamples yield a posterior mean for the threshold level v/u of 413psi and a highly uncertain95% posterior interval (43 , b , whose central 95% posterior probability interval spans twoorders of magnitude: (0 . , . a b c u v ξ We now turn to applying the fitted model to an example of a predictive scenario, suchas those analyzed in reliability assessments. Foschi et al. (1989) use stochastic processes to10 . . . . . . Constant load, 4Y/3000psilog(T) Fn ( x ) −5 0 5 10 . . . . . . Constant load, 1Y/4500psilog(T) Fn ( x ) −5.0 −4.5 −4.0 −3.5 . . . . . . Ramp loadlog(T) Fn ( x ) Figure 1: Cumulative distribution functions (CDFs) associated with the fitted gamma pro-cess model, on the three calibration datasets. In each plot, the black points show theempirical CDF of the dataset. The dashed curve is the CDF associated with the set of pa-rameters with the highest posterior density among the MCMC samples, while the grey arearepresents the 95% posterior probability interval of the CDF based on the MCMC samples.The vertical dotted lines indicate the censoring times for the two constant load scenarios.11haracterize load profiles on individual lumber members over the lifetime of a wood structure,and an adapted example of a heavier than typical 50-year load profile for a residential dwellingunit is shown in the left panel of Figure 2. This profile is a piecewise constant functionobtained by summing different component loads. Intuitively, the total load at any giventime includes the constant dead weight of the structure, along with load from occupancywhich varies by resident. In addition, the ‘spikes’ correspond to various short-term loadsthat are expected to occur periodically in homes.
Year Lo a d ( p s i ) Year h t Figure 2: Reliability assessment example. The left panel shows an example of a simulatedresidential 50-year load profile, adapted from Foschi et al. (1989). The right panel shows thecorresponding η t of the fitted gamma process model under this load profile. The black curveshown is computed on the set of parameters with the highest posterior density among theMCMC samples, while the grey area represents the 95% posterior probability interval basedon the MCMC samples.Using the parameters from the fitted model, we may compute η t corresponding to thisload profile using Equation (7). The solid black curve shows η t computed for this 50-yearperiod using the sampled parameter vector with the highest posterior density. It can be seenthat η t increases rapidly the first time the load exceeds a new threshold, for example, at time ∼ ∼ ∼
15 years (load ∼ ∼
48 years its effect on η t is much morediminished. As before, the grey area represents 95% posterior bands based on the MCMCsamples.Ultimately the probability of failure by the end of the 50-year period is of primary interest.This is determined by the value of η t at 50 years, along with the scale parameter ξ of thegamma process according to Equation (4). We obtain the posterior mean for the probabilityof failure of 0.090, and a central 95% posterior interval of (0 . , . As a further application of the fitted Bayesian model, we may use the MCMC samples tocompute the posterior probability distributions of the residual life for pieces that have notfailed up to a given time t ′ . This requires a knowledge of η t ′ , which in our model is computedfrom the load profile τ ( t ); 0 ≤ t ≤ t ′ and the fitted parameters, as well as a characterizationof the expected future loads τ ( t ); t > t ′ . Letting T denote the random variable for thefailure time, then of interest is the distribution of T r := [ T | T > t ′ ] − t ′ which represents theremaining lifetime. It has survivor function P [ T r > t r | ξ, η t , η t ′ ] = P [ T > t ′ + t r | ξ, η t ] P [ T > t ′ | ξ, η t ′ ] , which may be computed using Equation (4).To illustrate, we use the two constant-load scenarios in the experimental data, wherespecimens were held at load levels of 3000psi and 4500psi for 4 years and 1 year respectively.Consider the distribution of remaining lifetime of the surviving specimens, if these constantload levels were maintained indefinitely. These survivor functions are shown, for up to 100more years, in Figure 3, with posterior uncertainty shown by the grey bands. These distri-butions have very long right tails, corresponding to the strongest members of the populationwhich can carry these load levels almost indefinitely. As such, the mean residual lifetimeis not very meaningful. Instead quantities such as the time until 50% of the survivors fail,namely the median of these distributions, may be of interest. Using the MCMC samples, wecalculate the 95% posterior intervals of these medians to be (21 . , .
5) years under 3000psiand (5 . , .
3) years under 4500psi. It can be seen that there is much higher uncertaintyassociated with these distributions at the lower load level.
In the analysis of the experimental data we found that the effect of degradation from aconstant load due to time, as modeled in the shape parameter, was not a power law t a . Thisis evident by examining the plots in Figure 1. With a simple power law, the CDF would beapproximately linear as a function of log-time during the constant load period. Instead, theempirical CDF increases quite nonlinearly with time on the log-scale. This led us to positadding a second power term to the model, yielding t a + bt c with a < c . This form provides a13
20 40 60 80 100 . . . . . . Constant load 3000psiResidual lifetime (years) S u r v i va l p r ob a b ili t y . . . . . . Constant load 4500psiResidual lifetime (years) S u r v i va l p r ob a b ili t y Figure 3: Residual lifetime example. Survivor functions of the remaining lifetime for spec-imens under a continued 3000psi constant load after surviving the 4-year test period (leftpanel), and for specimens under a continued 4500psi constant load after surviving the 1-yeartest period (right panel). The black curve shown is the posterior mean, while the grey arearepresents the 95% posterior interval based on the MCMC samples.14ood fit to the data, however with wide posterior intervals for the parameters b and c . Thatin turn translates to the high uncertainty that we find associated with using tests of 1 and 4year durations to predict reliability and residual lifetime over much longer periods, such as50 years. Larger tests, or over longer periods, would be necessary to reduce this variability.In the work by Foschi et al. (1989), a crucial parameter in the Canadian ADM usedfor reliability analysis is the ‘stress threshold’ σ . In that model it is hypothesized that anindividual piece of lumber does not accumulate damage when the load is below σ τ s , where τ s is the strength of that piece as measured in a short-term ramp load test. That work reportedan estimate for the population mean of σ to be 0.533; based on that estimate along with apopulation mean short-term strength of ∼ σ was found to be highlyuncertain and a strong Bayesian prior was needed to stabilize its estimate (Yang et al., 2017);in fact, a mean of σ ≈ σ as a piece-level parameter in the ADM does not havea direct relationship with our estimated damage threshold of 413psi for the population.In the ADM, the population mean of σ τ s is the load below which the average piece inthe population is undamaged; however, the realization of σ τ s cannot be assessed for anyindividual piece since it is unobservable. In contrast the 413psi population threshold in ourmodel represents the stress level below which all members of the population are undamaged.Nonetheless as discussed above, both approaches show little evidence of a high damagethreshold by analyzing the Hemlock data alone, when uncertainty is considered. Specializedproof-loading tests (e.g., Woeste et al., 2007) may instead be more useful if estimating thedamage threshold is of primary interest.Another point of comparison between the ADM and our proposed approach lies in thenumber of parameters to be estimated. Fitting the Canadian ADM in particular requiresestimating 10 population parameters (the five log-normal means and variances from whichthe random effects in Equation (2) are drawn for specific pieces of lumber), some of which donot have a clear physical interpretation. As found in Yang et al. (2017), a number of differentsets of these population parameters could lead to essentially the same likelihood, suggestingthat while the Canadian ADM can fit the empirical data well, it may be over-parametrizedleading to worse prediction performance due to the inflated uncertainty about the individualparameters. Our model fits the empirical data well with four fewer parameters (six), andit is simpler to see that the resulting uncertainty in prediction stems primarily from theuncertainty in the estimation of degradation rate over longer periods based on acceleratedtesting data.It can be said that the results of applying the accumulated damage modeling approach15long with its predecessor, the empirical model of Wood et al. (1960), have laid a foundationfor incorporating long term stress effects into the calculation of design values that have stoodthe test of time. So why a critical review of these models at this time? The answer lies in theneed for application of the methods to a new generation of forest products such as strandbased wood composites (Wang et al., 2012a,b) that are also susceptible to DOL effects.Given that the new applications do not automatically inherit the record of success of theADM, prudence suggests a re-evaluation of the approach given its limitations as describedin the Introduction, one that takes full advantage of the new computational and statisticalmethods now available. Since engineered wood composites have much lower short-termstrength variability compared to lumber, the size of the DOL effect (and its estimation) forthese materials would have a more significant role in determining appropriate safety factors.The above considerations led the authors to explore the alternative to the ADM presentedin this paper and it was found to overcome many of the difficulties described above withthe ADM approach. The model based on the gamma process is simpler to interpret withfewer parameters, separates external (population) and internal (individual piece) sourcesof variability, and lends itself well to standard statistical assessments of uncertainty. Thedegradation approach also led to a number of new discoveries as previously summarizedin the Introduction. In particular, a key finding from our analysis is that the acceleratedtesting data yields poor predictors of the long term future of a piece of lumber in service. Ouranalysis shows very wide credibility bands for the median time to failure, particularly whenthe sustained load level for the test is low. This finding suggests much larger acceleratedtests are needed to ensure the reliability of predictions. Acknowledgements
The work reported in this manuscript was partially supported by FPInnovations and a CRDgrant from the Natural Sciences and Engineering Research Council of Canada. The dataanalysed in this paper were provided by FPInnovations. We are greatly indebted to ConroyLum and Erol Karacabeyli from FPInnovations for their extensive advice during the conductof the research reported herein.
References
Barrett, J. and Foschi, R. (1978a). Duration of load and probability of failure in wood. parti. modelling creep rupture.
Canadian Journal of Civil Engineering , 5(4):505–514.Barrett, J. and Foschi, R. (1978b). Duration of load and probability of failure in wood. part ii.constant, ramp, and cyclic loadings.
Canadian Journal of Civil Engineering , 5(4):515–532.Ellingwood, B. and Rosowsky, D. (1991). Duration of load effects in lrfd for wood construc-tion.
Journal of Structural Engineering , 117(2):584–599.Foschi, R. O. (1984). Reliability of wood structural systems.
Journal of Structural Engi-neering , 110(12):2995–3013. 16oschi, R. O. and Barrett, J. D. (1982). Load-duration effects in western hemlock lumber.
Journal of the Structural Division , 108:1494–1510.Foschi, R. O., Folz, B., and Yao, F. (1989).
Reliability-based design of wood structures .Number 34. Dept. of Civil Engineering, University of British Columbia.Gerhards, C. C. (1979). Time-related effects on wood strength: A linear cumulative damagetheory.
Wood science , 11:139–144.Haupt, H. (1867).
General theory of bridge construction . Appleton, New York.Hoffmeyer, P. and Sørensen, J. D. (2007). Duration of load revisited.
Wood Science andTechnology , 41(8):687–711.Karacabeyli, E. and Soltis, L. A. (1991). State of the art report on duration of load researchfor lumber in north america. In
Proceedings of the 1991 International Timber EngineeringConference. London, United Kingdom .Lawless, J. and Crowder, M. (2004). Covariates and random effects in a gamma processmodel with application to degradation and failure.
Lifetime Data Analysis , 10(3):213–227.Paroissin, C. and Salami, A. (2014). Failure time of non homogeneous gamma process.
Communications in Statistics-Theory and Methods , 43(15):3148–3161.Rosowsky, D. V. and Bulleit, W. M. (2002). Another look at load duration effects in wood.
Journal of Structural Engineering , 128(6):824–828.Swendsen, R. H. and Wang, J.-S. (1986). Replica monte carlo simulation of spin-glasses.
Physical Review Letters , 57(21):2607.Wang, J. B., Foschi, R. O., and Lam, F. (2012a). Duration-of-load and creep effects instrand-based wood composite: a creep-rupture model.
Wood science and technology , 46(1-3):375–391.Wang, J. B., Lam, F., and Foschi, R. O. (2012b). Duration-of-load and creep effects instrand-based wood composite: experimental research.
Wood science and technology , 46(1-3):361–373.Woeste, F., Green, D., Tarbell, K., and Marin, L. (2007). Proof loading to assure lumberstrength.
Wood and fiber science , 19(3):283–297.Wong, S. W. and Zidek, J. V. (2016). Dimensional and statistical foundations for accumu-lated damage models. arXiv preprint arXiv:1708.03018 .Wood, L. W. et al. (1960).
Relation of strength of wood to duration of load . Madison, Wis.:US Dept. of Agriculture, Forest Service, Forest Products Laboratory.Yang, C.-H., Zidek, J. V., and Wong, S. W. (2017). Bayesian analysis of accumulated damagemodels in lumber reliability. arXiv preprint arXiv:1706.04643arXiv preprint arXiv:1706.04643