Role of an opportunistic pathogen in the decline of stressed oak trees
Summary
- 1
The importance of opportunistic pathogens, in particular Armillaria species, in forest decline has often been open to debate.
- 2
In order to assess the role of Armillaria gallica in the decline of oak trees, 60 Quercus robur trees with high (HIP trees) or low (LIP trees) levels of A. gallica inoculum, as measured by the density of epiphytic rhizomorphs on the root collar, were artificially defoliated for 2 years. Half of the HIP trees were treated when first defoliated with boric acid to reduce the A. gallica inoculum potential (BHIP trees). The ability of in situ rhizomorphs to colonize plant material was similar for LIP and BHIP, but was lower than in HIP trees, indicating that the boric acid treatment reduced the level of A. gallica inoculum.
- 3
Tree growth was similar between treatments as determined by dendrochronological comparisons. Although defoliation greatly reduced both tree growth and sapwood starch reserves at the beginning of autumn, growth response to defoliation and sapwood starch concentration at the beginning of autumn were similar for LIP, BHIP and HIP trees.
- 4
HIP trees suffered considerably greater crown deterioration and mortality following defoliation than either BHIP or LIP trees (62%, 32% and 5% mortality rates, respectively). The trees that died had very low sapwood starch concentrations. In addition, at similar levels of sapwood starch, HIP trees were much more likely to die than LIP or BHIP trees.
- 5
Two other factors influenced tree mortality. Past stress that reduced the tree growth a few years prior to the start of the experiment was shown to alter the tree's ability to cope with defoliation. Oak mildew selectively infected the defoliated trees and increased the severity of the defoliation stress.
- 6
Thus, trees subjected to high level of A. gallica inoculum had a lower ability to overcome the defoliation stress. These findings support the forest decline models developed by Manion in 1991 and show that it is important to take into account the role of opportunistic pathogens in tree mortality processes.
Introduction
Tree mortality remains a poorly understood process that is often difficult to predict (Franklin et al. 1987; Pedersen 1998). Pathogens are important agents of tree mortality that may regulate host demography and strongly alter the structure of plant communities (Hansen 1999; Gilbert 2002; Møller 2005). Several examples of forest declines linked to severe epidemics have been described (Anagnostakis 1987; Weste & Marks 1987; Gibbs et al. 1999). However, in many cases of forest decline, the role of pathogens is less significant as the process appears to be multifactorial, with the involvement of just opportunistic parasites, i.e. organisms unable to colonize a host unless it has been first weakened as a result of another stress. Manion (1991) developed a conceptual model of forest decline that postulates a conjunction of three different types of factors that must occur for the onset of a decline: predisposing factors act over the long-term to weaken the trees, while inciting factors are short-term stresses that trigger the decline, and contributing factors, mostly opportunistic organisms, act on the weakened trees to increase or to speed up the level of decline and mortality. However, the importance of opportunistic organisms in forest decline has been controversial, in part because they tend to be late invaders of trees that are about to die. While they have sometimes been considered to play an important part in the decline, greatly aggravating the problem (Guillaumin et al. 1985; Wargo & Harrington 1991; Houston 1992), they have also been considered as minor components of declines that were mainly related to tree ageing, location on inadequate sites, or pollution (Becker & Lévy 1982; Mueller-Dombois 1992; Landmann et al. 1993; Thomas et al. 2002; Frey et al. 2004). Nevertheless, forest dieback results from ecosystem processes that both result from and induce community imbalances.
We investigated the decline of oak trees to assess the importance of opportunistic pathogens in forest decline. Oak decline has been an episodic problem in Europe during the 20th century and typically involves the action of several biotic and abiotic factors operating in sequence (Thomas et al. 2002). Such declines have been a growing concern, in particular because of the possible relationship with climatic change. In oak decline, fungal pathogens colonizing either the root system (Armillaria sp.) or the bole bark (Biscogniauxia mediterranea) and bark insects such as Agrilus species have been reported as important opportunistic parasites, contributing to tree mortality. As in most cases of forest decline, the importance of these opportunistic pathogens is controversial (Thomas et al. 2002).
A. gallica, one of the most frequently found Armillaria species on declining oaks, is a wood decay root-rotting fungus and an aggressive colonizer of oak stumps. From colonized wood, it forms in forest soil a network of rhizomorphs, perennial cord-like organs that are capable of colonizing the tree bases of most oak trees in the forest. Rhizomorphs develop epiphytically on the root collar (Redfern & Filip 1991; Marçais & Caël 2006). A. gallica is an opportunistic pathogen with a low level of aggressiveness and is unable to colonize vigorously growing hosts (Wargo & Harrington 1991). However, the fungus often invades trees weakened by insect defoliation or drought. Inoculum potential, defined by Garrett (1956) as a combination of rhizomorph abundance at the host surface and the vitality of those propagules, has been considered an important feature determining the ability of Armillaria to invade hosts. The importance of inoculum potential has mainly been documented for aggressive Armillaria species. In forests of NW America, A. ostoyae was shown to be able to rapidly invade tree stumps created by selective logging from quiescent lesions and subsequently develop an increased inoculum potential in the vicinity, resulting in increased infection in surrounding trees (Cruickshank et al. 1997; Morrison et al. 2001). Comparatively little information exists on the importance of inoculum potential for opportunistic Armillaria species such as A. gallica.
In healthy mature oaks, the highest concentration of total non-structural carbohydrates is found in the stem, in October, just prior to leaf fall (Barbaroux & Bréda 2002). Artificial defoliation has been found to induce a reduction in sugars and in amino acids in the roots of oak seedlings (Parker & Patton 1975), while the level of starch is affected only when trees are defoliated sufficiently severely to cause refoliation in the same season (Wargo et al. 1972). Several studies have shown that physiological imbalances in starch compared with sugars in declining trees may decrease their resistance to insect (Dunn et al. 1990) and fungal organisms (Wargo 1981). Renaud & Mauffette (1991) reported that crown dieback in Sugar Maple was associated with reduced concentrations of carbohydrates, and suggested that such an imbalance in sugar/starch compounds may decrease the resistance of the trees to biotic and abiotic stresses. That is, tree carbohydrate reserves may be used as a physiological marker for the tree's ability to overcome stress. Nevertheless, no clear threshold of total non-structural carbohydrates leading to increased tree mortality risk has been established.
Our aim is to determine the effect of A. gallica and leaf loss on the survival of young, pedunculate oak trees. This soil pathogen is very common and it is difficult to identify comparable trees colonized or not by epiphytic rhizomorphs. However, A. gallica inoculum potential can show very high within-stand heterogeneity levels (Marçais & Caël 2006) and comparing the response to defoliation of trees subjected to high or low inoculum potential is possible. The hypothesis was that trees with a dense root collar colonization by epiphytic A. gallica rhizomorphs would exhibit a greater decline or mortality following stress (i.e. leaf loss), than trees with less root collar rhizomorph colonization. Stress was induced by defoliation and physiological consequences were quantified by determining the end of the season carbohydrate reserve levels.
Materials and methods
study plot and experimental design
The study plot was set up in a 20-year-old stand of 8–12 cm diameter at 1.3 m from soil level Quercus robur, naturally regenerated in the Champenoux communal forest in NE France. Oaks are the dominant tree species, with an understory of Carpinus betulus. The soil was homogeneous across the stand, consisting of a hydromorphic clay loam with a calcareous clay layer, 30–45 cm below the soil surface. The humus layer was a eutrophic mull (pH 4.7). A previous study (Marçais & Caël 2006) showed that the Armillaria gallica inoculum potential was highly heterogeneous within this stand, presenting an aggregated pattern with a range of 10 m that could be related to the colonization pattern of tree stumps of the previous stand by Armillaria.
Fifteen blocks of five trees, with approximately 8–12 m between the furthest trees within a block, were selected in the spring of 2000 (Fig. 1). Five blocks consisted of trees selected to have less than 1.2 mg cm−2 epiphytic Armillaria rhizomorphs on the bark of the collar area (referred to as ‘low inoculum potential trees, LIP’) while 10 blocks consisted of trees selected with a collar density of rhizomorphs between 5.5 and 9 mg cm−2 (referred to as ‘high inoculum potential trees, HIP’). Rhizomorphs on tree collars were measured using the method of Marçais & Caël (2006). Briefly, a small section of the collar was exposed and the density of epiphytic rhizomorph on the tree collar estimated by rhizomorph counts on a grid. Five randomly selected blocks containing HIP trees were treated with boric acid at the beginning of July 2000, in order to reduce the Armillaria inoculum potential (referred to as the BHIP treatment). The humus layer was brushed away up to 1 m from the tree base and a litre of 3% boric acid sprayed on to the soil (Bauce & Allen 1992).
Four trees per block were artificially defoliated at the end of June 2000 and again at the beginning of July 2001 to mimic the action of late insect defoliators such as Lymantria dispar or Thaumetopoea procesionea. The tree crowns were bent to the soil by pulling the upper trunk with a rope and all the leaves were cut at the petiole with scissors. Trees were then lifted to recover their former position. A control tree in each block was bent but not defoliated.
measuring tree growth
To determine radial growth, each tree was cored to the pith at 1.3 m from soil level (one core per tree) during winter 2001–02. Tree radial growth was not observed after 2001 because severe mortality as a consequence of the treatment occurred in the autumn of 2001 and would have led to a strong bias in the growth results. Each early wood, late wood and total ring width was measured microscopically to the nearest 0.01 mm. Following the measurements, individual ring-width series were cross-dated to ensure dating accuracy (Becker 1989). The effect of time, defoliation and Armillaria inoculum potential treatment on past tree growth was analysed with a mixed model using SAS (SAS/STAT 8.1, SAS Institute Inc., Cary, NC). Blocks were treated as random variables and a first-order autoregressive covariance structure was assumed for the ring widths of successive years.
determination of armillaria inoculum potential
In order to monitor the impact of the boron treatment, colonization rate of wood segments by soil rhizomorphs was determined. Fifteen days following the boron treatment, soil at the collar of each of the 75 trees was removed to expose a Armillaria rhizomorph, taking care not to disturb the rhizomorph network. A freshly cut pedunculate oak branch, 3–4 cm in diameter and 15 cm long, was attached to the rhizomorph within 10–20 cm of the collar with a rubber band, and the soil replaced. The colonization of the branch was checked after 1 year. The wood segments were retrieved and the presence of Armillaria mycelial fans beneath the bark was checked. Whenever Armillaria fans were present, a sample was taken to determine the species, using Alu I digested rDNA intergenic spacer profiles (Harrington & Wingfield 1995). The difference in frequency of branch colonization by A. gallica between the three treatments was analysed by logistic regression analysis, including a block effect. The density of epiphytic rhizomorphs on the tree collars was also checked at the end of the experimental period, in the summer of 2003 (Marçais & Caël 2006).
measuring crown status and carbohydrate reserve
Twelve defoliated trees (eight LIP + four HIP) and six control trees (four LIP + two HIP) were monitored in greater detail. The leaves of 12 defoliated trees were collected during the period of artificial defoliation in June 2000 and July 2001, in order to determine each tree's total leaf area. Individual leaf area was measured from subsamples from both upper and lower crown positions (20 sun exposed and 20 shadow leaves) for each defoliated tree using a portable area meter coupled to a transparent belt conveyer (LI-3000 A and LI-3050 A, LI-Cor, Lincoln, Nebraska, USA). Samples were oven-dried for 24 hours at 60 °C and the specific leaf area was determined as the ratio of leaf area to dry weight (cm2 g−1). This ratio was used to compute individual tree leaf areas from the leaves’ dry weight.
The 18 trees (defoliated + controls) were sampled to determine the concentration of total non-structural carbohydrates (TNC, including starch, glucose, fructose and sucrose) in the sapwood of tree trunks in June 2000 and 2001 prior to defoliation and in October 2000 and 2001 after leaf abscission, but prior to the first frost. These June and October dates correspond to the times of minimum and maximum concentrations in TNC, respectively, as determined by seasonal analysis of TNC in oak stems (Barbaroux & Bréda 2002). Two short cores including the whole sapwood were extracted, one from the base of the stem (from a height of 0–1.3 m) and the other from a major root, frozen and stored at −20 °C until freeze-dried. Heartwood was removed from the cores and the sapwood and bark were analysed together. TNC concentration, i.e. starch and soluble sugars, was enzymatically determined according to Barbaroux & Bréda (2002) and Barbaroux et al. (2003). The effect of time and defoliation on TNC concentration was analysed with a mixed model using the ‘mixed’ procedure of SAS. The significance of TNC evolution between June and October of 2000 and 2001 for both defoliated and control trees was tested with contrasts.
In October 2001, the defoliated trees (n = 57) were sampled in the bole and analysed for TNC concentration as previously described. However, no control trees were sampled at this time. The relationship between the starch concentration in the sapwood in autumn 2001 and the Armillaria inoculum potential was analysed by variance analysis using the SAS ‘mixed’ procedure, introducing the block as a random variable. To control for variation caused by oak mildew attack and past stress events, we introduced mildew severity in 2000 and relative growth reduction in 1995 (RGR95) as covariate. Indeed, a severe oak mildew infection developed on the studied trees in the summer of 2000: following defoliation, newly emerged leaves were infected by oak mildew, Erisiphe alphitoides, while spring-formed leaves of control trees were not. In addition, a past stress event that had an impact on tree growth in 1995 was detected.
The crown status was monitored several times per year over a period of 4 years. The level of infection of oak mildew in the summer of 2000 was rated as: 1 (no visible leaf necrosis, but presence of infection); 2 (5–33% of leaves with necrosis); 3 (more than 33% leaves with necrosis of leaf margin, but a normal leaf size); and 4 (very severe necrosis on the majority of leaves, with greatly reduced leaf size). Ten leaves from the new foliation were collected in September 2000 to quantify total chlorophyll content using a SPAD-502 chlorophyll meter (Minolta, Osaka, Japan) and to measure leaf area. The calibration used between transmittance T and chlorophyll content was:
[chlorophyll] (µmol m−2) = 0.08 × T2 + 11.597 − T − 98.548
In the spring of each year, the crown was rated as: 0 (healthy); 1 (moderately declining, with sparse foliage over the entire crown but with no major dead limbs); 2 (severely declining, with sparse foliage and also the death of major limbs); or 3 (dead trunk and collar). The difference in crown status between trees with the three Armillaria inoculum potential treatments (LIP, HIP, BHIP) was analysed by procedure ‘genmod’ of SAS using a multinomial distribution with the cumulative logit link. Two trees (one HIP and one BHIP) were damaged as a result of the trunk bending during artificial defoliation in 2001 and were discarded from further analysis. The trunk sapwood starch concentration, as well as the interaction between trunk sapwood starch concentration and treatment, was introduced as a covariate. Trunk sapwood starch concentration was used as a surrogate for the level of stress the trees sustained. The block factor, nested within treatment, was introduced as a fixed effect. Differences between the three inoculum potential treatments were tested using contrasts. To determine the possible colonization by Armillaria, the collar area and major roots of trees that died were checked by looking for mycelial fans beneath the bark in the cambium area. Whenever suspected Armillaria mycelial fans were detected, a sample was taken to determine the Armillaria species.
Results
past tree radial growth
There was no difference in past growth or in growth reduction following artificial defoliation between the trees exposed to low/high A. gallica inoculum potential or between trees treated/untreated with boron (Fig. 2, Table 1). The growth of undefoliated or defoliated trees did not significantly differ prior to their defoliation in 2000. The artificial defoliation induced a growth reduction of 38% in 2000 and 80% in 2001. In 2001, only early wood (i.e. only one large vessel layer) was produced by defoliated trees.
Effect | Numeratord.f. | Denominatord.f. | F-value | Pr > F |
---|---|---|---|---|
Year | 16 | 171 | 50.7 | < 0.001 |
Inoculum potential | 2 | 14.2 | 0.1 | 0.908 |
Year × Inoculum potential | 32 | 172 | 0.8 | 0.827 |
Defoliation | 1 | 45.4 | 6.3 | 0.016 |
Year × Defoliation | 16 | 171 | 3.8 | < 0.001 |
Inoculum potential × Defoliation | 2 | 45.3 | 0.2 | 0.798 |
Year × Inoculum potential × Defoliation | 32 | 172 | 0.9 | 0.663 |
- * The three Armillaria inoculum potential treatments are: low A. gallica IP; high A. gallica IP treated with boron; and high A. gallica IP not treated with boron.
- The block effect is specified as a random effect and is thus not included in this table of fixed effects. The between-block variance is 0.023 and is not significant (z = 1.15, P = 0.125).
In the recent past, the trees experienced a severe reduction in radial growth in 1995–96, which could be correlated with an especially rainy spring. We calculated, using a soil water balance model (Granier et al. 1999) and climatic data from a nearby weather station, an excess of water in 1995 of 492 mm, as compared with an average value of 340 mm year−1 for that stand. This could putatively induce severe waterlogging above the clay layer. The severity of this growth reduction on individual trees was measured by the relative growth reduction in 1995 (RGR95 = [mean radial growth 1990–94 – radial growth 1995]/mean radial growth 1990–94). The RGR95 was not significantly correlated with epiphytic rhizomorph density at the tree collar (r = 0.052, P = 0.69). It was slightly higher for BHIP trees (0.64 ± 0.07) than for LIP and HIP trees (0.56 ± 0.04 and 0.58 ± 0.07, respectively).
impact of boric acid treatment
The boric acid treatment had a significant impact on the epiphytic rhizomorph density. While at the start of the experiment, the initial epiphytic rhizomorph dry weight on tree collars was 1.0 ± 0.2, 5.4 ± 0.2 and 5.6 ± 0.3 mg cm−2 for trees of low inoculum potential (LIP), boron treated high inoculum potential (BHIP) and non-treated high inoculum potential (HIP) groups, respectively, it was 1.5 ± 0.3, 1.0 ± 0.2 and 4.3 ± 0.4 mg cm−2, respectively, in 2003. In addition, the frequency of branch segment colonization by Armillaria was significantly reduced by the boron treatment, with colonization rates, respectively, of 62.5%, 52.2% and 87% for trees of the LIP, BHIP and HIP groups (χ2 = 6.66, P = 0.036). The Armillaria species that colonized the segment was determined in 40 cases and only A. gallica was detected.
refoliation of defoliated trees
The trees’ leaf area in June 2000 ranged from 9 to 20 m2. After the first defoliation, new leaves developed in approximately 2 weeks. As a consequence of the severe oak mildew infection in 2000, the newly formed leaves had a reduced leaf area (18.6 ± 1.5 cm2 for defoliated trees vs. 48.9 ± 8.6 cm2 for controls, t = 5.9, P < 0.001) and chlorophyll content (234 ± 12 µmol m−2 for defoliated trees vs. 448 ± 19 for controls, t = 19.9, P = 0.003) relative to leaves from trees that were not defoliated. The level of infection was extremely severe on 42% of the trees, with almost a second defoliation induced by the oak mildew. There was no significant difference in mildew infection between LIP, BHIP and HIP trees (χ2 = 0.68, P = 0.713). In July 2001, just prior to the second artificial defoliation, the average tree total leaf area was 5.8 m2, compared with 13.5 m2 in June 2000, prior to the first defoliation (a reduction of 56%, paired t-test = 9.5, P < 0.001). The impact of this was mainly on the leaf area (paired t-test = 2.4, P = 0.035), while the number of leaves per tree decreased slightly, but not significantly. No E. alphitoïdes infection occurred on the newly formed leaves following the 2001 defoliation.
carbohydrate reserve and mortality
Very similar results were obtained for the analysis of trunk and root sapwood starch concentrations; only the results for trunk starch sapwood concentrations are presented. The variance analysis showed that defoliation, time and the defoliation–time interaction all significantly influence the trunk starch sapwood concentrations (result not shown, Fig. 3). While the starch concentration increased from late June 2000 to October 2000 in the control trees (F = 19.36, P ≤ 0.001), it decreased for the defoliated trees (F = 178.30, P ≤ 0.001). Similarly, in 2001, the sapwood starch concentration increased between June and October in the control trees (F = 27.58, P ≤ 0.001) while it remained constant for the defoliated trees (F = 0.70, P = 0.408). In October 2001, although the defoliated trees had significantly lower starch concentrations in the lower bole sapwood than the non-defoliated controls (Fig. 3), they had similar concentrations of higher soluble sugars.
The starch concentrations measured in the trunk sapwood in June 2000 were 4.1 ± 1.1 and 3.5 ± 1.6 for LIP and HIP, t-test = 0.71, P = 0.489, respectively. In addition, in October 2001, no significant difference in the trunk sapwood starch concentration existed between the three treatments, LIP, HIP and BHIP (Table 2), although there was a tendency for LIP trees to have higher starch reserves than HIP trees (1.6 ± 0.4, 1.0 ± 0.4 and 0.8 ± 0.4, respectively, for LIP, BHIP and HIP trees). Trees that experienced the highest relative growth reduction in 1995 (RGR95) or that were more severely infected by oak mildew following the 2000 defoliation, had a significantly lower level of starch reserve in the trunk sapwood in October 2001 (Table 2). In June 2000, RGR95 and starch concentration of the lower bole sapwood were not significantly correlated (r = − 0.18, P = 0.464).
Sources | Numeratord.f. | Denominatord.f. | F-value | P-value |
---|---|---|---|---|
RGR95* | 1 | 39 | 6.28 | 0.014 |
Mildew severity in 2000 | 1 | 39 | 5.05 | 0.039 |
Inoculum potential† | 2 | 39 | 2.36 | 0.119 |
- * Relative growth reduction in 1995 (see section ‘Past tree growth’ in the Results).
- † The three inoculum potential treatments are: low A. gallica IP; high A. gallica IP treated with boron; and high A. gallica IP not treated with boron.
- The block effect is specified as a random effect and is thus not included in this table of fixed effects. The between-block variance is 0.03 and is not significant (z = 0.24, P = 0.406).
Only one tree died in the winter of 2000–01, in the HIP treatment. The mortality was greater in 2001. Most of the mortality occurred between October and December (15 trees out of 19), while some trees died in the spring of 2002, with some additional mortality in 2004. The dieback always started in the upper crown and progressed downwards. No Agrilus bilineatus colonization was detected in the upper crown; this insect is an opportunistic invader of the bark of stressed oak trees. The last step of the process was the invasion of the collar and lower bole by Armillaria. Seventeen of the 19 dead trees were invaded by Armillaria and all tested isolates were A. gallica. Three trees were uprooted for a more detailed investigation of symptoms. The root system was completely invaded by Armillaria. An isolate from all the lesions that could have resulted from independent sources was checked for species identification and only A. gallica was detected (n = 8).
The mortality was very different in the three treatments, with 62% mortality for HIP trees, 32% for BHIP trees and 5% for LIP trees. The decline status of the crown was also very different between the treatments (Table 3, Fig. 4), the crown of the HIP trees being in a significantly worse state compared with both the LIP and BHIP trees as revealed by contrast analysis (χ2 of 6.14, P = 0.013 and 5.26, P = 0.022, respectively), while the LIP and the BHIP trees were not different from each other (χ2 = 0.04, P = 0.850). Trees showing a low level of starch reserves in the trunk sapwood in the autumn of 2001 experienced greater crown deterioration and mortality than trees with a high sapwood starch concentration (Table 3, Fig. 5). Most of the trees that died had a very low sapwood starch concentration. In addition, trees that experienced severe growth reduction in 1995, i.e. a high RGR95 (χ2 = 4.33, P = 0.037), and trees that were more severely infected by oak mildew following the 2000 defoliation (χ2 = 4.89, P = 0.027), showed a greater deterioration of crown status in 2004 and a higher mortality level. However, when these two effects were introduced in a model already containing the trees’ trunk sapwood starch levels, they did not add significant information (results not shown).
Sources | d.f. | Likelihood χ2 | P-value |
---|---|---|---|
Starch sapwood concentration (SSC) | 1 | 7.44 | 0.006 |
Inoculum potential* | 2 | 7.83 | 0.020 |
SSC × Inoculum potential | 2 | 0.57 | 0.751 |
Block (Inoculum potential) | 12 | 15.50 | 0.215 |
- * The three inoculum potential treatments are: low A. gallica IP; high A. gallica IP treated with boron; and high A. gallica IP not treated with boron.
Discussion
In the forest decline model developed by Manion (1991), three different types of factors (predisposing, inciting and contributing) must occur for the onset of a decline. Indeed, in our experiment, decline was more severe when a past stress period that occurred in 1995 predisposed the trees to decline, and the conjunction of defoliation, severe oak mildew infection and high A. gallica inoculum potential was necessary for decline and mortality to occur. Thus, these findings bring some experimental support to this model, which until now has been supported mainly by observation of forest decline (Manion 1991; Pedersen 1997; Cherubini et al. 2002; Suarez et al. 2004).
Figure 6 summarizes our hypothesis regarding the chain of events that led to tree mortality during this study. Manual defoliation had a severe impact on oak physiology, especially on carbon assimilation deficit: both tree growth and storage were severely affected. Pedersen (1997) developed a model of tree mortality following acute stress. In this model, the reduced availability in photosynthate induced by stress initiates a pathway where too low allocation to the fine roots and foliage leads to a reduction in fine root and foliage biomass, and as a consequence, additional decrease in photosynthate availability. Our data partly support such mechanisms as we found that leaf area in the year following the first defoliation was reduced by approximately 50%, probably as a result of limited starch availability to ensure maintenance functions and spring reactivation. Also, mortality was correlated with insufficient availability of photosynthate. The level of tree carbohydrate reserves in the autumn of 2001 was a key factor. Most of the trees died between October and December 2001 and the first step was a rapid death of the crown and upper bole; a possible explanation for this could be poor tissue hardening due to the limited starch availability in defoliated trees and subsequent tree death at the first frosts. Indeed, poor tree hardening following defoliation and the importance of a sufficient carbohydrate reserve for adequate hardening have been well documented (Gregory et al. 1986; Ameglio et al. 2001; Thomas et al. 2004). The level of carbohydrate reserves in the autumn of 2001 was influenced by a past stress that impacted tree growth in 1995 (possibly spring water-logging), by defoliation, and by the oak mildew infection, but not by the A. gallica inoculum potential to which the trees were exposed.
However, one major difference to Pedersen's (1997) model is that the predisposing and inciting stresses were not sufficient to induce significant mortality under our conditions. The results indicate that Armillaria gallica, although unable to attack vigorous trees and despite a very late intervention in the decline process, was actively involved in the trees’ decline following defoliation. Trees subjected to a high A. gallica inoculum potential, with both a high rhizomorph density on the root collar and the presence of rhizomorphs with a high colonizing capacity, experienced a 10-fold increase in post-defoliation mortality compared with trees with a low A. gallica inoculum potential. However, as the experiment was conducted in an uncontrolled environment, we cannot rule out interference from undetermined factors that might influence these results. Nevertheless, treating trees exposed to high A. gallica inoculum potential with boric acid (BHIP trees) both reduced the inoculum potential and the mortality level, suggesting that the observed lower mortality rate might indeed be caused by the decreased presence of A. gallica.
The status of the crown in BHIP trees was found to be intermediate between HIP and LIP trees and was linked to an increased level of the 1995 stress with a higher value of RGR95 for these trees, and lower levels of starch reserves in the autumn of 2001. When this was accounted for, the difference in crown status between LIP and BHIP trees was not significant. Possibly, the trees with low starch reserves that were exposed to high A. gallica inoculum potential were unable to both defend themselves against the pathogen and mobilize sufficient soluble carbohydrates to adequately harden themselves against the action of frost. In agreement with this hypothesis, it has been demonstrated that trees with very low starch reserves (under 5 mg g−1 dry weight) are prone to attack by opportunistic organisms (Dunn et al. 1987; Wargo & Harrington 1991). According to the opportunistic pathogen concept, trees subjected to the same level of stress should have a higher likelihood of decline/mortality if they are faced with a high level of A. gallica inoculum (Gregory et al. 1991). Although we applied a uniform defoliation stress level, the trees exhibited differences as a consequence of past events (RGR95) and an unexpected pathogen, oak mildew, infected the trees with different levels of severity during the course of the experiment. An integrated method to determine the level of stress experienced by the trees is to compare trees that had similar levels of carbohydrate reserves at the end of the 2001 growing season. These results show that for similar levels of carbohydrate reserve, trees supporting high levels of A. gallica inoculum indeed experienced a greater decline/mortality.
Infection of new leaves by oak mildew following defoliation appears to have amplified the negative effects of defoliation stress and to have been a significant factor leading to the decline of these trees. Usually, oak mildew attacks are not severe because the pathogen is only able to infect expanding leaves and is not present at the beginning of the season when the foliage of mature oak trees develops. The impact of mildew on trees whose phenology is disturbed by defoliation has previously been documented (Thomas et al. 2004); however, its importance may have been under-appreciated. In 2000, many of the trees in this study were nearly defoliated for a second time as a result of the oak mildew infection. The importance of this infection on the new leaves of previously defoliated trees has been previously demonstrated by protecting the refoliation of oak trees (using a chemical treatment) following an infection by Thaumetopoea processionea against oak mildew (B. Marçais, unpublished results). These findings stress the importance of taking into account the interaction between parasites. Pathogens that on their own may not have a strong impact can by interacting with other parasites have a significantly greater impact on their host and on the surrounding plant community. Such interactions between pathogens and parasites were documented for mortality processes leading to either succession or to an altered dominance between two competing plants (De Rooij-Van der Goes 1995; Holah & Alexander 1999).
Tree mortality during this study appeared as a complex process fitting Manion's (1991) model of forest decline, i.e. involving the action of several factors acting on different time-scales and the interaction of different factors such as defoliation/oak mildew/Armillaria. Attempts to assign the mortality of forest trees to single causes might thus be incorrect, particularly in the case of decline or background mortality, and might explain why this process has often been difficult to predict (Bigler et al. 2004; Suarez et al. 2004). More generally, it has been recently stressed that more attention should be paid to indirect mechanisms, such as increased predation likelihood, by which parasites can regulate host populations (Møller 2005). Such an indirect mechanisms is reported in this study: A. gallica did not have any direct impact on host morbidity, but the pathogen affected the ability to cope with acute stress, such as defoliation. If A. gallica has such an impact on a tree's ability to withstand chronic stress as a result of, for example, competition, this pathogen could have an impact on tree density, which is an important forest attribute.
Acknowledgements
The authors are especially grateful to C. Barbaroux, J. Biedermann and L. Lhoste for their contribution to the carbohydrate measurements, to François Gérémia, who contributed the dendrochronological work, to Olivier Caël, who helped in the fieldwork, to Everett Hansen for reviewing the manuscript, and finally to the members of the forest pathology and phytoecology laboratories who helped with the manual defoliation.