Large deviations for Branching Processes
in Random Environment
Abstract
A branching process in random environment is a generalization of Galton Watson processes where at each generation the reproduction law is picked randomly. In this paper we give several results which belong to the class of large deviations. By contrast to the GaltonWatson case, here random environments and the branching process can conspire to achieve atypical events such as when is smaller than the typical geometric growth rate and when .
One way to obtain such an atypical rate of growth is to have a typical realization of the branching process in an atypical sequence of environments. This gives us a general lower bound for the rate of decrease of their probability.
When each individual leaves at least one offspring in the next generation almost surely, we compute the exact rate function of these events and we show that conditionally on the large deviation event, the trajectory converges to a deterministic function in probability in the sense of the uniform norm. The most interesting case is when and we authorize individuals to have only one offspring in the next generation. In this situation, conditionally on , the population size stays fixed at 1 until a time . After time an atypical sequence of environments let grow with the appropriate rate () to reach The corresponding map is piecewise linear and is 0 on and on
AMS 2000 Subject Classification. 60J80, 60K37, 60J05, 92D25
Key words and phrases. Branching processes, random environments, large deviations.
1 Introduction
Let be the space of probability measures on the integer, that is
and denote by the mean of :
A branching process in random environment (BPRE for short) with environment distribution is a discrete time Markov process which evolves as follows : at time we draw according to independently of the past and then each individual reproduces independently according to the same i.e. the probability that individual gives birth to offsprings in the next generation is for each . We will denote by the distribution probability of this process started from individuals. When we write and unless otherwise mentioned, we mean that the initial state is equal to .
Thus, we consider an i.i.d. sequence of random environment with common distribution . Traditionally, the study of BPRE has relied on analytical tools such as generating functions. More precisely, denoting by the probability generating function of , one can note that the BPRE is characterized by the relation
For classical references on these processes see [1, 2, 3, 6, 15, 23].
A good picture to keep in mind when thinking of a BPRE is the following : consider a population of plants which have a one year lifecycle (so generations are discrete and nonoverlapping). Each year the climate or weather conditions (the environment) vary which impacts the reproductive success of the plant. Given the climate, all the plants reproduce according to the same given mechanism. In this context, can be thought of as the distribution which controls the successive climates, which are supposed to be iid, and the plant population then obeys a branching process in random environment. By taking a Dirac mass for we recover the classical case of Galton Watson processes.
At least intuitively one easily sees that some information on the behavior of the BPRE can be read from the process and that their typical behavior should be similar :
Hence the following dichotomy is hardly surprising: A BPRE is supercritical (resp. critical, resp. subcritical) if the expectation of with respect to the law of the environments :
is positive (resp. zero, resp. negative). In the supercrticial case, the BPRE survives with a positive probability, in the critical and subcritical case, it becomes extinct a.s.
Moreover, in the supercritical case, we have the following expected result [3, 16]. Assuming that for some , there exists a finite r.v. such that
which ensures that conditionally on the nonextinction of
This result is a generalization in random environment of the well known KestenStigum Theorem for Galton Watson processes : let be the reproduction law of the GW process and let be its mean. Assume that , then
The distribution of is completely determined by that of and a natural question concerns the tail behavior of near 0 and infinity. Results in this direction can be found for instance in [8, 12, 13, 22] for the Galton Watson case and [17] for the BPRE case. In a large deviation context, the tail behavior of can be related to event where grows with an atypical rate. Another way to study such events is to consider the asymptotic behavior of This is the approach taken in [5] to prove that decays supergeometrically when , assuming that . Yet another approach is the study of socalled moderate deviations (see [21] for the asymptotic behavior of with ).
Finally, we observe that Kesten Stigum Theorem for Galton Watson processes can be reinforced into the following statement:
in the sense of the uniform norm almost surely (see for instance [20] for this type of trajectorial results, unconditioned and conditioned on abnormally low growth rates).
In this work we will consider large deviation events for BPREs of the form
and we are interested in how fast the probability of such events is decaying. More precisely, we are interested in the cases where
Let us discuss very briefly the GaltonWatson case first (see [14, 20, 22]). Assume first that the Galton Watson process is supercritical () and and that all the moments of the reproduction law are finite. If we are in the Böttcher case () then there are no large deviations, i.e.
If, on the other hand, we are in the Schrödder case () then can be nontrivial for This case is discussed in [20] (see also [14] for finer results for lower deviations) where it is shown that to achieve a lowerthannormal rate of growth the process first refrains from branching for a long time until it can start to grow at the normal rate and reach its objective. More precisely, it is a consequence of Theorem 2 below that conditionally on ,
in probability in the sense of uniform norm, where When the reproduction law has infinite moments, the rate function is nontrivial for In the critical or subcritical case, there are no large deviations.
We will see that the situation for BPRE differs in many aspects from that of the GaltonWatson case: for instance the rate function is nontrivial as soon as is not constant and more than with positive probability. This is due to the fact that we can deviate following an atypical sequence of environments, as explained in the next Section, and as already observed by Kozlov for upper values in the supercritical case [18]. When we condition by and we assume the process still converges in probability uniformly to a function which has the same shape as above, that is there exists such that for and then is linear and reach , but the slope of this later piece can now differs from the typical rate .
2 Main results
Denote by the sequence of iid logmeans of the successive environments,
and
Define the Laplace transform of and let be the large deviation function associated with :
We briefly recall some well known fact about the rate function (see [11] for a classical reference on the matter). The map is strictly convex and in the interior of the set where Furthermore, , and is decreasing (strictly) on the left of and increasing (strictly) on its right.
The map is called the rate function for the following large deviation principle associated with the random walk . We have for every ,
(1) 
and for every
(2) 
Roughly speaking, one way to get
is to follow environments with a good sequence of reproduction law :
We have then the following upper bound for the rate function for any BPRE under a moment condition analogue to that used in [16]. The proof is deferred to the next section.
Proposition 1.
Assuming that for some , then for every :
As Theorem 2 below shows, the inequality may be strict. Moreover, this proves that even in the subcritical case, there may be large deviations, contrary to what happens in the Galton Watson case. More precisely, as soon as and is not constant almost surely, the rate function is non trivial on .
2.1 Lower deviation in the strongly supercritical case.
We focus here on the socalled strongly supercritical case
(in which the environments are almost surely supercritical). Let us define for every ,
It is quite easy to prove that this infimum is reached at a unique point (see Lemma 6):
and that . We can thus define the function for each as follows (see figure ):
We will need the following moment assumption .
() 
Observe that the condition in Proposition 1 ( such that ) is included in .
The main result is the following theorem which gives the large deviation cost of and the
asymptotic trajectory behavior of when conditioned on
Theorem 2.

Assuming that and the hypothesis we have

If , then for every ,
and furthermore, conditionally on ,

If , then for every ,
and furthermore for every , conditionally on ,
Let us note that if , then the takeoff point of the trajectory may either be zero, either be equal to , or belong to (see Section 3 for examples).
Moreover, when is deterministic, as in the case of a GW process,

If (Böttcher case), then and .

If (Schrodder case), then .
Let us first give a heuristic interpretation of the above theorem. Observe that
and that
so that we have
and is just the “optimal” cost of
such an event with respect to the choice of It is not hard
to see that the event is asymptotically included in and
hence is an upper bound for the rate function for
. Adding that once is large enough it has no
choice but to follow the random walk of the logmeans of the
environment sequence, is actually the good candidate to
be the rate function.
Thus, roughly speaking, to deviate below the process
stays bounded until an optimal
time and then deviates in straight line to thanks to a
good sequence of environments.
The proof in Section 5 and 6 follows this heuristic.
Another heuristic comment concerns the behavior of the environment sequence conditionally on the event Before time we see a sequence of iid environments which are picked according to the original probability law biased by the probability to have one offspring (think of the case where charges only two environments). After time we know that the distribution of the sequence is the law of a sequence of iid conditioned on This implies that the law of the environments is that of an exchangeable sequence with common distribution tilted by the logmeans.
To conclude this section, we comment on the hypothesis It is known (see [6]) that for a Galton Watson process with survival probability and generating function under the condition, for all
(*) 
where and In the case where (no death), which tells us that the cost of staying bounded is the cost of keeping the population size fixed at , a fact that we also use for our analysis of BPRE. This suggests that the analogue of for BPRE should also play a role in the lower deviations events when . However there is not yet an analogue of for BPRE and the situation is probably more complex.
2.2 Upper deviation in the strongly supercritical case
Assume as above that
and that for every ,
we have the following large deviation result for upper values.
Theorem 3.
For every ,
and furthermore for , conditionally on ,
To put it in words, this says that the cost of achieving a higher than normal rate of growth is just the cost of seeing an atypical sequence of environments in which this rate is expected. Furthermore, conditionally on , the trajectory is asymptotically a straight line.
Kozlov [18] gives the upper deviations of in the case where the generating functions are a.s. linear fractional and verify a.s. . In the strongly supercritical case and under those hypothesis, he proves that for every , there exists such that
Thus, Kozlov gets a finer result in the linear fractional case with a.s. by proving that the upper deviations of the BPRE above are exactly given by the large deviations of the random walk .
Proposition 1 shows that the rates of upper and lower deviations are at least those of the environments, but Theorem 2 and the remark below show that the converse is not always true.
Theorem 3 is the symmetric for upper deviations of case (b) of Theorem 2 for lower deviations. It is natural to ask if there is an analogue of case (a) as well. In this direction, we make the following two remarks.

If there exists such that
then the cost of reaching can be less that , since the BPRE might “explode” to a very large value in the first generation and then follow a geometric growth. This mirrors nicely what happens for lower deviations in the case (a). However we do not have an equivalent of Theorem 2 for upper deviations as such a result seems much harder to obtain for now.

In the case when
then by Theorem 3 in [16],
Thus, the BPRE might deviate from the exponential of the random walk of environments :
which would yield a more complicated rate function for deviations.
2.3 No large deviation without supercritical environment
Finally, we consider the case when environments are a.s. subcritical or critical :
and we assume that for every , there exists such that
Note that the condition implies simply by
considering
In that case, even if , there is no large deviation, as in the case of a Galton Watson process.
Proposition 4.
Suppose and that , then for every ,
We recall that by Proposition 1, this result does not hold if .
The next short section shows a concrete example where is non trivial. Section 4 is devoted to the proof of Proposition 1. Section 5 is devoted to proving two key lemmas which are then used repeatedly. The first gives the cost of keeping the population bounded for a long time. The second tells us that once the population passes a threshold, it grows geometrically following the product of the means of environments. In Section 6, we start by computing the rate function and then we describe the trajectory. Section 7 is devoted to upper large deviation while Section 8 to case when environments are a.s. subcritical or critical.
3 A motivating example : the case of two environments
Suppose we have two environments and with Call and their respective log mean and suppose . The random walk is thus the sum of iid variables
Recall that if is a Bernoulli variable with parameter the Fentchel Legendre transform of is
Hence the rate function for the large deviation principle associated to the random walk is defined for by
Recall that is the probability that an individual has exactly one descendent in the next generation.
The following figure 2 shows the function , so is the minimum of this function and is the where this minimum is reached. Figure 2 is drawn using the values , and . Thus, we ask whereas behaves normally as and this example illustrate Theorem 2 a) with
As an illustration and a motivation we propose the following model for parasites infection. In [7], we consider a branching model for parasite infection with cell division. In every generation, the cells give birth to two daughter cells and the cell population is the binary tree. We want to take into account unequal sharing of parasites, following experiments made in Tamara’s Laboratory in Hopital Necker (Paris), and we distinguish a first (resp. a second) daughter cell. Then we call (resp. ) the number of descendants of a given parasite of the mother cell in the first (resp. the second daughter), where is any couple of random variable (it may be non symmetric, dependent…). A key role for limit theorems is played by the process which gives the number of parasites in a random cell line (choosing randomly one of the two daughter cells at each cell division and counting the number of parasites inside). This process follows a Branching process with two equiprobable environment with respective reproduction law and . Thus, here , and .
We are interested in determining the number of cells with a large number of parasites and we call (resp ) the number of cells in generation which contain less (resp. more) than parasites, for . An easy computation (which follows (17) in [7]) shows that
If , Section 2.1 ensures that for every ,
Moreover Section 2.2 ensures that for every ,
4 Proof of Proposition 1 for general BPRE
Proposition 1 comes from continuity of and
the following Lemma.
Lemma 5.
For every and ,
Proof.
Let . Recall that ,
and this supremum is reached in such that
Then introduce the probability on defined by
Under this new probability
so under is a random walk with drift and is a supercritical BPRE under with survival probability . Then, for every ,
(3) 
Moreover, for every bounded measurable function ,
We will use the above with to obtain that, for every ,
Now, under , is a random walk with positive drift which tends to infinity as tends to infinity. By using (3) we see that under almost surely so that
This ensures, by Fatou’s lemma,
And since we get
Letting gives
which completes the proof. ∎
5 Key lemmas for lower deviation
5.1 The function
We recall that
Lemma 6.
There exists a unique such that
and .
Proof.
Put and Then we have we thus want to solve the equation and if we let
Assume that has two solutions both in then there exists such that , i.e.
That is impossible since Adding that completes the proof. ∎
5.2 The cost of staying bounded
We start with the following elementary result, which says that staying bounded has the same exponential cost than staying fixed at 1.
Lemma 7.
For every ,
Moreover, if , then for every fixed there is a constant such that for every ,
Proof.
We call the number of offspring of a random lineage. More explicitly, we call the size of the offspring of the ancestor in generation and pick uniformly one individual among this offspring. We call the size of the offspring of this individual and so on…
Note that are iid with common ditribution . Hence, for every , recalling that ,
Adding that
allows us to conclude. ∎
Our proof actually shows the stronger
for
5.3 The cost of deviating from the environments
The aim of this section is to show that once the process “takes off” (i.e. once the population passes a certain threshold), it has to follow the products of the means of the environments sequence.
Lemma 8.
Assuming , for all and , there exist such that for all and ,
so that
Define for every ,
so that
For all , and define the function
(this quantity does not depend on by Markov property) and
The proof will use the following Lemma, the proof of which is given at the end of this section.
Lemma 9.
Fix there exist and such that
where is a random probability with law
We proceed with the proof of Lemma 8 assuming that the above result holds.
Proof.
Let us fix and and let us show that such that
(6) 
For every ,