Population Ecology using Probability

An advanced mathematical exploration supporting our series of articles on population dynamics for advanced students.
Exploring and noticing Working systematically Conjecturing and generalising Visualising and representing Reasoning, convincing and proving
Being curious Being resourceful Being resilient Being collaborative

Problem

Branching Processes

Branching processes, or tree graphs, model the growth and eventual size of a population. If we know the probabilities of the number of offpsring produced at each generation, then we can determine the probability of ultimate extinction, or the eventual population size.

Image
Population Ecology using Probability

Probability Generating Functions

Consider a variable X, where $P(X=0)=p_0,   P(X=1)=p_1,   ...$

This is an integer valued variable with its mass function as a sequence.  We set two conditions:

  1.  All probabilities need to be positive  $ p_k \geq 0 $
  2. Only one event can and must occur, so $p_0+p_1+...=\displaystyle\sum\limits_{k=0}^{\infty} p_k =1$

 

The probability generating function G, is an ordinary function in terms of s: $$G_X(s)=p_0+p_1 s+p_2 s^2+...$$ Question:    What is the value of G(s) when $s=0$? And when $s=1$?

 

Example:    Consider a random variable Y with the geometric distribution with parameter p

Then $P(Y=k)=p(1-p)^{k-1}=pq^{k-1}$ for $k=0,1,...$.  

So Y has PGF given by:  $$\begin{align*} G_Y(s) & = \displaystyle \sum_{k=1}^{\infty} p q^{k-1} s^k \\ &= ps \displaystyle \sum_{k=0}^{\infty} (qs)^k \\  &= \frac {ps}{1-qs} \end{align*}$$

Expectation

We can relate the PGF to the mean, or expectation. Recall that: $$E(X)=\bar x = \displaystyle \sum_{all  x}^{ } xP(X=x)$$We can extend this definition to not just a variable, but to a function of a variable:  $$E(g(X))=\bar{g}(x) = \displaystyle \sum_{all  x}^{ } g(x) P(X=x)$$This definition reminds us of our PGF polynomial, with the important result: $$ G_X(s)=p_0+p_1 s+p_2 s^2+...=E(s^X)$$

 

Random Sums Formula

Consider a population of meerkats, where each individual has a random number of offspring in the next generation. Using this information, we can determine the total expected number of offspring in future generations.

First let $N, X_1, X_2, ...$ be independent variables, with $X_1, X_2, ...$ all having the same probability generating function G.  Think of these X as the individual meerkats in our population. This also means that our PGF is given by $G(s)=p_0+p_1s+p_2s^2+...$, where $p_0=P(\text{no offspring}),   p_1=P(\text{one offspring}) ,  ...$

 

We are interested in finding the PGF of the sum   $X_1+X_2+...+X_N$ $$\begin{align*} G_T(s) & = E[s^T] \\ &= \displaystyle \sum_{n=o}^{\infty} E\Big [s^T|N=n\Big ] P(N=n) \\ & = \displaystyle \sum_{n=o}^{\infty} G(s)^n P(N=n) \\ & = E[G(s)^n] \\ &= G_N \Big( G(s) \Big) \end{align*} $$Example:    Elephants (in most cases) only have one offspring at a time, with probability p, say. We can model the number of offspring using the Bernoulli distribution with parameter p.

Generation n+1 consists of the offspring of generation n.

Let $Z_{n+1}= \displaystyle \sum_{j=1}^{Z_n} X_j$ ,  where $X_j$ is the number of offspring of the jth individual in generation n.

 

In the first generation:    $G_{Z_1} (s)=G_X(s)=(1-p)+ps$

In the second generation:     $G_{Z_2} (s)=G_{Z_1} \bigg(G_X (s) \bigg)=(1-p)+p\big((1-p)+ps\big)=(1-p^2)+p^2 s$

Continuing, we see that at the nth generation:     $G_{Z_n} (s)=(1-p^n)+p^n s$

 

Now click here to find out about branching processes and how we can use probability to determine the likelihood of a population becoming extinct.