Loading...

PPT – Probability distributions and likelihood PowerPoint presentation | free to view - id: 1496a7-ODhkN

The Adobe Flash plugin is needed to view this content

Probability distributions and likelihood

Readings

- Ecological Detective
- Chapter 3 Probability distributions
- Chapter 7 Likelihood

Overview

- Probability distributions - binomial, poisson,

normal, lognormal, negative binomial, beta - Likelihood
- Likelihood profile
- The concept of support
- Model Selection Likelihood Ratio, AIC
- Robustness - contradictory data

The binomial distributiondiscrete outcomes

discrete trials

- Consider a discrete outcome - a coin is heads or

tails, an animal (or plant) lives or dies - We examine a fixed number of such events - a

number of flips of the coin, a certain number of

animals that may or may not survive

The binomial formula

Z is the observed number of outcomes N is the

number of trials p is the probability of the

event happening on a given trial

Factorial term

You may remember the concept of N things taken k

at a time - then again you may not

The Poissonoutcomes discrete, continuous number

of observations

r is the expected number of events can be defined

as r t, r is a rate and t is the time

Limitations of Poisson

- Has only one parameter, which is both the mean

and the variance - We often have discrete count data, but want the

variance to be estimable or at least larger than

Poisson

Thus we often use the negative binomial

- Also discrete outcomes with continuous

observations - Is derived from the Poisson where the rate

parameter is a random variable

The negative binomialoutcomes discrete,

continuous observations

R is the expected number of observations k is a

parameter related to variance

The normal distributioncontinuous distribution

This is the familiar bell shaped curve

Quiz But what is the Y axiswhat units?

The Y axis is the first derivative of the

cumulative probability distribution

The log normal distribution

Key notes re lognormal distribution

- Since x is a constant, when calculating

likelihoods we often drop the 1/x term - If s.d. is fixed, then the entire first term is a

constant (also true in the normal) and can be

ignored - expected value of lognormal is not the mean

The beta distribution

Shapes of the beta

Summary by nature of trials and observations

(No Transcript)

Moving from probability distributions to

likelihood

Probability

The probability observing data Yi given parameter

p. If Y is poisson distributed, then in one unit

of time the probability of observing k events is

When using data, the data are known and the

hypothesis (parameter) is unknown. Thus we ask,

given the data how likely are alternative

hypotheses.

Note that now the subscript is on the

hypothesis! In probability the hypothesis is

known and the data unknown, in likelihood the

data are known and the hypothesis unknown. We

assume that likelihood is proportional to

probability

The probability of all outcomes for a given

hypothesis must sum to 1.0. This is not true for

likelihood, the likelihood of all hypotheses for

a given outcome will not be 1.0. Assuming a

Poisson model, and we had k4

(No Transcript)

Rescale to max1

Log likelihoods

Multiple observations

- If observations are independent then

Mark recapture example

- We tagged 100 fish
- Went back a few days later (after mixing etc)
- And recaptured 100 fish
- 5 were tagged.
- We use Poisson distribution to explore the

likelihood of different population sizes

What we need

- Data is number marked, number recaptured, and

tags recaptured - tagged is marked/population size
- expected recoveries is tagged recaptured
- expected recoveries is r of the Poisson

(No Transcript)

(No Transcript)

Multiple observations

- Assume we go out twice more, capture 100 animals

each time, and 3 and then 4 are captured

(No Transcript)

Combining all data

The likelihood profile

- Fix the parameter of interest at discrete values

and find the maximum likelihood by searching over

all other parameters - In the bad old days when people reported

confidence intervals, you can use the likelihood

profile to calculate a confidence interval - add demo from logistic model using macro

The concept of support

- Edwards 1972, Likelihood
- Think of the relative likelihood as the amount of

support the data offer for the hypothesis

The lognormal distribution

Lindley, D.V. 1965. Introduction to probability

statistics from a Bayesian viewpoint. Part 1.

Probability. Cambridge U. Press. 259

p. Lognormal distribution page 143.

Readings on robustness and contradictory data

Robustness Numerical Recipes pp

539 Contradictory data Schnute, J. T. and R.

Hilborn. 1993. Analysis of contradictory data

sources in fish stock assessment. Canadian

Journal of Fisheries and Aquatic Sciences 50

1916-1923

Robustness

- In the real world, assumptions are not always met
- For instance, data may be mis-recorded, the wrong

animal may be measured, the instrument may have

failed, or some major assumption may have been

wrong - Outliers exist

(No Transcript)

What is c?

Contaminated data

Fit with robust estimation

Demonstrate robustness in excel

- likelihood lecture workbook.xls

Contradictory data

- We often have two independent measures of

something, that disagree - The problem here is not that an individual data

point is contaminated, but that the data set

isnt measuring what we hope

The infamous northern cod

What they say about r

Likelihoods for contradictory data

Combined likelihood

Challenges in likelihood

- All probability statements are based on the

assumptions of the models - We normally do not admit that either data are

contaminated, or data sets are not reflecting

what we think they are - Thus we almost certainly overestimate the

confidence in our analysis