Difference between revisions of "Poisson race distribution"
(Saving work) |
m (Removing inline name) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | {{ProbMacro}} | + | {{ProbMacro}}{{M|\newcommand{\l}[1]{\lambda#1} }} |
__TOC__ | __TOC__ | ||
==Definition== | ==Definition== | ||
Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable: | Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable: | ||
* {{M|Z:\eq X-Y}} | * {{M|Z:\eq X-Y}} | ||
− | We | + | We<ref>I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area</ref> call this a "''Poisson race''" as in some sense they're racing, {{M|Z>0}} then {{M|X}} is winning, {{M|Z\eq 0}}, neck and neck and {{M|Z<0}} then {{M|Y}} is winning. |
* {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail. | * {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail. | ||
** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile. | ** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile. | ||
Line 16: | Line 16: | ||
** Can't find where I've written it down, will not guess it | ** Can't find where I've written it down, will not guess it | ||
==Proof of claims== | ==Proof of claims== | ||
− | {{Requires proof|grade=B|msg=Page 384 document spans consecutive numbers, squared, A4, no holes}} | + | {{Requires proof|grade=B|msg=Page 384 document spans consecutive numbers, squared, A4, no holes |
+ | * Added proof, attention was all over the place so may be a bit disjoint [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 02:08, 7 January 2018 (UTC) | ||
+ | ** Worth going through it again! [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 02:08, 7 January 2018 (UTC) | ||
+ | STILL TODO: | ||
+ | * The {{M|k<0}} case! - Method outlined already. [[User:Alec|Alec]] ([[User talk:Alec|talk]]) 02:08, 7 January 2018 (UTC)}} | ||
+ | <div style="float:right;margin-left:0.2em;margin-top:0.2em;margin-bottom:0.2em;">{{/CalcTable1}}</div> | ||
+ | ===Finding {{M|\P{Z\eq k} }} for {{M|k\in\mathbb{N}_0}}=== | ||
+ | {{Todo|Cover why we split the cases (from below: "''Whereas: if we have {{M|k<0}} then {{M|Y>X}} so if we have {{M|Y\eq 0}} then {{M|X< 0}} follows, which we can't do, so we must sum the other way around '') - good start}} | ||
+ | * Let {{M|k\in\mathbb{N}_0}} be given | ||
+ | ** Suppose {{M|k\ge 0}} | ||
+ | *** Then {{M|X-Y\eq k}} giving {{M|X\eq Y+k}} | ||
+ | *** Specifically, {{M|k\ge 0}} means {{M|X-Y\ge 0}} so {{M|X\ge Y}} and so forth as shown for a handful of values in the '''''table on the right''''' | ||
+ | **** As [[Poisson distribution|Poisson]] is only defined for values in {{M|\mathbb{N}_0}} we can start with {{M|Y\eq 0}} and {{M|X}} will be {{M|\ge 0}} too | ||
+ | ***** ''Whereas:'' if we have {{M|k<0}} then {{M|Y>X}} so if we have {{M|Y\eq 0}} then {{M|X< 0}} follows, which we can't do, so we must sum the other way around if {{M|k<0}}. '''''This is why we analyse {{M|\P{Z\eq k} }} as two cases!''''' | ||
+ | *** So we see that if {{M|Y}} takes any value in {{M|\mathbb{N}_0}} that {{M|X\eq Y+k}} means {{M|X\in\mathbb{N}_0}} too, the key point | ||
+ | *** Thus we want to sum over {{M|i\in\mathbb{N}_0}} of {{M|\P{Y\eq i\wedge X\eq i+k} }} as for these cases we see {{M|X-Y\eq i+k-i\eq k}} as required | ||
+ | **** So: {{MM|\P{Z\eq k}:\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\Pcond{X\eq i+k}{Y\eq i} }} | ||
+ | ***** As {{M|X}} and {{M|Y}} are [[independent distributions]] we know {{M|\Pcond{X\eq u}{Y\eq v}\eq\P{X\eq u} }} we see: | ||
+ | ****** {{MM|\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\P{X\eq i+k} }} | ||
+ | ***** Then {{MM|\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0}{\Bigg[ \underbrace{e^{-\l{_2} }\cdot\frac{\l{_2}^i}{i!} }_{\P{Y\eq i} }\cdot\underbrace{e^{-\l{_1} }\cdot\frac{\l{_1}^{i+k} }{(i+k)!} }_{\P{X\eq i+k} }\cdot\Bigg]} }} - by [[Poisson distribution|definition of the Poisson distribution]] | ||
+ | *****: {{MM|\eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i}{i!}\cdot\frac{\lambda_1^k}{(i+k)!} }} | ||
+ | ****** Notice that {{M|(i+k)!\eq i!\big((i+1)(i+2)\cdots(i+k)\big)}}{{MM|\eq i!\frac{(i+k)!}{i!} }}, so {{MM|\frac{1}{(i+k)!}\eq \frac{1}{i!\frac{(i+k)!}{i!} } \eq \frac{i!}{i!(i+k)!} }} | ||
+ | ***** So {{MM|\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i\lambda_1^k}{i!}\cdot\frac{i!}{i!(i+k)!} }} | ||
+ | ****** Finally giving: | ||
+ | ******* {{MM|\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{i!i!}\cdot\frac{i!}{(i+k)!} }} or | ||
+ | ******* '''''is is perhaps better written as: ''''' <div style="display:inline-block;font-size:1.2em;border:1px solid #000;padding:0.25em;">{{MM|\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}{\left[\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{(i!)^2}\cdot\frac{1}{\prod_{j\eq 1}^{k}(i+j)}\right]} }}</div> | ||
+ | |||
+ | ====NOTES for {{M|k<0}}==== | ||
+ | Let {{M|k':\eq -k}} so it's positive and note that: | ||
+ | * {{M|X-Y\eq k}} means {{M|Y-X\eq -k\eq k'}} | ||
+ | ** Importantly {{M|Y-X\eq k'}} for {{M|k' >0}} (as {{M|k<0}} we see {{M|k':\eq -k > 0}}) | ||
+ | *** Let {{M|X':\eq Y}} and {{M|Y':\eq X}} then we have: | ||
+ | **** {{M|X'-Y'\eq k'}} for {{M|k'>0}} and both {{M|X'}} and {{M|Y'}} as [[Poisson distributions]] with given rates | ||
+ | ***** We can use the {{M|k\ge 0}} formula for this as {{M|k'>0}} so is certainly {{M|\ge 0}} | ||
==Notes== | ==Notes== | ||
<references group="Note"/> | <references group="Note"/> | ||
==References== | ==References== | ||
<references/> | <references/> | ||
+ | {{Fundamental probability distributions navbox|plain}} | ||
{{Definition|Probability|Statistics|Elementary Probability}} | {{Definition|Probability|Statistics|Elementary Probability}} | ||
+ | {{Probability Distribution|fund=true}} |
Latest revision as of 10:33, 24 December 2018
Contents
Definition
Let [ilmath]X\sim[/ilmath][ilmath]\text{Poi} [/ilmath][ilmath](\lambda_1)[/ilmath] and [ilmath]Y\sim\text{Poi}(\lambda_2)[/ilmath] be given random variables (that are independent), then define a new random variable:
- [ilmath]Z:\eq X-Y[/ilmath]
We[1] call this a "Poisson race" as in some sense they're racing, [ilmath]Z>0[/ilmath] then [ilmath]X[/ilmath] is winning, [ilmath]Z\eq 0[/ilmath], neck and neck and [ilmath]Z<0[/ilmath] then [ilmath]Y[/ilmath] is winning.
- Caveat:Remember that Poisson measures stuff per unit thing, meaning, for example, that say we are talking "defects per mile of track" with [ilmath]\lambda_1[/ilmath] being the defects per mile of the left rail (defined somehow) and [ilmath]\lambda_2[/ilmath] the defects per mile of the right rail.
- If there are 5 on the left and 3 on the right [ilmath]Z[/ilmath] for that mile was [ilmath]+2[/ilmath] - the next mile is independent and doesn't start off with this bias, that is [ilmath]Z\eq 0[/ilmath] for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.
- So this doesn't model an ongoing race.
Descriptions
- for [ilmath]k\ge 0[/ilmath] and [ilmath]k\in\mathbb{Z} [/ilmath] we have:
- Claim 1: [math]\P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2} \sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\lambda_2)^i}{i!(i+k)!} [/math][Note 1], which may be written:
- [math]\P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2}\sum_{i\in\mathbb{N}_0}\left( \frac{(\lambda_1\lambda_2)^i}{(i!)^2}\cdot\frac{i!}{(i+k)!}\right) [/math] - this has some terms that look a bit like a Poisson term (squared) - perhaps it might lead to a closed form.
- Claim 1: [math]\P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2} \sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\lambda_2)^i}{i!(i+k)!} [/math][Note 1], which may be written:
- for [ilmath]k\le 0[/ilmath] and [ilmath]k\in\mathbb{Z} [/ilmath] we can re-use the above result with [ilmath]X[/ilmath] and [ilmath]Y[/ilmath] flipped:
- Can't find where I've written it down, will not guess it
Proof of claims
The message provided is:
[ilmath]k[/ilmath] | [ilmath]X[/ilmath] | [ilmath]Y[/ilmath] |
---|---|---|
[ilmath]0[/ilmath] | [ilmath]X\eq 0[/ilmath] | [ilmath]Y\eq 0[/ilmath] |
[ilmath]X\eq 1[/ilmath] | [ilmath]Y\eq 1[/ilmath] | |
[ilmath]X\eq 2[/ilmath] | [ilmath]Y\eq 2[/ilmath] | |
[ilmath]\vdots[/ilmath] | [ilmath]\vdots[/ilmath] | |
[ilmath]X\eq i[/ilmath] | [ilmath]Y\eq i[/ilmath] | |
[ilmath]1[/ilmath] | [ilmath]X\eq 1[/ilmath] | [ilmath]Y\eq 0[/ilmath] |
[ilmath]X\eq 2[/ilmath] | [ilmath]Y\eq 1[/ilmath] | |
[ilmath]X\eq 3[/ilmath] | [ilmath]Y\eq 2[/ilmath] | |
[ilmath]\vdots[/ilmath] | [ilmath]\vdots[/ilmath] | |
[ilmath]X\eq i+1[/ilmath] | [ilmath]Y\eq i[/ilmath] | |
[ilmath]2[/ilmath] | [ilmath]X\eq 2[/ilmath] | [ilmath]Y\eq 0[/ilmath] |
[ilmath]X\eq 3[/ilmath] | [ilmath]Y\eq 1[/ilmath] | |
[ilmath]X\eq 4[/ilmath] | [ilmath]Y\eq 2[/ilmath] | |
[ilmath]\vdots[/ilmath] | [ilmath]\vdots[/ilmath] | |
[ilmath]X\eq i+2[/ilmath] | [ilmath]Y\eq i[/ilmath] | |
[ilmath]\cdots[/ilmath] | ||
[ilmath]j[/ilmath] | [ilmath]X\eq j[/ilmath] | [ilmath]Y\eq 0[/ilmath] |
[ilmath]X\eq j+1[/ilmath] | [ilmath]Y\eq 1[/ilmath] | |
[ilmath]X\eq j+2[/ilmath] | [ilmath]Y\eq 2[/ilmath] | |
[ilmath]\vdots[/ilmath] | [ilmath]\vdots[/ilmath] | |
[ilmath]X\eq i+j[/ilmath] | [ilmath]Y\eq i[/ilmath] |
Finding [ilmath]\P{Z\eq k} [/ilmath] for [ilmath]k\in\mathbb{N}_0[/ilmath]
TODO: Cover why we split the cases (from below: "Whereas: if we have [ilmath]k<0[/ilmath] then [ilmath]Y>X[/ilmath] so if we have [ilmath]Y\eq 0[/ilmath] then [ilmath]X< 0[/ilmath] follows, which we can't do, so we must sum the other way around ) - good start
- Let [ilmath]k\in\mathbb{N}_0[/ilmath] be given
- Suppose [ilmath]k\ge 0[/ilmath]
- Then [ilmath]X-Y\eq k[/ilmath] giving [ilmath]X\eq Y+k[/ilmath]
- Specifically, [ilmath]k\ge 0[/ilmath] means [ilmath]X-Y\ge 0[/ilmath] so [ilmath]X\ge Y[/ilmath] and so forth as shown for a handful of values in the table on the right
- As Poisson is only defined for values in [ilmath]\mathbb{N}_0[/ilmath] we can start with [ilmath]Y\eq 0[/ilmath] and [ilmath]X[/ilmath] will be [ilmath]\ge 0[/ilmath] too
- Whereas: if we have [ilmath]k<0[/ilmath] then [ilmath]Y>X[/ilmath] so if we have [ilmath]Y\eq 0[/ilmath] then [ilmath]X< 0[/ilmath] follows, which we can't do, so we must sum the other way around if [ilmath]k<0[/ilmath]. This is why we analyse [ilmath]\P{Z\eq k} [/ilmath] as two cases!
- As Poisson is only defined for values in [ilmath]\mathbb{N}_0[/ilmath] we can start with [ilmath]Y\eq 0[/ilmath] and [ilmath]X[/ilmath] will be [ilmath]\ge 0[/ilmath] too
- So we see that if [ilmath]Y[/ilmath] takes any value in [ilmath]\mathbb{N}_0[/ilmath] that [ilmath]X\eq Y+k[/ilmath] means [ilmath]X\in\mathbb{N}_0[/ilmath] too, the key point
- Thus we want to sum over [ilmath]i\in\mathbb{N}_0[/ilmath] of [ilmath]\P{Y\eq i\wedge X\eq i+k} [/ilmath] as for these cases we see [ilmath]X-Y\eq i+k-i\eq k[/ilmath] as required
- So: [math]\P{Z\eq k}:\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\Pcond{X\eq i+k}{Y\eq i} [/math]
- As [ilmath]X[/ilmath] and [ilmath]Y[/ilmath] are independent distributions we know [ilmath]\Pcond{X\eq u}{Y\eq v}\eq\P{X\eq u} [/ilmath] we see:
- [math]\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\P{X\eq i+k} [/math]
- Then [math]\P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0}{\Bigg[ \underbrace{e^{-\l{_2} }\cdot\frac{\l{_2}^i}{i!} }_{\P{Y\eq i} }\cdot\underbrace{e^{-\l{_1} }\cdot\frac{\l{_1}^{i+k} }{(i+k)!} }_{\P{X\eq i+k} }\cdot\Bigg]} [/math] - by definition of the Poisson distribution
- [math]\eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i}{i!}\cdot\frac{\lambda_1^k}{(i+k)!} [/math]
- Notice that [ilmath](i+k)!\eq i!\big((i+1)(i+2)\cdots(i+k)\big)[/ilmath][math]\eq i!\frac{(i+k)!}{i!} [/math], so [math]\frac{1}{(i+k)!}\eq \frac{1}{i!\frac{(i+k)!}{i!} } \eq \frac{i!}{i!(i+k)!} [/math]
- So [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i\lambda_1^k}{i!}\cdot\frac{i!}{i!(i+k)!} [/math]
- Finally giving:
- [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{i!i!}\cdot\frac{i!}{(i+k)!} [/math] or
- is is perhaps better written as: [math]\P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}{\left[\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{(i!)^2}\cdot\frac{1}{\prod_{j\eq 1}^{k}(i+j)}\right]} [/math]
- Finally giving:
- As [ilmath]X[/ilmath] and [ilmath]Y[/ilmath] are independent distributions we know [ilmath]\Pcond{X\eq u}{Y\eq v}\eq\P{X\eq u} [/ilmath] we see:
- So: [math]\P{Z\eq k}:\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\Pcond{X\eq i+k}{Y\eq i} [/math]
- Suppose [ilmath]k\ge 0[/ilmath]
NOTES for [ilmath]k<0[/ilmath]
Let [ilmath]k':\eq -k[/ilmath] so it's positive and note that:
- [ilmath]X-Y\eq k[/ilmath] means [ilmath]Y-X\eq -k\eq k'[/ilmath]
- Importantly [ilmath]Y-X\eq k'[/ilmath] for [ilmath]k' >0[/ilmath] (as [ilmath]k<0[/ilmath] we see [ilmath]k':\eq -k > 0[/ilmath])
- Let [ilmath]X':\eq Y[/ilmath] and [ilmath]Y':\eq X[/ilmath] then we have:
- [ilmath]X'-Y'\eq k'[/ilmath] for [ilmath]k'>0[/ilmath] and both [ilmath]X'[/ilmath] and [ilmath]Y'[/ilmath] as Poisson distributions with given rates
- We can use the [ilmath]k\ge 0[/ilmath] formula for this as [ilmath]k'>0[/ilmath] so is certainly [ilmath]\ge 0[/ilmath]
- [ilmath]X'-Y'\eq k'[/ilmath] for [ilmath]k'>0[/ilmath] and both [ilmath]X'[/ilmath] and [ilmath]Y'[/ilmath] as Poisson distributions with given rates
- Let [ilmath]X':\eq Y[/ilmath] and [ilmath]Y':\eq X[/ilmath] then we have:
- Importantly [ilmath]Y-X\eq k'[/ilmath] for [ilmath]k' >0[/ilmath] (as [ilmath]k<0[/ilmath] we see [ilmath]k':\eq -k > 0[/ilmath])
Notes
- ↑ Note that [ilmath]\sum_{i\in\mathbb{N}_0}a_i[/ilmath] means [ilmath]\sum^\infty_{i\eq 0}a_i[/ilmath] - see Notes:Infinity notation
References
- ↑ I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area
|