Difference between revisions of "Poisson race distribution"

From Maths
Jump to: navigation, search
(Added proof and entire gist of missing case.)
m (Removing inline name)
 
Line 4: Line 4:
 
Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable:
 
Let {{M|X\sim}}[[Poisson distribution|{{M|\text{Poi} }}]]{{M|(\lambda_1)}} and {{M|Y\sim\text{Poi}(\lambda_2)}} be given {{plural|random variable|s}} (that are [[independent random variables|independent]]), then define a new random variable:
 
* {{M|Z:\eq X-Y}}
 
* {{M|Z:\eq X-Y}}
We<sup><span style="color:#454545;">Alec:</span></sup><ref>I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area</ref> call this a "''Poisson race''" as in some sense they're racing, {{M|Z>0}} then {{M|X}} is winning, {{M|Z\eq 0}}, neck and neck and {{M|Z<0}} then {{M|Y}} is winning.
+
We<ref>I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area</ref> call this a "''Poisson race''" as in some sense they're racing, {{M|Z>0}} then {{M|X}} is winning, {{M|Z\eq 0}}, neck and neck and {{M|Z<0}} then {{M|Y}} is winning.
 
* {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail.
 
* {{Caveat|Remember that Poisson measures stuff ''per unit thing''}}, meaning, for example, that say we are talking "defects per mile of track" with {{M|\lambda_1}} being the defects per mile of the left rail (defined somehow) and {{M|\lambda_2}} the defects per mile of the right rail.
 
** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.  
 
** If there are 5 on the left and 3 on the right {{M|Z}} for that mile was {{M|+2}} - the next mile is independent and doesn't start off with this bias, that is {{M|Z\eq 0}} for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.  

Latest revision as of 10:33, 24 December 2018

\newcommand{\P}[2][]{\mathbb{P}#1{\left[{#2}\right]} } \newcommand{\Pcond}[3][]{\mathbb{P}#1{\left[{#2}\!\ \middle\vert\!\ {#3}\right]} } \newcommand{\Plcond}[3][]{\Pcond[#1]{#2}{#3} } \newcommand{\Prcond}[3][]{\Pcond[#1]{#2}{#3} }
\newcommand{\l}[1]{\lambda#1}

Definition

Let X\sim\text{Poi} (\lambda_1) and Y\sim\text{Poi}(\lambda_2) be given random variables (that are independent), then define a new random variable:

  • Z:\eq X-Y

We[1] call this a "Poisson race" as in some sense they're racing, Z>0 then X is winning, Z\eq 0, neck and neck and Z<0 then Y is winning.

  • Caveat:Remember that Poisson measures stuff per unit thing, meaning, for example, that say we are talking "defects per mile of track" with \lambda_1 being the defects per mile of the left rail (defined somehow) and \lambda_2 the defects per mile of the right rail.
    • If there are 5 on the left and 3 on the right Z for that mile was +2 - the next mile is independent and doesn't start off with this bias, that is Z\eq 0 for the next mile it doesn't mean the right rail caught up with 2 more than the left for this mile.
    • So this doesn't model an ongoing race.

Descriptions

  • for k\ge 0 and k\in\mathbb{Z} we have:
    • Claim 1: \P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2} \sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\lambda_2)^i}{i!(i+k)!} [Note 1], which may be written:
      • \P{Z\eq k}\eq \lambda_1^k e^{-\lambda_1-\lambda_2}\sum_{i\in\mathbb{N}_0}\left( \frac{(\lambda_1\lambda_2)^i}{(i!)^2}\cdot\frac{i!}{(i+k)!}\right) - this has some terms that look a bit like a Poisson term (squared) - perhaps it might lead to a closed form.
  • for k\le 0 and k\in\mathbb{Z} we can re-use the above result with X and Y flipped:
    • Can't find where I've written it down, will not guess it

Proof of claims

Grade: B
This page requires one or more proofs to be filled in, it is on a to-do list for being expanded with them.
Please note that this does not mean the content is unreliable. Unless there are any caveats mentioned below the statement comes from a reliable source. As always, Warnings and limitations will be clearly shown and possibly highlighted if very important (see template:Caution et al).
The message provided is:
Page 384 document spans consecutive numbers, squared, A4, no holes
  • Added proof, attention was all over the place so may be a bit disjoint Alec (talk) 02:08, 7 January 2018 (UTC)
    • Worth going through it again! Alec (talk) 02:08, 7 January 2018 (UTC)

STILL TODO:

  • The k<0 case! - Method outlined already. Alec (talk) 02:08, 7 January 2018 (UTC)
k X Y
0 X\eq 0 Y\eq 0
X\eq 1 Y\eq 1
X\eq 2 Y\eq 2
\vdots \vdots
X\eq i Y\eq i
1 X\eq 1 Y\eq 0
X\eq 2 Y\eq 1
X\eq 3 Y\eq 2
\vdots \vdots
X\eq i+1 Y\eq i
2 X\eq 2 Y\eq 0
X\eq 3 Y\eq 1
X\eq 4 Y\eq 2
\vdots \vdots
X\eq i+2 Y\eq i
\cdots
j X\eq j Y\eq 0
X\eq j+1 Y\eq 1
X\eq j+2 Y\eq 2
\vdots \vdots
X\eq i+j Y\eq i

Finding \P{Z\eq k} for k\in\mathbb{N}_0


TODO: Cover why we split the cases (from below: "Whereas: if we have k<0 then Y>X so if we have Y\eq 0 then X< 0 follows, which we can't do, so we must sum the other way around ) - good start


  • Let k\in\mathbb{N}_0 be given
    • Suppose k\ge 0
      • Then X-Y\eq k giving X\eq Y+k
      • Specifically, k\ge 0 means X-Y\ge 0 so X\ge Y and so forth as shown for a handful of values in the table on the right
        • As Poisson is only defined for values in \mathbb{N}_0 we can start with Y\eq 0 and X will be \ge 0 too
          • Whereas: if we have k<0 then Y>X so if we have Y\eq 0 then X< 0 follows, which we can't do, so we must sum the other way around if k<0. This is why we analyse \P{Z\eq k} as two cases!
      • So we see that if Y takes any value in \mathbb{N}_0 that X\eq Y+k means X\in\mathbb{N}_0 too, the key point
      • Thus we want to sum over i\in\mathbb{N}_0 of \P{Y\eq i\wedge X\eq i+k} as for these cases we see X-Y\eq i+k-i\eq k as required
        • So: \P{Z\eq k}:\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\Pcond{X\eq i+k}{Y\eq i}
          • As X and Y are independent distributions we know \Pcond{X\eq u}{Y\eq v}\eq\P{X\eq u} we see:
            • \P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0} \P{Y\eq i}\cdot\P{X\eq i+k}
          • Then \P{Z\eq k}\eq\sum_{i\in\mathbb{N}_0}{\Bigg[ \underbrace{e^{-\l{_2} }\cdot\frac{\l{_2}^i}{i!} }_{\P{Y\eq i} }\cdot\underbrace{e^{-\l{_1} }\cdot\frac{\l{_1}^{i+k} }{(i+k)!} }_{\P{X\eq i+k} }\cdot\Bigg]} - by definition of the Poisson distribution
            \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i}{i!}\cdot\frac{\lambda_1^k}{(i+k)!}
            • Notice that (i+k)!\eq i!\big((i+1)(i+2)\cdots(i+k)\big)\eq i!\frac{(i+k)!}{i!} , so \frac{1}{(i+k)!}\eq \frac{1}{i!\frac{(i+k)!}{i!} } \eq \frac{i!}{i!(i+k)!}
          • So \P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\frac{(\lambda_1\cdot\lambda_2)^i\lambda_1^k}{i!}\cdot\frac{i!}{i!(i+k)!}
            • Finally giving:
              • \P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{i!i!}\cdot\frac{i!}{(i+k)!} or
              • is is perhaps better written as:
                \P{Z\eq k} \eq e^{-\lambda_1-\l{_2} }\sum_{i\in\mathbb{N}_0}{\left[\lambda_1^k\cdot\frac{(\lambda_1\cdot\lambda_2)^i}{(i!)^2}\cdot\frac{1}{\prod_{j\eq 1}^{k}(i+j)}\right]}

NOTES for k<0

Let k':\eq -k so it's positive and note that:

  • X-Y\eq k means Y-X\eq -k\eq k'
    • Importantly Y-X\eq k' for k' >0 (as k<0 we see k':\eq -k > 0)
      • Let X':\eq Y and Y':\eq X then we have:
        • X'-Y'\eq k' for k'>0 and both X' and Y' as Poisson distributions with given rates
          • We can use the k\ge 0 formula for this as k'>0 so is certainly \ge 0

Notes

  1. Jump up Note that \sum_{i\in\mathbb{N}_0}a_i means \sum^\infty_{i\eq 0}a_i - see Notes:Infinity notation

References

  1. Jump up I've heard this somewhere before, I can't remember where from - I've had a quick search, while not established it's in the right area