Difference between revisions of "Alec's sample mean bound"
From Maths
(Created page with "{{Stub page|grade=A*|msg=Needs some work, like what is a random variable for which expectation and variance are defined? Can we have complex or vector ones for example?}} ==In...") |
(Started to separate out into multiple pages) |
||
Line 1: | Line 1: | ||
− | {{Stub page|grade=A*|msg=Needs some work, like what is a random variable for which expectation and variance are defined? Can we have complex or vector ones for example?}} | + | {{Stub page|grade=A*|msg=Needs some work, like what is a random variable for which expectation and variance are defined? Can we have complex or vector ones for example?}}{{ProbMacros}} |
+ | ==Notice== | ||
+ | It appears that this is actually 3 inequalities in one, which we shall name as follows: | ||
+ | # [[Alec's remaining probability bound]] - that for {{M|X}} a real and non-negative random variable, {{MM|\forall \alpha\in\mathbb{R}_{>0}\left[\P{X\ge \alpha}\le\frac{\E{X} }{\alpha}\right]}} | ||
+ | # [[Alec's deviation probability bound]] - that for a real random variable (possibly negative) {{M|X}} that {{MM|\forall\beta\in\mathbb{R}_{>0}\left[\mathbb{P}\Big[\big\vert X-\E{X}\big\vert\ge\beta\Big]\le\frac{\Var{X} }{\beta^2} \right]}} | ||
+ | # [[Alec's sample mean bound]] (this page) | ||
==Inequality== | ==Inequality== | ||
Let {{M|X_1,\ldots,X_n}} be a collection of {{M|n}} {{link|random variable||s}} which are [[pairwise independent]], such that: | Let {{M|X_1,\ldots,X_n}} be a collection of {{M|n}} {{link|random variable||s}} which are [[pairwise independent]], such that: | ||
Line 11: | Line 16: | ||
*** Note that the notation here differs from that in my 2011 research journal slightly, but {{M|\sigma}} and {{M|\mu}} were present. | *** Note that the notation here differs from that in my 2011 research journal slightly, but {{M|\sigma}} and {{M|\mu}} were present. | ||
**{{MM|1=\mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n} }} | **{{MM|1=\mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n} }} | ||
+ | ==History of form== | ||
+ | When I "discovered" this inequality I was looking to say something like "the chance of the sample mean being within so much of the mean is at least ..." | ||
+ | |||
+ | I didn't know how to handle {{M|\vert X-\E{X}\vert}} (what we'd now call [[Mdm|{{M|\Mdm{X} }}]]) which is why I applied it to variance, and of course {{M|\big(X-\E{X}\big)^2\ge 0}} - the only condition required for the first inequality. |
Latest revision as of 21:02, 11 January 2018
Stub grade: A*
This page is a stub
This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
Needs some work, like what is a random variable for which expectation and variance are defined? Can we have complex or vector ones for example?
[ilmath]\newcommand{\P}[2][]{\mathbb{P}#1{\left[{#2}\right]} } \newcommand{\Pcond}[3][]{\mathbb{P}#1{\left[{#2}\!\ \middle\vert\!\ {#3}\right]} } \newcommand{\Plcond}[3][]{\Pcond[#1]{#2}{#3} } \newcommand{\Prcond}[3][]{\Pcond[#1]{#2}{#3} }[/ilmath]
[ilmath]\newcommand{\E}[1]{ {\mathbb{E}{\left[{#1}\right]} } } [/ilmath][ilmath]\newcommand{\Mdm}[1]{\text{Mdm}{\left({#1}\right) } } [/ilmath][ilmath]\newcommand{\Var}[1]{\text{Var}{\left({#1}\right) } } [/ilmath][ilmath]\newcommand{\ncr}[2]{ \vphantom{C}^{#1}\!C_{#2} } [/ilmath]
Notice
It appears that this is actually 3 inequalities in one, which we shall name as follows:
- Alec's remaining probability bound - that for [ilmath]X[/ilmath] a real and non-negative random variable, [math]\forall \alpha\in\mathbb{R}_{>0}\left[\P{X\ge \alpha}\le\frac{\E{X} }{\alpha}\right][/math]
- Alec's deviation probability bound - that for a real random variable (possibly negative) [ilmath]X[/ilmath] that [math]\forall\beta\in\mathbb{R}_{>0}\left[\mathbb{P}\Big[\big\vert X-\E{X}\big\vert\ge\beta\Big]\le\frac{\Var{X} }{\beta^2} \right][/math]
- Alec's sample mean bound (this page)
Inequality
Let [ilmath]X_1,\ldots,X_n[/ilmath] be a collection of [ilmath]n[/ilmath] random variables which are pairwise independent, such that:
- [ilmath]\exists\mu\forall i\in\{1,\ldots,n\}\big[\mathbb{E}[X_i]\eq\mu\big][/ilmath] - all of the [ilmath]X_i[/ilmath] have the same expectation and
- Alternatively: [ilmath]\forall i,j\in\{1,\ldots,n\}\big[\mathbb{E}[X_i]\eq\mathbb{E}[X_j]\big][/ilmath], but note we need [ilmath]\mu[/ilmath] in the expression
- [ilmath]\exists\sigma\forall i\in\{1,\ldots,n\}\big[\text{Var}(X_i)\eq\sigma^2\big][/ilmath] - all the [ilmath]X_i[/ilmath] have the same variance
- Alternatively: [ilmath]\forall i,j\in\{1,\ldots,n\}\big[\text{Var}(X_i)\eq\text{Var}(X_j)\big][/ilmath], but note again we need [ilmath]\sigma[/ilmath] in the expression
Then
- For all [ilmath]\epsilon>0[/ilmath] we have:
- [math]\mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}[/math]
- Note that the notation here differs from that in my 2011 research journal slightly, but [ilmath]\sigma[/ilmath] and [ilmath]\mu[/ilmath] were present.
- [math]\mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}[/math]
- [math]\mathbb{P}\left[\left\vert\frac{\sum^n_{i\eq 1}X_i}{n}-\mu\right\vert<\epsilon\right]\ge 1-\frac{\sigma^2}{\epsilon^2n}[/math]
History of form
When I "discovered" this inequality I was looking to say something like "the chance of the sample mean being within so much of the mean is at least ..."
I didn't know how to handle [ilmath]\vert X-\E{X}\vert[/ilmath] (what we'd now call [ilmath]\Mdm{X} [/ilmath]) which is why I applied it to variance, and of course [ilmath]\big(X-\E{X}\big)^2\ge 0[/ilmath] - the only condition required for the first inequality.