$X_1,..,X_n$ i.i.d random variable and discrete, local limit theorem












8












$begingroup$



Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question











$endgroup$












  • $begingroup$
    You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    $endgroup$
    – Michael
    Jan 3 at 6:02










  • $begingroup$
    @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    $endgroup$
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • $begingroup$
    You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    $endgroup$
    – Michael
    Jan 3 at 18:06












  • $begingroup$
    I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    $endgroup$
    – Michael
    Jan 5 at 17:30








  • 1




    $begingroup$
    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    $endgroup$
    – Michael
    Jan 7 at 1:38


















8












$begingroup$



Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question











$endgroup$












  • $begingroup$
    You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    $endgroup$
    – Michael
    Jan 3 at 6:02










  • $begingroup$
    @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    $endgroup$
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • $begingroup$
    You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    $endgroup$
    – Michael
    Jan 3 at 18:06












  • $begingroup$
    I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    $endgroup$
    – Michael
    Jan 5 at 17:30








  • 1




    $begingroup$
    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    $endgroup$
    – Michael
    Jan 7 at 1:38
















8












8








8


5



$begingroup$



Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question











$endgroup$





Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !







probability probability-theory probability-limit-theorems






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 6 at 10:49







Jebfiffkkfnfolzbd

















asked Jan 2 at 14:45









JebfiffkkfnfolzbdJebfiffkkfnfolzbd

642




642












  • $begingroup$
    You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    $endgroup$
    – Michael
    Jan 3 at 6:02










  • $begingroup$
    @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    $endgroup$
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • $begingroup$
    You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    $endgroup$
    – Michael
    Jan 3 at 18:06












  • $begingroup$
    I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    $endgroup$
    – Michael
    Jan 5 at 17:30








  • 1




    $begingroup$
    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    $endgroup$
    – Michael
    Jan 7 at 1:38




















  • $begingroup$
    You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    $endgroup$
    – Michael
    Jan 3 at 6:02










  • $begingroup$
    @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    $endgroup$
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • $begingroup$
    You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    $endgroup$
    – Michael
    Jan 3 at 18:06












  • $begingroup$
    I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    $endgroup$
    – Michael
    Jan 5 at 17:30








  • 1




    $begingroup$
    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    $endgroup$
    – Michael
    Jan 7 at 1:38


















$begingroup$
You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
$endgroup$
– Michael
Jan 3 at 6:02




$begingroup$
You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
$endgroup$
– Michael
Jan 3 at 6:02












$begingroup$
@Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
$endgroup$
– Jebfiffkkfnfolzbd
Jan 3 at 17:01




$begingroup$
@Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
$endgroup$
– Jebfiffkkfnfolzbd
Jan 3 at 17:01












$begingroup$
You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
$endgroup$
– Michael
Jan 3 at 18:06






$begingroup$
You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
$endgroup$
– Michael
Jan 3 at 18:06














$begingroup$
I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
$endgroup$
– Michael
Jan 5 at 17:30






$begingroup$
I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
$endgroup$
– Michael
Jan 5 at 17:30






1




1




$begingroup$
@Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
$endgroup$
– Michael
Jan 7 at 1:38






$begingroup$
@Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
$endgroup$
– Michael
Jan 7 at 1:38












1 Answer
1






active

oldest

votes


















0












$begingroup$

I can prove the lower bound as follows:



$P(|S_n-nmu| geq cn^atheta) leq frac{1}{c^2n^{2a-1}}$.



Thus the largest $P(S_n=k)$ is roughly greater than $(1-c^{-2}n^{1-2a})(2cn^atheta+1)^{-1}$.
(Because this quantity is not greater than $P(|S_n-nmu| leq cn^atheta)|[nmu-cn^atheta,nmu+cn^atheta] cap mathbb{Z}|^{-1}$).



Take $a=1/2, c=2$, this quantity is greater than $frac{3/4}{4thetasqrt{n}+1}$, which proves the lower bound.



Edit: this was done without looking at Michael’s comment.






share|cite|improve this answer









$endgroup$














    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059544%2fx-1-x-n-i-i-d-random-variable-and-discrete-local-limit-theorem%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    I can prove the lower bound as follows:



    $P(|S_n-nmu| geq cn^atheta) leq frac{1}{c^2n^{2a-1}}$.



    Thus the largest $P(S_n=k)$ is roughly greater than $(1-c^{-2}n^{1-2a})(2cn^atheta+1)^{-1}$.
    (Because this quantity is not greater than $P(|S_n-nmu| leq cn^atheta)|[nmu-cn^atheta,nmu+cn^atheta] cap mathbb{Z}|^{-1}$).



    Take $a=1/2, c=2$, this quantity is greater than $frac{3/4}{4thetasqrt{n}+1}$, which proves the lower bound.



    Edit: this was done without looking at Michael’s comment.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      I can prove the lower bound as follows:



      $P(|S_n-nmu| geq cn^atheta) leq frac{1}{c^2n^{2a-1}}$.



      Thus the largest $P(S_n=k)$ is roughly greater than $(1-c^{-2}n^{1-2a})(2cn^atheta+1)^{-1}$.
      (Because this quantity is not greater than $P(|S_n-nmu| leq cn^atheta)|[nmu-cn^atheta,nmu+cn^atheta] cap mathbb{Z}|^{-1}$).



      Take $a=1/2, c=2$, this quantity is greater than $frac{3/4}{4thetasqrt{n}+1}$, which proves the lower bound.



      Edit: this was done without looking at Michael’s comment.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        I can prove the lower bound as follows:



        $P(|S_n-nmu| geq cn^atheta) leq frac{1}{c^2n^{2a-1}}$.



        Thus the largest $P(S_n=k)$ is roughly greater than $(1-c^{-2}n^{1-2a})(2cn^atheta+1)^{-1}$.
        (Because this quantity is not greater than $P(|S_n-nmu| leq cn^atheta)|[nmu-cn^atheta,nmu+cn^atheta] cap mathbb{Z}|^{-1}$).



        Take $a=1/2, c=2$, this quantity is greater than $frac{3/4}{4thetasqrt{n}+1}$, which proves the lower bound.



        Edit: this was done without looking at Michael’s comment.






        share|cite|improve this answer









        $endgroup$



        I can prove the lower bound as follows:



        $P(|S_n-nmu| geq cn^atheta) leq frac{1}{c^2n^{2a-1}}$.



        Thus the largest $P(S_n=k)$ is roughly greater than $(1-c^{-2}n^{1-2a})(2cn^atheta+1)^{-1}$.
        (Because this quantity is not greater than $P(|S_n-nmu| leq cn^atheta)|[nmu-cn^atheta,nmu+cn^atheta] cap mathbb{Z}|^{-1}$).



        Take $a=1/2, c=2$, this quantity is greater than $frac{3/4}{4thetasqrt{n}+1}$, which proves the lower bound.



        Edit: this was done without looking at Michael’s comment.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 10 at 0:34









        MindlackMindlack

        4,910211




        4,910211






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059544%2fx-1-x-n-i-i-d-random-variable-and-discrete-local-limit-theorem%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Probability when a professor distributes a quiz and homework assignment to a class of n students.

            Aardman Animations

            Are they similar matrix