Analytical approximation for logit-normal-binomial distribution












1












$begingroup$


As I understand, there is no closed form expression for



$$f(x, mu, sigma) = int_0^1 p^{(x-1)}(1-p)^{n-x-1}expleft(-{(text{logit}(p) -mu)^2 over 2sigma^2}right)dp.$$



Is it possible to obtain an analytical approximation for this?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
    $endgroup$
    – user14717
    Dec 17 '18 at 7:28












  • $begingroup$
    I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 11:19










  • $begingroup$
    You could use quadrature to evaluate it at many arguments and then do interpolation.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:12










  • $begingroup$
    I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 16:23








  • 1




    $begingroup$
    The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:27
















1












$begingroup$


As I understand, there is no closed form expression for



$$f(x, mu, sigma) = int_0^1 p^{(x-1)}(1-p)^{n-x-1}expleft(-{(text{logit}(p) -mu)^2 over 2sigma^2}right)dp.$$



Is it possible to obtain an analytical approximation for this?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
    $endgroup$
    – user14717
    Dec 17 '18 at 7:28












  • $begingroup$
    I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 11:19










  • $begingroup$
    You could use quadrature to evaluate it at many arguments and then do interpolation.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:12










  • $begingroup$
    I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 16:23








  • 1




    $begingroup$
    The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:27














1












1








1





$begingroup$


As I understand, there is no closed form expression for



$$f(x, mu, sigma) = int_0^1 p^{(x-1)}(1-p)^{n-x-1}expleft(-{(text{logit}(p) -mu)^2 over 2sigma^2}right)dp.$$



Is it possible to obtain an analytical approximation for this?










share|cite|improve this question









$endgroup$




As I understand, there is no closed form expression for



$$f(x, mu, sigma) = int_0^1 p^{(x-1)}(1-p)^{n-x-1}expleft(-{(text{logit}(p) -mu)^2 over 2sigma^2}right)dp.$$



Is it possible to obtain an analytical approximation for this?







integration numerical-methods approximation approximate-integration






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 17 '18 at 5:00









zipzapboingzipzapboing

979




979












  • $begingroup$
    Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
    $endgroup$
    – user14717
    Dec 17 '18 at 7:28












  • $begingroup$
    I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 11:19










  • $begingroup$
    You could use quadrature to evaluate it at many arguments and then do interpolation.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:12










  • $begingroup$
    I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 16:23








  • 1




    $begingroup$
    The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:27


















  • $begingroup$
    Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
    $endgroup$
    – user14717
    Dec 17 '18 at 7:28












  • $begingroup$
    I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 11:19










  • $begingroup$
    You could use quadrature to evaluate it at many arguments and then do interpolation.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:12










  • $begingroup$
    I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
    $endgroup$
    – zipzapboing
    Dec 17 '18 at 16:23








  • 1




    $begingroup$
    The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
    $endgroup$
    – user14717
    Dec 17 '18 at 16:27
















$begingroup$
Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
$endgroup$
– user14717
Dec 17 '18 at 7:28






$begingroup$
Do you mean "asymptotic"? There are numerous analytic approximations to any continuous function.
$endgroup$
– user14717
Dec 17 '18 at 7:28














$begingroup$
I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
$endgroup$
– zipzapboing
Dec 17 '18 at 11:19




$begingroup$
I suppose there must be some that are more computationally efficient/reach small error with a smaller number of terms? I need to insert this as part of a statistical model so a tradeoff between accuracy and efficiency is needed.
$endgroup$
– zipzapboing
Dec 17 '18 at 11:19












$begingroup$
You could use quadrature to evaluate it at many arguments and then do interpolation.
$endgroup$
– user14717
Dec 17 '18 at 16:12




$begingroup$
You could use quadrature to evaluate it at many arguments and then do interpolation.
$endgroup$
– user14717
Dec 17 '18 at 16:12












$begingroup$
I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
$endgroup$
– zipzapboing
Dec 17 '18 at 16:23






$begingroup$
I suppose that can work. I've never attempted something similar so I had something like a multivariable Taylor series in mind. I guess that theoretical guarantees are irrelevant when there's enough computational power to brute force the approximation and ensure it works over the range of things I care about.
$endgroup$
– zipzapboing
Dec 17 '18 at 16:23






1




1




$begingroup$
The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
$endgroup$
– user14717
Dec 17 '18 at 16:27




$begingroup$
The theoretical guarantees will be much stronger with an interpolator than with multivariate Taylor series.
$endgroup$
– user14717
Dec 17 '18 at 16:27










1 Answer
1






active

oldest

votes


















1












$begingroup$

Here's what you need to do:




  • Decide on an interpolator. I suggest a tricubic b-spline, but finding software for this is going to be painful. To understand this interpolant, start in Rainer Kress's Numerical Analysis which introduces it in 1D, learn about the bicubic b-splines in 2D, and then you'll be able to understand the tricubic. If you don't like tricubic b-splines, as an alternative, you might also be able to use multivariate Chebyshev series.


  • Interpolators require data at a particular geometry of points; figure out what those points are for your given interpolator and then evaluate the integral by quadrature at each point. (For tricubic b-splines it's easy: A uniform grid.) It looks like tanh-sinh quadrature is probably the best for this integral but Gaussian or Gauss-Kronrod will also work fine.



Another alternative is just to use quadrature to evaluate $f$ at any point $(x, mu, sigma)$, and ditch the interpolator. This will reduce the speed by a factor of 10 to 100, but since a quadrature takes about 500ns-1$mu$s, you might not really care.



If you've never done anything like this get ready for some effort shock.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3043544%2fanalytical-approximation-for-logit-normal-binomial-distribution%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Here's what you need to do:




    • Decide on an interpolator. I suggest a tricubic b-spline, but finding software for this is going to be painful. To understand this interpolant, start in Rainer Kress's Numerical Analysis which introduces it in 1D, learn about the bicubic b-splines in 2D, and then you'll be able to understand the tricubic. If you don't like tricubic b-splines, as an alternative, you might also be able to use multivariate Chebyshev series.


    • Interpolators require data at a particular geometry of points; figure out what those points are for your given interpolator and then evaluate the integral by quadrature at each point. (For tricubic b-splines it's easy: A uniform grid.) It looks like tanh-sinh quadrature is probably the best for this integral but Gaussian or Gauss-Kronrod will also work fine.



    Another alternative is just to use quadrature to evaluate $f$ at any point $(x, mu, sigma)$, and ditch the interpolator. This will reduce the speed by a factor of 10 to 100, but since a quadrature takes about 500ns-1$mu$s, you might not really care.



    If you've never done anything like this get ready for some effort shock.






    share|cite|improve this answer











    $endgroup$


















      1












      $begingroup$

      Here's what you need to do:




      • Decide on an interpolator. I suggest a tricubic b-spline, but finding software for this is going to be painful. To understand this interpolant, start in Rainer Kress's Numerical Analysis which introduces it in 1D, learn about the bicubic b-splines in 2D, and then you'll be able to understand the tricubic. If you don't like tricubic b-splines, as an alternative, you might also be able to use multivariate Chebyshev series.


      • Interpolators require data at a particular geometry of points; figure out what those points are for your given interpolator and then evaluate the integral by quadrature at each point. (For tricubic b-splines it's easy: A uniform grid.) It looks like tanh-sinh quadrature is probably the best for this integral but Gaussian or Gauss-Kronrod will also work fine.



      Another alternative is just to use quadrature to evaluate $f$ at any point $(x, mu, sigma)$, and ditch the interpolator. This will reduce the speed by a factor of 10 to 100, but since a quadrature takes about 500ns-1$mu$s, you might not really care.



      If you've never done anything like this get ready for some effort shock.






      share|cite|improve this answer











      $endgroup$
















        1












        1








        1





        $begingroup$

        Here's what you need to do:




        • Decide on an interpolator. I suggest a tricubic b-spline, but finding software for this is going to be painful. To understand this interpolant, start in Rainer Kress's Numerical Analysis which introduces it in 1D, learn about the bicubic b-splines in 2D, and then you'll be able to understand the tricubic. If you don't like tricubic b-splines, as an alternative, you might also be able to use multivariate Chebyshev series.


        • Interpolators require data at a particular geometry of points; figure out what those points are for your given interpolator and then evaluate the integral by quadrature at each point. (For tricubic b-splines it's easy: A uniform grid.) It looks like tanh-sinh quadrature is probably the best for this integral but Gaussian or Gauss-Kronrod will also work fine.



        Another alternative is just to use quadrature to evaluate $f$ at any point $(x, mu, sigma)$, and ditch the interpolator. This will reduce the speed by a factor of 10 to 100, but since a quadrature takes about 500ns-1$mu$s, you might not really care.



        If you've never done anything like this get ready for some effort shock.






        share|cite|improve this answer











        $endgroup$



        Here's what you need to do:




        • Decide on an interpolator. I suggest a tricubic b-spline, but finding software for this is going to be painful. To understand this interpolant, start in Rainer Kress's Numerical Analysis which introduces it in 1D, learn about the bicubic b-splines in 2D, and then you'll be able to understand the tricubic. If you don't like tricubic b-splines, as an alternative, you might also be able to use multivariate Chebyshev series.


        • Interpolators require data at a particular geometry of points; figure out what those points are for your given interpolator and then evaluate the integral by quadrature at each point. (For tricubic b-splines it's easy: A uniform grid.) It looks like tanh-sinh quadrature is probably the best for this integral but Gaussian or Gauss-Kronrod will also work fine.



        Another alternative is just to use quadrature to evaluate $f$ at any point $(x, mu, sigma)$, and ditch the interpolator. This will reduce the speed by a factor of 10 to 100, but since a quadrature takes about 500ns-1$mu$s, you might not really care.



        If you've never done anything like this get ready for some effort shock.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 17 '18 at 17:04

























        answered Dec 17 '18 at 16:57









        user14717user14717

        3,8631120




        3,8631120






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3043544%2fanalytical-approximation-for-logit-normal-binomial-distribution%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How do I know what Microsoft account the skydrive app is syncing to?

            When does type information flow backwards in C++?

            Grease: Live!