Relating posterior to the least square estimator of W











up vote
1
down vote

favorite












I'm currently working on an assigment and I'm currently stuck and could really use some help, I've been given the fact that my prior over my parameters W is given by a gaussian pdf, likewise is the likelihood a gaussian pdf. Without any further proof the posterior can also be taken for a gaussian.



My expression for the posterior is:



$$p(textbf{W}vert textbf{X},textbf{T}) = exp(-frac{1}{2}textbf{W}^{T}Sigma_w^{-1}textbf{W} +textbf{W}^{T}Sigma_w^{-1}textbf{W}_mu -frac{1}{2}textbf{W}_mu^{T}Sigma_w^{-1}textbf{W}_mu)$$



I've made derivations for the mean $textbf{W}_mu$ and the covariance $Sigma_w^{-1}$, but I don't think they play an important role to what I'm supposed to do here. I think I should use maximum likelihood but I don't seem to get the calculations right.










share|cite|improve this question




























    up vote
    1
    down vote

    favorite












    I'm currently working on an assigment and I'm currently stuck and could really use some help, I've been given the fact that my prior over my parameters W is given by a gaussian pdf, likewise is the likelihood a gaussian pdf. Without any further proof the posterior can also be taken for a gaussian.



    My expression for the posterior is:



    $$p(textbf{W}vert textbf{X},textbf{T}) = exp(-frac{1}{2}textbf{W}^{T}Sigma_w^{-1}textbf{W} +textbf{W}^{T}Sigma_w^{-1}textbf{W}_mu -frac{1}{2}textbf{W}_mu^{T}Sigma_w^{-1}textbf{W}_mu)$$



    I've made derivations for the mean $textbf{W}_mu$ and the covariance $Sigma_w^{-1}$, but I don't think they play an important role to what I'm supposed to do here. I think I should use maximum likelihood but I don't seem to get the calculations right.










    share|cite|improve this question


























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I'm currently working on an assigment and I'm currently stuck and could really use some help, I've been given the fact that my prior over my parameters W is given by a gaussian pdf, likewise is the likelihood a gaussian pdf. Without any further proof the posterior can also be taken for a gaussian.



      My expression for the posterior is:



      $$p(textbf{W}vert textbf{X},textbf{T}) = exp(-frac{1}{2}textbf{W}^{T}Sigma_w^{-1}textbf{W} +textbf{W}^{T}Sigma_w^{-1}textbf{W}_mu -frac{1}{2}textbf{W}_mu^{T}Sigma_w^{-1}textbf{W}_mu)$$



      I've made derivations for the mean $textbf{W}_mu$ and the covariance $Sigma_w^{-1}$, but I don't think they play an important role to what I'm supposed to do here. I think I should use maximum likelihood but I don't seem to get the calculations right.










      share|cite|improve this question















      I'm currently working on an assigment and I'm currently stuck and could really use some help, I've been given the fact that my prior over my parameters W is given by a gaussian pdf, likewise is the likelihood a gaussian pdf. Without any further proof the posterior can also be taken for a gaussian.



      My expression for the posterior is:



      $$p(textbf{W}vert textbf{X},textbf{T}) = exp(-frac{1}{2}textbf{W}^{T}Sigma_w^{-1}textbf{W} +textbf{W}^{T}Sigma_w^{-1}textbf{W}_mu -frac{1}{2}textbf{W}_mu^{T}Sigma_w^{-1}textbf{W}_mu)$$



      I've made derivations for the mean $textbf{W}_mu$ and the covariance $Sigma_w^{-1}$, but I don't think they play an important role to what I'm supposed to do here. I think I should use maximum likelihood but I don't seem to get the calculations right.







      normal-distribution maximum-likelihood






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited yesterday

























      asked yesterday









      A.Maine

      448




      448



























          active

          oldest

          votes











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














           

          draft saved


          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996375%2frelating-posterior-to-the-least-square-estimator-of-w%23new-answer', 'question_page');
          }
          );

          Post as a guest





































          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes
















           

          draft saved


          draft discarded



















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996375%2frelating-posterior-to-the-least-square-estimator-of-w%23new-answer', 'question_page');
          }
          );

          Post as a guest




















































































          Popular posts from this blog

          Aardman Animations

          Are they similar matrix

          “minimization” problem in Euclidean space related to orthonormal basis