Obtaining the gradient of a vector-valued function











up vote
0
down vote

favorite












I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question






















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    Nov 15 at 9:01












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    Nov 15 at 9:29

















up vote
0
down vote

favorite












I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question






















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    Nov 15 at 9:01












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    Nov 15 at 9:29















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?










share|cite|improve this question













I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.



Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)



How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?







derivatives vector-spaces gradient-descent






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 15 at 8:22









The Bosco

495211




495211












  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    Nov 15 at 9:01












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    Nov 15 at 9:29




















  • A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
    – mathreadler
    Nov 15 at 9:01












  • Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
    – Masacroso
    Nov 15 at 9:29


















A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01






A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01














Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29






Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29












2 Answers
2






active

oldest

votes

















up vote
0
down vote













The 'gradient' of something usually means taking all partial derivatives. Therefore



"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






share|cite|improve this answer





















  • This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    Nov 15 at 8:33










  • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    Nov 15 at 8:34


















up vote
0
down vote













A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer





















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      Nov 15 at 8:33










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      Nov 15 at 8:34















    up vote
    0
    down vote













    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer





















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      Nov 15 at 8:33










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      Nov 15 at 8:34













    up vote
    0
    down vote










    up vote
    0
    down vote









    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.






    share|cite|improve this answer












    The 'gradient' of something usually means taking all partial derivatives. Therefore



    "taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"



    is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Nov 15 at 8:30









    maxmilgram

    4227




    4227












    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      Nov 15 at 8:33










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      Nov 15 at 8:34


















    • This is a homework question. For some specific function $F$ I have to do that.
      – The Bosco
      Nov 15 at 8:33










    • The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
      – maxmilgram
      Nov 15 at 8:34
















    This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    Nov 15 at 8:33




    This is a homework question. For some specific function $F$ I have to do that.
    – The Bosco
    Nov 15 at 8:33












    The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    Nov 15 at 8:34




    The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
    – maxmilgram
    Nov 15 at 8:34










    up vote
    0
    down vote













    A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



    Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






    share|cite|improve this answer

























      up vote
      0
      down vote













      A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



      Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






      share|cite|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



        Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).






        share|cite|improve this answer












        A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.



        Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 15 at 9:08









        mathreadler

        14.6k72160




        14.6k72160






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Probability when a professor distributes a quiz and homework assignment to a class of n students.

            Aardman Animations

            Are they similar matrix