Obtaining the gradient of a vector-valued function
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
add a comment |
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
I have read that obtaining the gradient of a vector-valued function $f:mathbb{R}^n to mathbb{R}^m$ is the same as obtaining the Jacobian of this function.
Nevertheless, this function has only one argument (the vector $mathbf{x} in mathbb{R}^n$)
How can I take the gradient of a function $F(mathbf{x}_1, mathbf{x}_2, ldots, mathbf{x}_n)$ with respect to some $mathbf{x}_i$?
derivatives vector-spaces gradient-descent
derivatives vector-spaces gradient-descent
asked Nov 15 at 8:22
The Bosco
495211
495211
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29
add a comment |
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
add a comment |
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
add a comment |
up vote
0
down vote
up vote
0
down vote
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
The 'gradient' of something usually means taking all partial derivatives. Therefore
"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"
is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $mtimes n$ matrix are usually referd to as Jacobian.
answered Nov 15 at 8:30
maxmilgram
4227
4227
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
add a comment |
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
This is a homework question. For some specific function $F$ I have to do that.
– The Bosco
Nov 15 at 8:33
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
The question is to calculate the gradient? Then just calculate all partial derivatives. If that doesn't help, please provide the full problem description.
– maxmilgram
Nov 15 at 8:34
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
add a comment |
up vote
0
down vote
up vote
0
down vote
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3times 1$ vector, $1-$tensor out $3times 1$ vector.
Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
answered Nov 15 at 9:08
mathreadler
14.6k72160
14.6k72160
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999410%2fobtaining-the-gradient-of-a-vector-valued-function%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
A gradient of something with $n$ input indices and $m$ output indices will have $ntimes m$ indices. In this sense it is a tensor outer product. For example 1-tensor in like for example $3times 1$ vector, 1-tensor out $3times 1$ vector. 1+1 index = 2 indices. So your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).
– mathreadler
Nov 15 at 9:01
Idk if it is correct to call it "gradient". It is called, instead, the Jacobian matrix of $f$ at a point. Yes, to obtain the Jacobian matrix respect to one of the variables just assume all the others are constants
– Masacroso
Nov 15 at 9:29