Identity for divergence of vector product












1












$begingroup$


I'm reviewing this paper and they state the following "vector identity":



$ nabla cdot (textbf{u}textbf{v}) = nabla textbf{u} cdot textbf{v} + (nabla cdot textbf{v})textbf{u}$



I tried to find some material online about this identity but didn't have any luck. I'm pretty sure $textbf{u}$ and $textbf{v}$ are both vectors, as they're both bold face in the paper. Can someone help to provide more information on this identity? Is it valid? Can you give the proof?



EDIT



After some discussion, it seems that $textbf{u}$ should be interpreted as a scalar (remove boldface), not a vector. I will clarify this with the original source and follow up if there is any further amendments. If it is a scalar, the answer below is correct.










share|cite|improve this question











$endgroup$












  • $begingroup$
    en.wikipedia.org/wiki/…
    $endgroup$
    – Duncan
    Jan 1 at 4:13












  • $begingroup$
    Which one are you referring to? It's not immediately obvious to me.
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:21










  • $begingroup$
    Vector dot product
    $endgroup$
    – Duncan
    Jan 1 at 4:32










  • $begingroup$
    I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:44
















1












$begingroup$


I'm reviewing this paper and they state the following "vector identity":



$ nabla cdot (textbf{u}textbf{v}) = nabla textbf{u} cdot textbf{v} + (nabla cdot textbf{v})textbf{u}$



I tried to find some material online about this identity but didn't have any luck. I'm pretty sure $textbf{u}$ and $textbf{v}$ are both vectors, as they're both bold face in the paper. Can someone help to provide more information on this identity? Is it valid? Can you give the proof?



EDIT



After some discussion, it seems that $textbf{u}$ should be interpreted as a scalar (remove boldface), not a vector. I will clarify this with the original source and follow up if there is any further amendments. If it is a scalar, the answer below is correct.










share|cite|improve this question











$endgroup$












  • $begingroup$
    en.wikipedia.org/wiki/…
    $endgroup$
    – Duncan
    Jan 1 at 4:13












  • $begingroup$
    Which one are you referring to? It's not immediately obvious to me.
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:21










  • $begingroup$
    Vector dot product
    $endgroup$
    – Duncan
    Jan 1 at 4:32










  • $begingroup$
    I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:44














1












1








1





$begingroup$


I'm reviewing this paper and they state the following "vector identity":



$ nabla cdot (textbf{u}textbf{v}) = nabla textbf{u} cdot textbf{v} + (nabla cdot textbf{v})textbf{u}$



I tried to find some material online about this identity but didn't have any luck. I'm pretty sure $textbf{u}$ and $textbf{v}$ are both vectors, as they're both bold face in the paper. Can someone help to provide more information on this identity? Is it valid? Can you give the proof?



EDIT



After some discussion, it seems that $textbf{u}$ should be interpreted as a scalar (remove boldface), not a vector. I will clarify this with the original source and follow up if there is any further amendments. If it is a scalar, the answer below is correct.










share|cite|improve this question











$endgroup$




I'm reviewing this paper and they state the following "vector identity":



$ nabla cdot (textbf{u}textbf{v}) = nabla textbf{u} cdot textbf{v} + (nabla cdot textbf{v})textbf{u}$



I tried to find some material online about this identity but didn't have any luck. I'm pretty sure $textbf{u}$ and $textbf{v}$ are both vectors, as they're both bold face in the paper. Can someone help to provide more information on this identity? Is it valid? Can you give the proof?



EDIT



After some discussion, it seems that $textbf{u}$ should be interpreted as a scalar (remove boldface), not a vector. I will clarify this with the original source and follow up if there is any further amendments. If it is a scalar, the answer below is correct.







vector-analysis






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 1 at 6:35







ThatsRightJack

















asked Jan 1 at 4:02









ThatsRightJackThatsRightJack

352114




352114












  • $begingroup$
    en.wikipedia.org/wiki/…
    $endgroup$
    – Duncan
    Jan 1 at 4:13












  • $begingroup$
    Which one are you referring to? It's not immediately obvious to me.
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:21










  • $begingroup$
    Vector dot product
    $endgroup$
    – Duncan
    Jan 1 at 4:32










  • $begingroup$
    I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:44


















  • $begingroup$
    en.wikipedia.org/wiki/…
    $endgroup$
    – Duncan
    Jan 1 at 4:13












  • $begingroup$
    Which one are you referring to? It's not immediately obvious to me.
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:21










  • $begingroup$
    Vector dot product
    $endgroup$
    – Duncan
    Jan 1 at 4:32










  • $begingroup$
    I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
    $endgroup$
    – ThatsRightJack
    Jan 1 at 4:44
















$begingroup$
en.wikipedia.org/wiki/…
$endgroup$
– Duncan
Jan 1 at 4:13






$begingroup$
en.wikipedia.org/wiki/…
$endgroup$
– Duncan
Jan 1 at 4:13














$begingroup$
Which one are you referring to? It's not immediately obvious to me.
$endgroup$
– ThatsRightJack
Jan 1 at 4:21




$begingroup$
Which one are you referring to? It's not immediately obvious to me.
$endgroup$
– ThatsRightJack
Jan 1 at 4:21












$begingroup$
Vector dot product
$endgroup$
– Duncan
Jan 1 at 4:32




$begingroup$
Vector dot product
$endgroup$
– Duncan
Jan 1 at 4:32












$begingroup$
I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
$endgroup$
– ThatsRightJack
Jan 1 at 4:44




$begingroup$
I'm sorry, but I don't see how that is the same identity I placed in the question. Can you show me how the two are equal?
$endgroup$
– ThatsRightJack
Jan 1 at 4:44










3 Answers
3






active

oldest

votes


















1












$begingroup$

This is, fundamentally, just another case of the product rule for derivatives.



For the notation to make sense ($nabla u$ and $nablacdot v$ defined), $u$ should be a scalar-valued function of a vector variable, and $v$ should be a vector-valued function of that variable. Then $nabla u$ is the gradient of $u$, and $nablacdot v$ is the divergence of $v$. The two terms on the right are both scalars - the first is the dot product of the vector-valued gradient of $u$ and the vector-valued function $v$, while the second is the product of the scalar-valued divergence of $v$ and the scalar-valued function $u$.



To prove it, we just go down to components. I'll assume three dimensions here, although it works in more generality (Let $A,B,C$ be the components of $v$; $v(x,y,z)=(A(x,y,z),B(x,y,z),C(x,y,z))$):
begin{align*}nablacdot (uv)quad &?quad nabla ucdot v + (nablacdot v) u\
frac{partial (uA)}{partial x}+frac{partial (uB)}{partial y}+frac{partial (uC)}{partial z}quad &?quad left(frac{partial u}{partial x},frac{partial u}{partial y},frac{partial u}{partial z}right)cdot (A,B,C) + left(frac{partial A}{partial x}+frac{partial B}{partial y}+frac{partial C}{partial z}right)u\
ufrac{partial A}{partial x}+Afrac{partial u}{partial x}+ufrac{partial B}{partial y}+Bfrac{partial u}{partial y}+ufrac{partial C}{partial z}+Cfrac{partial u}{partial z} &= Afrac{partial u}{partial x}+Bfrac{partial u}{partial y}+Cfrac{partial u}{partial z}+ufrac{partial A}{partial x}+ufrac{partial B}{partial y}+ufrac{partial C}{partial z} end{align*}

Not too bad there. It's just all the same terms rearranged a bit.



There's another way to look at this. If you go on with mathematics long enough, you'll encounter the language of manifolds and differential forms, which provides a unifying framework for all of these vector calculus notions. In particular, both the gradient and the divergence are examples of the exterior derivative, and the exterior derivative has a product rule. Here, working in $n$ dimensions, we have $u$ as a $0$-form and $v$ as an $(n-1)$-form. The gradient takes $0$-forms to $1$-forms, the divergence takes $(n-1)$-forms to $n$-forms, and the dot product between $1$-forms and $(n-1)$ forms is the wedge product, taking us to a $n$-form. Then we just apply the exterior derivative's product rule:
$$nablacdot (uv) = d(uwedge v) = duwedge v + (-1)^0 uwedge dv = nabla ucdot v + u(nabla cdot v)$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
    $endgroup$
    – ThatsRightJack
    Jan 1 at 5:21










  • $begingroup$
    Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
    $endgroup$
    – ThatsRightJack
    Jan 1 at 5:21












  • $begingroup$
    Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
    $endgroup$
    – jmerry
    Jan 1 at 5:24










  • $begingroup$
    OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
    $endgroup$
    – ThatsRightJack
    Jan 1 at 5:50












  • $begingroup$
    What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
    $endgroup$
    – jmerry
    Jan 1 at 6:08



















2












$begingroup$

Using Einstein notation, we can see it more easily:
$$
nabla cdot (u textbf{v}) = partial_i (uv_i) = partial_iu cdot v_i + ucdotpartial_iv_i = nabla ucdot textbf{v} + u(nabla cdot textbf{v}).
$$






share|cite|improve this answer









$endgroup$





















    1












    $begingroup$

    For this to make sense $u$ is a scalar function and $mathbf{v} = (v_1, ldots, v_n)$ is a vector function.



    We can calculate the partial derivatives using the chain rule:



    $$frac{partial}{partial x_j}(umathbf{v})_i = frac{partial}{partial x_j}(uv_i) = frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}$$



    Therefore, the Jacobi matrix is given by
    $$nabla(umathbf{v}) = left[frac{partial}{partial x_j}(umathbf{v})_iright]_{1le i,jle n} = left[frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}right]_{1le i,jle n} = mathbf{v}otimes (nabla u) + u (nablamathbf{v})$$



    The divergence is obtained by taking the trace:



    $$nabla cdot (umathbf{v}) = operatorname{Tr}left(nabla(umathbf{v})right) = operatorname{Tr}(mathbf{v}otimes (nabla u)) + u operatorname{Tr}(nablamathbf{v}) = mathbf{v}cdot (nabla u) + u(nablacdot mathbf{v})$$






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3058211%2fidentity-for-divergence-of-vector-product%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      This is, fundamentally, just another case of the product rule for derivatives.



      For the notation to make sense ($nabla u$ and $nablacdot v$ defined), $u$ should be a scalar-valued function of a vector variable, and $v$ should be a vector-valued function of that variable. Then $nabla u$ is the gradient of $u$, and $nablacdot v$ is the divergence of $v$. The two terms on the right are both scalars - the first is the dot product of the vector-valued gradient of $u$ and the vector-valued function $v$, while the second is the product of the scalar-valued divergence of $v$ and the scalar-valued function $u$.



      To prove it, we just go down to components. I'll assume three dimensions here, although it works in more generality (Let $A,B,C$ be the components of $v$; $v(x,y,z)=(A(x,y,z),B(x,y,z),C(x,y,z))$):
      begin{align*}nablacdot (uv)quad &?quad nabla ucdot v + (nablacdot v) u\
      frac{partial (uA)}{partial x}+frac{partial (uB)}{partial y}+frac{partial (uC)}{partial z}quad &?quad left(frac{partial u}{partial x},frac{partial u}{partial y},frac{partial u}{partial z}right)cdot (A,B,C) + left(frac{partial A}{partial x}+frac{partial B}{partial y}+frac{partial C}{partial z}right)u\
      ufrac{partial A}{partial x}+Afrac{partial u}{partial x}+ufrac{partial B}{partial y}+Bfrac{partial u}{partial y}+ufrac{partial C}{partial z}+Cfrac{partial u}{partial z} &= Afrac{partial u}{partial x}+Bfrac{partial u}{partial y}+Cfrac{partial u}{partial z}+ufrac{partial A}{partial x}+ufrac{partial B}{partial y}+ufrac{partial C}{partial z} end{align*}

      Not too bad there. It's just all the same terms rearranged a bit.



      There's another way to look at this. If you go on with mathematics long enough, you'll encounter the language of manifolds and differential forms, which provides a unifying framework for all of these vector calculus notions. In particular, both the gradient and the divergence are examples of the exterior derivative, and the exterior derivative has a product rule. Here, working in $n$ dimensions, we have $u$ as a $0$-form and $v$ as an $(n-1)$-form. The gradient takes $0$-forms to $1$-forms, the divergence takes $(n-1)$-forms to $n$-forms, and the dot product between $1$-forms and $(n-1)$ forms is the wedge product, taking us to a $n$-form. Then we just apply the exterior derivative's product rule:
      $$nablacdot (uv) = d(uwedge v) = duwedge v + (-1)^0 uwedge dv = nabla ucdot v + u(nabla cdot v)$$






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21










      • $begingroup$
        Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21












      • $begingroup$
        Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
        $endgroup$
        – jmerry
        Jan 1 at 5:24










      • $begingroup$
        OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:50












      • $begingroup$
        What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
        $endgroup$
        – jmerry
        Jan 1 at 6:08
















      1












      $begingroup$

      This is, fundamentally, just another case of the product rule for derivatives.



      For the notation to make sense ($nabla u$ and $nablacdot v$ defined), $u$ should be a scalar-valued function of a vector variable, and $v$ should be a vector-valued function of that variable. Then $nabla u$ is the gradient of $u$, and $nablacdot v$ is the divergence of $v$. The two terms on the right are both scalars - the first is the dot product of the vector-valued gradient of $u$ and the vector-valued function $v$, while the second is the product of the scalar-valued divergence of $v$ and the scalar-valued function $u$.



      To prove it, we just go down to components. I'll assume three dimensions here, although it works in more generality (Let $A,B,C$ be the components of $v$; $v(x,y,z)=(A(x,y,z),B(x,y,z),C(x,y,z))$):
      begin{align*}nablacdot (uv)quad &?quad nabla ucdot v + (nablacdot v) u\
      frac{partial (uA)}{partial x}+frac{partial (uB)}{partial y}+frac{partial (uC)}{partial z}quad &?quad left(frac{partial u}{partial x},frac{partial u}{partial y},frac{partial u}{partial z}right)cdot (A,B,C) + left(frac{partial A}{partial x}+frac{partial B}{partial y}+frac{partial C}{partial z}right)u\
      ufrac{partial A}{partial x}+Afrac{partial u}{partial x}+ufrac{partial B}{partial y}+Bfrac{partial u}{partial y}+ufrac{partial C}{partial z}+Cfrac{partial u}{partial z} &= Afrac{partial u}{partial x}+Bfrac{partial u}{partial y}+Cfrac{partial u}{partial z}+ufrac{partial A}{partial x}+ufrac{partial B}{partial y}+ufrac{partial C}{partial z} end{align*}

      Not too bad there. It's just all the same terms rearranged a bit.



      There's another way to look at this. If you go on with mathematics long enough, you'll encounter the language of manifolds and differential forms, which provides a unifying framework for all of these vector calculus notions. In particular, both the gradient and the divergence are examples of the exterior derivative, and the exterior derivative has a product rule. Here, working in $n$ dimensions, we have $u$ as a $0$-form and $v$ as an $(n-1)$-form. The gradient takes $0$-forms to $1$-forms, the divergence takes $(n-1)$-forms to $n$-forms, and the dot product between $1$-forms and $(n-1)$ forms is the wedge product, taking us to a $n$-form. Then we just apply the exterior derivative's product rule:
      $$nablacdot (uv) = d(uwedge v) = duwedge v + (-1)^0 uwedge dv = nabla ucdot v + u(nabla cdot v)$$






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21










      • $begingroup$
        Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21












      • $begingroup$
        Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
        $endgroup$
        – jmerry
        Jan 1 at 5:24










      • $begingroup$
        OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:50












      • $begingroup$
        What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
        $endgroup$
        – jmerry
        Jan 1 at 6:08














      1












      1








      1





      $begingroup$

      This is, fundamentally, just another case of the product rule for derivatives.



      For the notation to make sense ($nabla u$ and $nablacdot v$ defined), $u$ should be a scalar-valued function of a vector variable, and $v$ should be a vector-valued function of that variable. Then $nabla u$ is the gradient of $u$, and $nablacdot v$ is the divergence of $v$. The two terms on the right are both scalars - the first is the dot product of the vector-valued gradient of $u$ and the vector-valued function $v$, while the second is the product of the scalar-valued divergence of $v$ and the scalar-valued function $u$.



      To prove it, we just go down to components. I'll assume three dimensions here, although it works in more generality (Let $A,B,C$ be the components of $v$; $v(x,y,z)=(A(x,y,z),B(x,y,z),C(x,y,z))$):
      begin{align*}nablacdot (uv)quad &?quad nabla ucdot v + (nablacdot v) u\
      frac{partial (uA)}{partial x}+frac{partial (uB)}{partial y}+frac{partial (uC)}{partial z}quad &?quad left(frac{partial u}{partial x},frac{partial u}{partial y},frac{partial u}{partial z}right)cdot (A,B,C) + left(frac{partial A}{partial x}+frac{partial B}{partial y}+frac{partial C}{partial z}right)u\
      ufrac{partial A}{partial x}+Afrac{partial u}{partial x}+ufrac{partial B}{partial y}+Bfrac{partial u}{partial y}+ufrac{partial C}{partial z}+Cfrac{partial u}{partial z} &= Afrac{partial u}{partial x}+Bfrac{partial u}{partial y}+Cfrac{partial u}{partial z}+ufrac{partial A}{partial x}+ufrac{partial B}{partial y}+ufrac{partial C}{partial z} end{align*}

      Not too bad there. It's just all the same terms rearranged a bit.



      There's another way to look at this. If you go on with mathematics long enough, you'll encounter the language of manifolds and differential forms, which provides a unifying framework for all of these vector calculus notions. In particular, both the gradient and the divergence are examples of the exterior derivative, and the exterior derivative has a product rule. Here, working in $n$ dimensions, we have $u$ as a $0$-form and $v$ as an $(n-1)$-form. The gradient takes $0$-forms to $1$-forms, the divergence takes $(n-1)$-forms to $n$-forms, and the dot product between $1$-forms and $(n-1)$ forms is the wedge product, taking us to a $n$-form. Then we just apply the exterior derivative's product rule:
      $$nablacdot (uv) = d(uwedge v) = duwedge v + (-1)^0 uwedge dv = nabla ucdot v + u(nabla cdot v)$$






      share|cite|improve this answer











      $endgroup$



      This is, fundamentally, just another case of the product rule for derivatives.



      For the notation to make sense ($nabla u$ and $nablacdot v$ defined), $u$ should be a scalar-valued function of a vector variable, and $v$ should be a vector-valued function of that variable. Then $nabla u$ is the gradient of $u$, and $nablacdot v$ is the divergence of $v$. The two terms on the right are both scalars - the first is the dot product of the vector-valued gradient of $u$ and the vector-valued function $v$, while the second is the product of the scalar-valued divergence of $v$ and the scalar-valued function $u$.



      To prove it, we just go down to components. I'll assume three dimensions here, although it works in more generality (Let $A,B,C$ be the components of $v$; $v(x,y,z)=(A(x,y,z),B(x,y,z),C(x,y,z))$):
      begin{align*}nablacdot (uv)quad &?quad nabla ucdot v + (nablacdot v) u\
      frac{partial (uA)}{partial x}+frac{partial (uB)}{partial y}+frac{partial (uC)}{partial z}quad &?quad left(frac{partial u}{partial x},frac{partial u}{partial y},frac{partial u}{partial z}right)cdot (A,B,C) + left(frac{partial A}{partial x}+frac{partial B}{partial y}+frac{partial C}{partial z}right)u\
      ufrac{partial A}{partial x}+Afrac{partial u}{partial x}+ufrac{partial B}{partial y}+Bfrac{partial u}{partial y}+ufrac{partial C}{partial z}+Cfrac{partial u}{partial z} &= Afrac{partial u}{partial x}+Bfrac{partial u}{partial y}+Cfrac{partial u}{partial z}+ufrac{partial A}{partial x}+ufrac{partial B}{partial y}+ufrac{partial C}{partial z} end{align*}

      Not too bad there. It's just all the same terms rearranged a bit.



      There's another way to look at this. If you go on with mathematics long enough, you'll encounter the language of manifolds and differential forms, which provides a unifying framework for all of these vector calculus notions. In particular, both the gradient and the divergence are examples of the exterior derivative, and the exterior derivative has a product rule. Here, working in $n$ dimensions, we have $u$ as a $0$-form and $v$ as an $(n-1)$-form. The gradient takes $0$-forms to $1$-forms, the divergence takes $(n-1)$-forms to $n$-forms, and the dot product between $1$-forms and $(n-1)$ forms is the wedge product, taking us to a $n$-form. Then we just apply the exterior derivative's product rule:
      $$nablacdot (uv) = d(uwedge v) = duwedge v + (-1)^0 uwedge dv = nabla ucdot v + u(nabla cdot v)$$







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited Jan 1 at 5:25

























      answered Jan 1 at 5:03









      jmerryjmerry

      16.5k11633




      16.5k11633












      • $begingroup$
        Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21










      • $begingroup$
        Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21












      • $begingroup$
        Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
        $endgroup$
        – jmerry
        Jan 1 at 5:24










      • $begingroup$
        OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:50












      • $begingroup$
        What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
        $endgroup$
        – jmerry
        Jan 1 at 6:08


















      • $begingroup$
        Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21










      • $begingroup$
        Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:21












      • $begingroup$
        Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
        $endgroup$
        – jmerry
        Jan 1 at 5:24










      • $begingroup$
        OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
        $endgroup$
        – ThatsRightJack
        Jan 1 at 5:50












      • $begingroup$
        What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
        $endgroup$
        – jmerry
        Jan 1 at 6:08
















      $begingroup$
      Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:21




      $begingroup$
      Let me just clarify this part: "u should be a scalar-valued function of a vector variable, and v should be a vector-valued function of that variable"...are you saying that "u" must be a scalar and "v" must be a vector? That would imply that my interpretation of the paper notation is wrong (they use 2 bold letters, so I assumed they were both vectors).
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:21












      $begingroup$
      Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:21






      $begingroup$
      Then you say "The two terms on the left are both scalars..."? Should say "right" not "left"?
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:21














      $begingroup$
      Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
      $endgroup$
      – jmerry
      Jan 1 at 5:24




      $begingroup$
      Yes, your interpretation was wrong. $nablacdot$ is the divergence, which applies to a vector. $nabla$ without the dot is the gradient, which applies to a scalar. That's just what the standard notation means. Other comment - argh, directional dyslexia. I meant the right. Time to edit.
      $endgroup$
      – jmerry
      Jan 1 at 5:24












      $begingroup$
      OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:50






      $begingroup$
      OK, I don't disagree with what you wrote, but I let me throw something at you. In the "material/total" derivative, you often see $(textbf{u}cdot nabla) textbf{v}$. I would read this as a vector, dotted with the gradient of a vector, which is a tensor 2? If I'm right in interpreting it this way, then the gradient operator doesn't necessarily operate on a scalar? I'm just worried that the author did intend that "u" and "v" be vectors.
      $endgroup$
      – ThatsRightJack
      Jan 1 at 5:50














      $begingroup$
      What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
      $endgroup$
      – jmerry
      Jan 1 at 6:08




      $begingroup$
      What that notation reads as to me? A directional derivative. $u$ determines a direction by applying multipliers to the partial derivatives, and then we apply that to $v$. It's not the gradient operator at all; it takes a slice of $v$ as a function of one variable, and differentiates that.
      $endgroup$
      – jmerry
      Jan 1 at 6:08











      2












      $begingroup$

      Using Einstein notation, we can see it more easily:
      $$
      nabla cdot (u textbf{v}) = partial_i (uv_i) = partial_iu cdot v_i + ucdotpartial_iv_i = nabla ucdot textbf{v} + u(nabla cdot textbf{v}).
      $$






      share|cite|improve this answer









      $endgroup$


















        2












        $begingroup$

        Using Einstein notation, we can see it more easily:
        $$
        nabla cdot (u textbf{v}) = partial_i (uv_i) = partial_iu cdot v_i + ucdotpartial_iv_i = nabla ucdot textbf{v} + u(nabla cdot textbf{v}).
        $$






        share|cite|improve this answer









        $endgroup$
















          2












          2








          2





          $begingroup$

          Using Einstein notation, we can see it more easily:
          $$
          nabla cdot (u textbf{v}) = partial_i (uv_i) = partial_iu cdot v_i + ucdotpartial_iv_i = nabla ucdot textbf{v} + u(nabla cdot textbf{v}).
          $$






          share|cite|improve this answer









          $endgroup$



          Using Einstein notation, we can see it more easily:
          $$
          nabla cdot (u textbf{v}) = partial_i (uv_i) = partial_iu cdot v_i + ucdotpartial_iv_i = nabla ucdot textbf{v} + u(nabla cdot textbf{v}).
          $$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 1 at 13:27









          SongSong

          18.5k21651




          18.5k21651























              1












              $begingroup$

              For this to make sense $u$ is a scalar function and $mathbf{v} = (v_1, ldots, v_n)$ is a vector function.



              We can calculate the partial derivatives using the chain rule:



              $$frac{partial}{partial x_j}(umathbf{v})_i = frac{partial}{partial x_j}(uv_i) = frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}$$



              Therefore, the Jacobi matrix is given by
              $$nabla(umathbf{v}) = left[frac{partial}{partial x_j}(umathbf{v})_iright]_{1le i,jle n} = left[frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}right]_{1le i,jle n} = mathbf{v}otimes (nabla u) + u (nablamathbf{v})$$



              The divergence is obtained by taking the trace:



              $$nabla cdot (umathbf{v}) = operatorname{Tr}left(nabla(umathbf{v})right) = operatorname{Tr}(mathbf{v}otimes (nabla u)) + u operatorname{Tr}(nablamathbf{v}) = mathbf{v}cdot (nabla u) + u(nablacdot mathbf{v})$$






              share|cite|improve this answer









              $endgroup$


















                1












                $begingroup$

                For this to make sense $u$ is a scalar function and $mathbf{v} = (v_1, ldots, v_n)$ is a vector function.



                We can calculate the partial derivatives using the chain rule:



                $$frac{partial}{partial x_j}(umathbf{v})_i = frac{partial}{partial x_j}(uv_i) = frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}$$



                Therefore, the Jacobi matrix is given by
                $$nabla(umathbf{v}) = left[frac{partial}{partial x_j}(umathbf{v})_iright]_{1le i,jle n} = left[frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}right]_{1le i,jle n} = mathbf{v}otimes (nabla u) + u (nablamathbf{v})$$



                The divergence is obtained by taking the trace:



                $$nabla cdot (umathbf{v}) = operatorname{Tr}left(nabla(umathbf{v})right) = operatorname{Tr}(mathbf{v}otimes (nabla u)) + u operatorname{Tr}(nablamathbf{v}) = mathbf{v}cdot (nabla u) + u(nablacdot mathbf{v})$$






                share|cite|improve this answer









                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  For this to make sense $u$ is a scalar function and $mathbf{v} = (v_1, ldots, v_n)$ is a vector function.



                  We can calculate the partial derivatives using the chain rule:



                  $$frac{partial}{partial x_j}(umathbf{v})_i = frac{partial}{partial x_j}(uv_i) = frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}$$



                  Therefore, the Jacobi matrix is given by
                  $$nabla(umathbf{v}) = left[frac{partial}{partial x_j}(umathbf{v})_iright]_{1le i,jle n} = left[frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}right]_{1le i,jle n} = mathbf{v}otimes (nabla u) + u (nablamathbf{v})$$



                  The divergence is obtained by taking the trace:



                  $$nabla cdot (umathbf{v}) = operatorname{Tr}left(nabla(umathbf{v})right) = operatorname{Tr}(mathbf{v}otimes (nabla u)) + u operatorname{Tr}(nablamathbf{v}) = mathbf{v}cdot (nabla u) + u(nablacdot mathbf{v})$$






                  share|cite|improve this answer









                  $endgroup$



                  For this to make sense $u$ is a scalar function and $mathbf{v} = (v_1, ldots, v_n)$ is a vector function.



                  We can calculate the partial derivatives using the chain rule:



                  $$frac{partial}{partial x_j}(umathbf{v})_i = frac{partial}{partial x_j}(uv_i) = frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}$$



                  Therefore, the Jacobi matrix is given by
                  $$nabla(umathbf{v}) = left[frac{partial}{partial x_j}(umathbf{v})_iright]_{1le i,jle n} = left[frac{partial u}{partial x_j}v_i + ufrac{partial v_i}{partial x_j}right]_{1le i,jle n} = mathbf{v}otimes (nabla u) + u (nablamathbf{v})$$



                  The divergence is obtained by taking the trace:



                  $$nabla cdot (umathbf{v}) = operatorname{Tr}left(nabla(umathbf{v})right) = operatorname{Tr}(mathbf{v}otimes (nabla u)) + u operatorname{Tr}(nablamathbf{v}) = mathbf{v}cdot (nabla u) + u(nablacdot mathbf{v})$$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Jan 1 at 13:18









                  mechanodroidmechanodroid

                  29k62648




                  29k62648






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3058211%2fidentity-for-divergence-of-vector-product%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Probability when a professor distributes a quiz and homework assignment to a class of n students.

                      Aardman Animations

                      Are they similar matrix