Does a sequence of vectors whose norms converge to zero necessarily converge to the zero vector?












1












$begingroup$


Let $V$ be an $N$-dimensional normed vector space, and let $(v_n)$ be a sequence of vectors such that $lim (left | v_n right |)=0$. Given some basis ${e^1,e^2,..., e^N }$, we can write each $v_n$ as $a_1^{(n)}e^1+a_2^{(n)}e^2 +dots + a_N^{(n)}e^N$.



Is it necessarily true that for each fixed $k in {1, 2, dots, N}$, $lim_{n to infty} a_k^{(n)}=0$ ? If so, how can we prove this using only the axioms for a generic norm?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Yes, since one has the result that all norms are equivalent.
    $endgroup$
    – Lord Shark the Unknown
    Dec 21 '18 at 5:26






  • 1




    $begingroup$
    @LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
    $endgroup$
    – Nate Eldredge
    Dec 21 '18 at 5:41










  • $begingroup$
    The questions in the title and in the body are different.
    $endgroup$
    – zipirovich
    Dec 22 '18 at 3:20
















1












$begingroup$


Let $V$ be an $N$-dimensional normed vector space, and let $(v_n)$ be a sequence of vectors such that $lim (left | v_n right |)=0$. Given some basis ${e^1,e^2,..., e^N }$, we can write each $v_n$ as $a_1^{(n)}e^1+a_2^{(n)}e^2 +dots + a_N^{(n)}e^N$.



Is it necessarily true that for each fixed $k in {1, 2, dots, N}$, $lim_{n to infty} a_k^{(n)}=0$ ? If so, how can we prove this using only the axioms for a generic norm?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    Yes, since one has the result that all norms are equivalent.
    $endgroup$
    – Lord Shark the Unknown
    Dec 21 '18 at 5:26






  • 1




    $begingroup$
    @LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
    $endgroup$
    – Nate Eldredge
    Dec 21 '18 at 5:41










  • $begingroup$
    The questions in the title and in the body are different.
    $endgroup$
    – zipirovich
    Dec 22 '18 at 3:20














1












1








1





$begingroup$


Let $V$ be an $N$-dimensional normed vector space, and let $(v_n)$ be a sequence of vectors such that $lim (left | v_n right |)=0$. Given some basis ${e^1,e^2,..., e^N }$, we can write each $v_n$ as $a_1^{(n)}e^1+a_2^{(n)}e^2 +dots + a_N^{(n)}e^N$.



Is it necessarily true that for each fixed $k in {1, 2, dots, N}$, $lim_{n to infty} a_k^{(n)}=0$ ? If so, how can we prove this using only the axioms for a generic norm?










share|cite|improve this question









$endgroup$




Let $V$ be an $N$-dimensional normed vector space, and let $(v_n)$ be a sequence of vectors such that $lim (left | v_n right |)=0$. Given some basis ${e^1,e^2,..., e^N }$, we can write each $v_n$ as $a_1^{(n)}e^1+a_2^{(n)}e^2 +dots + a_N^{(n)}e^N$.



Is it necessarily true that for each fixed $k in {1, 2, dots, N}$, $lim_{n to infty} a_k^{(n)}=0$ ? If so, how can we prove this using only the axioms for a generic norm?







linear-algebra normed-spaces






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 21 '18 at 5:23









WillGWillG

481310




481310








  • 1




    $begingroup$
    Yes, since one has the result that all norms are equivalent.
    $endgroup$
    – Lord Shark the Unknown
    Dec 21 '18 at 5:26






  • 1




    $begingroup$
    @LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
    $endgroup$
    – Nate Eldredge
    Dec 21 '18 at 5:41










  • $begingroup$
    The questions in the title and in the body are different.
    $endgroup$
    – zipirovich
    Dec 22 '18 at 3:20














  • 1




    $begingroup$
    Yes, since one has the result that all norms are equivalent.
    $endgroup$
    – Lord Shark the Unknown
    Dec 21 '18 at 5:26






  • 1




    $begingroup$
    @LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
    $endgroup$
    – Nate Eldredge
    Dec 21 '18 at 5:41










  • $begingroup$
    The questions in the title and in the body are different.
    $endgroup$
    – zipirovich
    Dec 22 '18 at 3:20








1




1




$begingroup$
Yes, since one has the result that all norms are equivalent.
$endgroup$
– Lord Shark the Unknown
Dec 21 '18 at 5:26




$begingroup$
Yes, since one has the result that all norms are equivalent.
$endgroup$
– Lord Shark the Unknown
Dec 21 '18 at 5:26




1




1




$begingroup$
@LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
$endgroup$
– Nate Eldredge
Dec 21 '18 at 5:41




$begingroup$
@LordSharktheUnknown: That's kind of a heavy hammer, though. I wonder if there is a proof of this that doesn't need the Heine-Borel theorem?
$endgroup$
– Nate Eldredge
Dec 21 '18 at 5:41












$begingroup$
The questions in the title and in the body are different.
$endgroup$
– zipirovich
Dec 22 '18 at 3:20




$begingroup$
The questions in the title and in the body are different.
$endgroup$
– zipirovich
Dec 22 '18 at 3:20










3 Answers
3






active

oldest

votes


















0












$begingroup$

I thought the simplest way is that $lvertlvert v_n rvertrvert rightarrow 0$ is the same as $lvert lvert v_n - 0 rvertrvert rightarrow 0$, i.e. $v_n rightarrow 0$.



EDIT 1: Let $v_n = a_1^ne_1 + cdots + a_N^ne_N$. Then $lvertlvert v_n rvertrvert = lvertlvert sum_{i=1}^N a_i^n e_i rvertrvert leq sum_{i=1}^N lvert lvert a_i^ne_i rvertrvert$ by the triangle inequality. Thus if $v_n rightarrow 0$ component-wise, i.e. $a_i^n rightarrow 0$, then $lvertlvert a_i^ne_i rvertrvert = lvert a_i^n rvert lvertlvert e_i rvertrvert rightarrow 0$ and hence $lvert lvert v_n rvert rvert rightarrow 0$, i.e. $v_n rightarrow 0$ in norm.



On the other hand, if $v_n rightarrow 0$ in norm but not all $a_i^n rightarrow 0$, let $v = a_1e_1 + cdots + a_Ne_N$ where $a_i = lim_{ntoinfty} a_i^n$. Then $v$ is nonzero, since $(e_j)_{j=1}^N$ is a basis and at least one $a_i neq 0$, and $lvert lvert v_n - v rvert rvert leq sum_{i=1}^N lvert lvert (a_i^n - a_i)e_i rvert rvert rightarrow 0$ so $v_n rightarrow v$, a contradiction since by continuity of the norm, $lvert lvert v_n rvert rvert rightarrow lvert lvert v rvert rvert neq 0$.



EDIT 2: The above assumes the limit exists, which may not be true. I'm not sure how to show the limit exists.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
    $endgroup$
    – WillG
    Dec 21 '18 at 6:35










  • $begingroup$
    I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:36










  • $begingroup$
    If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
    $endgroup$
    – Riley
    Dec 21 '18 at 7:12










  • $begingroup$
    Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 9:04










  • $begingroup$
    Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
    $endgroup$
    – Riley
    Dec 22 '18 at 4:37



















1












$begingroup$

Here is a proof that uses no theorem on normed linear spaces: Let $x_n=(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ and $r_n$ be its (usual) norm in $mathbb R^{N}$. If ${r_n}$ is unbounded there is a subsequence ${r_n'}$ that tends to $infty$. Now $frac {v_n'} {r_n'} to 0$ and we get unit vectors $(b_1^{(n')},b_2^{(n')},cdots,b_N^{(n')})$ (obtained by normalizing $(a_1^{(n')},a_2^{(n')},cdots,a_N^{(n')})$) such that $sum b_i^{(n')}e_i to 0$. But any sequence of unit vectors in $mathbb R^{N}$ has a subsequence which tends to unit vector. We end up with $sum c_i e_i=0$ where $(c_1,c_2,cdots,c_n)$ is a unit vector contradicting linear independence of $e_i$'s. Hence ${r_n}$ is bounded. Now extract a convergent subsequence and take the limit in the equation $v_n=sum a_1^{(n)}e_i$ to see that the limit of the subsquence must be $(0,0,cdots,0)$. Now you can apply this argument to subsequences of $(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ to conclude that every subsequence of this has a further subsequence converging to the zero vector. Hence $a_i^{(n)} to 0$ for each $i$.



PS This approach can be used prove many of the basic theorem about finite dimensional normed linear spaces from scratch. Unfortunately it is not seen in text books. By this approach you can prove that finite dimensional subspaces are closed, that linear maps on finite dimensional spaces are continuous and all norms on finite dimensional spaces are equivalent.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
    $endgroup$
    – WillG
    Dec 21 '18 at 5:57












  • $begingroup$
    I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:01












  • $begingroup$
    Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:12












  • $begingroup$
    A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:14





















1












$begingroup$

To answer the question in the title (which is a little bit different than the question in the body of the OP):



Let's get back to the definitions! The benefit is that we don't need component, and the answer would work in infinite dimension.



In a normed vector space, we have a natural distance between vectors $v$ and $w$ given by $||v-w||$. Therefore, convergence in a normed vector space is defined as the convergence in the metric space induced by the norm, that




Definition $1$: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. We say that $v_n$ converges to a vector $v$ in $V$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $||v_n-v||<varepsilon$.




In plain English, we can make $v_n$ as close as we want to $v$ (that is, we can make the distance between $v_n$ and $v$ as small as we want), provided that $n$ is large enough.



Now, consider the sequence $x_n=||v_n||$ of the norms of the vectors $v_n$. That's a sequence of real numbers so its convergence is defined by the "classic" (Calculus 1 or 2 course) convergence of sequences:




Definition $2$: Let $x_n$ be a sequence of real numbers. We say that $x_n$ converges to a real number $ell$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $|x_n-ell|<varepsilon$.




This time, we are using the distance between real numbers, that is the absolute value of their difference.



We are finally in position to answer the question in the title (and a little bit more):




Theorem: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. Then $v_n$ converges to the zero vector $0$ if and only if its sequence of norms $x_n=||v_n||$ converges to $0$.




The proof is essentially that $|x_n-0|=|x_n|=|;||v_n||;|=||v_n||=||v_n-0||$.
Indeed, assume that $x_n=||v_n||$ converges to $0$. We will show that $v_n$ has to converge to the zero vector. For, let $varepsilon>0$. Since $x_n$ converges to $0$, there exists $N$ in $mathbb N$ such that, if $ngeqslant N_1$, we have $|x_n|<varepsilon$. Therefore, if $ngeqslant N$, we have also $||v_n||<epsilon$, so the claim is proved.



It should be obvious that the proof of the converse is almost identical. I think that a source of confusion is that it seems "too easy".






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048204%2fdoes-a-sequence-of-vectors-whose-norms-converge-to-zero-necessarily-converge-to%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    I thought the simplest way is that $lvertlvert v_n rvertrvert rightarrow 0$ is the same as $lvert lvert v_n - 0 rvertrvert rightarrow 0$, i.e. $v_n rightarrow 0$.



    EDIT 1: Let $v_n = a_1^ne_1 + cdots + a_N^ne_N$. Then $lvertlvert v_n rvertrvert = lvertlvert sum_{i=1}^N a_i^n e_i rvertrvert leq sum_{i=1}^N lvert lvert a_i^ne_i rvertrvert$ by the triangle inequality. Thus if $v_n rightarrow 0$ component-wise, i.e. $a_i^n rightarrow 0$, then $lvertlvert a_i^ne_i rvertrvert = lvert a_i^n rvert lvertlvert e_i rvertrvert rightarrow 0$ and hence $lvert lvert v_n rvert rvert rightarrow 0$, i.e. $v_n rightarrow 0$ in norm.



    On the other hand, if $v_n rightarrow 0$ in norm but not all $a_i^n rightarrow 0$, let $v = a_1e_1 + cdots + a_Ne_N$ where $a_i = lim_{ntoinfty} a_i^n$. Then $v$ is nonzero, since $(e_j)_{j=1}^N$ is a basis and at least one $a_i neq 0$, and $lvert lvert v_n - v rvert rvert leq sum_{i=1}^N lvert lvert (a_i^n - a_i)e_i rvert rvert rightarrow 0$ so $v_n rightarrow v$, a contradiction since by continuity of the norm, $lvert lvert v_n rvert rvert rightarrow lvert lvert v rvert rvert neq 0$.



    EDIT 2: The above assumes the limit exists, which may not be true. I'm not sure how to show the limit exists.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
      $endgroup$
      – WillG
      Dec 21 '18 at 6:35










    • $begingroup$
      I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:36










    • $begingroup$
      If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
      $endgroup$
      – Riley
      Dec 21 '18 at 7:12










    • $begingroup$
      Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 9:04










    • $begingroup$
      Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
      $endgroup$
      – Riley
      Dec 22 '18 at 4:37
















    0












    $begingroup$

    I thought the simplest way is that $lvertlvert v_n rvertrvert rightarrow 0$ is the same as $lvert lvert v_n - 0 rvertrvert rightarrow 0$, i.e. $v_n rightarrow 0$.



    EDIT 1: Let $v_n = a_1^ne_1 + cdots + a_N^ne_N$. Then $lvertlvert v_n rvertrvert = lvertlvert sum_{i=1}^N a_i^n e_i rvertrvert leq sum_{i=1}^N lvert lvert a_i^ne_i rvertrvert$ by the triangle inequality. Thus if $v_n rightarrow 0$ component-wise, i.e. $a_i^n rightarrow 0$, then $lvertlvert a_i^ne_i rvertrvert = lvert a_i^n rvert lvertlvert e_i rvertrvert rightarrow 0$ and hence $lvert lvert v_n rvert rvert rightarrow 0$, i.e. $v_n rightarrow 0$ in norm.



    On the other hand, if $v_n rightarrow 0$ in norm but not all $a_i^n rightarrow 0$, let $v = a_1e_1 + cdots + a_Ne_N$ where $a_i = lim_{ntoinfty} a_i^n$. Then $v$ is nonzero, since $(e_j)_{j=1}^N$ is a basis and at least one $a_i neq 0$, and $lvert lvert v_n - v rvert rvert leq sum_{i=1}^N lvert lvert (a_i^n - a_i)e_i rvert rvert rightarrow 0$ so $v_n rightarrow v$, a contradiction since by continuity of the norm, $lvert lvert v_n rvert rvert rightarrow lvert lvert v rvert rvert neq 0$.



    EDIT 2: The above assumes the limit exists, which may not be true. I'm not sure how to show the limit exists.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
      $endgroup$
      – WillG
      Dec 21 '18 at 6:35










    • $begingroup$
      I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:36










    • $begingroup$
      If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
      $endgroup$
      – Riley
      Dec 21 '18 at 7:12










    • $begingroup$
      Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 9:04










    • $begingroup$
      Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
      $endgroup$
      – Riley
      Dec 22 '18 at 4:37














    0












    0








    0





    $begingroup$

    I thought the simplest way is that $lvertlvert v_n rvertrvert rightarrow 0$ is the same as $lvert lvert v_n - 0 rvertrvert rightarrow 0$, i.e. $v_n rightarrow 0$.



    EDIT 1: Let $v_n = a_1^ne_1 + cdots + a_N^ne_N$. Then $lvertlvert v_n rvertrvert = lvertlvert sum_{i=1}^N a_i^n e_i rvertrvert leq sum_{i=1}^N lvert lvert a_i^ne_i rvertrvert$ by the triangle inequality. Thus if $v_n rightarrow 0$ component-wise, i.e. $a_i^n rightarrow 0$, then $lvertlvert a_i^ne_i rvertrvert = lvert a_i^n rvert lvertlvert e_i rvertrvert rightarrow 0$ and hence $lvert lvert v_n rvert rvert rightarrow 0$, i.e. $v_n rightarrow 0$ in norm.



    On the other hand, if $v_n rightarrow 0$ in norm but not all $a_i^n rightarrow 0$, let $v = a_1e_1 + cdots + a_Ne_N$ where $a_i = lim_{ntoinfty} a_i^n$. Then $v$ is nonzero, since $(e_j)_{j=1}^N$ is a basis and at least one $a_i neq 0$, and $lvert lvert v_n - v rvert rvert leq sum_{i=1}^N lvert lvert (a_i^n - a_i)e_i rvert rvert rightarrow 0$ so $v_n rightarrow v$, a contradiction since by continuity of the norm, $lvert lvert v_n rvert rvert rightarrow lvert lvert v rvert rvert neq 0$.



    EDIT 2: The above assumes the limit exists, which may not be true. I'm not sure how to show the limit exists.






    share|cite|improve this answer











    $endgroup$



    I thought the simplest way is that $lvertlvert v_n rvertrvert rightarrow 0$ is the same as $lvert lvert v_n - 0 rvertrvert rightarrow 0$, i.e. $v_n rightarrow 0$.



    EDIT 1: Let $v_n = a_1^ne_1 + cdots + a_N^ne_N$. Then $lvertlvert v_n rvertrvert = lvertlvert sum_{i=1}^N a_i^n e_i rvertrvert leq sum_{i=1}^N lvert lvert a_i^ne_i rvertrvert$ by the triangle inequality. Thus if $v_n rightarrow 0$ component-wise, i.e. $a_i^n rightarrow 0$, then $lvertlvert a_i^ne_i rvertrvert = lvert a_i^n rvert lvertlvert e_i rvertrvert rightarrow 0$ and hence $lvert lvert v_n rvert rvert rightarrow 0$, i.e. $v_n rightarrow 0$ in norm.



    On the other hand, if $v_n rightarrow 0$ in norm but not all $a_i^n rightarrow 0$, let $v = a_1e_1 + cdots + a_Ne_N$ where $a_i = lim_{ntoinfty} a_i^n$. Then $v$ is nonzero, since $(e_j)_{j=1}^N$ is a basis and at least one $a_i neq 0$, and $lvert lvert v_n - v rvert rvert leq sum_{i=1}^N lvert lvert (a_i^n - a_i)e_i rvert rvert rightarrow 0$ so $v_n rightarrow v$, a contradiction since by continuity of the norm, $lvert lvert v_n rvert rvert rightarrow lvert lvert v rvert rvert neq 0$.



    EDIT 2: The above assumes the limit exists, which may not be true. I'm not sure how to show the limit exists.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Dec 22 '18 at 4:37

























    answered Dec 21 '18 at 6:27









    RileyRiley

    1825




    1825












    • $begingroup$
      Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
      $endgroup$
      – WillG
      Dec 21 '18 at 6:35










    • $begingroup$
      I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:36










    • $begingroup$
      If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
      $endgroup$
      – Riley
      Dec 21 '18 at 7:12










    • $begingroup$
      Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 9:04










    • $begingroup$
      Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
      $endgroup$
      – Riley
      Dec 22 '18 at 4:37


















    • $begingroup$
      Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
      $endgroup$
      – WillG
      Dec 21 '18 at 6:35










    • $begingroup$
      I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:36










    • $begingroup$
      If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
      $endgroup$
      – Riley
      Dec 21 '18 at 7:12










    • $begingroup$
      Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 9:04










    • $begingroup$
      Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
      $endgroup$
      – Riley
      Dec 22 '18 at 4:37
















    $begingroup$
    Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
    $endgroup$
    – WillG
    Dec 21 '18 at 6:35




    $begingroup$
    Hm, is $v_n to x$ defined to mean $||v_n-x|| to 0$ or that the components of $v_n$ approach the components of $x$ in some basis?
    $endgroup$
    – WillG
    Dec 21 '18 at 6:35












    $begingroup$
    I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:36




    $begingroup$
    I guess what I'm really looking for is a proof that these two definitions facts are equivalent.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:36












    $begingroup$
    If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
    $endgroup$
    – Riley
    Dec 21 '18 at 7:12




    $begingroup$
    If you're wondering if convergence in norm is equivalent to component-wise convergence in finite dimensional normed vector spaces, then I think I've edited accordingly. As for your first question, $v_n rightarrow x$ in a normed vector space is taken to mean $lvertlvert v_n - x rvertrvert rightarrow 0$, since the norm induces the metric.
    $endgroup$
    – Riley
    Dec 21 '18 at 7:12












    $begingroup$
    Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 9:04




    $begingroup$
    Riley's proof is wrong. One has to prove that $lim a_i^{n}$ exists and is $0$. You cannot assume that the limit exists and show that the limit must be $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 9:04












    $begingroup$
    Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
    $endgroup$
    – Riley
    Dec 22 '18 at 4:37




    $begingroup$
    Hmm yes, I do have issues showing the limit exists. I'll edit accordingly.
    $endgroup$
    – Riley
    Dec 22 '18 at 4:37











    1












    $begingroup$

    Here is a proof that uses no theorem on normed linear spaces: Let $x_n=(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ and $r_n$ be its (usual) norm in $mathbb R^{N}$. If ${r_n}$ is unbounded there is a subsequence ${r_n'}$ that tends to $infty$. Now $frac {v_n'} {r_n'} to 0$ and we get unit vectors $(b_1^{(n')},b_2^{(n')},cdots,b_N^{(n')})$ (obtained by normalizing $(a_1^{(n')},a_2^{(n')},cdots,a_N^{(n')})$) such that $sum b_i^{(n')}e_i to 0$. But any sequence of unit vectors in $mathbb R^{N}$ has a subsequence which tends to unit vector. We end up with $sum c_i e_i=0$ where $(c_1,c_2,cdots,c_n)$ is a unit vector contradicting linear independence of $e_i$'s. Hence ${r_n}$ is bounded. Now extract a convergent subsequence and take the limit in the equation $v_n=sum a_1^{(n)}e_i$ to see that the limit of the subsquence must be $(0,0,cdots,0)$. Now you can apply this argument to subsequences of $(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ to conclude that every subsequence of this has a further subsequence converging to the zero vector. Hence $a_i^{(n)} to 0$ for each $i$.



    PS This approach can be used prove many of the basic theorem about finite dimensional normed linear spaces from scratch. Unfortunately it is not seen in text books. By this approach you can prove that finite dimensional subspaces are closed, that linear maps on finite dimensional spaces are continuous and all norms on finite dimensional spaces are equivalent.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
      $endgroup$
      – WillG
      Dec 21 '18 at 5:57












    • $begingroup$
      I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:01












    • $begingroup$
      Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:12












    • $begingroup$
      A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:14


















    1












    $begingroup$

    Here is a proof that uses no theorem on normed linear spaces: Let $x_n=(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ and $r_n$ be its (usual) norm in $mathbb R^{N}$. If ${r_n}$ is unbounded there is a subsequence ${r_n'}$ that tends to $infty$. Now $frac {v_n'} {r_n'} to 0$ and we get unit vectors $(b_1^{(n')},b_2^{(n')},cdots,b_N^{(n')})$ (obtained by normalizing $(a_1^{(n')},a_2^{(n')},cdots,a_N^{(n')})$) such that $sum b_i^{(n')}e_i to 0$. But any sequence of unit vectors in $mathbb R^{N}$ has a subsequence which tends to unit vector. We end up with $sum c_i e_i=0$ where $(c_1,c_2,cdots,c_n)$ is a unit vector contradicting linear independence of $e_i$'s. Hence ${r_n}$ is bounded. Now extract a convergent subsequence and take the limit in the equation $v_n=sum a_1^{(n)}e_i$ to see that the limit of the subsquence must be $(0,0,cdots,0)$. Now you can apply this argument to subsequences of $(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ to conclude that every subsequence of this has a further subsequence converging to the zero vector. Hence $a_i^{(n)} to 0$ for each $i$.



    PS This approach can be used prove many of the basic theorem about finite dimensional normed linear spaces from scratch. Unfortunately it is not seen in text books. By this approach you can prove that finite dimensional subspaces are closed, that linear maps on finite dimensional spaces are continuous and all norms on finite dimensional spaces are equivalent.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
      $endgroup$
      – WillG
      Dec 21 '18 at 5:57












    • $begingroup$
      I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:01












    • $begingroup$
      Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:12












    • $begingroup$
      A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:14
















    1












    1








    1





    $begingroup$

    Here is a proof that uses no theorem on normed linear spaces: Let $x_n=(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ and $r_n$ be its (usual) norm in $mathbb R^{N}$. If ${r_n}$ is unbounded there is a subsequence ${r_n'}$ that tends to $infty$. Now $frac {v_n'} {r_n'} to 0$ and we get unit vectors $(b_1^{(n')},b_2^{(n')},cdots,b_N^{(n')})$ (obtained by normalizing $(a_1^{(n')},a_2^{(n')},cdots,a_N^{(n')})$) such that $sum b_i^{(n')}e_i to 0$. But any sequence of unit vectors in $mathbb R^{N}$ has a subsequence which tends to unit vector. We end up with $sum c_i e_i=0$ where $(c_1,c_2,cdots,c_n)$ is a unit vector contradicting linear independence of $e_i$'s. Hence ${r_n}$ is bounded. Now extract a convergent subsequence and take the limit in the equation $v_n=sum a_1^{(n)}e_i$ to see that the limit of the subsquence must be $(0,0,cdots,0)$. Now you can apply this argument to subsequences of $(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ to conclude that every subsequence of this has a further subsequence converging to the zero vector. Hence $a_i^{(n)} to 0$ for each $i$.



    PS This approach can be used prove many of the basic theorem about finite dimensional normed linear spaces from scratch. Unfortunately it is not seen in text books. By this approach you can prove that finite dimensional subspaces are closed, that linear maps on finite dimensional spaces are continuous and all norms on finite dimensional spaces are equivalent.






    share|cite|improve this answer











    $endgroup$



    Here is a proof that uses no theorem on normed linear spaces: Let $x_n=(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ and $r_n$ be its (usual) norm in $mathbb R^{N}$. If ${r_n}$ is unbounded there is a subsequence ${r_n'}$ that tends to $infty$. Now $frac {v_n'} {r_n'} to 0$ and we get unit vectors $(b_1^{(n')},b_2^{(n')},cdots,b_N^{(n')})$ (obtained by normalizing $(a_1^{(n')},a_2^{(n')},cdots,a_N^{(n')})$) such that $sum b_i^{(n')}e_i to 0$. But any sequence of unit vectors in $mathbb R^{N}$ has a subsequence which tends to unit vector. We end up with $sum c_i e_i=0$ where $(c_1,c_2,cdots,c_n)$ is a unit vector contradicting linear independence of $e_i$'s. Hence ${r_n}$ is bounded. Now extract a convergent subsequence and take the limit in the equation $v_n=sum a_1^{(n)}e_i$ to see that the limit of the subsquence must be $(0,0,cdots,0)$. Now you can apply this argument to subsequences of $(a_1^{(n)},a_2^{(n)},cdots,a_N^{(n)})$ to conclude that every subsequence of this has a further subsequence converging to the zero vector. Hence $a_i^{(n)} to 0$ for each $i$.



    PS This approach can be used prove many of the basic theorem about finite dimensional normed linear spaces from scratch. Unfortunately it is not seen in text books. By this approach you can prove that finite dimensional subspaces are closed, that linear maps on finite dimensional spaces are continuous and all norms on finite dimensional spaces are equivalent.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Dec 21 '18 at 9:08

























    answered Dec 21 '18 at 5:45









    Kavi Rama MurthyKavi Rama Murthy

    64.8k42766




    64.8k42766












    • $begingroup$
      I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
      $endgroup$
      – WillG
      Dec 21 '18 at 5:57












    • $begingroup$
      I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:01












    • $begingroup$
      Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:12












    • $begingroup$
      A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:14




















    • $begingroup$
      I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
      $endgroup$
      – WillG
      Dec 21 '18 at 5:57












    • $begingroup$
      I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:01












    • $begingroup$
      Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
      $endgroup$
      – WillG
      Dec 21 '18 at 6:12












    • $begingroup$
      A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
      $endgroup$
      – Kavi Rama Murthy
      Dec 21 '18 at 6:14


















    $begingroup$
    I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
    $endgroup$
    – WillG
    Dec 21 '18 at 5:57






    $begingroup$
    I'm a bit confused by the difference between $v_n$ and $x_n$, and what exactly it means that $v_n/r_n to 0$. Should that $v_n$ be a norm, and should that $r_n$ be primed? Also, does this generalize to all norms, not just the standard $mathbb{R}^N$ norm?
    $endgroup$
    – WillG
    Dec 21 '18 at 5:57














    $begingroup$
    I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:01






    $begingroup$
    I am working simultaneously in two spaces: the given normed linear space and the Euclidean space $mathbb R^{N}$. $x_n$ is a vector in the latter and $r_n$ is its norm, which is a positive number. I am proving the result for any any norm on any vector space but the proof utilizes Euclidean norm as a tool. The prime is only a notation for a subsequence (just to avoid typing subscripts).
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:01














    $begingroup$
    Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:12






    $begingroup$
    Does $v_n/r_n to 0$ mean that the norms of $v_n/r_n$ approach zero, or that all the components of these vectors in some particular basis approach zero? I'm trying not to assume these are identical statements.
    $endgroup$
    – WillG
    Dec 21 '18 at 6:12














    $begingroup$
    A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:14






    $begingroup$
    A sequence of vectors in a normed linear sapce converges to $0$ off their norms converge to $0$. It is already given that $|v_n| to 0$ so if $r_n 'to infty$ then certainly $|frac {v_n'} {r_n'}|$ also tends to $0$.
    $endgroup$
    – Kavi Rama Murthy
    Dec 21 '18 at 6:14













    1












    $begingroup$

    To answer the question in the title (which is a little bit different than the question in the body of the OP):



    Let's get back to the definitions! The benefit is that we don't need component, and the answer would work in infinite dimension.



    In a normed vector space, we have a natural distance between vectors $v$ and $w$ given by $||v-w||$. Therefore, convergence in a normed vector space is defined as the convergence in the metric space induced by the norm, that




    Definition $1$: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. We say that $v_n$ converges to a vector $v$ in $V$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $||v_n-v||<varepsilon$.




    In plain English, we can make $v_n$ as close as we want to $v$ (that is, we can make the distance between $v_n$ and $v$ as small as we want), provided that $n$ is large enough.



    Now, consider the sequence $x_n=||v_n||$ of the norms of the vectors $v_n$. That's a sequence of real numbers so its convergence is defined by the "classic" (Calculus 1 or 2 course) convergence of sequences:




    Definition $2$: Let $x_n$ be a sequence of real numbers. We say that $x_n$ converges to a real number $ell$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $|x_n-ell|<varepsilon$.




    This time, we are using the distance between real numbers, that is the absolute value of their difference.



    We are finally in position to answer the question in the title (and a little bit more):




    Theorem: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. Then $v_n$ converges to the zero vector $0$ if and only if its sequence of norms $x_n=||v_n||$ converges to $0$.




    The proof is essentially that $|x_n-0|=|x_n|=|;||v_n||;|=||v_n||=||v_n-0||$.
    Indeed, assume that $x_n=||v_n||$ converges to $0$. We will show that $v_n$ has to converge to the zero vector. For, let $varepsilon>0$. Since $x_n$ converges to $0$, there exists $N$ in $mathbb N$ such that, if $ngeqslant N_1$, we have $|x_n|<varepsilon$. Therefore, if $ngeqslant N$, we have also $||v_n||<epsilon$, so the claim is proved.



    It should be obvious that the proof of the converse is almost identical. I think that a source of confusion is that it seems "too easy".






    share|cite|improve this answer









    $endgroup$


















      1












      $begingroup$

      To answer the question in the title (which is a little bit different than the question in the body of the OP):



      Let's get back to the definitions! The benefit is that we don't need component, and the answer would work in infinite dimension.



      In a normed vector space, we have a natural distance between vectors $v$ and $w$ given by $||v-w||$. Therefore, convergence in a normed vector space is defined as the convergence in the metric space induced by the norm, that




      Definition $1$: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. We say that $v_n$ converges to a vector $v$ in $V$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $||v_n-v||<varepsilon$.




      In plain English, we can make $v_n$ as close as we want to $v$ (that is, we can make the distance between $v_n$ and $v$ as small as we want), provided that $n$ is large enough.



      Now, consider the sequence $x_n=||v_n||$ of the norms of the vectors $v_n$. That's a sequence of real numbers so its convergence is defined by the "classic" (Calculus 1 or 2 course) convergence of sequences:




      Definition $2$: Let $x_n$ be a sequence of real numbers. We say that $x_n$ converges to a real number $ell$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $|x_n-ell|<varepsilon$.




      This time, we are using the distance between real numbers, that is the absolute value of their difference.



      We are finally in position to answer the question in the title (and a little bit more):




      Theorem: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. Then $v_n$ converges to the zero vector $0$ if and only if its sequence of norms $x_n=||v_n||$ converges to $0$.




      The proof is essentially that $|x_n-0|=|x_n|=|;||v_n||;|=||v_n||=||v_n-0||$.
      Indeed, assume that $x_n=||v_n||$ converges to $0$. We will show that $v_n$ has to converge to the zero vector. For, let $varepsilon>0$. Since $x_n$ converges to $0$, there exists $N$ in $mathbb N$ such that, if $ngeqslant N_1$, we have $|x_n|<varepsilon$. Therefore, if $ngeqslant N$, we have also $||v_n||<epsilon$, so the claim is proved.



      It should be obvious that the proof of the converse is almost identical. I think that a source of confusion is that it seems "too easy".






      share|cite|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        To answer the question in the title (which is a little bit different than the question in the body of the OP):



        Let's get back to the definitions! The benefit is that we don't need component, and the answer would work in infinite dimension.



        In a normed vector space, we have a natural distance between vectors $v$ and $w$ given by $||v-w||$. Therefore, convergence in a normed vector space is defined as the convergence in the metric space induced by the norm, that




        Definition $1$: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. We say that $v_n$ converges to a vector $v$ in $V$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $||v_n-v||<varepsilon$.




        In plain English, we can make $v_n$ as close as we want to $v$ (that is, we can make the distance between $v_n$ and $v$ as small as we want), provided that $n$ is large enough.



        Now, consider the sequence $x_n=||v_n||$ of the norms of the vectors $v_n$. That's a sequence of real numbers so its convergence is defined by the "classic" (Calculus 1 or 2 course) convergence of sequences:




        Definition $2$: Let $x_n$ be a sequence of real numbers. We say that $x_n$ converges to a real number $ell$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $|x_n-ell|<varepsilon$.




        This time, we are using the distance between real numbers, that is the absolute value of their difference.



        We are finally in position to answer the question in the title (and a little bit more):




        Theorem: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. Then $v_n$ converges to the zero vector $0$ if and only if its sequence of norms $x_n=||v_n||$ converges to $0$.




        The proof is essentially that $|x_n-0|=|x_n|=|;||v_n||;|=||v_n||=||v_n-0||$.
        Indeed, assume that $x_n=||v_n||$ converges to $0$. We will show that $v_n$ has to converge to the zero vector. For, let $varepsilon>0$. Since $x_n$ converges to $0$, there exists $N$ in $mathbb N$ such that, if $ngeqslant N_1$, we have $|x_n|<varepsilon$. Therefore, if $ngeqslant N$, we have also $||v_n||<epsilon$, so the claim is proved.



        It should be obvious that the proof of the converse is almost identical. I think that a source of confusion is that it seems "too easy".






        share|cite|improve this answer









        $endgroup$



        To answer the question in the title (which is a little bit different than the question in the body of the OP):



        Let's get back to the definitions! The benefit is that we don't need component, and the answer would work in infinite dimension.



        In a normed vector space, we have a natural distance between vectors $v$ and $w$ given by $||v-w||$. Therefore, convergence in a normed vector space is defined as the convergence in the metric space induced by the norm, that




        Definition $1$: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. We say that $v_n$ converges to a vector $v$ in $V$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $||v_n-v||<varepsilon$.




        In plain English, we can make $v_n$ as close as we want to $v$ (that is, we can make the distance between $v_n$ and $v$ as small as we want), provided that $n$ is large enough.



        Now, consider the sequence $x_n=||v_n||$ of the norms of the vectors $v_n$. That's a sequence of real numbers so its convergence is defined by the "classic" (Calculus 1 or 2 course) convergence of sequences:




        Definition $2$: Let $x_n$ be a sequence of real numbers. We say that $x_n$ converges to a real number $ell$ if, for any $varepsilon >0$, there exists $Ninmathbb N$ such that, if $ngeqslant N$, then $|x_n-ell|<varepsilon$.




        This time, we are using the distance between real numbers, that is the absolute value of their difference.



        We are finally in position to answer the question in the title (and a little bit more):




        Theorem: Let $v_n$ be a sequence of vectors in a normed space $(V,||cdot||)$. Then $v_n$ converges to the zero vector $0$ if and only if its sequence of norms $x_n=||v_n||$ converges to $0$.




        The proof is essentially that $|x_n-0|=|x_n|=|;||v_n||;|=||v_n||=||v_n-0||$.
        Indeed, assume that $x_n=||v_n||$ converges to $0$. We will show that $v_n$ has to converge to the zero vector. For, let $varepsilon>0$. Since $x_n$ converges to $0$, there exists $N$ in $mathbb N$ such that, if $ngeqslant N_1$, we have $|x_n|<varepsilon$. Therefore, if $ngeqslant N$, we have also $||v_n||<epsilon$, so the claim is proved.



        It should be obvious that the proof of the converse is almost identical. I think that a source of confusion is that it seems "too easy".







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Dec 22 '18 at 3:13









        TaladrisTaladris

        4,91431933




        4,91431933






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3048204%2fdoes-a-sequence-of-vectors-whose-norms-converge-to-zero-necessarily-converge-to%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Aardman Animations

            Are they similar matrix

            “minimization” problem in Euclidean space related to orthonormal basis