Linear independence of functions: $x_1(t) = 3$, $x_2(t)=3sin^2t$, $x_3(t)=4cos^2t$
$begingroup$
I want to determine whether 3 functions are linearly independent:
begin{align*}
x_1(t) & = 3 \
x_2(t) & = 3sin^2(t) \
x_3(t) & = 4cos^2(t)
end{align*}
Definition of Linear Independence: $c_1x_1 + c_2x_2 + c_3x_3 = 0 implies c_1=c_2=c_3=0$ (only the trivial solution)
So we have:
begin{align}
3c_1 + 3c_2sin^2(t) + 4c_3cos^2(t) = 0
end{align}
My first idea is to differentiate both sides and get:
$6c_2sin(t)cos(t) - 8c_3cos(t)sin(t) = 0$
Then we can factor to get:
$sin(t)cos(t)(6c_2 - 8c_3) = 0$
So $c_3= dfrac{6}{8}c_2$ gives the equation equals zero. Thus all $c$ are not $0$ and thus $x_1, x_2, x_3$ are linearly dependent.
Is this correct? Or is there a cleaner way to do this?
linear-algebra functions vector-spaces
$endgroup$
add a comment |
$begingroup$
I want to determine whether 3 functions are linearly independent:
begin{align*}
x_1(t) & = 3 \
x_2(t) & = 3sin^2(t) \
x_3(t) & = 4cos^2(t)
end{align*}
Definition of Linear Independence: $c_1x_1 + c_2x_2 + c_3x_3 = 0 implies c_1=c_2=c_3=0$ (only the trivial solution)
So we have:
begin{align}
3c_1 + 3c_2sin^2(t) + 4c_3cos^2(t) = 0
end{align}
My first idea is to differentiate both sides and get:
$6c_2sin(t)cos(t) - 8c_3cos(t)sin(t) = 0$
Then we can factor to get:
$sin(t)cos(t)(6c_2 - 8c_3) = 0$
So $c_3= dfrac{6}{8}c_2$ gives the equation equals zero. Thus all $c$ are not $0$ and thus $x_1, x_2, x_3$ are linearly dependent.
Is this correct? Or is there a cleaner way to do this?
linear-algebra functions vector-spaces
$endgroup$
2
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
3
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
1
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24
add a comment |
$begingroup$
I want to determine whether 3 functions are linearly independent:
begin{align*}
x_1(t) & = 3 \
x_2(t) & = 3sin^2(t) \
x_3(t) & = 4cos^2(t)
end{align*}
Definition of Linear Independence: $c_1x_1 + c_2x_2 + c_3x_3 = 0 implies c_1=c_2=c_3=0$ (only the trivial solution)
So we have:
begin{align}
3c_1 + 3c_2sin^2(t) + 4c_3cos^2(t) = 0
end{align}
My first idea is to differentiate both sides and get:
$6c_2sin(t)cos(t) - 8c_3cos(t)sin(t) = 0$
Then we can factor to get:
$sin(t)cos(t)(6c_2 - 8c_3) = 0$
So $c_3= dfrac{6}{8}c_2$ gives the equation equals zero. Thus all $c$ are not $0$ and thus $x_1, x_2, x_3$ are linearly dependent.
Is this correct? Or is there a cleaner way to do this?
linear-algebra functions vector-spaces
$endgroup$
I want to determine whether 3 functions are linearly independent:
begin{align*}
x_1(t) & = 3 \
x_2(t) & = 3sin^2(t) \
x_3(t) & = 4cos^2(t)
end{align*}
Definition of Linear Independence: $c_1x_1 + c_2x_2 + c_3x_3 = 0 implies c_1=c_2=c_3=0$ (only the trivial solution)
So we have:
begin{align}
3c_1 + 3c_2sin^2(t) + 4c_3cos^2(t) = 0
end{align}
My first idea is to differentiate both sides and get:
$6c_2sin(t)cos(t) - 8c_3cos(t)sin(t) = 0$
Then we can factor to get:
$sin(t)cos(t)(6c_2 - 8c_3) = 0$
So $c_3= dfrac{6}{8}c_2$ gives the equation equals zero. Thus all $c$ are not $0$ and thus $x_1, x_2, x_3$ are linearly dependent.
Is this correct? Or is there a cleaner way to do this?
linear-algebra functions vector-spaces
linear-algebra functions vector-spaces
edited Dec 1 '18 at 14:55
Martin Sleziak
44.7k8117272
44.7k8117272
asked Jun 26 '13 at 22:11
CodeKingPlusPlusCodeKingPlusPlus
2,766123876
2,766123876
2
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
3
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
1
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24
add a comment |
2
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
3
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
1
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24
2
2
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
3
3
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
1
1
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Yes, indeed, your answer is fine. And it would have been a particularly fine determining the linear (in)dependence of a system of equations that doesn't readily admit of another observation about the relationship between $cos^2 t$ and $sin^2 t$ $(dagger)$. Indeed, you're one step away from working with the Wronskian, which is a useful tool to prove linear independence.
$(dagger)$ Now, to the observation previously noted: You could have also used the fact that $$x_1(t) - left[(x_2(t) +frac 34 x_3(t)right] = 3 - (3 sin^2 t + 3cos^2 t)= 3 - 3left(underbrace{sin^2(t) + cos^2(t)}_{large = 1}right)=0$$
and saved yourself a little bit of work: you can read off the nonzero coefficients $c_i$ to demonstrate their existence: $c_1 = 1, c_2 = -1, c_3 = -frac 34$, or you could simply express $x_1$ as a linear combination of $x_2, x_3$, to conclude the linear dependence of the vectors. (But don't count on just any random set of vectors turning out so nicely!)
$endgroup$
add a comment |
$begingroup$
It is much more easier to use known identity $sin^2{t}+cos^2{t}=1$. We have $x_1(t)-x_2(t)-frac{3}{4}x_3(t)=3-3cos^2{t}-3sin^2{t}=0$, so functions are linearly dependent.
$endgroup$
add a comment |
$begingroup$
This is a pretty straightforward question, firstly, let me remind you the definition of linearly dependent functions. It says,
A set of functions $mathrm{f_1(x), f_2(x), ... , f_n(x)}$ are called linearly dependent if
$mathrm{c_1f_1(x)+ c_2f_2(x)+ ... + c_n f_n(x) = 0, where c_1, c_2, ... , c_n are arbitrary constants}$ holds true for atleast two non-zero c's.
Coming to the question, then
$mathrm{f_1(x) = 3, f_2(x) = 3sin^2x, f_3(x) = 4cos^2x}$
Consider $mathrm{c_1, c_2, c_3}$ as arbitrary constants,
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$
Easily, if $mathrm{c_1 = frac{-1}{3}, c_2 = frac{1}{3}, c_3 = frac{1}{4}}$, then
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$ = $0$
As all the arbitrary constants are non-zero, the functions are definitely LD.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f430346%2flinear-independence-of-functions-x-1t-3-x-2t-3-sin2t-x-3t-4-cos%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Yes, indeed, your answer is fine. And it would have been a particularly fine determining the linear (in)dependence of a system of equations that doesn't readily admit of another observation about the relationship between $cos^2 t$ and $sin^2 t$ $(dagger)$. Indeed, you're one step away from working with the Wronskian, which is a useful tool to prove linear independence.
$(dagger)$ Now, to the observation previously noted: You could have also used the fact that $$x_1(t) - left[(x_2(t) +frac 34 x_3(t)right] = 3 - (3 sin^2 t + 3cos^2 t)= 3 - 3left(underbrace{sin^2(t) + cos^2(t)}_{large = 1}right)=0$$
and saved yourself a little bit of work: you can read off the nonzero coefficients $c_i$ to demonstrate their existence: $c_1 = 1, c_2 = -1, c_3 = -frac 34$, or you could simply express $x_1$ as a linear combination of $x_2, x_3$, to conclude the linear dependence of the vectors. (But don't count on just any random set of vectors turning out so nicely!)
$endgroup$
add a comment |
$begingroup$
Yes, indeed, your answer is fine. And it would have been a particularly fine determining the linear (in)dependence of a system of equations that doesn't readily admit of another observation about the relationship between $cos^2 t$ and $sin^2 t$ $(dagger)$. Indeed, you're one step away from working with the Wronskian, which is a useful tool to prove linear independence.
$(dagger)$ Now, to the observation previously noted: You could have also used the fact that $$x_1(t) - left[(x_2(t) +frac 34 x_3(t)right] = 3 - (3 sin^2 t + 3cos^2 t)= 3 - 3left(underbrace{sin^2(t) + cos^2(t)}_{large = 1}right)=0$$
and saved yourself a little bit of work: you can read off the nonzero coefficients $c_i$ to demonstrate their existence: $c_1 = 1, c_2 = -1, c_3 = -frac 34$, or you could simply express $x_1$ as a linear combination of $x_2, x_3$, to conclude the linear dependence of the vectors. (But don't count on just any random set of vectors turning out so nicely!)
$endgroup$
add a comment |
$begingroup$
Yes, indeed, your answer is fine. And it would have been a particularly fine determining the linear (in)dependence of a system of equations that doesn't readily admit of another observation about the relationship between $cos^2 t$ and $sin^2 t$ $(dagger)$. Indeed, you're one step away from working with the Wronskian, which is a useful tool to prove linear independence.
$(dagger)$ Now, to the observation previously noted: You could have also used the fact that $$x_1(t) - left[(x_2(t) +frac 34 x_3(t)right] = 3 - (3 sin^2 t + 3cos^2 t)= 3 - 3left(underbrace{sin^2(t) + cos^2(t)}_{large = 1}right)=0$$
and saved yourself a little bit of work: you can read off the nonzero coefficients $c_i$ to demonstrate their existence: $c_1 = 1, c_2 = -1, c_3 = -frac 34$, or you could simply express $x_1$ as a linear combination of $x_2, x_3$, to conclude the linear dependence of the vectors. (But don't count on just any random set of vectors turning out so nicely!)
$endgroup$
Yes, indeed, your answer is fine. And it would have been a particularly fine determining the linear (in)dependence of a system of equations that doesn't readily admit of another observation about the relationship between $cos^2 t$ and $sin^2 t$ $(dagger)$. Indeed, you're one step away from working with the Wronskian, which is a useful tool to prove linear independence.
$(dagger)$ Now, to the observation previously noted: You could have also used the fact that $$x_1(t) - left[(x_2(t) +frac 34 x_3(t)right] = 3 - (3 sin^2 t + 3cos^2 t)= 3 - 3left(underbrace{sin^2(t) + cos^2(t)}_{large = 1}right)=0$$
and saved yourself a little bit of work: you can read off the nonzero coefficients $c_i$ to demonstrate their existence: $c_1 = 1, c_2 = -1, c_3 = -frac 34$, or you could simply express $x_1$ as a linear combination of $x_2, x_3$, to conclude the linear dependence of the vectors. (But don't count on just any random set of vectors turning out so nicely!)
edited Jun 27 '13 at 2:25
answered Jun 26 '13 at 22:42
amWhyamWhy
192k28225439
192k28225439
add a comment |
add a comment |
$begingroup$
It is much more easier to use known identity $sin^2{t}+cos^2{t}=1$. We have $x_1(t)-x_2(t)-frac{3}{4}x_3(t)=3-3cos^2{t}-3sin^2{t}=0$, so functions are linearly dependent.
$endgroup$
add a comment |
$begingroup$
It is much more easier to use known identity $sin^2{t}+cos^2{t}=1$. We have $x_1(t)-x_2(t)-frac{3}{4}x_3(t)=3-3cos^2{t}-3sin^2{t}=0$, so functions are linearly dependent.
$endgroup$
add a comment |
$begingroup$
It is much more easier to use known identity $sin^2{t}+cos^2{t}=1$. We have $x_1(t)-x_2(t)-frac{3}{4}x_3(t)=3-3cos^2{t}-3sin^2{t}=0$, so functions are linearly dependent.
$endgroup$
It is much more easier to use known identity $sin^2{t}+cos^2{t}=1$. We have $x_1(t)-x_2(t)-frac{3}{4}x_3(t)=3-3cos^2{t}-3sin^2{t}=0$, so functions are linearly dependent.
answered Jun 26 '13 at 22:17
alansalans
4,2241239
4,2241239
add a comment |
add a comment |
$begingroup$
This is a pretty straightforward question, firstly, let me remind you the definition of linearly dependent functions. It says,
A set of functions $mathrm{f_1(x), f_2(x), ... , f_n(x)}$ are called linearly dependent if
$mathrm{c_1f_1(x)+ c_2f_2(x)+ ... + c_n f_n(x) = 0, where c_1, c_2, ... , c_n are arbitrary constants}$ holds true for atleast two non-zero c's.
Coming to the question, then
$mathrm{f_1(x) = 3, f_2(x) = 3sin^2x, f_3(x) = 4cos^2x}$
Consider $mathrm{c_1, c_2, c_3}$ as arbitrary constants,
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$
Easily, if $mathrm{c_1 = frac{-1}{3}, c_2 = frac{1}{3}, c_3 = frac{1}{4}}$, then
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$ = $0$
As all the arbitrary constants are non-zero, the functions are definitely LD.
$endgroup$
add a comment |
$begingroup$
This is a pretty straightforward question, firstly, let me remind you the definition of linearly dependent functions. It says,
A set of functions $mathrm{f_1(x), f_2(x), ... , f_n(x)}$ are called linearly dependent if
$mathrm{c_1f_1(x)+ c_2f_2(x)+ ... + c_n f_n(x) = 0, where c_1, c_2, ... , c_n are arbitrary constants}$ holds true for atleast two non-zero c's.
Coming to the question, then
$mathrm{f_1(x) = 3, f_2(x) = 3sin^2x, f_3(x) = 4cos^2x}$
Consider $mathrm{c_1, c_2, c_3}$ as arbitrary constants,
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$
Easily, if $mathrm{c_1 = frac{-1}{3}, c_2 = frac{1}{3}, c_3 = frac{1}{4}}$, then
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$ = $0$
As all the arbitrary constants are non-zero, the functions are definitely LD.
$endgroup$
add a comment |
$begingroup$
This is a pretty straightforward question, firstly, let me remind you the definition of linearly dependent functions. It says,
A set of functions $mathrm{f_1(x), f_2(x), ... , f_n(x)}$ are called linearly dependent if
$mathrm{c_1f_1(x)+ c_2f_2(x)+ ... + c_n f_n(x) = 0, where c_1, c_2, ... , c_n are arbitrary constants}$ holds true for atleast two non-zero c's.
Coming to the question, then
$mathrm{f_1(x) = 3, f_2(x) = 3sin^2x, f_3(x) = 4cos^2x}$
Consider $mathrm{c_1, c_2, c_3}$ as arbitrary constants,
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$
Easily, if $mathrm{c_1 = frac{-1}{3}, c_2 = frac{1}{3}, c_3 = frac{1}{4}}$, then
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$ = $0$
As all the arbitrary constants are non-zero, the functions are definitely LD.
$endgroup$
This is a pretty straightforward question, firstly, let me remind you the definition of linearly dependent functions. It says,
A set of functions $mathrm{f_1(x), f_2(x), ... , f_n(x)}$ are called linearly dependent if
$mathrm{c_1f_1(x)+ c_2f_2(x)+ ... + c_n f_n(x) = 0, where c_1, c_2, ... , c_n are arbitrary constants}$ holds true for atleast two non-zero c's.
Coming to the question, then
$mathrm{f_1(x) = 3, f_2(x) = 3sin^2x, f_3(x) = 4cos^2x}$
Consider $mathrm{c_1, c_2, c_3}$ as arbitrary constants,
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$
Easily, if $mathrm{c_1 = frac{-1}{3}, c_2 = frac{1}{3}, c_3 = frac{1}{4}}$, then
$mathrm{3c_1 + 3c_2sin^2x + 4c_3cos^2x}$ = $0$
As all the arbitrary constants are non-zero, the functions are definitely LD.
answered Jul 8 '18 at 18:44
ArianaAriana
1
1
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f430346%2flinear-independence-of-functions-x-1t-3-x-2t-3-sin2t-x-3t-4-cos%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
How could you take advantage of the Pythagorean identity?
$endgroup$
– David Mitra
Jun 26 '13 at 22:13
3
$begingroup$
Since $cos^2 varphi + sin^2 varphi equiv 1$, you can directly see that $4x_2 + 3x_3$ is a constant.
$endgroup$
– Daniel Fischer♦
Jun 26 '13 at 22:13
$begingroup$
I was thinking about that, but I how do I deal with the coefficients? $(sqrt(3c_2)sin(t))^2 + (sqrt(4c_3)sin(t))^2$
$endgroup$
– CodeKingPlusPlus
Jun 26 '13 at 22:15
1
$begingroup$
@CodeKingPlusPlus Linear independence is invariant under multiplication of individual vectors by nonzero constants. So you can just multiply your three vectors by respectively $frac13,frac13,frac14$, and the problem becomes easy.
$endgroup$
– Marc van Leeuwen
Dec 26 '14 at 6:24