True or false statements about a square matrix
$begingroup$
Consider the following four statements about an $n times n$ matrix $A$.
$(i)$ If $det(A) neq 0$, then $A$ is a product of elementary matrices.
$(ii)$ The equation $Ax=b$ can be solved using Cramer's rule for any $n times 1$ matrix b.
$(iii)$ If $det(A)=0$, then either $A$ has a zero row or column, or two equal rows or columns.
$(iv)$ If $B$ is an $ltimes n$ matrix, then the rows of $BA$ are linear combinations of the columns of $A$.
How many of the preceding statements are always true?
$(a)$ None.
$(b)$ One.
$(c)$ Two.
$(d)$ Three.
$(e)$ All of them.
My reasoning:
I am pretty sure that $(i)$ is true. Since the determinant of $A$ is not zero, that means that the matrix is invertible and hence, it can be expressed as a product of elementary matrices.
I don't think $(ii)$ is true. There might be a matrix b such that when I attempt to use Cramer's rule, the denominator is $0$ which is a problem.
I don't think $(iii)$ is true either. A zero row would imply that the determinant is zero or if two rows are the same but not columns. I don't think the columns are an issue so $(iii)$ is false as well.
I'm not 100% sure about statement $(iv)$ but I don't think this is true either since I can always express the matrix $Ax=b$ multiply as a linear combination of the $x_1,x_2,x_3,...$.
Thus, only one statement $(i)$ is true.
Can someone confirm my reasoning for each statement?
linear-algebra proof-verification systems-of-equations determinant inverse
$endgroup$
add a comment |
$begingroup$
Consider the following four statements about an $n times n$ matrix $A$.
$(i)$ If $det(A) neq 0$, then $A$ is a product of elementary matrices.
$(ii)$ The equation $Ax=b$ can be solved using Cramer's rule for any $n times 1$ matrix b.
$(iii)$ If $det(A)=0$, then either $A$ has a zero row or column, or two equal rows or columns.
$(iv)$ If $B$ is an $ltimes n$ matrix, then the rows of $BA$ are linear combinations of the columns of $A$.
How many of the preceding statements are always true?
$(a)$ None.
$(b)$ One.
$(c)$ Two.
$(d)$ Three.
$(e)$ All of them.
My reasoning:
I am pretty sure that $(i)$ is true. Since the determinant of $A$ is not zero, that means that the matrix is invertible and hence, it can be expressed as a product of elementary matrices.
I don't think $(ii)$ is true. There might be a matrix b such that when I attempt to use Cramer's rule, the denominator is $0$ which is a problem.
I don't think $(iii)$ is true either. A zero row would imply that the determinant is zero or if two rows are the same but not columns. I don't think the columns are an issue so $(iii)$ is false as well.
I'm not 100% sure about statement $(iv)$ but I don't think this is true either since I can always express the matrix $Ax=b$ multiply as a linear combination of the $x_1,x_2,x_3,...$.
Thus, only one statement $(i)$ is true.
Can someone confirm my reasoning for each statement?
linear-algebra proof-verification systems-of-equations determinant inverse
$endgroup$
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36
add a comment |
$begingroup$
Consider the following four statements about an $n times n$ matrix $A$.
$(i)$ If $det(A) neq 0$, then $A$ is a product of elementary matrices.
$(ii)$ The equation $Ax=b$ can be solved using Cramer's rule for any $n times 1$ matrix b.
$(iii)$ If $det(A)=0$, then either $A$ has a zero row or column, or two equal rows or columns.
$(iv)$ If $B$ is an $ltimes n$ matrix, then the rows of $BA$ are linear combinations of the columns of $A$.
How many of the preceding statements are always true?
$(a)$ None.
$(b)$ One.
$(c)$ Two.
$(d)$ Three.
$(e)$ All of them.
My reasoning:
I am pretty sure that $(i)$ is true. Since the determinant of $A$ is not zero, that means that the matrix is invertible and hence, it can be expressed as a product of elementary matrices.
I don't think $(ii)$ is true. There might be a matrix b such that when I attempt to use Cramer's rule, the denominator is $0$ which is a problem.
I don't think $(iii)$ is true either. A zero row would imply that the determinant is zero or if two rows are the same but not columns. I don't think the columns are an issue so $(iii)$ is false as well.
I'm not 100% sure about statement $(iv)$ but I don't think this is true either since I can always express the matrix $Ax=b$ multiply as a linear combination of the $x_1,x_2,x_3,...$.
Thus, only one statement $(i)$ is true.
Can someone confirm my reasoning for each statement?
linear-algebra proof-verification systems-of-equations determinant inverse
$endgroup$
Consider the following four statements about an $n times n$ matrix $A$.
$(i)$ If $det(A) neq 0$, then $A$ is a product of elementary matrices.
$(ii)$ The equation $Ax=b$ can be solved using Cramer's rule for any $n times 1$ matrix b.
$(iii)$ If $det(A)=0$, then either $A$ has a zero row or column, or two equal rows or columns.
$(iv)$ If $B$ is an $ltimes n$ matrix, then the rows of $BA$ are linear combinations of the columns of $A$.
How many of the preceding statements are always true?
$(a)$ None.
$(b)$ One.
$(c)$ Two.
$(d)$ Three.
$(e)$ All of them.
My reasoning:
I am pretty sure that $(i)$ is true. Since the determinant of $A$ is not zero, that means that the matrix is invertible and hence, it can be expressed as a product of elementary matrices.
I don't think $(ii)$ is true. There might be a matrix b such that when I attempt to use Cramer's rule, the denominator is $0$ which is a problem.
I don't think $(iii)$ is true either. A zero row would imply that the determinant is zero or if two rows are the same but not columns. I don't think the columns are an issue so $(iii)$ is false as well.
I'm not 100% sure about statement $(iv)$ but I don't think this is true either since I can always express the matrix $Ax=b$ multiply as a linear combination of the $x_1,x_2,x_3,...$.
Thus, only one statement $(i)$ is true.
Can someone confirm my reasoning for each statement?
linear-algebra proof-verification systems-of-equations determinant inverse
linear-algebra proof-verification systems-of-equations determinant inverse
asked Dec 18 '18 at 8:16
Future Math personFuture Math person
977817
977817
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36
add a comment |
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $det(A)$, which has nothing to do with $b$. If $det(A)=0$, we can't solve the problem using Cramer's rule.
$(iii)$ is incorrect, because $det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $detBig(begin{bmatrix}1&4\9&36end{bmatrix}Big)=0$.
$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:
$BA=begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\b_{21}&b_{22}&...&b_{2n}\vdots&vdots&...&vdots\b_{l1}&b_{l2}&...&b_{ln}end{bmatrix}cdotbegin{bmatrix}A_1\A_2\vdots\A_n\end{bmatrix}=begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\vdots\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\end{bmatrix}$
where $A_i$ is the $i^{th}$ row of $A$.
$endgroup$
add a comment |
$begingroup$
Your justification for (i) is tautological. For (ii) and (iii) you should give counterexamples.
If $det Ane0$, the RREF of $A$ is the identity matrix, so $A$ is a product of elementary matrices.
Cramer's rule can be applied only to systems of the form $Ax=b$ with $det Ane0$; take $A=0$.
Consider $A=begin{bmatrix} 1 & 2 \ 2 & 4 end{bmatrix}$.
Saying that the rows of $BA$ are linear combinations of the columns of $A$ doesn't make sense at all.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3044900%2ftrue-or-false-statements-about-a-square-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $det(A)$, which has nothing to do with $b$. If $det(A)=0$, we can't solve the problem using Cramer's rule.
$(iii)$ is incorrect, because $det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $detBig(begin{bmatrix}1&4\9&36end{bmatrix}Big)=0$.
$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:
$BA=begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\b_{21}&b_{22}&...&b_{2n}\vdots&vdots&...&vdots\b_{l1}&b_{l2}&...&b_{ln}end{bmatrix}cdotbegin{bmatrix}A_1\A_2\vdots\A_n\end{bmatrix}=begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\vdots\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\end{bmatrix}$
where $A_i$ is the $i^{th}$ row of $A$.
$endgroup$
add a comment |
$begingroup$
For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $det(A)$, which has nothing to do with $b$. If $det(A)=0$, we can't solve the problem using Cramer's rule.
$(iii)$ is incorrect, because $det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $detBig(begin{bmatrix}1&4\9&36end{bmatrix}Big)=0$.
$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:
$BA=begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\b_{21}&b_{22}&...&b_{2n}\vdots&vdots&...&vdots\b_{l1}&b_{l2}&...&b_{ln}end{bmatrix}cdotbegin{bmatrix}A_1\A_2\vdots\A_n\end{bmatrix}=begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\vdots\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\end{bmatrix}$
where $A_i$ is the $i^{th}$ row of $A$.
$endgroup$
add a comment |
$begingroup$
For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $det(A)$, which has nothing to do with $b$. If $det(A)=0$, we can't solve the problem using Cramer's rule.
$(iii)$ is incorrect, because $det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $detBig(begin{bmatrix}1&4\9&36end{bmatrix}Big)=0$.
$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:
$BA=begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\b_{21}&b_{22}&...&b_{2n}\vdots&vdots&...&vdots\b_{l1}&b_{l2}&...&b_{ln}end{bmatrix}cdotbegin{bmatrix}A_1\A_2\vdots\A_n\end{bmatrix}=begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\vdots\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\end{bmatrix}$
where $A_i$ is the $i^{th}$ row of $A$.
$endgroup$
For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $det(A)$, which has nothing to do with $b$. If $det(A)=0$, we can't solve the problem using Cramer's rule.
$(iii)$ is incorrect, because $det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $detBig(begin{bmatrix}1&4\9&36end{bmatrix}Big)=0$.
$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:
$BA=begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\b_{21}&b_{22}&...&b_{2n}\vdots&vdots&...&vdots\b_{l1}&b_{l2}&...&b_{ln}end{bmatrix}cdotbegin{bmatrix}A_1\A_2\vdots\A_n\end{bmatrix}=begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\vdots\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\end{bmatrix}$
where $A_i$ is the $i^{th}$ row of $A$.
edited Dec 18 '18 at 8:51
answered Dec 18 '18 at 8:43
Shubham JohriShubham Johri
5,189718
5,189718
add a comment |
add a comment |
$begingroup$
Your justification for (i) is tautological. For (ii) and (iii) you should give counterexamples.
If $det Ane0$, the RREF of $A$ is the identity matrix, so $A$ is a product of elementary matrices.
Cramer's rule can be applied only to systems of the form $Ax=b$ with $det Ane0$; take $A=0$.
Consider $A=begin{bmatrix} 1 & 2 \ 2 & 4 end{bmatrix}$.
Saying that the rows of $BA$ are linear combinations of the columns of $A$ doesn't make sense at all.
$endgroup$
add a comment |
$begingroup$
Your justification for (i) is tautological. For (ii) and (iii) you should give counterexamples.
If $det Ane0$, the RREF of $A$ is the identity matrix, so $A$ is a product of elementary matrices.
Cramer's rule can be applied only to systems of the form $Ax=b$ with $det Ane0$; take $A=0$.
Consider $A=begin{bmatrix} 1 & 2 \ 2 & 4 end{bmatrix}$.
Saying that the rows of $BA$ are linear combinations of the columns of $A$ doesn't make sense at all.
$endgroup$
add a comment |
$begingroup$
Your justification for (i) is tautological. For (ii) and (iii) you should give counterexamples.
If $det Ane0$, the RREF of $A$ is the identity matrix, so $A$ is a product of elementary matrices.
Cramer's rule can be applied only to systems of the form $Ax=b$ with $det Ane0$; take $A=0$.
Consider $A=begin{bmatrix} 1 & 2 \ 2 & 4 end{bmatrix}$.
Saying that the rows of $BA$ are linear combinations of the columns of $A$ doesn't make sense at all.
$endgroup$
Your justification for (i) is tautological. For (ii) and (iii) you should give counterexamples.
If $det Ane0$, the RREF of $A$ is the identity matrix, so $A$ is a product of elementary matrices.
Cramer's rule can be applied only to systems of the form $Ax=b$ with $det Ane0$; take $A=0$.
Consider $A=begin{bmatrix} 1 & 2 \ 2 & 4 end{bmatrix}$.
Saying that the rows of $BA$ are linear combinations of the columns of $A$ doesn't make sense at all.
answered Dec 18 '18 at 8:37
egregegreg
183k1486205
183k1486205
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3044900%2ftrue-or-false-statements-about-a-square-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
For $(iv)$, Row space($BA$)= Column space ($(BA)^T$)=Column space ($A^TB^T$). Compare Column space($A$) with Column space($A^TB^T$).
$endgroup$
– Yadati Kiran
Dec 18 '18 at 8:36