Proof that Laplace expansion is row/column invariant, without induction
$begingroup$
There are various proof across the site of the equivalence of the Laplace expansion of the determinant and the Leibniz formula for the determinant.
There are also various proofs of the fact that the Laplace expansion is independent of the row/column chosen, but they all use induction.
First of all, I take the definition of determinant to be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}det(A_{1k}) & text{otherwise},
end{cases}
$$
where $A_{ij}$ denotes the submatrix obtained by deleting row $i$ and column $j$.
It is hard to explain to my students (who have not yet been introduced to induction) why, even though the determinant is defined using the first row, it is in fact independent of the row/column chosen. Is there a proof of this fact which does not use induction or uses permutations? Or at least, a way to illustrate this fact which makes it clear?
linear-algebra matrices determinant
$endgroup$
add a comment |
$begingroup$
There are various proof across the site of the equivalence of the Laplace expansion of the determinant and the Leibniz formula for the determinant.
There are also various proofs of the fact that the Laplace expansion is independent of the row/column chosen, but they all use induction.
First of all, I take the definition of determinant to be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}det(A_{1k}) & text{otherwise},
end{cases}
$$
where $A_{ij}$ denotes the submatrix obtained by deleting row $i$ and column $j$.
It is hard to explain to my students (who have not yet been introduced to induction) why, even though the determinant is defined using the first row, it is in fact independent of the row/column chosen. Is there a proof of this fact which does not use induction or uses permutations? Or at least, a way to illustrate this fact which makes it clear?
linear-algebra matrices determinant
$endgroup$
1
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13
add a comment |
$begingroup$
There are various proof across the site of the equivalence of the Laplace expansion of the determinant and the Leibniz formula for the determinant.
There are also various proofs of the fact that the Laplace expansion is independent of the row/column chosen, but they all use induction.
First of all, I take the definition of determinant to be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}det(A_{1k}) & text{otherwise},
end{cases}
$$
where $A_{ij}$ denotes the submatrix obtained by deleting row $i$ and column $j$.
It is hard to explain to my students (who have not yet been introduced to induction) why, even though the determinant is defined using the first row, it is in fact independent of the row/column chosen. Is there a proof of this fact which does not use induction or uses permutations? Or at least, a way to illustrate this fact which makes it clear?
linear-algebra matrices determinant
$endgroup$
There are various proof across the site of the equivalence of the Laplace expansion of the determinant and the Leibniz formula for the determinant.
There are also various proofs of the fact that the Laplace expansion is independent of the row/column chosen, but they all use induction.
First of all, I take the definition of determinant to be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}det(A_{1k}) & text{otherwise},
end{cases}
$$
where $A_{ij}$ denotes the submatrix obtained by deleting row $i$ and column $j$.
It is hard to explain to my students (who have not yet been introduced to induction) why, even though the determinant is defined using the first row, it is in fact independent of the row/column chosen. Is there a proof of this fact which does not use induction or uses permutations? Or at least, a way to illustrate this fact which makes it clear?
linear-algebra matrices determinant
linear-algebra matrices determinant
edited Dec 29 '18 at 16:07
Luke Collins
asked Dec 29 '18 at 15:58
Luke CollinsLuke Collins
754419
754419
1
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13
add a comment |
1
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13
1
1
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Your definition of "det" is itself inductive. If in proving your assertion about $n times n$ matrices, you're not going to use any properties of "det" on $(n-1) times (n-1)$ matrices, your definition could equally well be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}color{red}{det'}(A_{1k}) & text{otherwise},
end{cases}
$$
where $color{red}{det'}$ is some completely different function (e.g., the zero function).
I think that with an inductive definition like this, you pretty much have to do inductive proofs. (Indeed, your sharpest students might well ask "How do you even know that this defines a function?")
There might be some "way to illustrate this fact which makes it clear", but that really amounts to proof-by-vigorous-assertion.
I guess I'd be inclined to write out the det for a 1x1, 2x2, and 3x3 matrix (with indexed entries rather than numerical ones), and say "Look, all possible n-tuples with one index from each row and one index from each column appear!", and then spend a little while noticing which ones have positive or negative coefficients, and say "there's an alternative definition of det that uses this 'sum of permutations' approach instead...and writing it out is really messy. But let's see, in practice, what it produces if we swap two rows of the matrix..."
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055976%2fproof-that-laplace-expansion-is-row-column-invariant-without-induction%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your definition of "det" is itself inductive. If in proving your assertion about $n times n$ matrices, you're not going to use any properties of "det" on $(n-1) times (n-1)$ matrices, your definition could equally well be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}color{red}{det'}(A_{1k}) & text{otherwise},
end{cases}
$$
where $color{red}{det'}$ is some completely different function (e.g., the zero function).
I think that with an inductive definition like this, you pretty much have to do inductive proofs. (Indeed, your sharpest students might well ask "How do you even know that this defines a function?")
There might be some "way to illustrate this fact which makes it clear", but that really amounts to proof-by-vigorous-assertion.
I guess I'd be inclined to write out the det for a 1x1, 2x2, and 3x3 matrix (with indexed entries rather than numerical ones), and say "Look, all possible n-tuples with one index from each row and one index from each column appear!", and then spend a little while noticing which ones have positive or negative coefficients, and say "there's an alternative definition of det that uses this 'sum of permutations' approach instead...and writing it out is really messy. But let's see, in practice, what it produces if we swap two rows of the matrix..."
$endgroup$
add a comment |
$begingroup$
Your definition of "det" is itself inductive. If in proving your assertion about $n times n$ matrices, you're not going to use any properties of "det" on $(n-1) times (n-1)$ matrices, your definition could equally well be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}color{red}{det'}(A_{1k}) & text{otherwise},
end{cases}
$$
where $color{red}{det'}$ is some completely different function (e.g., the zero function).
I think that with an inductive definition like this, you pretty much have to do inductive proofs. (Indeed, your sharpest students might well ask "How do you even know that this defines a function?")
There might be some "way to illustrate this fact which makes it clear", but that really amounts to proof-by-vigorous-assertion.
I guess I'd be inclined to write out the det for a 1x1, 2x2, and 3x3 matrix (with indexed entries rather than numerical ones), and say "Look, all possible n-tuples with one index from each row and one index from each column appear!", and then spend a little while noticing which ones have positive or negative coefficients, and say "there's an alternative definition of det that uses this 'sum of permutations' approach instead...and writing it out is really messy. But let's see, in practice, what it produces if we swap two rows of the matrix..."
$endgroup$
add a comment |
$begingroup$
Your definition of "det" is itself inductive. If in proving your assertion about $n times n$ matrices, you're not going to use any properties of "det" on $(n-1) times (n-1)$ matrices, your definition could equally well be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}color{red}{det'}(A_{1k}) & text{otherwise},
end{cases}
$$
where $color{red}{det'}$ is some completely different function (e.g., the zero function).
I think that with an inductive definition like this, you pretty much have to do inductive proofs. (Indeed, your sharpest students might well ask "How do you even know that this defines a function?")
There might be some "way to illustrate this fact which makes it clear", but that really amounts to proof-by-vigorous-assertion.
I guess I'd be inclined to write out the det for a 1x1, 2x2, and 3x3 matrix (with indexed entries rather than numerical ones), and say "Look, all possible n-tuples with one index from each row and one index from each column appear!", and then spend a little while noticing which ones have positive or negative coefficients, and say "there's an alternative definition of det that uses this 'sum of permutations' approach instead...and writing it out is really messy. But let's see, in practice, what it produces if we swap two rows of the matrix..."
$endgroup$
Your definition of "det" is itself inductive. If in proving your assertion about $n times n$ matrices, you're not going to use any properties of "det" on $(n-1) times (n-1)$ matrices, your definition could equally well be
$$
det(a_{ij})_{ntimes n} =
begin{cases}
hfil a_{11} hfil & text{if $n = 1$}\[10pt]
displaystylesum_{k=1}^n (-1)^{k+1}a_{1k}color{red}{det'}(A_{1k}) & text{otherwise},
end{cases}
$$
where $color{red}{det'}$ is some completely different function (e.g., the zero function).
I think that with an inductive definition like this, you pretty much have to do inductive proofs. (Indeed, your sharpest students might well ask "How do you even know that this defines a function?")
There might be some "way to illustrate this fact which makes it clear", but that really amounts to proof-by-vigorous-assertion.
I guess I'd be inclined to write out the det for a 1x1, 2x2, and 3x3 matrix (with indexed entries rather than numerical ones), and say "Look, all possible n-tuples with one index from each row and one index from each column appear!", and then spend a little while noticing which ones have positive or negative coefficients, and say "there's an alternative definition of det that uses this 'sum of permutations' approach instead...and writing it out is really messy. But let's see, in practice, what it produces if we swap two rows of the matrix..."
answered Dec 29 '18 at 16:23
John HughesJohn Hughes
64.8k24292
64.8k24292
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055976%2fproof-that-laplace-expansion-is-row-column-invariant-without-induction%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
I'm afraid if you use recursion to define the determinant, you'll have to prove at least some of its properties by induction. If your students have not been introduced to induction, they should be ASAP; linear algebra really depends on it in many places (e.g., why is a strictly upper-triangular matrix nilpotent? why does Gauss elimination lead to $PA=LU$ ?).
$endgroup$
– darij grinberg
Dec 29 '18 at 16:13