Recurrence Relation with multiple variables
$begingroup$
Consider the following recurrence relation with two variables and similar bounding conditions:
begin{align*}
x_{1,t} &= frac{1}{2-frac{2}{n}} x_{1,t-1} + frac{1}{2} x_{2,t-1} \
x_{i,t} &= frac{1}{2} x_{i-1,t-1} + frac{1}{2} x_{i+1,t-1} & 1 < i < n \
x_{n,t} &= frac{1}{2} x_{n-1,t-1} + frac{1}{2-frac{2}{n}} x_{n,t-1}
end{align*}
$x_{i,0} = 1 $ for all $1 leq i leq n$.
Is there any standard method to solve these?
analysis recurrence-relations recursion
$endgroup$
|
show 2 more comments
$begingroup$
Consider the following recurrence relation with two variables and similar bounding conditions:
begin{align*}
x_{1,t} &= frac{1}{2-frac{2}{n}} x_{1,t-1} + frac{1}{2} x_{2,t-1} \
x_{i,t} &= frac{1}{2} x_{i-1,t-1} + frac{1}{2} x_{i+1,t-1} & 1 < i < n \
x_{n,t} &= frac{1}{2} x_{n-1,t-1} + frac{1}{2-frac{2}{n}} x_{n,t-1}
end{align*}
$x_{i,0} = 1 $ for all $1 leq i leq n$.
Is there any standard method to solve these?
analysis recurrence-relations recursion
$endgroup$
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28
|
show 2 more comments
$begingroup$
Consider the following recurrence relation with two variables and similar bounding conditions:
begin{align*}
x_{1,t} &= frac{1}{2-frac{2}{n}} x_{1,t-1} + frac{1}{2} x_{2,t-1} \
x_{i,t} &= frac{1}{2} x_{i-1,t-1} + frac{1}{2} x_{i+1,t-1} & 1 < i < n \
x_{n,t} &= frac{1}{2} x_{n-1,t-1} + frac{1}{2-frac{2}{n}} x_{n,t-1}
end{align*}
$x_{i,0} = 1 $ for all $1 leq i leq n$.
Is there any standard method to solve these?
analysis recurrence-relations recursion
$endgroup$
Consider the following recurrence relation with two variables and similar bounding conditions:
begin{align*}
x_{1,t} &= frac{1}{2-frac{2}{n}} x_{1,t-1} + frac{1}{2} x_{2,t-1} \
x_{i,t} &= frac{1}{2} x_{i-1,t-1} + frac{1}{2} x_{i+1,t-1} & 1 < i < n \
x_{n,t} &= frac{1}{2} x_{n-1,t-1} + frac{1}{2-frac{2}{n}} x_{n,t-1}
end{align*}
$x_{i,0} = 1 $ for all $1 leq i leq n$.
Is there any standard method to solve these?
analysis recurrence-relations recursion
analysis recurrence-relations recursion
edited Dec 5 '18 at 14:53
Jannik
asked Dec 5 '18 at 14:44
JannikJannik
587
587
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28
|
show 2 more comments
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28
|
show 2 more comments
1 Answer
1
active
oldest
votes
$begingroup$
Your system reads as
$$
{bf x}(t) = {1 over 2};left( {matrix{
{1 + {1 over {n - 1}}} & 1 & 0 & cdots & 0 cr
1 & 0 & 1 & cdots & 0 cr
vdots & ddots & ddots & ddots & vdots cr
0 & cdots & 1 & 0 & 1 cr
0 & cdots & 0 & 1 & {1 + {1 over {n - 1}}} cr
} } right);{bf x}(t - 1)
$$
So it is a random walk, with a stop at the ends, and with the
immission (always at the ends) of a influx
given as constant percentage increase of the population therein.
And the increase percentage lowers at growing of the vector dimension.
Since the matrix diagonalization is not neat, you are probably looking for
an approximation. If so with respect to what ?
-- reply to your comment --
The matrix above (call it $bf M$) is symmetric, either wrt the diagonal and wrt the antidiagonal , i.e.
$$
{bf M} = {bf M}^{,T} = {bf J};{bf M};{bf J}
$$
where ${bf J}$ is the exchange matrix.
Thus the matrix is centrosymmetric and in particular symmetric tridiagonal,
and only almost-Toeplitz symmetric.
As such:
- $bf M$ can be diagonalized by an orthogonal matrix $bf Q$;
- all its eigenvalues are real and distinct;
- if $bf v$ is an eigenvector so is its symmetric ${bf J}{bf v}$;
- since $Q$ is orthogonal, it will contain the eigenvectors.
For an eigenvector we surely have that the ratio of the components is constant, and v.v. a vector for which the ratio remains constant must be an eigenvector.
A calculation for small $n$ reveals that the e.v. associated to the higher eigenvalue
has all the components positive and is in fact symmetric.
But I couldn't yet reach to a demonstration of that, nor of a bound in such a ratio.
$endgroup$
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3027140%2frecurrence-relation-with-multiple-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your system reads as
$$
{bf x}(t) = {1 over 2};left( {matrix{
{1 + {1 over {n - 1}}} & 1 & 0 & cdots & 0 cr
1 & 0 & 1 & cdots & 0 cr
vdots & ddots & ddots & ddots & vdots cr
0 & cdots & 1 & 0 & 1 cr
0 & cdots & 0 & 1 & {1 + {1 over {n - 1}}} cr
} } right);{bf x}(t - 1)
$$
So it is a random walk, with a stop at the ends, and with the
immission (always at the ends) of a influx
given as constant percentage increase of the population therein.
And the increase percentage lowers at growing of the vector dimension.
Since the matrix diagonalization is not neat, you are probably looking for
an approximation. If so with respect to what ?
-- reply to your comment --
The matrix above (call it $bf M$) is symmetric, either wrt the diagonal and wrt the antidiagonal , i.e.
$$
{bf M} = {bf M}^{,T} = {bf J};{bf M};{bf J}
$$
where ${bf J}$ is the exchange matrix.
Thus the matrix is centrosymmetric and in particular symmetric tridiagonal,
and only almost-Toeplitz symmetric.
As such:
- $bf M$ can be diagonalized by an orthogonal matrix $bf Q$;
- all its eigenvalues are real and distinct;
- if $bf v$ is an eigenvector so is its symmetric ${bf J}{bf v}$;
- since $Q$ is orthogonal, it will contain the eigenvectors.
For an eigenvector we surely have that the ratio of the components is constant, and v.v. a vector for which the ratio remains constant must be an eigenvector.
A calculation for small $n$ reveals that the e.v. associated to the higher eigenvalue
has all the components positive and is in fact symmetric.
But I couldn't yet reach to a demonstration of that, nor of a bound in such a ratio.
$endgroup$
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
add a comment |
$begingroup$
Your system reads as
$$
{bf x}(t) = {1 over 2};left( {matrix{
{1 + {1 over {n - 1}}} & 1 & 0 & cdots & 0 cr
1 & 0 & 1 & cdots & 0 cr
vdots & ddots & ddots & ddots & vdots cr
0 & cdots & 1 & 0 & 1 cr
0 & cdots & 0 & 1 & {1 + {1 over {n - 1}}} cr
} } right);{bf x}(t - 1)
$$
So it is a random walk, with a stop at the ends, and with the
immission (always at the ends) of a influx
given as constant percentage increase of the population therein.
And the increase percentage lowers at growing of the vector dimension.
Since the matrix diagonalization is not neat, you are probably looking for
an approximation. If so with respect to what ?
-- reply to your comment --
The matrix above (call it $bf M$) is symmetric, either wrt the diagonal and wrt the antidiagonal , i.e.
$$
{bf M} = {bf M}^{,T} = {bf J};{bf M};{bf J}
$$
where ${bf J}$ is the exchange matrix.
Thus the matrix is centrosymmetric and in particular symmetric tridiagonal,
and only almost-Toeplitz symmetric.
As such:
- $bf M$ can be diagonalized by an orthogonal matrix $bf Q$;
- all its eigenvalues are real and distinct;
- if $bf v$ is an eigenvector so is its symmetric ${bf J}{bf v}$;
- since $Q$ is orthogonal, it will contain the eigenvectors.
For an eigenvector we surely have that the ratio of the components is constant, and v.v. a vector for which the ratio remains constant must be an eigenvector.
A calculation for small $n$ reveals that the e.v. associated to the higher eigenvalue
has all the components positive and is in fact symmetric.
But I couldn't yet reach to a demonstration of that, nor of a bound in such a ratio.
$endgroup$
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
add a comment |
$begingroup$
Your system reads as
$$
{bf x}(t) = {1 over 2};left( {matrix{
{1 + {1 over {n - 1}}} & 1 & 0 & cdots & 0 cr
1 & 0 & 1 & cdots & 0 cr
vdots & ddots & ddots & ddots & vdots cr
0 & cdots & 1 & 0 & 1 cr
0 & cdots & 0 & 1 & {1 + {1 over {n - 1}}} cr
} } right);{bf x}(t - 1)
$$
So it is a random walk, with a stop at the ends, and with the
immission (always at the ends) of a influx
given as constant percentage increase of the population therein.
And the increase percentage lowers at growing of the vector dimension.
Since the matrix diagonalization is not neat, you are probably looking for
an approximation. If so with respect to what ?
-- reply to your comment --
The matrix above (call it $bf M$) is symmetric, either wrt the diagonal and wrt the antidiagonal , i.e.
$$
{bf M} = {bf M}^{,T} = {bf J};{bf M};{bf J}
$$
where ${bf J}$ is the exchange matrix.
Thus the matrix is centrosymmetric and in particular symmetric tridiagonal,
and only almost-Toeplitz symmetric.
As such:
- $bf M$ can be diagonalized by an orthogonal matrix $bf Q$;
- all its eigenvalues are real and distinct;
- if $bf v$ is an eigenvector so is its symmetric ${bf J}{bf v}$;
- since $Q$ is orthogonal, it will contain the eigenvectors.
For an eigenvector we surely have that the ratio of the components is constant, and v.v. a vector for which the ratio remains constant must be an eigenvector.
A calculation for small $n$ reveals that the e.v. associated to the higher eigenvalue
has all the components positive and is in fact symmetric.
But I couldn't yet reach to a demonstration of that, nor of a bound in such a ratio.
$endgroup$
Your system reads as
$$
{bf x}(t) = {1 over 2};left( {matrix{
{1 + {1 over {n - 1}}} & 1 & 0 & cdots & 0 cr
1 & 0 & 1 & cdots & 0 cr
vdots & ddots & ddots & ddots & vdots cr
0 & cdots & 1 & 0 & 1 cr
0 & cdots & 0 & 1 & {1 + {1 over {n - 1}}} cr
} } right);{bf x}(t - 1)
$$
So it is a random walk, with a stop at the ends, and with the
immission (always at the ends) of a influx
given as constant percentage increase of the population therein.
And the increase percentage lowers at growing of the vector dimension.
Since the matrix diagonalization is not neat, you are probably looking for
an approximation. If so with respect to what ?
-- reply to your comment --
The matrix above (call it $bf M$) is symmetric, either wrt the diagonal and wrt the antidiagonal , i.e.
$$
{bf M} = {bf M}^{,T} = {bf J};{bf M};{bf J}
$$
where ${bf J}$ is the exchange matrix.
Thus the matrix is centrosymmetric and in particular symmetric tridiagonal,
and only almost-Toeplitz symmetric.
As such:
- $bf M$ can be diagonalized by an orthogonal matrix $bf Q$;
- all its eigenvalues are real and distinct;
- if $bf v$ is an eigenvector so is its symmetric ${bf J}{bf v}$;
- since $Q$ is orthogonal, it will contain the eigenvectors.
For an eigenvector we surely have that the ratio of the components is constant, and v.v. a vector for which the ratio remains constant must be an eigenvector.
A calculation for small $n$ reveals that the e.v. associated to the higher eigenvalue
has all the components positive and is in fact symmetric.
But I couldn't yet reach to a demonstration of that, nor of a bound in such a ratio.
edited Dec 6 '18 at 18:55
answered Dec 5 '18 at 17:20
G CabG Cab
18.5k31237
18.5k31237
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
add a comment |
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
Good lower and upper bounds for the values of $x_{i,t}$ would be nice. Perfect would be to prove that the ratio$(x_{1,t}/x_{i,t})$ is bounded by a constant for every $i$
$endgroup$
– Jannik
Dec 5 '18 at 17:30
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
$begingroup$
I thank you anyway for your time and hints. Actually you could help me solving a different problem because I weren't familiar with the theory of tridiagonal matrices. Maybe I can still find something that helps me out here. Thanks a lot :-)
$endgroup$
– Jannik
Dec 6 '18 at 10:40
1
1
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
$begingroup$
@Jannik: glad to have offered a minimal help. I could find on the net some papers dealing with the diagonalization of a symmetric tridiagonal matrix, but mainly relevant to the algorithms for that. But could not find yet a good analytical hint about the values of the eigenvectors
$endgroup$
– G Cab
Dec 6 '18 at 18:44
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3027140%2frecurrence-relation-with-multiple-variables%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
do you know Markov Chains ?
$endgroup$
– G Cab
Dec 5 '18 at 15:46
$begingroup$
Yes, but $frac{1}{2-frac{2}{n}}$ is larger than $frac{1}{2}$ and hence it's not stochastic anymore.
$endgroup$
– Jannik
Dec 5 '18 at 15:48
$begingroup$
well, then you are going to have non-stochastic vectors $x$. The principle of having a vectorial recurrence $bf{ x}(t)= bf M bf {x}(t-1)$ remains.
$endgroup$
– G Cab
Dec 5 '18 at 16:02
$begingroup$
That's true but this matrix has a strange behavior I cannot predict (it's pretty hard to calculate). That's why I'm asking for a different solution :-)
$endgroup$
– Jannik
Dec 5 '18 at 16:03
$begingroup$
That's why it is recommended that OP indicates what he has done and what is the obstacle. In fact, even for small $n$ the matrix diagonalize "strangely". So you have better to reformulate your post asking on how to diagonalize that matrix.
$endgroup$
– G Cab
Dec 5 '18 at 16:28