Minimum norm problem in Hilbert space with two constraints
up vote
3
down vote
favorite
Let $x_0$ be a fixed vector in a Hilbert space $H$ and suppose ${x_1,...,x_n}$ and ${y_1,...,y_m}$ are sets of linearly independent vectors in $H$. We seek the vector
$x^* = operatorname{arg} minlimits_x |x-x_0|$,
subject to
$x in M=text{span}(x_1,...,x_n)$ and $langle x, y_i rangle=c_i$ for $i=1,...,m$ where the $c_i$'s are constants.
ADDITIONAL INFO:
- This question corresponds to exercise 22 in chapter 3 of Luenberger's book Optimization by Vector Space Methods.
- The exercise only asks for a set of linear equations which uniquely determine the solution (and not for the solution itself).
- The exercise should be solved by using the Hilbert projection theorem (and not other techniques, like Lagrange multipliers).
My attempt:
Let $N=text{span}(y_1,...,y_m)$. Then, the linear variety defined by the equality constraints $langle x, y_i rangle=c_i$ is a translation of $N^perp$.
Since $M$ and $N^perp$ are closed, $M cap N^perp$ is also closed and, by the projection theorem, $x^*$ exists and is unique, assuming that $M cap N^perp$ is not empty. Furthermore, $x^*-x_0 in (M cap N^perp)^perp$.
$x^* in M iff x^*=sum_{j=1}^nalpha_j x_j$, for some scalars $alpha_j$. Defining $X$ as the matrix with columns $x_j$ and $alpha$ as the vector of the $alpha_j$'s, we can write $x^*=X alpha$.
Using this in the equality constraints yields $sum_{j=1}^n alpha_j langle x_j, y_i rangle=c_i$ for $i=1,...,m$. Defining the $n times m$ matrix $G$ with elements $G_{j,i} =langle x_j, y_i rangle$ and $c$ as the vector of the $c_i$'s, we can write $G'alpha = c$.
If $m geq n$, the solution of $G'alpha = c$ either does not exist or is unique, so assume $m < n$.
Now, I probably have to use the fact that $langle x^*-x_0, z rangle = 0$ for any $z in M cap N^perp$. Using this, we have:
$z in M cap N^perp implies z in M iff z = Xbeta$ for some $n$-dimensional vector $beta$.
Also, $z in M cap N^perp implies z in N^perp iff langle z, y_i rangle=0$ for $i=1,...,m$.
Replacing $z = Xbeta$ in the equalities $langle z, y_i rangle = 0$ yields $G'beta = 0$, where $G$ is naturally the same as above.
Finally, using $x^* = Xalpha$ and $z = Xbeta$ in $langle x^*-x_0, z rangle = 0$, we obtain
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$.
And this is where I am stuck right now. We have $2n$ unknowns (the vectors $alpha$ and $beta$) and $2m+1$ equations:
$G'alpha = c$ ($m$ equations),
$G'beta = 0$ ($m$ equations),
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$ (one equation),
where the last equation is clearly non-linear. Therefore, this cannot be the solution...
vector-spaces hilbert-spaces convex-optimization
|
show 2 more comments
up vote
3
down vote
favorite
Let $x_0$ be a fixed vector in a Hilbert space $H$ and suppose ${x_1,...,x_n}$ and ${y_1,...,y_m}$ are sets of linearly independent vectors in $H$. We seek the vector
$x^* = operatorname{arg} minlimits_x |x-x_0|$,
subject to
$x in M=text{span}(x_1,...,x_n)$ and $langle x, y_i rangle=c_i$ for $i=1,...,m$ where the $c_i$'s are constants.
ADDITIONAL INFO:
- This question corresponds to exercise 22 in chapter 3 of Luenberger's book Optimization by Vector Space Methods.
- The exercise only asks for a set of linear equations which uniquely determine the solution (and not for the solution itself).
- The exercise should be solved by using the Hilbert projection theorem (and not other techniques, like Lagrange multipliers).
My attempt:
Let $N=text{span}(y_1,...,y_m)$. Then, the linear variety defined by the equality constraints $langle x, y_i rangle=c_i$ is a translation of $N^perp$.
Since $M$ and $N^perp$ are closed, $M cap N^perp$ is also closed and, by the projection theorem, $x^*$ exists and is unique, assuming that $M cap N^perp$ is not empty. Furthermore, $x^*-x_0 in (M cap N^perp)^perp$.
$x^* in M iff x^*=sum_{j=1}^nalpha_j x_j$, for some scalars $alpha_j$. Defining $X$ as the matrix with columns $x_j$ and $alpha$ as the vector of the $alpha_j$'s, we can write $x^*=X alpha$.
Using this in the equality constraints yields $sum_{j=1}^n alpha_j langle x_j, y_i rangle=c_i$ for $i=1,...,m$. Defining the $n times m$ matrix $G$ with elements $G_{j,i} =langle x_j, y_i rangle$ and $c$ as the vector of the $c_i$'s, we can write $G'alpha = c$.
If $m geq n$, the solution of $G'alpha = c$ either does not exist or is unique, so assume $m < n$.
Now, I probably have to use the fact that $langle x^*-x_0, z rangle = 0$ for any $z in M cap N^perp$. Using this, we have:
$z in M cap N^perp implies z in M iff z = Xbeta$ for some $n$-dimensional vector $beta$.
Also, $z in M cap N^perp implies z in N^perp iff langle z, y_i rangle=0$ for $i=1,...,m$.
Replacing $z = Xbeta$ in the equalities $langle z, y_i rangle = 0$ yields $G'beta = 0$, where $G$ is naturally the same as above.
Finally, using $x^* = Xalpha$ and $z = Xbeta$ in $langle x^*-x_0, z rangle = 0$, we obtain
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$.
And this is where I am stuck right now. We have $2n$ unknowns (the vectors $alpha$ and $beta$) and $2m+1$ equations:
$G'alpha = c$ ($m$ equations),
$G'beta = 0$ ($m$ equations),
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$ (one equation),
where the last equation is clearly non-linear. Therefore, this cannot be the solution...
vector-spaces hilbert-spaces convex-optimization
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
1
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33
|
show 2 more comments
up vote
3
down vote
favorite
up vote
3
down vote
favorite
Let $x_0$ be a fixed vector in a Hilbert space $H$ and suppose ${x_1,...,x_n}$ and ${y_1,...,y_m}$ are sets of linearly independent vectors in $H$. We seek the vector
$x^* = operatorname{arg} minlimits_x |x-x_0|$,
subject to
$x in M=text{span}(x_1,...,x_n)$ and $langle x, y_i rangle=c_i$ for $i=1,...,m$ where the $c_i$'s are constants.
ADDITIONAL INFO:
- This question corresponds to exercise 22 in chapter 3 of Luenberger's book Optimization by Vector Space Methods.
- The exercise only asks for a set of linear equations which uniquely determine the solution (and not for the solution itself).
- The exercise should be solved by using the Hilbert projection theorem (and not other techniques, like Lagrange multipliers).
My attempt:
Let $N=text{span}(y_1,...,y_m)$. Then, the linear variety defined by the equality constraints $langle x, y_i rangle=c_i$ is a translation of $N^perp$.
Since $M$ and $N^perp$ are closed, $M cap N^perp$ is also closed and, by the projection theorem, $x^*$ exists and is unique, assuming that $M cap N^perp$ is not empty. Furthermore, $x^*-x_0 in (M cap N^perp)^perp$.
$x^* in M iff x^*=sum_{j=1}^nalpha_j x_j$, for some scalars $alpha_j$. Defining $X$ as the matrix with columns $x_j$ and $alpha$ as the vector of the $alpha_j$'s, we can write $x^*=X alpha$.
Using this in the equality constraints yields $sum_{j=1}^n alpha_j langle x_j, y_i rangle=c_i$ for $i=1,...,m$. Defining the $n times m$ matrix $G$ with elements $G_{j,i} =langle x_j, y_i rangle$ and $c$ as the vector of the $c_i$'s, we can write $G'alpha = c$.
If $m geq n$, the solution of $G'alpha = c$ either does not exist or is unique, so assume $m < n$.
Now, I probably have to use the fact that $langle x^*-x_0, z rangle = 0$ for any $z in M cap N^perp$. Using this, we have:
$z in M cap N^perp implies z in M iff z = Xbeta$ for some $n$-dimensional vector $beta$.
Also, $z in M cap N^perp implies z in N^perp iff langle z, y_i rangle=0$ for $i=1,...,m$.
Replacing $z = Xbeta$ in the equalities $langle z, y_i rangle = 0$ yields $G'beta = 0$, where $G$ is naturally the same as above.
Finally, using $x^* = Xalpha$ and $z = Xbeta$ in $langle x^*-x_0, z rangle = 0$, we obtain
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$.
And this is where I am stuck right now. We have $2n$ unknowns (the vectors $alpha$ and $beta$) and $2m+1$ equations:
$G'alpha = c$ ($m$ equations),
$G'beta = 0$ ($m$ equations),
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$ (one equation),
where the last equation is clearly non-linear. Therefore, this cannot be the solution...
vector-spaces hilbert-spaces convex-optimization
Let $x_0$ be a fixed vector in a Hilbert space $H$ and suppose ${x_1,...,x_n}$ and ${y_1,...,y_m}$ are sets of linearly independent vectors in $H$. We seek the vector
$x^* = operatorname{arg} minlimits_x |x-x_0|$,
subject to
$x in M=text{span}(x_1,...,x_n)$ and $langle x, y_i rangle=c_i$ for $i=1,...,m$ where the $c_i$'s are constants.
ADDITIONAL INFO:
- This question corresponds to exercise 22 in chapter 3 of Luenberger's book Optimization by Vector Space Methods.
- The exercise only asks for a set of linear equations which uniquely determine the solution (and not for the solution itself).
- The exercise should be solved by using the Hilbert projection theorem (and not other techniques, like Lagrange multipliers).
My attempt:
Let $N=text{span}(y_1,...,y_m)$. Then, the linear variety defined by the equality constraints $langle x, y_i rangle=c_i$ is a translation of $N^perp$.
Since $M$ and $N^perp$ are closed, $M cap N^perp$ is also closed and, by the projection theorem, $x^*$ exists and is unique, assuming that $M cap N^perp$ is not empty. Furthermore, $x^*-x_0 in (M cap N^perp)^perp$.
$x^* in M iff x^*=sum_{j=1}^nalpha_j x_j$, for some scalars $alpha_j$. Defining $X$ as the matrix with columns $x_j$ and $alpha$ as the vector of the $alpha_j$'s, we can write $x^*=X alpha$.
Using this in the equality constraints yields $sum_{j=1}^n alpha_j langle x_j, y_i rangle=c_i$ for $i=1,...,m$. Defining the $n times m$ matrix $G$ with elements $G_{j,i} =langle x_j, y_i rangle$ and $c$ as the vector of the $c_i$'s, we can write $G'alpha = c$.
If $m geq n$, the solution of $G'alpha = c$ either does not exist or is unique, so assume $m < n$.
Now, I probably have to use the fact that $langle x^*-x_0, z rangle = 0$ for any $z in M cap N^perp$. Using this, we have:
$z in M cap N^perp implies z in M iff z = Xbeta$ for some $n$-dimensional vector $beta$.
Also, $z in M cap N^perp implies z in N^perp iff langle z, y_i rangle=0$ for $i=1,...,m$.
Replacing $z = Xbeta$ in the equalities $langle z, y_i rangle = 0$ yields $G'beta = 0$, where $G$ is naturally the same as above.
Finally, using $x^* = Xalpha$ and $z = Xbeta$ in $langle x^*-x_0, z rangle = 0$, we obtain
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$.
And this is where I am stuck right now. We have $2n$ unknowns (the vectors $alpha$ and $beta$) and $2m+1$ equations:
$G'alpha = c$ ($m$ equations),
$G'beta = 0$ ($m$ equations),
$langle Xalpha, Xbeta rangle = langle x_0, Xbeta rangle$ (one equation),
where the last equation is clearly non-linear. Therefore, this cannot be the solution...
vector-spaces hilbert-spaces convex-optimization
vector-spaces hilbert-spaces convex-optimization
edited Dec 10 at 22:04
Hanno
1,925424
1,925424
asked Nov 22 at 14:33
D...
213113
213113
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
1
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33
|
show 2 more comments
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
1
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
1
1
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33
|
show 2 more comments
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
The mistake is at the very last point where you interpret $beta$ as a solution (together with $alpha$). Your condition basically says that
$$
existsbetacolon G'beta=0, z=Xbeta bot x^*-x_0
$$
However, it should be
$$
forallbetacolon G'beta=0Rightarrow z=Xbeta bot x^*-x_0,
$$
or equivalently
$$
betainker G'Rightarrow beta bot X'(x^*-x_0).
$$
In other words, $X'(x^*-x_0)$ annihilates the kernel of $G'$. Hence, it must belong to the image of $G$
$$
existslambdacolon X'(x^*-x_0)=Glambdaquad Leftrightarrowquad X'(Xalpha-x_0)=Glambda.
$$
This equation together with $G'alpha=c$ results in $n+m$ equations and $n+m$ unknowns $(alpha,lambda)$.
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
|
show 2 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009209%2fminimum-norm-problem-in-hilbert-space-with-two-constraints%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
The mistake is at the very last point where you interpret $beta$ as a solution (together with $alpha$). Your condition basically says that
$$
existsbetacolon G'beta=0, z=Xbeta bot x^*-x_0
$$
However, it should be
$$
forallbetacolon G'beta=0Rightarrow z=Xbeta bot x^*-x_0,
$$
or equivalently
$$
betainker G'Rightarrow beta bot X'(x^*-x_0).
$$
In other words, $X'(x^*-x_0)$ annihilates the kernel of $G'$. Hence, it must belong to the image of $G$
$$
existslambdacolon X'(x^*-x_0)=Glambdaquad Leftrightarrowquad X'(Xalpha-x_0)=Glambda.
$$
This equation together with $G'alpha=c$ results in $n+m$ equations and $n+m$ unknowns $(alpha,lambda)$.
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
|
show 2 more comments
up vote
1
down vote
accepted
The mistake is at the very last point where you interpret $beta$ as a solution (together with $alpha$). Your condition basically says that
$$
existsbetacolon G'beta=0, z=Xbeta bot x^*-x_0
$$
However, it should be
$$
forallbetacolon G'beta=0Rightarrow z=Xbeta bot x^*-x_0,
$$
or equivalently
$$
betainker G'Rightarrow beta bot X'(x^*-x_0).
$$
In other words, $X'(x^*-x_0)$ annihilates the kernel of $G'$. Hence, it must belong to the image of $G$
$$
existslambdacolon X'(x^*-x_0)=Glambdaquad Leftrightarrowquad X'(Xalpha-x_0)=Glambda.
$$
This equation together with $G'alpha=c$ results in $n+m$ equations and $n+m$ unknowns $(alpha,lambda)$.
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
|
show 2 more comments
up vote
1
down vote
accepted
up vote
1
down vote
accepted
The mistake is at the very last point where you interpret $beta$ as a solution (together with $alpha$). Your condition basically says that
$$
existsbetacolon G'beta=0, z=Xbeta bot x^*-x_0
$$
However, it should be
$$
forallbetacolon G'beta=0Rightarrow z=Xbeta bot x^*-x_0,
$$
or equivalently
$$
betainker G'Rightarrow beta bot X'(x^*-x_0).
$$
In other words, $X'(x^*-x_0)$ annihilates the kernel of $G'$. Hence, it must belong to the image of $G$
$$
existslambdacolon X'(x^*-x_0)=Glambdaquad Leftrightarrowquad X'(Xalpha-x_0)=Glambda.
$$
This equation together with $G'alpha=c$ results in $n+m$ equations and $n+m$ unknowns $(alpha,lambda)$.
The mistake is at the very last point where you interpret $beta$ as a solution (together with $alpha$). Your condition basically says that
$$
existsbetacolon G'beta=0, z=Xbeta bot x^*-x_0
$$
However, it should be
$$
forallbetacolon G'beta=0Rightarrow z=Xbeta bot x^*-x_0,
$$
or equivalently
$$
betainker G'Rightarrow beta bot X'(x^*-x_0).
$$
In other words, $X'(x^*-x_0)$ annihilates the kernel of $G'$. Hence, it must belong to the image of $G$
$$
existslambdacolon X'(x^*-x_0)=Glambdaquad Leftrightarrowquad X'(Xalpha-x_0)=Glambda.
$$
This equation together with $G'alpha=c$ results in $n+m$ equations and $n+m$ unknowns $(alpha,lambda)$.
answered Dec 9 at 10:34
A.Γ.
21.6k22455
21.6k22455
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
|
show 2 more comments
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
wonderful explanation, thank you!
– D...
Dec 10 at 10:05
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
Just a quick question: when you say that $X beta perp x^* - x_0$ is equivalent to $beta perp X'(x^* - x_0)$ aren't you assuming that the inner product in $H$ is the usual dot product? In case it is not, it should suffice to define an $m$ dimensional vector $v$ where the $j$-th element $v_j = langle x_j, x^* - x_0 rangle$. Then we have $v'beta = 0$ and the result follows, right?
– D...
Dec 10 at 11:54
1
1
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
@D... You are right with the vector $nu$. What I wrote with $nu=X'(x^*-x_0)$ was for notational simplicity (without any assumptions on the inner product). For any vector $y$ we have $langle Xbeta, yrangle=langle sum beta_i x_i, yrangle=sumbeta _ilangle x_i, yrangle=langle beta, nurangle$.
– A.Γ.
Dec 10 at 12:10
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
I'm a bit confused now... You have written $sum_i beta_i langle x_i, y rangle = langle beta, nu rangle$, but how is it true if the inner product is, for instance, $langle a, b rangle = 2 a' b$
– D...
Dec 10 at 13:56
1
1
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
@D... Sorry for the confusion. My notation $x'y$ apparently differs from yours. I mean exactly $x'y=langle x,yrangle$ in the sense that $x$ is a linear functional. It is simply a notational thing. In this sense, $X'y$ in my answer above means exactly the vector of $langle x_i,yrangle$ and nothing else.
– A.Γ.
Dec 10 at 14:32
|
show 2 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009209%2fminimum-norm-problem-in-hilbert-space-with-two-constraints%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
There are cases when no such $x^*$ exists. I think the assumption $Mcap N^perpneqemptyset$ does not guarantee existence of $x^*$.
– supinf
Nov 22 at 14:46
why you say so? the projection theorem guarantees that if $x$ is in some (non-empty) subspace then a unique solution exists...
– D...
Nov 22 at 14:48
$Mcap N^perp$ is not the feasible set from your question. You mentioned something about "translation by $c$" in the sentence before that, but that doesnt make much sense either because $c$ would be a vector in $mathbb R^m$ and $N^perp$ is a subset of a Hilbert space.
– supinf
Nov 22 at 14:52
yes, it is a linear variety but of course not a 'translation by $c$'. just edited my answer. the problem remains valid, though.
– D...
Nov 22 at 14:59
1
the Lagrangian approach results in a linear system
– LinAlg
Nov 26 at 18:33