Proof that the Booth encoding and the modified booth encoding are correct











up vote
1
down vote

favorite












In this book, sections 9.4 and 10.2 explains how to implement the booth encoding to speed up operations in hardware. I've being searching for some reference for a formal proof of why this encoding is correct. I understand the rationale behind such enconding, but I don't understand why for radix greater than 2 it work.



For example, consider a number



$$
x = -x_{n-1}2^{n-1} + sum_{j=0}^{n-2} x_j 2^j
$$



The booth encoding, radix $2$ consist in defining the digits



$$
y_j = -x_j + x_{j-1} ;; 0 leq j leq n - 1
$$



In radix $4$ it becomes ($j$ is multiple of 2)



$$
y_{j/2} = -2x_{j+1} + x_j + x_{j-1}
$$



In radix $8$ ($j$ is multiple of $3$)



$$
y_{j/3} = -4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



In radix $16$ ($j$ is multiple of 4)



$$
y_{j/4} = -8x_{j+3} + 4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



Etc...
But I cannot work out how to prove that basically the transformation is invertible...
I'm neither sure how to problem has to be set up in order to prove that the transformation is invertible, could you help me?



What I'm sure about is that the transformation that allow me to get the digit $y_j$ is linear, so maybe I could set up some linear system.



I want to formalize this think because it is very easy to get intuition that motivate the radix $2$ booth encoding, but increasing the radix makes me lose such intuition...



My attempt to formalize the problem is the following. Let's define $r = 2^k$ then



$$
y = sum_{
left{
begin{array}{l}
i=0\
j=ki
end{array}
right.}^{m - 1} y_{j/k} 2^{ik}
$$



where



$$
y_{j/k} = -x_{j+k-1}2^{k-1} + sum_{l=0}^{k-2} x_{j+l} 2^{l} + x_{j-1}
$$



However I firstly don't know how to find the value $m$, and also I'm confused on how I can use the expansion of $y$ in terms of $y_{j/k}$ to retrieve the digits of $x$.



Update: Maybe this could work (I'm changing a bit the notation)



Let's start with radix $2$ we have



$$
sum_{j=0}^{n-1} x_j 2^j = sum_{j=0}^n left(-x_j+x_{j-1}right)2^j = sum_{j=0}^n y^1_j 2^j
$$



where



$$
y^1_j = -x_j + x_{j-1} ;,; 0 leq j leq n
$$



where $x_n = x_{-1} = 0$



Let's move to higher radix now, for radix $4=2^2$ we need to consider the digits obtained as



$$
y^2_j = y^1_{2j+1} 2 + y^1_{2j} = -2x_{2j+1} + x_{2j} + x_{2j-1} ;,; 0 leq j leq leftlceil frac{n-1}{2} rightrceil
$$



For radix $8=2^3$ we consider instead



$$
y^3_j = y^1_{3j+2} 4 + y^1_{3j+1}2 + y_{3j} = -4x_{3j+2} + x_{3j+1}2 + x_{3j} + x_{3j-1} ;,; 0 leq j leq leftlceil frac{n-2}{3} rightrceil
$$



In general, i.e. radix $2^k$ we have



$$
begin{multline}
y^k_j = y^1_{kj + k - 1} 2^{k-1} + y^1_{kj + k - 2} 2^{k-2} + ldots + y^1_{kj} = \
-x_{kj+k-1}2^{k-1} + x_{kj + k - 2} 2^{k-2} + ldots + x_{kj} + x_{kj-1} = \
;,; 0 leq j leq leftlceil frac{n-k+1}{k} rightrceil
end{multline}
$$



Is this sufficient to state that



$$
x = sum_{j=0}^{n} x_j 2^j = sum_{j=0}^{leftlceil frac{n-k+1}{k}rightrceil} y_j^k 2^j
$$










share|cite|improve this question
























  • very good book, thanks for it
    – dato datuashvili
    Dec 15 '16 at 20:11















up vote
1
down vote

favorite












In this book, sections 9.4 and 10.2 explains how to implement the booth encoding to speed up operations in hardware. I've being searching for some reference for a formal proof of why this encoding is correct. I understand the rationale behind such enconding, but I don't understand why for radix greater than 2 it work.



For example, consider a number



$$
x = -x_{n-1}2^{n-1} + sum_{j=0}^{n-2} x_j 2^j
$$



The booth encoding, radix $2$ consist in defining the digits



$$
y_j = -x_j + x_{j-1} ;; 0 leq j leq n - 1
$$



In radix $4$ it becomes ($j$ is multiple of 2)



$$
y_{j/2} = -2x_{j+1} + x_j + x_{j-1}
$$



In radix $8$ ($j$ is multiple of $3$)



$$
y_{j/3} = -4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



In radix $16$ ($j$ is multiple of 4)



$$
y_{j/4} = -8x_{j+3} + 4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



Etc...
But I cannot work out how to prove that basically the transformation is invertible...
I'm neither sure how to problem has to be set up in order to prove that the transformation is invertible, could you help me?



What I'm sure about is that the transformation that allow me to get the digit $y_j$ is linear, so maybe I could set up some linear system.



I want to formalize this think because it is very easy to get intuition that motivate the radix $2$ booth encoding, but increasing the radix makes me lose such intuition...



My attempt to formalize the problem is the following. Let's define $r = 2^k$ then



$$
y = sum_{
left{
begin{array}{l}
i=0\
j=ki
end{array}
right.}^{m - 1} y_{j/k} 2^{ik}
$$



where



$$
y_{j/k} = -x_{j+k-1}2^{k-1} + sum_{l=0}^{k-2} x_{j+l} 2^{l} + x_{j-1}
$$



However I firstly don't know how to find the value $m$, and also I'm confused on how I can use the expansion of $y$ in terms of $y_{j/k}$ to retrieve the digits of $x$.



Update: Maybe this could work (I'm changing a bit the notation)



Let's start with radix $2$ we have



$$
sum_{j=0}^{n-1} x_j 2^j = sum_{j=0}^n left(-x_j+x_{j-1}right)2^j = sum_{j=0}^n y^1_j 2^j
$$



where



$$
y^1_j = -x_j + x_{j-1} ;,; 0 leq j leq n
$$



where $x_n = x_{-1} = 0$



Let's move to higher radix now, for radix $4=2^2$ we need to consider the digits obtained as



$$
y^2_j = y^1_{2j+1} 2 + y^1_{2j} = -2x_{2j+1} + x_{2j} + x_{2j-1} ;,; 0 leq j leq leftlceil frac{n-1}{2} rightrceil
$$



For radix $8=2^3$ we consider instead



$$
y^3_j = y^1_{3j+2} 4 + y^1_{3j+1}2 + y_{3j} = -4x_{3j+2} + x_{3j+1}2 + x_{3j} + x_{3j-1} ;,; 0 leq j leq leftlceil frac{n-2}{3} rightrceil
$$



In general, i.e. radix $2^k$ we have



$$
begin{multline}
y^k_j = y^1_{kj + k - 1} 2^{k-1} + y^1_{kj + k - 2} 2^{k-2} + ldots + y^1_{kj} = \
-x_{kj+k-1}2^{k-1} + x_{kj + k - 2} 2^{k-2} + ldots + x_{kj} + x_{kj-1} = \
;,; 0 leq j leq leftlceil frac{n-k+1}{k} rightrceil
end{multline}
$$



Is this sufficient to state that



$$
x = sum_{j=0}^{n} x_j 2^j = sum_{j=0}^{leftlceil frac{n-k+1}{k}rightrceil} y_j^k 2^j
$$










share|cite|improve this question
























  • very good book, thanks for it
    – dato datuashvili
    Dec 15 '16 at 20:11













up vote
1
down vote

favorite









up vote
1
down vote

favorite











In this book, sections 9.4 and 10.2 explains how to implement the booth encoding to speed up operations in hardware. I've being searching for some reference for a formal proof of why this encoding is correct. I understand the rationale behind such enconding, but I don't understand why for radix greater than 2 it work.



For example, consider a number



$$
x = -x_{n-1}2^{n-1} + sum_{j=0}^{n-2} x_j 2^j
$$



The booth encoding, radix $2$ consist in defining the digits



$$
y_j = -x_j + x_{j-1} ;; 0 leq j leq n - 1
$$



In radix $4$ it becomes ($j$ is multiple of 2)



$$
y_{j/2} = -2x_{j+1} + x_j + x_{j-1}
$$



In radix $8$ ($j$ is multiple of $3$)



$$
y_{j/3} = -4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



In radix $16$ ($j$ is multiple of 4)



$$
y_{j/4} = -8x_{j+3} + 4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



Etc...
But I cannot work out how to prove that basically the transformation is invertible...
I'm neither sure how to problem has to be set up in order to prove that the transformation is invertible, could you help me?



What I'm sure about is that the transformation that allow me to get the digit $y_j$ is linear, so maybe I could set up some linear system.



I want to formalize this think because it is very easy to get intuition that motivate the radix $2$ booth encoding, but increasing the radix makes me lose such intuition...



My attempt to formalize the problem is the following. Let's define $r = 2^k$ then



$$
y = sum_{
left{
begin{array}{l}
i=0\
j=ki
end{array}
right.}^{m - 1} y_{j/k} 2^{ik}
$$



where



$$
y_{j/k} = -x_{j+k-1}2^{k-1} + sum_{l=0}^{k-2} x_{j+l} 2^{l} + x_{j-1}
$$



However I firstly don't know how to find the value $m$, and also I'm confused on how I can use the expansion of $y$ in terms of $y_{j/k}$ to retrieve the digits of $x$.



Update: Maybe this could work (I'm changing a bit the notation)



Let's start with radix $2$ we have



$$
sum_{j=0}^{n-1} x_j 2^j = sum_{j=0}^n left(-x_j+x_{j-1}right)2^j = sum_{j=0}^n y^1_j 2^j
$$



where



$$
y^1_j = -x_j + x_{j-1} ;,; 0 leq j leq n
$$



where $x_n = x_{-1} = 0$



Let's move to higher radix now, for radix $4=2^2$ we need to consider the digits obtained as



$$
y^2_j = y^1_{2j+1} 2 + y^1_{2j} = -2x_{2j+1} + x_{2j} + x_{2j-1} ;,; 0 leq j leq leftlceil frac{n-1}{2} rightrceil
$$



For radix $8=2^3$ we consider instead



$$
y^3_j = y^1_{3j+2} 4 + y^1_{3j+1}2 + y_{3j} = -4x_{3j+2} + x_{3j+1}2 + x_{3j} + x_{3j-1} ;,; 0 leq j leq leftlceil frac{n-2}{3} rightrceil
$$



In general, i.e. radix $2^k$ we have



$$
begin{multline}
y^k_j = y^1_{kj + k - 1} 2^{k-1} + y^1_{kj + k - 2} 2^{k-2} + ldots + y^1_{kj} = \
-x_{kj+k-1}2^{k-1} + x_{kj + k - 2} 2^{k-2} + ldots + x_{kj} + x_{kj-1} = \
;,; 0 leq j leq leftlceil frac{n-k+1}{k} rightrceil
end{multline}
$$



Is this sufficient to state that



$$
x = sum_{j=0}^{n} x_j 2^j = sum_{j=0}^{leftlceil frac{n-k+1}{k}rightrceil} y_j^k 2^j
$$










share|cite|improve this question















In this book, sections 9.4 and 10.2 explains how to implement the booth encoding to speed up operations in hardware. I've being searching for some reference for a formal proof of why this encoding is correct. I understand the rationale behind such enconding, but I don't understand why for radix greater than 2 it work.



For example, consider a number



$$
x = -x_{n-1}2^{n-1} + sum_{j=0}^{n-2} x_j 2^j
$$



The booth encoding, radix $2$ consist in defining the digits



$$
y_j = -x_j + x_{j-1} ;; 0 leq j leq n - 1
$$



In radix $4$ it becomes ($j$ is multiple of 2)



$$
y_{j/2} = -2x_{j+1} + x_j + x_{j-1}
$$



In radix $8$ ($j$ is multiple of $3$)



$$
y_{j/3} = -4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



In radix $16$ ($j$ is multiple of 4)



$$
y_{j/4} = -8x_{j+3} + 4x_{j+2} + 2x_{j+1} + x_j + x_{j-1}
$$



Etc...
But I cannot work out how to prove that basically the transformation is invertible...
I'm neither sure how to problem has to be set up in order to prove that the transformation is invertible, could you help me?



What I'm sure about is that the transformation that allow me to get the digit $y_j$ is linear, so maybe I could set up some linear system.



I want to formalize this think because it is very easy to get intuition that motivate the radix $2$ booth encoding, but increasing the radix makes me lose such intuition...



My attempt to formalize the problem is the following. Let's define $r = 2^k$ then



$$
y = sum_{
left{
begin{array}{l}
i=0\
j=ki
end{array}
right.}^{m - 1} y_{j/k} 2^{ik}
$$



where



$$
y_{j/k} = -x_{j+k-1}2^{k-1} + sum_{l=0}^{k-2} x_{j+l} 2^{l} + x_{j-1}
$$



However I firstly don't know how to find the value $m$, and also I'm confused on how I can use the expansion of $y$ in terms of $y_{j/k}$ to retrieve the digits of $x$.



Update: Maybe this could work (I'm changing a bit the notation)



Let's start with radix $2$ we have



$$
sum_{j=0}^{n-1} x_j 2^j = sum_{j=0}^n left(-x_j+x_{j-1}right)2^j = sum_{j=0}^n y^1_j 2^j
$$



where



$$
y^1_j = -x_j + x_{j-1} ;,; 0 leq j leq n
$$



where $x_n = x_{-1} = 0$



Let's move to higher radix now, for radix $4=2^2$ we need to consider the digits obtained as



$$
y^2_j = y^1_{2j+1} 2 + y^1_{2j} = -2x_{2j+1} + x_{2j} + x_{2j-1} ;,; 0 leq j leq leftlceil frac{n-1}{2} rightrceil
$$



For radix $8=2^3$ we consider instead



$$
y^3_j = y^1_{3j+2} 4 + y^1_{3j+1}2 + y_{3j} = -4x_{3j+2} + x_{3j+1}2 + x_{3j} + x_{3j-1} ;,; 0 leq j leq leftlceil frac{n-2}{3} rightrceil
$$



In general, i.e. radix $2^k$ we have



$$
begin{multline}
y^k_j = y^1_{kj + k - 1} 2^{k-1} + y^1_{kj + k - 2} 2^{k-2} + ldots + y^1_{kj} = \
-x_{kj+k-1}2^{k-1} + x_{kj + k - 2} 2^{k-2} + ldots + x_{kj} + x_{kj-1} = \
;,; 0 leq j leq leftlceil frac{n-k+1}{k} rightrceil
end{multline}
$$



Is this sufficient to state that



$$
x = sum_{j=0}^{n} x_j 2^j = sum_{j=0}^{leftlceil frac{n-k+1}{k}rightrceil} y_j^k 2^j
$$







linear-algebra sequences-and-series computer-arithmetic






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 21 at 9:12

























asked Dec 15 '16 at 17:13









user8469759

1,3361616




1,3361616












  • very good book, thanks for it
    – dato datuashvili
    Dec 15 '16 at 20:11


















  • very good book, thanks for it
    – dato datuashvili
    Dec 15 '16 at 20:11
















very good book, thanks for it
– dato datuashvili
Dec 15 '16 at 20:11




very good book, thanks for it
– dato datuashvili
Dec 15 '16 at 20:11















active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2060152%2fproof-that-the-booth-encoding-and-the-modified-booth-encoding-are-correct%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2060152%2fproof-that-the-booth-encoding-and-the-modified-booth-encoding-are-correct%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix