Big Oh Notation: Proving that $n! in Omega(7^n)$
up vote
1
down vote
favorite
Problem
I've got the following statement which I'm looking to prove:
$log_2(n!) in mathcal{O}(n cdot log_3(n))$
The question is: how to do it?
Steps taken so far
My approach so far was to apply a few laws regarding the logarithms as follows:
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(n cdot log_3(n)right)$
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(log_3(n^n)right)$
$Leftrightarrow left(frac{ln(n!)}{ln(2)}right) in mathcal{O}left( frac{ln(n^n)}{ln(3)}right)$
$Leftrightarrow left(frac{1}{ln(2)} cdot ln(n!)right) in mathcal{O}left( frac{1}{ln(3)} cdot ln(n^n) right)$
Which approximately boils down to..
$underline{Leftrightarrow left(1.44 cdot ln(n!)right) in mathcal{O}left( 0.91 cdot ln(n^n) right)}$
Unfortunately, that's still not particularly helpful. Of course, I realize that $n^n$ is going to grow much faster than $n!$. Still, the natural logarithms combined with the constants are making it hard for me to estimate which of the two terms might be the "smaller" one.
Therefore, I'd greatly appreciate your ideas. In case we can't find a fully formal proof, a more informal one would certainly be helpful nevertheless.
computer-science computational-complexity
|
show 2 more comments
up vote
1
down vote
favorite
Problem
I've got the following statement which I'm looking to prove:
$log_2(n!) in mathcal{O}(n cdot log_3(n))$
The question is: how to do it?
Steps taken so far
My approach so far was to apply a few laws regarding the logarithms as follows:
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(n cdot log_3(n)right)$
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(log_3(n^n)right)$
$Leftrightarrow left(frac{ln(n!)}{ln(2)}right) in mathcal{O}left( frac{ln(n^n)}{ln(3)}right)$
$Leftrightarrow left(frac{1}{ln(2)} cdot ln(n!)right) in mathcal{O}left( frac{1}{ln(3)} cdot ln(n^n) right)$
Which approximately boils down to..
$underline{Leftrightarrow left(1.44 cdot ln(n!)right) in mathcal{O}left( 0.91 cdot ln(n^n) right)}$
Unfortunately, that's still not particularly helpful. Of course, I realize that $n^n$ is going to grow much faster than $n!$. Still, the natural logarithms combined with the constants are making it hard for me to estimate which of the two terms might be the "smaller" one.
Therefore, I'd greatly appreciate your ideas. In case we can't find a fully formal proof, a more informal one would certainly be helpful nevertheless.
computer-science computational-complexity
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
1
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29
|
show 2 more comments
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Problem
I've got the following statement which I'm looking to prove:
$log_2(n!) in mathcal{O}(n cdot log_3(n))$
The question is: how to do it?
Steps taken so far
My approach so far was to apply a few laws regarding the logarithms as follows:
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(n cdot log_3(n)right)$
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(log_3(n^n)right)$
$Leftrightarrow left(frac{ln(n!)}{ln(2)}right) in mathcal{O}left( frac{ln(n^n)}{ln(3)}right)$
$Leftrightarrow left(frac{1}{ln(2)} cdot ln(n!)right) in mathcal{O}left( frac{1}{ln(3)} cdot ln(n^n) right)$
Which approximately boils down to..
$underline{Leftrightarrow left(1.44 cdot ln(n!)right) in mathcal{O}left( 0.91 cdot ln(n^n) right)}$
Unfortunately, that's still not particularly helpful. Of course, I realize that $n^n$ is going to grow much faster than $n!$. Still, the natural logarithms combined with the constants are making it hard for me to estimate which of the two terms might be the "smaller" one.
Therefore, I'd greatly appreciate your ideas. In case we can't find a fully formal proof, a more informal one would certainly be helpful nevertheless.
computer-science computational-complexity
Problem
I've got the following statement which I'm looking to prove:
$log_2(n!) in mathcal{O}(n cdot log_3(n))$
The question is: how to do it?
Steps taken so far
My approach so far was to apply a few laws regarding the logarithms as follows:
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(n cdot log_3(n)right)$
$Leftrightarrow left(log_2(n!)right) in mathcal{O}left(log_3(n^n)right)$
$Leftrightarrow left(frac{ln(n!)}{ln(2)}right) in mathcal{O}left( frac{ln(n^n)}{ln(3)}right)$
$Leftrightarrow left(frac{1}{ln(2)} cdot ln(n!)right) in mathcal{O}left( frac{1}{ln(3)} cdot ln(n^n) right)$
Which approximately boils down to..
$underline{Leftrightarrow left(1.44 cdot ln(n!)right) in mathcal{O}left( 0.91 cdot ln(n^n) right)}$
Unfortunately, that's still not particularly helpful. Of course, I realize that $n^n$ is going to grow much faster than $n!$. Still, the natural logarithms combined with the constants are making it hard for me to estimate which of the two terms might be the "smaller" one.
Therefore, I'd greatly appreciate your ideas. In case we can't find a fully formal proof, a more informal one would certainly be helpful nevertheless.
computer-science computational-complexity
computer-science computational-complexity
edited Nov 17 at 22:07
Rócherz
2,6262721
2,6262721
asked Nov 17 at 22:02
StckXchnge-nub12
253
253
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
1
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29
|
show 2 more comments
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
1
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
1
1
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29
|
show 2 more comments
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3002869%2fbig-oh-notation-proving-that-n-in-omega7n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Isn't the functionality of $ln$ such that it tapers off for any outrageous values?
– T.Woody
Nov 17 at 22:14
Sure, it starts growing slower and slower with growing $x$-values, just like any logarithm I suppose. Still, I don't see how that would be helpful to answer this question?
– StckXchnge-nub12
Nov 17 at 22:19
It would help out because we know that we visually can see this be the case as we take a Lim to inifinity.
– T.Woody
Nov 17 at 22:20
1
Supposing you know $n!=O(n^n)$ (easy to prove) then $log(n!)=O(log(n^n))$. Note $log(n^n)=n*log(n)$ and of course you can take care of the different basis of the log...
– Marco Bellocchi
Nov 17 at 22:26
@T.Woody I see limited use in this. Sure, the $ln$ keeps growing slower, but still, $lim_{x rightarrow infty} (ln(x)) = infty$. It's not like we had reason to assume $ln$ as constant after passing a certain $x$-boundary.
– StckXchnge-nub12
Nov 17 at 22:29