algorithm - When to use Big O instead of theta or little o -
a question asymptotic notation. i've seen lot of explanations of asymptotic notation say:
θ(...)
analogous =
o(...)
analogous <=
o(...)
analogous <
which seem imply if f(n) = o(g(n))
, either f(n) = θ(g(n))
or f(n) = o(g(n))
.
is possible have f(n) = o(g(n))
such neither f(n) = θ(g(n))
nor f(n) = o(g(n))
? if so, example of this? , if not, why ever use o(...)
when θ(...)
or o(...)
stronger descriptors?
let f(n)=k!
, when k
smallest integer such n<=k!
.
then f(n)
not θ(n)
(since f(k!+1)/(k!+1)
tends infinity) neither o(n)
(since f(k!)=k!
), f(n)=o(n)
(as f(n)<=n
).
Comments
Post a Comment