# Can someone explain how Big-Oh works with Summations?

I know this isn't strictly a programming question, but it *is* a computer science question so I'm hoping someone can help me.

I've been working on my Algorithms homework and figuring out the Big-Oh, Big-Omega, Theta, etc, of several algorithms. I'm proving them by finding their C and N_{0} values and all is going well.

However, I've come across my last two problems in the set and I'm struggling figuring out how to do them (and google isn't helping much).

I haven't had to figure out the Big-Oh/Omega of summations before.

My last two problems are:

- Show that
**Σ (i=1 to n) of i**is O(N^{2}^{3})

and

- Show that
**Σ (i=1 to n) of [log**is Ω(n log n)_{2}i]

My question is, How do I show that?

For example, in the first one, intuitively I can't see how that summation of i^{2} is O(N^{3}). The second one confuses me even more. Can someone explain how to show the Big-Oh and and Big-Omega of these summations?

## Answers:

My *guess* is that what the question statement means is if you're summing the results of some calculation for which the running time is proportional to i^{2} in the first case, and proportional to log_{2}i in the second case. In both cases, the running time of the overall summation is "dominated" by the larger values of N within the summation, and thus the overall big-O evaluation of both will be N*O(f) where f is the function you're summing.