I have some points that I don't really agree with.

But for the sake of software engineering technical interviews, all we care about is the best and worst case scenarios, which Big O provides the tightest description of the run time and space.

Big O describes the upper bound. The "tightest description" would be provided by Big Omega which describes both upper and lower bound.

A function that is in O(1) is also in O(n) etc. but yeah, we prefer the tightest bound to get the best info about the function. In some sense is big Omega used as big O in CS.

Meaning? If the input size is 2, then the function has to be done 4 times. If the input size is 3, then the function has to be done 9 times.

This isn't really correct. The number of instructions/steps grows, not the number of the function executions.

Most of the time, we would describe an algorithm that has quadratic time when we have to iterate an object at least twice, examples like nested for loops.

"Iterate an object at least twice" this looks like indication for two not nested loops.

Dancer using her creativity in new ways 🩰 | Embrace being a beginner 🌱| Believe in sharing knowledge ❤️ | Flatiron School grad && The Collab Lab Winter 2021✨

For your first and second comment, I actually paraphrased those from the resources I found, I apologized about the inaccuracy as I intended to use less technical words that it might put off as inaccurate. I am going to fix that.
I would rearrange the words of what you said about the third comment, now that I am reading it, it does sound like two not nested loops.

Please let us know if there's any other inaccuracy you may find, we don't want to give false information to other learners who are also trying to understand big-o notation

Hey, glad my feedback helped at least a little bit. It looks great.

Last time I missed this (maybe it is out of purpose of this article being less technical?):

A function with quadratic time complexity has a growth rate of n2. Meaning? If the input size is 2, then the function will take 4 operations

Functions n^2, n^2 + n, n^2 + 1 etc. are all in O(n^2) and only n^2 (for n=2) would result into 4 (operations). You have similar "issue" with exponential time and some other cases.

But I like the way you show those calculations as they give better idea how the number of operations grow.

Hi, good article.

I have some points that I don't really agree with.

Big O describes the upper bound. The "tightest description" would be provided by Big Omega which describes both upper and lower bound.

A function that is in O(1) is also in O(n) etc. but yeah, we prefer the tightest bound to get the best info about the function. In some sense is big Omega used as big O in CS.

This isn't really correct. The number of instructions/steps grows, not the number of the function executions.

"Iterate an object at least twice" this looks like indication for two not nested loops.

Same as Megan, thank you for the feedback and insight Lukáš!

Hey Lukáš! Thank you so much for the feedback!

For your first and second comment, I actually paraphrased those from the resources I found, I apologized about the inaccuracy as I intended to use less technical words that it might put off as inaccurate. I am going to fix that.

I would rearrange the words of what you said about the third comment, now that I am reading it, it does sound like two not nested loops.

Please let us know if there's any other inaccuracy you may find, we don't want to give false information to other learners who are also trying to understand big-o notation

Hey, glad my feedback helped at least a little bit. It looks great.

Last time I missed this (maybe it is out of purpose of this article being less technical?):

Functions

`n^2`

,`n^2 + n`

,`n^2 + 1`

etc. are all in O(n^2) and only`n^2`

(for`n=2`

) would result into 4 (operations). You have similar "issue" with exponential time and some other cases.But I like the way you show those calculations as they give better idea how the number of operations grow.

Agreed.