It's well known that asymptotic notation is used to convey the speed and efficiency of code blocks in computer programs. I haven't used them very much while working with Python, so I needed to refresh my memory before trying to use this great tool.

Cardinal Rule: Focus primarily the largest value in the equation of time complexity. All other factors in the time complexity equation are essentially trumped.

O(n^4+n^2+n^3+nm+100) ~= O(n^4)
Update: assuming m is linear.

Trump Rules for Time Complexity:

  • Notes
    • Remember that we care most about the upper bound and are not so concerned with the lower (in general)
    • The smaller the upper bound number the better (and consequently, faster)
  • The Ladder
    • Constants are less than logarithms
    • Logarithms are less than polynomials
    • Polynomials are less than exponentials
    • Exponentials are less than factorials

Notation and Hierarchy (Smaller Is Better):

Constant Θ(1)
Logarithmic Θ(lg n) 
Linear Θ(n) 
Loglinear Θ(n lg n) 
Cubic Θ(n^3
Polynomial O(n^k
Exponential O(k^n
Factorial Θ(n!)

Quick Examples:

[i for i in list] {linear}
Functions that generally operate on lists or generators (sum, map, filter, reduce, min, max, etc) tend to be linear in time complexity

[i+k for i in list for k in list] {quadratic}
[i+k for i in list1 for k in list2] O(list1*list2) {quadratic I think, since it's linear*linear}
Add 1 to the exponent value for each nested loop. For example. [j+i+k+n for j in list1 for i in list1 for k in list1 for n in list1] would have a time complexity of O(n^4)

Note: Some programmers reduce quadratic time complexity a bit when using nested loops with sorted lists by ensuring that calculations aren't performed more than once. Consequently, that code block runs faster and faster and less and less has to be evaluated through each iteration of the loop. For example:

list1 = [i for i in range(10)]
size = len(list1)
for n in range(size-1):
    for k in range(n+1, size):
        print number*(n+k)