Читать книгу The Self-Taught Computer Scientist - Cory Althoff - Страница 25
Cubic Time
ОглавлениеAfter quadratic comes cubic time complexity. An algorithm runs in cubic time when its performance is directly proportional to the problem's size cubed. In big O notation, you express a cubic algorithm as O(n**3). An algorithm with a cubic complexity is similar to quadratic, except n is raised to the third power instead of the second.
Here is an algorithm with cubic time complexity:
numbers = [1, 2, 3, 4, 5] for i in numbers: for j in numbers: for h in numbers: x = i + j + h print(x)
The equation for this algorithm is as follows:
f(n) = 1 + n * n * n * (1 + 1)
Or as follows:
f(n) = 1 + 2 * n**3
Like an algorithm with quadratic complexity, the most critical part of this equation is n**3, which grows so quickly it makes the rest of the equation, even if it included n**2, irrelevant. So, in big O notation, quadratic complexity is as follows:
O(n) = n**3
While two nested loops are a sign of quadratic time complexity, having three nested loops running from 0 to n means that the algorithm will follow cubic time. You will most likely encounter cubic time complexity if your work involves data science or statistics.
Both quadratic and cubic time complexities are special cases of a larger family of polynomial time complexities. An algorithm that runs in polynomial time scales as O(n**a), where a = 2 for quadratic time and a = 3 for cubic time. When designing your algorithms, you generally want to avoid polynomial scaling when possible because the algorithms can get very slow as n gets larger. Sometimes you can't escape polynomial scaling, but you can find comfort knowing that the polynomial complexity is still not the worst case, by far.