Big O Notation

computer-science

A mathematical notation used to describe the upper bound of an algorithm's time or space complexity as input size grows. Common complexities include O(1), O(log n), O(n), O(n log n), and O(n^2). Nearly every coding interview expects you to analyze and optimize the Big O complexity of your solution.

Termeni Asociați