We can describe the efficiency of an algorithm, program, or a programmatic operation, in terms of the time it takes, the amount of memory it uses, or the amount of secondary storage space it needs to do its work. However, these performance measures depend on a number of factors, not least characteristics of the computer that a program is running on. Big O tells us something else. Big O describes the way that the time taken by a program (or memory and space usage) depends on the amount of the data it has been given to work on in the first place. Big O, therefore, tells us how well a program scales. We say that Big O describes the ‘complexity’ of a program. This video explains what is meant by time complexity and shows how various worst case scenario Big O complexities can be derived for some well known algorithms. These include the linear search, stack push and pop operations, the bubble sort, the binary search and the merge sort.