A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. 1)Paul E. Black, “big-O notation”, in Dictionary of Algorithms and Data Structures [online], Vreda Pieterse and Paul E. Black, eds. 31 August 2012. (accessed 6 October 2015) Available from:

« Back to Glossary Index