Big O Notation (Time Complexities)

In computer science, Big O notation is used to classify algorithms according to how their running time increases as the input size grows. Big O notation formalises the notion of "how long an algorithm takes to run". We use it to describe the worst-case runtime.

· August 2, 2022
Prerequisites: Principles of Programming 
Course hours: 5-15 hours 
Assessments: Summative Quiz 
Accreditation: NIL 
Certificate:WYWM Certificate of Completion
Instructor Support: Yes 
Difficulty: Intermediate 

In computer science, Big O notation is used to classify algorithms according to how their running time increases as the input size grows. Big O notation formalises the notion of “how long an algorithm takes to run”. We use it to describe the worst-case runtime. 

By taking this course, you can optimise your code to be more efficient. This course will also help you understand why code can take a lot longer to run if you do it wrong!

After completing this course, students will be able to:  

  • Identify the time complexity of an algorithm on a graph 
  • Explain why the time complexity of an algorithm is given a specific label 
  1. O(1) 
  1. O(log n) 
  1. O(n) 
  1. O(n2) 
  1. O(n log n) 
  • Interpret algorithms to determine their time complexity 
+19 enrolled
Not Enrolled
This course is currently closed

Course Includes

  • 2 Lessons
  • Course Certificate