Q1
Q1 Which of the following is a characteristic of an efficient algorithm?
Uses minimal CPU time
Uses minimal memory
Is easy to implement
All of the above
Q2
Q2 The process of breaking a complex problem into smaller, more manageable parts is known as what?
Decomposition
Abstraction
Encapsulation
Inheritance
Q3
Q3 What is the primary purpose of pseudocode?
To document algorithms in natural language
To compile and execute algorithms
To debug code
To optimize algorithms
Q4
Q4 In algorithm analysis, what does asymptotic complexity refer to?
The complexity in the best case
The complexity in the worst case
The complexity in the average case
The behavior of an algorithm as the input size grows
Q5
Q5 What will be the output of the following pseudocode if the input is 5?
function factorial(n): if n == 1 return 1 else return n * factorial(n-1)
5
24
120
None of the above
Q6
Q6 Consider the following algorithm for calculating the nth Fibonacci number: function fib(n):
if n <= 1 return n else return fib(n-1) + fib(n-2). What is the time complexity of this algorithm?
O(n)
O(log n)
O(n^2)
O(2^n)
Q7
Q7 An algorithm supposed to calculate the sum of numbers from 1 to n returns a higher value than expected.
What is the most likely mistake?
Starting the loop from 0
Not initializing the sum variable
Adding n twice
All of the above
Q8
Q8 Given an algorithm that always returns the first element in a sorted list instead of the smallest, what is likely the issue?
The algorithm incorrectly assumes the first element is the smallest
The list is not properly sorted
A loop iterates incorrectly
All of the above
Q9
Q9 Which Big O notation represents constant time complexity?
O(1)
O(n)
O(log n)
O(n^2)
Q10
Q10 For a linear search in an unsorted array of n elements, what is the average case time complexity?
O(1)
O(n)
O(log n)
O(n^2)
Q11
Q11 What does the Big O notation O(n^2) signify about an algorithm's growth rate?
Linear growth
Quadratic growth
Logarithmic growth
Exponential growth
Q12
Q12 In Big O notation, what does O(log n) typically represent?
The time complexity of binary search
The time complexity of linear search
The space complexity of sorting algorithms
The space complexity of hashing
Q13
Q13 Which of the following best describes the time complexity of inserting an element into a binary search tree?
O(1)
O(log n)
O(n)
O(n log n)
Q14
Q14 What is the worst-case time complexity of quicksort?
O(n log n)
O(n)
O(n^2)
O(log n)
Q15
Q15 How does the space complexity of an iterative solution compare to a recursive solution for the same problem?
Iterative solutions always use more space
Recursive solutions always use more space
Depends on the specific problem
They use the same amount of space
Q16
Q16 What is the time complexity of the following code snippet?
for i in range(n):
print(i)
O(1)
O(n)
O(log n)
O(n^2)
Q17
Q17 Given the code for i in range(n):
for j in range(n):
print(i, j),
what is the time complexity?
O(1)
O(n)
O(n log n)
O(n^2)
Q18
Q18 Analyze the time complexity of the following function: def func(n):
if n <= 1:
return else func(n/2) + func(n/2)
O(n)
O(log n)
O(n^2)
O(n log n)
Q19
Q19 An algorithm that should run in O(n log n) time complexity runs significantly slower.
The likely cause is:
Incorrect base case in recursion
Excessive memory allocation
Poor choice of pivot in sorting
All of the above
Q20
Q20 A function designed to be O(n) complexity takes longer as n increases.
What might be overlooked?
Nested loops
Constant factors
Linear operations
None of the above
Q21
Q21 A recursive algorithm expected to have a time complexity of O(log n) is running slower.
The likely issue is:
Not halving the input on each recursive call
Incorrect termination condition
Stack overflow
All of the above
Q22
Q22 Which data structure should be used to store a collection of characters in a sequence?
Array
Stack
Queue
Graph
Q23
Q23 What is the time complexity of accessing an element in an array by its index?
O(1)
O(n)
O(log n)
O(n^2)
Q24
Q24 Which of the following is NOT a valid reason to use a string builder in Java instead of concatenating strings using the + operator?
It reduces memory usage
It is faster for concatenating multiple strings
It is immutable
It can be used in multi-threaded environments
Q25
Q25 In an unsorted array of integers, what is the best time complexity achievable for searching for a specific value?
O(1)
O(n)
O(log n)
O(n^2)
Q26
Q26 Considering a character array representing a string, what is the space complexity for storing this string?
O(1)
O(n)
O(log n)
O(n^2)
Q27
Q27 Which operation has a worse time complexity in a dynamic array when it needs to expand its size?
Accessing an element by index
Appending an element at the end
Inserting an element at the beginning
Searching for an element
Q28
Q28 What does the following Python code snippet return?
arr = ['a', 'b', 'c', 'd'];
print(arr[1:3])
['a', 'b']
['b', 'c']
['c', 'd']
['b', 'c', 'd']
Q29
Q29 Given an array of integers, which of the following operations will NOT mutate the original array in JavaScript?
arr.sort()
arr.push(5)
[...arr, 5]
arr.pop()
Q30
Q30 What is the result of concatenating two arrays in Python using the + operator, arr1 = [1, 2, 3] and arr2 = [4, 5, 6]?
A new array [1, 2, 3, 4, 5, 6]
The original arrays are mutated to include the elements of the other
A syntax error
None of the above