Grade – 11 – Computer Science – Data Structures and Algorithms (Advanced) – Multiple Choice Questions

Multiple Choice Questions

Data Structures and Algorithms (Advanced)

Topic: Arrays and Linked Lists
Grade: 11

Question 1:
What is the time complexity of accessing an element in an array?
a) O(1)
b) O(log n)
c) O(n)
d) O(n^2)

Answer: a) O(1)
Explanation: Accessing an element in an array takes constant time, regardless of the size of the array. This is because arrays have a fixed size and elements are stored sequentially in memory. For example, if we have an array of integers, accessing the element at index 5 can be done in constant time, as the memory address of the element can be calculated using a simple formula.

Question 2:
Which data structure is best suited for implementing a stack?
a) Array
b) Linked List
c) Queue
d) Tree

Answer: b) Linked List
Explanation: A stack is a Last-In-First-Out (LIFO) data structure, and a linked list is well-suited for implementing this behavior. This is because a linked list provides efficient insertion and deletion at the beginning of the list, which corresponds to the push and pop operations of a stack. For example, to push an element onto a stack implemented with a linked list, we can simply add a new node at the head of the list.

Question 3:
Which of the following data structures can be used to efficiently implement a dictionary?
a) Array
b) Linked List
c) Queue
d) Hash Table

Answer: d) Hash Table
Explanation: A hash table is a data structure that provides efficient insertion, deletion, and search operations. It uses a hash function to map keys to indices in an array, allowing for constant-time access to values based on their keys. This makes it well-suited for implementing a dictionary, where key-value pairs are stored and retrieved. For example, if we want to implement a dictionary of words and their definitions, a hash table can provide fast look-up based on the word.

Question 4:
What is the time complexity of searching for an element in a sorted linked list?
a) O(1)
b) O(log n)
c) O(n)
d) O(n^2)

Answer: c) O(n)
Explanation: Searching for an element in a sorted linked list requires traversing the list from the beginning until the desired element is found or the end of the list is reached. This operation has a time complexity of O(n), as the worst-case scenario is that the element being searched for is at the end of the list. For example, if we have a sorted linked list of names and we want to search for a specific name, we may need to iterate through all the names in the list until we find a match.

Question 5:
Which data structure is best suited for implementing a queue?
a) Array
b) Linked List
c) Stack
d) Hash Table

Answer: a) Array
Explanation: A queue is a First-In-First-Out (FIFO) data structure, and an array is well-suited for implementing this behavior. This is because an array allows for efficient insertion at the end and deletion at the beginning, which corresponds to the enqueue and dequeue operations of a queue. For example, to enqueue an element into a queue implemented with an array, we can simply add the element at the end of the array.

Topic: Sorting Algorithms
Grade: 11

Question 6:
Which sorting algorithm has a worst-case time complexity of O(n^2)?
a) Bubble Sort
b) Insertion Sort
c) Merge Sort
d) Quick Sort

Answer: a) Bubble Sort
Explanation: Bubble Sort compares adjacent elements and swaps them if they are in the wrong order, repeatedly until the entire array is sorted. In the worst-case scenario, where the input array is in reverse order, Bubble Sort has to perform n comparisons for each element, resulting in a time complexity of O(n^2). For example, if we have an array [5, 4, 3, 2, 1], Bubble Sort would need to perform 10 comparisons and swaps to sort the array.

Question 7:
Which sorting algorithm is based on the divide-and-conquer strategy?
a) Bubble Sort
b) Insertion Sort
c) Merge Sort
d) Quick Sort

Answer: c) Merge Sort
Explanation: Merge Sort divides the input array into two halves, recursively sorts each half, and then merges the sorted halves to obtain the final sorted array. This divide-and-conquer strategy results in a time complexity of O(n log n) in all cases. For example, if we have an array [4, 2, 8, 6, 1], Merge Sort would split it into [4, 2, 8] and [6, 1], recursively sort each half, and then merge them to obtain the sorted array [1, 2, 4, 6, 8].

Question 8:
Which sorting algorithm has the best average-case time complexity?
a) Bubble Sort
b) Insertion Sort
c) Merge Sort
d) Quick Sort

Answer: d) Quick Sort
Explanation: Quick Sort is a divide-and-conquer sorting algorithm that selects a pivot element and partitions the array around it, such that all elements smaller than the pivot are placed before it, and all elements greater than the pivot are placed after it. This process is repeated recursively on the sub-arrays until the entire array is sorted. On average, Quick Sort has a time complexity of O(n log n), making it more efficient than Bubble Sort or Insertion Sort. For example, if we have an array [5, 2, 8, 1, 4], Quick Sort would choose a pivot (let\’s say 4), partition the array into [2, 1] (smaller than 4) and [5, 8] (greater than 4), and recursively sort each sub-array.

Question 9:
Which sorting algorithm is not suitable for large input sizes due to its space complexity?
a) Bubble Sort
b) Insertion Sort
c) Merge Sort
d) Quick Sort

Answer: c) Merge Sort
Explanation: Merge Sort requires additional space to store the intermediate sorted sub-arrays during the merge phase. This results in a space complexity of O(n), which can be a concern for large input sizes as it requires additional memory. In contrast, Bubble Sort, Insertion Sort, and Quick Sort have a space complexity of O(1), as they only require a constant amount of additional memory. For example, if we have a large array with millions of elements, Merge Sort would require a significant amount of extra memory to store the intermediate sorted sub-arrays during the merge phase.

Question 10:
Which sorting algorithm is stable?
a) Bubble Sort
b) Insertion Sort
c) Merge Sort
d) Quick Sort

Answer: c) Merge Sort
Explanation: A sorting algorithm is stable if it maintains the relative order of elements with equal keys. Merge Sort is a stable sorting algorithm, as it ensures that when merging two sorted sub-arrays, elements with equal keys from the original array are placed in the same order in the sorted array. This stability property can be important when sorting objects with multiple attributes. For example, if we have an array of objects with names and ages, and we want to sort them first by age and then by name, Merge Sort would preserve the relative order of elements with equal ages.

Leave a Comment

Your email address will not be published. Required fields are marked *

Shopping Cart
error: Content cannot be copied. it is protected !!
Scroll to Top