Sorting
Question 1 
An array of 25 distinct elements is to be sorted using quicksort. Assume that the pivot element is chosen uniformly at random. The probability that the pivot element gets placed in the worst possible location in the first round of partitioning (rounded off to 2 decimal places) is _____.
0.08  
0.01  
1  
8 
Step2: Pivot element = uniformly random.
Step3: Worst case position in the pivot element is either first (or) last.
Step4: So total 2 possibilities among 25 distinct elements
= 2/25
= 0.08
Question 2 
There are n unsorted arrays: A_{1}, A_{2}, ..., A_{n}. Assume that n is odd. Each of A_{1}, A_{2}, ..., A_{n} contains n distinct elements. There are no common elements between any two arrays. The worstcase time complexity of computing the median of the medians of A_{1}, A_{2}, ..., A_{n} is
O(n)  
O(n log n)  
Ω(n^{2} log n)  
O(n^{2}) 
But it is similar to quicksort but in quicksort, partitioning will take extra time.
→ Find the median will be (i+j)/2
1. If n is odd, the value is Ceil((i+j)/2)
2. If n is even, the value is floor((i+j)/2)
> Here, total number of arrays are
⇒ O(n)*O(n)
⇒ O(n^{2})
Note:
They are clearly saying that all are distinct elements.
There is no common elements between any two arrays.
Question 3 
The worst case running times of Insertion sort, Merge sort and Quick sort, respectively, are:
Θ(nlogn), Θ(nlogn), and Θ(n^{2})  
Θ(n^{2} ), Θ(n^{2} ), and Θ(nlogn)  
Θ(n^{2}), Θ(nlogn), and Θ(nlogn)  
Θ(n^{2}), Θ(nlogn), and Θ(n^{2}) 
Question 4 
I. Quicksort runs in Θ(n^{2}) time
II. Bubblesort runs in Θ(n^{2}) time
III. Mergesort runs in Θ(n) time
IV. Insertion sort runs in Θ(n) time
I and II only  
I and III only  
II and IV only  
I and IV only 
→ The recurrence relation for Quicksort, if elements are already sorted,
T(n) = T(n1)+O(n) with the help of substitution method it will take O(n^{2}).
→ The recurrence relation for Merge sort, is elements are already sorted,
T(n) = 2T(n/2) + O(n) with the help of substitution method it will take O(nlogn).
We can also use master's theorem [a=2, b=2, k=1, p=0] for above recurrence.
Question 5 
You have an array of n elements. Suppose you implement quicksort by always choosing the central element of the array as the pivot. Then the tightest upper bound for the worst case performance is
O(n^{2})  
O(n log n)  
Θ(n logn)  
O(n^{3}) 
Question 6 
Which one of the following is the tightest upper bound that represents the number of swaps required to sort n numbers using selection sort?
O(log n)  
O(n)  
O(n log n)  
O(n^{2}) 
Selection sort time complexity O(n^{2}) in terms of number of comparisons. Each of these scans requires one swap for n1 elements (the final element is already in place).
Question 7 
Which one of the following is the tightest upper bound that represents the time complexity of inserting an object into a binary search tree of n nodes?
O(1)  
O(log n)  
O(n)  
O(n log n) 
Question 8 
What is the time complexity of BellmanFord singlesource shortest path algorithm on a complete graph of n vertices?
Θ(n^{2})  
Θ(n^{2} log n)  
Θ(n^{3})  
Θ(n^{3} log n) 
Bellman–Ford runs in O(V * E) time, where V and E are the number of vertices and edges respectively.
Note: For complete graph: E = n(n1)/2 = O(n^{2})
Question 9 
The number of elements that can be sorted in Θ(log n) time using heap sort is
Θ(1)  
Θ(√logn)  
Θ (logn/loglogn)  
Θ(log n) 
So, Θ((logn/ log log n)log(logn/log log n))
= Θ(logn/log logn (log logn  log log logn))
= Θ((log n/log logn) × log log n)
= Θ(log n)
Hence, option (C) is correct answer.
Question 10 
What is the number of swaps required to sort n elements using selection sort, in the worst case?
θ(n)  
θ(n log n)  
θ(n^{2})  
θ(n^{2} logn) 
Question 11 
In quick sort, for sorting n elements, the (n/4)^{th} smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort?
θ(n)  
θ(n log n)  
θ(n^{2})  
θ(n^{2} log n) 
n→n/(4/3)→n/(4/3)^{2}→n/(4/3)^{3}n/(4/3)^{k}=1
n/(4/3)^{k} = 1
⇒n=(4/3)^{k} ⇒ k = log_{4/3}n [k=no. of levels]
In each level workdone = Cn
So, total workdone = Cn⋅log_{4/3}n = (nlog_{4/3}n)
Question 12 
Consider the Quicksort algorithm. Suppose there is a procedure for finding a pivot element which splits the list into two sublists each of which contains at least onefifth of the elements. Let T(n) be the number of comparisons required to sort n elements. Then
T(n) ≤ 2T(n/5) + n
 
T(n) ≤ T(n/5) + T(4n/5) + n  
T(n) ≤ 2T(4n/5) + n  
T(n) ≤ 2T(n/2) + n

So, T(n/5) comparisons are needed for the first subset and T(4n/5) comparisons needed for second subset.
Now, suppose that one subset contains more than n/5 elements then another subset will contain less than 4n/5 elements. Due to which time complexity will be less than
T(n/5) + T(4n/5) + n
Because recursion tree will be more balanced.
Question 13 
Which one of the following in place sorting algorithms needs the minimum number of swaps?
Quick sort  
Insertion sort  
Selection sort  
Heap sort 
Question 14 
In a permutation a_{1}...a_{n} of n distinct integers, an inversion is a pair (a_{i}, a_{j}) such that i < j and a_{i} > a_{j}.
If all permutations are equally likely, what is the expected number of inversions in a randomly chosen permutation of 1...n ?
n(n1)/2  
n(n1)/4  
n(n+1)/4  
2n[log_{2}n] 
Question 15 
In a permutation a_{1}...a_{n} of n distinct integers, an inversion is a pair (a_{i}, a_{j}) such that i
What would be the worst case time complexity of the Insertion Sort algorithm, if the inputs are restricted to permutations of 1...n with at most n inversions?
Θ(n^{2})  
Θ(n log n)  
Θ(n^{1.5})  
Θ(n) 
Question 16 
Randomized quicksort is an extension of quicksort where the pivot is chosen randomly. What is the worst case complexity of sorting n numbers using randomized quicksort?
O(n)  
O(n log n)  
O(n^{2})  
O(n!) 
Question 17 
Let s be a sorted array of n integers. Let t(n) denote the time taken for the most efficient algorithm to determined if there are two elements with sum less than 1000 in s. Which of the following statements is true?
t(n) is O(1)  
n ≤ t(n) ≤ n log_{2} n  
n log_{2} n ≤ t(n) < (n/2)  
t(n) = (n/2) 
Question 18 
A sorting technique is called stable if
it takes O (nlog n) time  
it maintains the relative order of occurrence of nondistinct elements  
it uses divide and conquer paradigm  
it takes O(n) space 
Question 19 
If one uses straight twoway merge sort algorithm to sort the following elements in ascending order:
20, 47, 15, 8, 9, 4, 40, 30, 12, 17
then the order of these elements after second pass of the algorithm is:
8, 9, 15, 20, 47, 4, 12, 17, 30, 40  
8, 15, 20, 47, 4, 9, 30, 40, 12, 17  
15, 20, 47, 4, 8, 9, 12, 30, 40, 17  
4, 8, 9, 15, 20, 47, 12, 17, 30, 40 
Question 20 
A two dimensional array A[1...n][1...n] of integers is partially sorted if
∀i, j ∈ [1...n−1], A[i][j] < A[i][j+1] and A[i][j] < A[i+1][j]
Fill in the blanks:
(a) The smallest item in the array is at A[i][j] where i=............and j=..............
(b) The smallest item is deleted. Complete the following O(n) procedure to insert item x (which is guaranteed to be smaller than any item in the last row or column) still keeping A partially sorted.
procedure insert (x: integer); var i,j: integer; begin (1) i:=1; j:=1, A[i][j]:=x; (2) while (x > ...... or x > ......) do (3) if A[i+1][j] < A[i][j] ......... then begin (4) A[i][j]:=A[i+1][j]; i:=i+1; (5) end (6) else begin (7) ............ (8) end (9) A[i][j]:= ............. end
Theory Explanation. 
Question 21 
Merge sort uses
Divide and conquer strategy  
Backtracking approach  
Heuristic search  
Greedy approach 
Question 22 
Consider the following sequence of numbers
92, 37, 52, 12, 11, 25
Use bubble sort to arrange the sequence in ascending order. Give the sequence at the end of each of the first five passes.
Theory Explanation. 
Question 23 
Consider the recursive algorithm given below:
procedure bubblersort (n); var i,j: index; temp : item; begin for i:=1 to n1 do if A[i] > A [i+1] then begin temp : A[i]; A[i]:=A[i+1]; A[i+1]:=temp end; bubblesort (n1) end
Let a_{n} be the number of times the ‘if…then….’ Statement gets executed when the algorithm is run with value n. Set up the recurrence relation by defining a_{n} in terms of a_{n1}. Solve for a_{n}.
Theory Explanation. 
Question 24 
Following algorithm(s) can be used to sort n integers in the range [1...n^{3}] in O(n) time
Heapsort  
Quicksort  
Mergesort  
Radixsort 
As Radix sort is not comparison based sort (it is counting sort), so can be done in linear time.
Question 25 
The minimum number of comparisons required to sort 5 elements is _____
7 
= ⌈log(n!)⌉
= ⌈log(5!)⌉
= ⌈log(120)⌉
= 7
Question 26 
Quicksort is ________ efficient than heapsort in the worst case.
LESS. 
Question 27 
Let P be a quicksort program to sort numbers in ascending order. Let t_{1} and t_{2} be the time taken by the program for the inputs [1 2 3 4] and [5 4 3 2 1] respectively. Which of the following holds?
t_{1} = t_{2}  
t_{1} > t_{2}  
t_{1} < t_{2}  
t_{1} = t_{2} + 5 log 5 
Question 28 
If we use Radix Sort to sort n integers in the range [n^{k/2}, n^{k}], for some k>0 which is independent of n, the time taken would be?
Θ(n)  
Θ(kn)  
Θ(nlogn)  
Θ(n^{2}) 
where n = keys, w = word size
w = log_{2} (n^{k}) = k × log_{2} (n)
Complexity Θ(wn) = Θ(k × log_{2}(n) × n) = Θ(n log n) = Θ(n log n)
Question 29 
Let a and b be two sorted arrays containing n integers each, in nondecreasing order. Let c be a sorted array containing 2n integers obtained by merging the two arrays a and b. Assuming the arrays are indexed starting from 0, consider the following four statements
1. a[i] ≥ b [i] => c[2i] ≥ a [i] 2. a[i] ≥ b [i] => c[2i] ≥ b [i] 3. a[i] ≥ b [i] => c[2i] ≤ a [i] 4. a[i] ≥ b [i] => c[2i] ≤ b [i]Which of the following is TRUE?
only I and II  
only I and IV  
only II and III  
only III and IV 
Since both 'a' and 'b' are sorted in the beginning, there are 'i' elements than or equal to a[i] and similarly 'i' elements smaller than or equal to b[i]. So, a[i] ≥ b[i] means there are 2i elements smaller than or equal to a[i] and hence in the merged array, a[i] will come after these 2i elements. So, c[2i] ≤ a[i].
Similarly, a[i] ≥ b[i] says for b that, there are not more than 2i elements smaller than b[i] in the sorted array. So, b[i] ≤ c[2i].
So, option (C) is correct.
Question 30 
m*n  
maximum of m and n  
minimum of m and n  
m+n–1 
For example: Take 2 subarrays of size 3 such as 1,3,5 and another subarray 2,4,6. Then in this case number of comparisons is 5 which is m+n1 can be taken as O(m+n).
Question 31 
Merge Sort  
Insertion Sort  
Selection Sort  
Quick Sort 
Question 32 
Greedy method  
Divideandconquer  
Dynamic Programming  
Backtracking 
1) The subarray which is already sorted.
2) Remaining subarray which is unsorted.
In every iteration of selection sort, the minimum element (considering ascending order) from the unsorted subarray is picked and moved to the sorted subarray. Clearly, it is a greedy approach to sort the array.
Question 33 
O(n^{2}), O(n^{2})  
O(n^{2}), O(nlog_{2}n)  
O(n log_{2}n), O(n^{2})  
O(n log_{2}n), O(n log_{2}n) 
Question 34 
Insertion Sort  
Quick Sort  
Heap Sort  
Selection Sort 
Question 35 
11  
12  
13  
10 
Step2: 8,7,9,22,31,5,13
Step3: 8,7,9,22,5,31,13
Step4: 8,7,9,22,5,13,31
Step5: 7,8,9,22,5,13,31
Step6: 7,8,9,5,22,13,31
Step7: 7,8,9,5,13,22,31
Step8: 7,8,5,9,13,22,31
Step9: 7,5,8,9,13,22,31
Step10: 5,7,8,9,12,22,31
Note:Total 10 swaps are required to sort the array.
Question 36 
1  
5  
10  
20 
1st iteration will compare 4 numbers with the 5
2nd iteration will compare 3 numbers with the 4
3rd iteration will compare 2 numbers with the 3
4th iteration i will compare 1 number with the 2
So, the total number of comparisons is 4 + 3 + 2 + 1 = 10
It can be viewed as the sum of the sequence of the first (n1) numbers starting by 1
S = ((1 + (n1) )*(n1) )/2
S = 10
Question 37 
Dynamic programming  
Backtracking  
Divide and conquer  
Greedy method 
1. Quick sort
2. Merge sort
3. Heap sort
****Binary search also follows the divide and conquer.
Question 38 
Quick sort  
Insertion sort  
Selection sort  
Heap sort 
The algorithm finds the minimum value, swaps it with the value in the first position, and repeats these steps for the remainder of the list. It does no more than n swaps, and thus is useful where swapping is very expensive.
Question 39 
Greedy method  
Backtracking  
Dynamic programming  
Divide and Conquer 
Conceptually, a merge sort works as follows:
1. Divide the unsorted list into n sublists, each containing one element (a list of one element is considered sorted).
2. Repeatedly merge sublists to produce new sorted sublists until there is only one sublist remaining. This will be the sorted list.
Question 40 
Insertion sort, Quick sort  
Quick sort, Quick sort  
Quick sort, Insertion sort  
Insertion sort, Insertion sort 
Question 41 
 Quicksort
 Heapsort
 Mergesort
1 and 2 only  
2 and 3 only  
3 only  
1, 1 and 3 
Question 42 
45  
72  
360  
450 
Octal number having → 5 digits
The octal number system base value = 8
The maximum number of comparison = (number of items) * (radix) * (number of digits)
= 9 * 5 * 8
= 360
Question 43 
True,True  
False, False  
True,False  
False,False 
Question 44 
Quick sort  
Merge sort  
Shell sort  
Bubble sort 
Question 45 
O(nlogn),O(nlogn) and O(n^{2})  
O(n^{2}),O(n^{2}) and O(nlogn)  
O(n^{2}),O(nlogn) and O(nlogn)  
O(n^{2}),O(nlogn) and O(n^{2}) 
merge sort → O(nlogn)
quick sort → O(n^{2})
Question 46 
Insertion sort  
merge sort  
Quick sort  
bubble sort 
Question 47 
Place reference to them in and array an sort the array  
place them in a linked list and sort the linked list  
place pointers to them in an array and sort the array  
place them in an array and sort the array 
OptionB is not correct because it will extra time complexity and searching very difficult
OptionD is suitable for small amount of objects/elements
Question 48 
If it takes O(n log n) time  
It uses divide and conquer technique  
Relative order of occurrence of nondistinct elements is maintained  
It takes O(n) space 
→ That is, a sorting algorithm is stable if whenever there are two records R and S with the same key and with R appearing before S in the original list, R will appear before S in the sorted list.
Question 49 
A. 1, 2, 3……n
B. n, n1, n2 …… 1
Let C1 and C2 be the number of comparisons made for A and B respectively. Then
C1>C2  
C1=C2  
C1  
Cannot say anything for arbitrary n 
→ Elements are already sorted order it gives worst case complexity O(n^{2})
→ If all elements are having same weight, it performs worst case complexity.
Question 50 
No. of inputs  
Arrangement of elements in an array  
Size of elements  
Pivot element 
Question 51 
50.2 sec  
6.7 sec  
72.7 sec  
11.2 sec 
The Worst case time complexity is O(n^{2})
Average and best case time complexity is O(n logn)
Step1: Time taken to sort 1000 names by quicksort
100= c*nlogn
= c*1000log1000
= 100
c=1/10log1000
Now, time taken to sort 100 names by quicksort,
=c*nlogn
= (1/10log1000)*100*log100
= 6.7
Question 52 
20, 47, 15, 8, 9, 4, 40, 30, 12, 17
then the order of these elements after second pass of the algorithm is:
8, 9, 15, 20, 47, 4, 12, 17, 30, 40  
8, 15, 20, 47, 4, 9, 30, 40, 12, 17  
15, 20, 47, 4, 8, 9, 12, 30, 40, 17  
4, 8, 9, 15, 20, 47, 12, 17, 30, 40 
Question 53 
Which of the following data structures is most suitable for radix sort?
Stack
 
Binary search tree
 
Tree
 
Linked list

→ This concept of a circularly linked trie structure is similar to the concept of a threaded binary tree. This structure will be called a circularly threaded trie.
Question 54 
Bubble sort and Insertion sort  
Bubble sort and Quicksort  
Bubble sort and merge sort  
Quicksort and merge sort 
The quick sort and merge sort algorithms are based on the divide and conquer algorithm which works in the quite similar way.
In the both algorithm, the larger problem is divided into smaller one until you find the solution later we will combine all smaller solutions into final solution.
Question 55 
Consider an array with n elements. Which of the following options gives the maximum number of swap(a_{j}, a_{j+1}) operations to sort the array elements using efficient bubble sort algorithm?
n^{2} /2  
n(n1)/2  
n^{2}
 
n(n+1)/2 
→ This means that in the first iteration it would have to look at n elements, then after that it would look n  1 elements (since the biggest integer is at the end) and so on and so forth till 1 comparison occurs.
→ For n number of numbers, total number of comparisons done will be (n1)+ ... + 2 + 1.
= (n*(n1))/2
= O(n^{2})
Question 56 
Linear search  
Hash search  
Binary search  
All of these 
→ Binary search in which elements should be in sorting order and every time we will search in the half the elements based upon the mid value.
Question 57 
A process of rearranging a given set of objects in a specific order  
To facilitate the later search for members of the sorted set  
Is a relevant and essential activity, particularly in data processing  
All of these 
● ordering: arranging items in a sequence ordered by some criterion;
● categorizing: grouping items with similar properties.
The most common uses of sorted sequences are:
● making lookup or search efficient;
● making merging of sequences efficient.
● enable processing of data in a defined order.
Question 58 
Bubble  
Shake  
Tree  
Insertion 
Question 59 
Insertion sort  
Heap sort  
Merge sort  
Bubble sort 
1. Mergesort
2. Quicksort.
Question 60 
floor ((i+1)/2)  
ceiling ((i+1)/2)  
floor (i/2)  
ceiling (i/2) 
Left Child is at index: 2i
Right child is at index: 2*i+1
Question 61 
O(n log n)  
O(n ^{2} log n)  
O(n ^{2} + log n)  
O(n ^{3} ) 
2.The length of the string is n, the time taken is to be O(n) for each comparison.
3. For level 1(bottomup order): Every element has to be compared with other element the number of comparisons is O(n/2).
4. For level 1(bottomup order): Total time taken by one level is O(n ^{2} ).
5. For copying level to level the time taken by O(n ^{2} ).
So, For level 1= O(n^{ 2} )+ O(n^{ 2} )
6. Second level O(n ^{2} )+ O(n ^{2} ).
;
;
;
Final n level (logn)*O(n ^{2} ) = O(n ^{2} logn)
Question 62 
Stage (i) 12,3,40,2
Stage (ii) 12,3,2,40
Stage (iii) 3,12,2,40
Stage (iv) 3,2,12,40
Stage (v) 2,3,12,40
The array has been sorted using
Insertion sort  
Selection sort  
Quick sort  
Bubble sort 
→The above stages Bubble sort First Pass/ iteration steps.
→In the bubble sort, after completion the largest element gets last position.
Question 63 
O (log _{2} n)  
O (n log _{2} n)  
O (1)  
O (n) 
1. Stable: Equal keys aren't reordered.
2. Operates in place, requiring O(1) extra space.
3. Worst case O(nlog(n)) key comparisons.
4. Worst case O(n) swaps.
5. Adaptive: Speeds up to O(n) when data is nearly sorted or when there are few unique keys.
Question 64 
Selection sort  
Bubble sort  
Radix sort  
Insertion sort 
Phase 1:
1. We use n bins, one for each of the integers 0,1,.., n1. We place each integer i on the list to be sorted into the bin numbered (i mod n) each bin will then contain a list of integers leaving the same remainder when divided by n.
At the end, we concatenate the bins in order to obtain a list L.
Phase 2:
1. The integers on the list L are redistributed into bins, but using the bin selection
function: ⌊i/n⌋
2. Now append integers to the ends of lists. Finally, concatenate the lists to get the final sorted sequence.
Question 65 
O(1)  
O(Log n)  
O(n)  
O(n ^{2} ) 
→ The heapify function does not take O(logn) space, it will take O(logn) time in worst case. Here we don't do heapify through any recursion call. We will do through iterative method and we don't require any auxiliary space in this sorting. We use only input array.
So, we do heap sort in O(1) space complexity.
Question 66 
Bubble sort  
Selection sort  
Quick sort  
Insertion sort 
So, we can eliminate optionB and OptionC because it gives worst complexity(O(n 2 )) when array is already sorted.
Bubble sort and insertion sort will give best case complexity(O(n)) but here they given small constraint is “there are few random elements”. So, insertion sort is more appropriate answer.
Question 67 
The array elements form a heap.  
Elements in each half of the array are sorted amongst themselves.  
Elements in the first half of the array are less than or equal to elements in second half of the array.  
All of the above 
→Divide the unsorted list into n sublists, each containing one element (a list of one element is considered sorted).
→Repeatedly merge sublists to produce new sorted sublists until there is only one sublist remaining. This will be the sorted list.
As per the merge sort, The elements in each half of the array are sorted amongst themselves.
Question 68 
0.5 β Ig n  
0.5 (1 – β) Ig n  
– (Ig n)/(Ig β)  
– (Ig n)/Ig (1 – β) 
Question 69 
45  
72  
360  
450 
Octal number having→ 5 digits
The octal number system base value= 8
The maximum number of comparison=(number of items)*(radix)*(number of digits)
= 9*5*8
= 360
Question 70 
0(log n)  
0(n log n)  
0(n)  
None of the above 
Question 71 
O(nm log m)  
O(mn log n)  
O(m + n)  
O(mn) 
Another strategy to achieve complexity O(mn log n) is to build a minheap of size n, consisting the first element in each of the n lists. We then extract the minimum element from the heap to the output and replace it in the heap with the next element from the list from which the minimum came. Thus, it takes time O(log n) to generate each element in the output and there are O(mn) output elements in all, so the overall time is O(mn log n).
On the other hand, if we can apply min() to n elements at a time, we can merge all n lists is parallel. Let the n lists be ` l_{1} , `l _{2} , . . . , `l_{ n} . We maintain indices i _{1} , i_{ 2} , . . . , i _{n} into these n lists, initially set to 1. At each stage we add j = min(` l_{1} [i_{ 1} ], ` l_{2} [i_{ 2>} ], . . . , ` l_{n} [i_{ n} ]) to the sorted output list and increment the corresponding index. Overall, we add mn elements to the sorted output and each element is generated in constant time, so the time taken is O(mn).
Question 72 
Descriptive Explanation 
First of all, the question is interesting only if the lists have duplicate elements. Other wise, we just merge the two lists as usual and after k steps we have the answer.
Let us first solve a simpler problem and find the k th smallest number in a single sorted list of length n. The first number in the list is clearly the smallest number. Let us call this number x_{ 1} . The next number must be at least x _{1} +1, so we search for x _{1} +1 in the list using binary search. (Recall that binary search is an example of divide and conquer.) Either we find the number, in which case the second smallest number x _{2} is x_{ 1} +1, or we fail to find it, in which case we will be at a boundary between the last occurrence of x _{1} and the first occurrence of the next smaller number, which is x _{2} . We now search for x_{ 2} +1 and discover x _{3} . Repeat this k−1 times to find x _{k} . Each binary search takes time log n, so overall this procedure takes time k log n, which is O(log n) if we treat k as a constant.
If we have two lists, we can find the first k numbers in the first list, which takes time O(log m) and then find the first k numbers in the second list, which takes time O(log n) and then merge these in time O(k) (which is a constant!) to find the k th smallest number overall.
In fact, we can reduce the number of binary searches to k and avoid the merge step. Maintain indices i _{1} and i_{ 2} for the two lists, pointing initially to the first element in lists 1 and 2, respectively. At each stage, the smaller of the numbers pointed to by i _{1} and i_{ 2} is the next number to be enumerated. We then advance i _{1} or i_{ 2} , as the case may be, using binary search as described above. After k such steps, we would have found the k th smallest number overall. Some steps may involve a binary search in list 1, which takes time O(log m) and others may involve a binary search in list 2, which takes time O(log n), so each step is bounded by max(O(log m), O(log n)). This gives an overall complexity of max(O(log m), O(log n)), which is equivalent to O(log m + log n).
Question 73 
Insertion in a sorted array takes constant time.  
Insertion in a sorted linear linked list takes constant time.  
Searching for a key in a sorted array can be done in O(log n) time.  
Searching for a key in a sorted linear linked list can be done in O(log n) time. 
Because elements are in sorted order, we are using binary search tree. it will take worst case time complexity O(logn) only.
Question 74 
[(9, 0, 9), (7, 1, 8), (6, 1, 4), (0, 2, 5), (6, 5, 9), (3, 5, 7)]  
[(0, 2, 5), (3, 5, 7), (6, 1, 4), (6, 5, 9), (7, 1, 8), (9, 0, 9)]  
[(9, 0, 9), (7, 1, 8), (6, 1, 4), (0, 2, 5), (3, 5, 7), (6, 5, 9)]  
[(9, 0, 9), (6, 1, 4), (7, 1, 8), (0, 2, 5), (3, 5, 7), (6, 5, 9)] 
Question 75 
Give an O(log n) time algorithm to find the largest element in a circularly shifted array. (The number of positions through which it has been shifted is unknown to you.)
Descriptive Explanation 
The following code describes this algorithm: a/b gives the integer part of dividing a by b. The required solution is obtained by calling f(T, 0, n1), where n is the length of T.
function f(T, first, last)
if (T[first] <= T[last])
return T[last]
else {
mid := first + (last  first)/2
if (T[mid] > T[mid+1])
return T[mid]
else if (T[first] > T[mid])
return f(T, first, mid1)
else if (T[mid] >= T[last])
return f(T, mid+1, last)
}
Question 76 
Ω(n)  
Ω(n/k)  
Ω(nlogk )  
Ω(n/klogn/k) 
(k!)^{n/k} ≤ 2^{h}
Taking the logarithm of both sides, we get:
h ≥ lg(k!)^{n/k}
= (n/k)lg(k!)
≥ (n/k)(k/2)lg(k/2)
= (1/2)*(nlogk)(1/2)*n
= Ω(nlogk)
Question 77 
Ω(lg n)  
Ω(n)  
Ω(n lg n)  
Ω(n^{2}) 
Proof:
→ The number of leaves l ≥ n!
→ Any binary tree of height h has ≤ 2^{h} leaves.
→ Hence, n! ≤ l ≤ 2^{h}
→ Take logs: h ≥ lg(n!)
→ Use Stirling’s approximation: n! > (n/e)^{n}
→ We get:
h ≥ log(n/e)^{n}
= nlg(n/e)
= nlg n  n lg e
= Ω(n lg n)
Question 78 
aiii, bi, cii, div  
aii, bi, civ, diii  
aii, bi, ciii, div  
biii, bii, ci, div 
Huffman coding→ Greedy approach
Optimal polygon triangulation→ Dynamic programming
Subset sum problem → Backtracking
Note: We can also use subset sum problem as well backtracking.
Question 79 
O (k (n + d))  
O(d (n + k))  
O((n + k) lg d)  
O((n + d) lg k) 
Question 80 
O(d n k)  
O(d n^{k})  
O((d +n) k)  
O(d (n + k))

Question 81 
Greedy algorithm, θ (V^{3})  
Greedy algorithm, θ (V^{2} lgn)  
Dynamic programming, θ (V^{3})
 
Dynamic programming, θ (V^{2} lgn) 
Question 82 
O(log n)  
O(n)  
O(n log n)  
O(n^{2}) 
But using radix sort will get in linear time only. O(n)
Question 83 
Merge sort, Quick sort  
Quick sort, Insertion sort  
Insertion sort, Quick sort  
insertion sort, Merge sort 
Method1:
Mergesort is having best,average and worst case time complexity is O(nlogn) as like hep sort. Quick sort is also same like heap and merge except worst case time complexity is O(n^{2}). According to question, P is merge sort and Q is quick sort.
Method2:
Heap sort and insertion sort is not following divide and conquer technique. So, it is same like P.
Merge sort follows Divide and Conquer. So, it is not like Q.
Note: Official Key given answer is Option4
Question 84 
Question 85 
Insertion sort  
Quick sort  
Merge sort  
Selection sort 
Question 86 
67, 12, 10, 5, 4, 7, 23  
4, 7, 10, 23, 67, 12, 5  
4, 5, 7, 67, 10, 12, 23  
10, 7, 4, 67, 23, 12, 5 
Pass1: 4, 10, 7, 23, 67, 12, 5
Pass2: 4, 7, 10, 23, 67, 12, 5
Pass3: 4, 7, 10, 23, 67, 12, 5 [ No change because the value 10 is placed in the same position ]
Question 87 
O(n)  
O(1)  
O(n^{2})  
O(n log n) 
Question 88 
Merge sort  
Radix sort  
Heap sort  
Address Calculation Sort 
Merge sort has space complexity of O(n). Hence it is not an inplace sorting algorithm.
Question 89 
The worst case time complexity of Quick Sort is __________
O(n^{2})  
O(log n)  
O(n)  
O(n logn)

The worst case quicksort happens only 2 times
1. Elements are already in sorted order.
2. All elements having the same weight.
Question 90 
The worst case time complexity of Merge Sort is _______
O(n^{2})  
O(log n)
 
O(n)
 
O(n logn)

The merge sort will not depend on the type of input. It will give the best, average and worst case time complexity is O(nlogn) only.
Question 91 
Which of the following sorting procedures is the slowest?
Quick sort  
Heap sort  
Shell sort  
Bubble sort

Question 92 
Which of the following sorting methods would be most suitable for sorting a list which is almost sorted?
Bubble Sort
 
Insertion Sort  
Selection Sort
 
Quick Sort

Question 93 
The running time of insertion sort is
O(n^{2})  
O(n)  
O(log n)
 
O(n log n)

Question 94 
A sort which compares adjacent elements in a list and switches where necessary is _________
Insertion sort
 
Heap sort
 
Quick sort  
Bubble sort 
Question 95 
The correct order of the efficiency of the following sorting algorithms according to their overall running time comparison is
insertion>selection>bubble  
insertion>bubble>selection
 
selection>bubble>insertion
 
bubble >selection>insertion 
Question 96 
A sort which iteratively passes through a list to exchange the first element with any element less than it and then repeats with a new first element is called
insertion sort  
selection sort  
heap sort
 
quick sort

Question 97 
The algorithm that will efficiently sort an array that is nearly sorted except for the interchange of some adjacent pairs of numbers like {1,3,2,5,4,6} is
quick sort
 
bubble sort
 
merge sort
 
selection sort

Best case of Bubble sort is O(n)
Best case of merge sort is O(n logn)
Best case of selection sort is (n^{2})
So the algorithm that will efficiently sort an array that is nearly sorted except for the interchange of some adjacent pairs of numbers is Bubble sort.
Question 98 
Assume that the algorithms considered here sort the input sequences in ascending order. If the input is already in ascending order, which of the following are TRUE?
I. Quick sort runs in Q(n^{2}) time
II. Bubble sort runs in Q(n^{2}) time
III. Mergesort runs in Q(n^{2}) time
IV. Insertion sort runs in Q(n^{2}) time
I and II only  
I and III only  
II and IV only
 
I and IV only  
None of the above. 
Bubble sort runs in O(n) time if the input is already in sorted order.
Merge sort runs in O(nlogn) time.
Insertion sort runs in O(n) time.
Question 99 
How much extra space is used by heap sort?
O(1)  
O(log n)  
O(n)  
O(n^{2})

Heap sort uses MaxHeapify function which calls itself but it can be made using a simple while loop and thus making it an iterative function which takes no space.
So, the space complexity of Heap Sort can be reduced to O(1)
> The heapify function does not take O(logn) space, it will take O(logn) time in the worst case. Here we don't do heapify through any recursion call. We will do it through an iterative method and we don't require any auxiliary space in this sorting. We use only the input array.
So, we do heap sort in O(1) space complexity
Question 100 
What is meant by stable sorting algorithm?
A sorting algorithm is stable if it preserves the order of all keys  
A sorting algorithm is stable if it preserves the order of nonduplicate keys  
A sorting algorithm is stable if it preserves the order of duplicate keys
 
A sorting algorithm is stable if it doesn't preserve the order of duplicate keys 
The examples of sorting algorithms are insertion sort,merge sort,counting sort etc.
Question 101 
Consider the following statements
1. Insertion sort and merge sort are stable
2. Heap sort is inherently unstable
3. Selection sort is not inherently stable, but may be coded in such a way that it is stable
Only statement 1 is correct
 
All statements are correct
 
Statement 3 is not correct
 
Only statement 1 and 2 is correct

S1 is true.
S2 is True, Heapsort is not stable because operations on the heap can change the relative order of equal items. From here: When sorting (in ascending order) heapsort first peaks the largest element and put it in the last of the list.
S3 is True. Selection sort is unstable by nature but can be stable by making some changes in the algorithm. Hence not inherently unstable.
Inherently unstable algorithm is an algorithm which cannot be made stable even by modifying the algorithm.
Question 102 
What is the complexity of merge sort?
O(n)
 
O(n^{2} logn)
 
O(nlogn)
 
O(n^{2})

Question 103 
Time complexity of Merge sort is
O(n)  
O(n^{2})  
O(log n)  
O(n log n) 
Question 104 
In which sorting algorithm average number of comparison is (n^{2} – n) / 2?
Insertion sort
 
Selection sort  
Merge sort
 
Quick sort 
Question 105 
Quick Sort  
Heap Sort  
Merge sort  
All the above 
Question 106 
Dynamic programming  
Back tracking  
Divide and conquer  
Greedy method 
Question 107 
Bubble Sort  
Quick Sort  
Merge Sort  
Heap Sort 
Question 108 
Quick sort is run on two inputs shown below to sort in ascending order:
I : 1,2,3,…n II : n, n1,…, 2, 1
Let k1 and k2 be the number of comparisons made for the inputs I and II respectively. Then
k1 < k2  
k1 = k2  
k1 > k2  
None 
Question 109 
Heap Sort  
Quick Sort  
Insertion Sort  
None of the above 
Example of external sorting is merge sort.
Question 110 
–lgn/ lg(1 – α)  
–(lg (1 – α) / lgn)  
–lgn / lgα  
–lgα / lgn 
Question 111 
function mystery (A[0..99]) {
int i,j,m;
for (i = 0; i < 100; i++) {
m = i;
for (j = i; j < 100, j++) {
if (A[j] > A[m]) {
m = j;
}
}
reverse(A,i,m);
}
return;
}
When the procedure terminates, the array A has been:
Sorted in descending order  
Sorted in ascending order  
Reversed  
Left unaltered 
Question 112 
O(n log n) but not O(n)  
O(n) but not O(√n)  
O(√n) but not O(log n)  
O(log n) but not O(1)  
O(1) 
Question 113 
A can be sorted with constant · kn comparisons but not with fewer comparisons  
A cannot be sorted with less than constant · n log n comparisons  
A can be sorted with constant · n comparisons  
A can be sorted with constant · n log k comparisons but not with fewer comparisons  
A can be sorted with constant · k^{2}n comparisons but not fewer 
Question 114 
The running time of the algorithm is Θ(n).  
The running time of the algorithm is Θ(n log n)  
The running time of the algorithm is Θ(n^{1.5}).  
The running time of the algorithm is Θ(n^{2})  
None of the above. 
Question 115 
ascending order  
descending order  
random order  
all of the above 
Question 116 
O(n),O(n)  
O(n),O(n^{2})  
O(n^{2}),O(n)  
O(1),O(n) 
Bubble sort will take O(n^2) time
Question 117 
(n1)
 
n/2
 
0  
log n 
Question 118 
Insertion sort done on a heap data structure instead of a list.  
Selection sort done on a heap data structure instead of a list.  
Bubble sort done on a heap data structure instead of a list.  
None of the above. 
We do similar type in selection sort,we find min or max element and then we exchange that element with first element and we repeat the element with the remaining procedure .
Question 119 
The correct result of Lexicographically sorting
109, 20, 119, 2,207, 27, 19
in ascending order is:
19, 109, 119, 2, 20, 27, 207  
109, 119, 19, 2,20,27,207  
109, 119, 19, 2,20,207,27  
2, 19, 20, 27, 109, 119 , 207 
So the correct answer is,
109,119,19,2,20,207,27
Question 120 
Quick Sort  
Merge Sort  
Radix Sort  
Bubble Sort 
Question 121 
Heap sort  
Insertion sort  
Quick sort  
Merge sort 
One example of external sorting is the external merge sort algorithm, which sorts chunks that each fit in RAM, then merges the sorted chunks together. We first divide the file into runs such that the size of a run is small enough to fit into main memory. Then sort each run in main memory using merge sort sorting algorithm. Finally merge the resulting runs together into successively bigger runs, until the file is sorted.
Question 122 
Quicksort and Radix sort  
Heap sort and Merge sort  
Bubble sort and Selection sort  
None of the above. 
Merge sort has worst case time complexity as O(nlogn) and average case time complexity as O(nlogn)
Bubble sort has worst case time complexity as O(n^2) and average case time complexity as O(n^2)
Selection sort has worst case time complexity as O(n^2) and average case time complexity as O(n^2)
Quick sort has worst case time complexity as O(n^2) and average case time complexity as O(nlogn)
Radix sort has worst case time complexity as O(n+K) and average case time complexity as O(n+K)
Question 123 
insertion sort only  
selection sort only  
bubble sort and selection sort  
bubble sort and insertion sort 
Question 124 
bubble sort  
insertion sort  
selection sort  
quicksort 
Whereas in Bubble sort tha average no. of swaps is O(n^2) , in insertion sort it is O(n^2), and in quick sort it is O(nlogn).
Question 125 
Quick Sort  
Merge Sort  
Radix Sort  
Bubble Sort 
The average time complexity of merge sort is O(nlogn) which is not quadratic.
The average time complexity of radix sort is O(nlogn) which is not quadratic.
The average time complexity of quick sort is O(n^2) which is quadratic.
Question 126 
the first two use recursive partitioning, while the third uses static treeIike structure  
All of them use recursive partitioning  
the first two use static treelike structure, while the third uses recursive partitioning  
the first two use nested do loop where both loops are bounded by the array size, while the third uses static treelike structure 
Question 127 
15  
12 to 13  
10  
25 
A B C D E
Beginning with E it takes n2 = 3 comparisons to recognize that B in sublist is already in its correct position. Sorting is now reduced to sorting the following sublist with n2=3 letters:
A B C D E
Consequently we have 4 + 3 + 2 + 1 =10
So using Quick sort it takes
C=(n1) + (n2) + (n3) + …. + 2 + 1 = n(n1)/2 = O(n^{2})
=5^{2}
=25
Whenever the sequence is sorted either in descending order or ascending order the time complexity of quick sort becomes O(n^2). Since the given sequence of elements is in sorted order ,hence the answer will be approximately n^2 = 5^2 = 25.
Question 128 
0  
1  
10  
20 
(n1) + (n2) + (n3) + . . . . . . . . + 1 = n(n1)/2 = 5*4/2 = 10
Question 129 
bubble sort  
Shell sort  
Merge Sort
 
Selection sort 
Question 130 
001<010<011<0001<0101  
0001<001<010<0101<011  
0001<0101<001<010<011  
001<010<0001<0101<011 
0001<001<010<0101<011
Note: Lexicographical order is nothing but dictionary based order.
Question 131 
Insertion sort  
Internal sort  
External sort  
Radix sort 
Question 132 
Distribution sort  
Internal sort  
External sort
 
Radix sort 
Question 133 
8, 9, 15, 20, 47, 4, 12, 17, 30, 40  
8, 15, 20, 47, 4, 9, 30, 40, 12, 17  
15, 20, 47, 4, 8, 9, 12, 30, 40, 17  
4, 8, 9, 15, 20, 47, 12, 17, 30, 40 
Pass2: (8,15,20,47), (4,9,30,40), (12,17)
Pass3: (4,8,9,15,20,30,40,47),(12,17)
Pass4: (4,8,9,12,15,17,20,30,40,47)
So within 4 passed we can sort all the elements
Question 134 
It is faster  
Consumes less memory  
Detects whether the input is already sorted
 
All of the options 
OptionA is definitely FALSE.
OptionB is not an advantage when compared to other sorting techniques.
OptionC Bubble sort will give the best case O(n) when elements are in sorted order.
Question 135 
O(nlogn)  
O(logn)  
O(n)  
O(n^{2}) 
T(n)=2T(n/2)+n
=O(nlogn)
Question 136 
Impossible to sort in linear time  
Radix Sort  
Insertion Sort  
Bubble Sort 
Even it gives linear time when the input array is 0 to n^6….
Question 137 
4  
2  
1  
3 
Note: Official answer key given, OptionD as the correct answer.
Question 138 
procedure mystery() {
for (i = 0; i < 100; i++) { C[i] = A[i]; }
p = 99;
for (i = 0; i < 100; i++) {
B[p] = C[0];
p = p1;
for (j = 1; j < 100; j++) {
C[j1] = C[j];
}
}
}
When the procedure terminates, which of the following statements can be asserted about the array B?
All elements of B are equal to A[0]  
B contains the elements of A sorted in descending order  
All values of B are the same  
B contains the elements of A in reverse order 
Question 139 
procedure mystery() {
for (i = 0; i < 100; i++) { C[i] = A[i]; }
p = 99;
for (i = 0; i < 100; i++) {
B[p] = C[0];
p = p1;
for (j = 1; j < 100; j++) {
C[j1] = C[j];
}
}
}
When the procedure terminates, which of the following statements can be asserted about the array C?
It contains the elements of A sorted in ascending order  
It contains the elements of A sorted in descending order  
All values are equal to A[99]  
All values are equal to A[0] 
At the end of iteration i, C[99], which is the same as A[99], is copied to positions 99i, 99i+1, . . . , 99 of C. Since the outer loop is run for 100 iterations, the result follows.
Question 140 
Descriptive explanation 
Now i, k ∈ Si and j, k ∈ Sj, and both i and j are good. Therefore A[i] ≤ A[k] and A[k] ≤ A[j], and hence A[i] ≤ A[j]. The desired result follows.
Question 141 
O(n log n) but not O(n log log n)  
O(n log log n) but not O(n)  
O(n) but not O(n/ log log n)  
O(n/ log log n)  
None of the above. 
Question 142 
c = 3n  
c = 2n + 5  
c ≥ 3n − 1  
c ≤ n  
c ≤ 2n + 3 
Question 143 
Further, let count_IS be the number of comparisons made by the insertion sort algorithm on the array a.
Which of the following statements is TRUE for some constant c?
For all N ≥ c, there exists an array of size N for which count_IS ≥ N^{2}/c,
while count_FN ≤ cN  
For all N ≥ c, there exists an array of size N for which count_FN ≥ N^{2}/c, while
count_IS ≤ cN  
For all N ≥ c, for all arrays of size N, count_FN ≤ count_IS ≤ c × count_FN  
For all N ≥ c, for all arrays of size N, count_FN ≥ N^{2}/c  
None of the above 