...
NTA UGC NET JUNE-2023 Paper-2
November 18, 2023
Software-Engineering
November 18, 2023
NTA UGC NET JUNE-2023 Paper-2
November 18, 2023
Software-Engineering
November 18, 2023

NTA UGC NET JUNE-2023 Paper-2

Question 17
Consider the following statements
A. C-fuzzy means cluster is supervised method of learning
B. PCA is used for dimension reduction
C. Apriori is not a supervised technique
D. When a machine learning model becomes so specially tuned to its exact input data that it fails to generalize to other similar data it is called underfitting

Choose the correct answer from the options given below

A
A and B
B
B and C
C
C and D
D
D and A
Question 17 Explanation: 
Statement B is correct. PCA (Principal Component Analysis) is indeed used for dimension reduction in machine learning and data analysis.

Statement C is also correct. Apriori is a frequent itemset mining algorithm used in association rule learning and is not a supervised technique in machine learning.

Statements A and D are not correct:

Statement A is incorrect. “C-fuzzy” is not a standard term in machine learning, and the statement doesn’t accurately describe a supervised method of learning.

Statement D is also incorrect. “Underfitting” is when a model is too simple and fails to capture the underlying patterns in data. It is the opposite of overfitting, which is when a model becomes too specialized to its training data.

Correct Answer: B
Question 17 Explanation: 
Statement B is correct. PCA (Principal Component Analysis) is indeed used for dimension reduction in machine learning and data analysis.

Statement C is also correct. Apriori is a frequent itemset mining algorithm used in association rule learning and is not a supervised technique in machine learning.

Statements A and D are not correct:

Statement A is incorrect. “C-fuzzy” is not a standard term in machine learning, and the statement doesn’t accurately describe a supervised method of learning.

Statement D is also incorrect. “Underfitting” is when a model is too simple and fails to capture the underlying patterns in data. It is the opposite of overfitting, which is when a model becomes too specialized to its training data.

Leave a Reply

Your email address will not be published. Required fields are marked *