T4Tutorials .PK

NSCT-Feature Engineering & Selection MCQs

1. . Feature engineering is:

(A) Encrypting features only


(B) The process of creating new input features from existing data to improve model performance


(C) Compressing datasets only


(D) Backup only




2. . Feature selection is:

(A) The process of choosing the most relevant features for model training


(B) Encrypting selected features


(C) Compressing features


(D) Backup only




3. . Why is feature engineering important?

(A) It helps improve model accuracy, interpretability, and efficiency


(B) Encrypts features only


(C) Compresses datasets


(D) Backup only




4. . Common feature engineering techniques include:

(A) Compressing features only


(B) Encrypting variables only


(C) Encoding categorical variables, scaling, creating interaction terms, and aggregations


(D) Backup only




5. . One-hot encoding is used to:

(A) Convert categorical variables into binary vectors


(B) Encrypt categories


(C) Compress data


(D) Backup only




6. . Label encoding is:

(A) Compressing labels


(B) Encrypting labels


(C) Converting categorical variables into numeric labels


(D) Backup only




7. . Feature scaling is important because:

(A) Encrypting scaling


(B) Many algorithms like SVM, KNN, and gradient descent are sensitive to feature magnitude


(C) Compressing features


(D) Backup only




8. . Standardization scales features by:

(A) Subtracting mean and dividing by standard deviation


(B) Encrypting data


(C) Compressing data


(D) Backup only




9. . Normalization scales features by:

(A) Compressing values


(B) Encrypting values


(C) Bringing values to a fixed range, often [0,1]


(D) Backup only




10. . Feature interaction involves:

(A) Creating new features by combining two or more existing features


(B) Encrypting features


(C) Compressing features


(D) Backup only




11. . Principal Component Analysis (PCA) is used for:

(A) Backup only


(B) Encrypting components


(C) Compressing features


(D) Dimensionality reduction by transforming features into uncorrelated components




12. . Recursive Feature Elimination (RFE) is:

(A) Encrypting features


(B) A method to select important features by recursively removing less important ones


(C) Compressing features


(D) Backup only




13. . Mutual information in feature selection measures:

(A) The dependency between features and the target variable


(B) Encrypting dependency


(C) Compressing data


(D) Backup only




14. . Correlation-based feature selection aims to:

(A) Backup only


(B) Encrypt correlations


(C) Compress features


(D) Remove redundant features that are highly correlated with others




15. . L1 regularization (Lasso) can be used for:

(A) Compressing coefficients


(B) Encrypting coefficients


(C) Feature selection by shrinking less important feature coefficients to zero


(D) Backup only




16. . Feature importance scores from tree-based models help to:

(A) Compress features


(B) Encrypt features


(C) Identify the most influential features for predictions


(D) Backup only




17. . Removing irrelevant or noisy features helps to:

(A) Encrypt features


(B) Improve model performance and reduce overfitting


(C) Compress datasets


(D) Backup only




18. . Dimensionality reduction helps to:

(A) Backup only


(B) Encrypt features


(C) Compress features


(D) Reduce computational cost while retaining important information




19. . Interaction terms are useful when:

(A) Encrypting interactions


(B) Relationships between features affect the target variable


(C) Compressing interactions


(D) Backup only




20. . The main purpose of feature engineering and selection is to:

(A) Encrypt features only


(B) Prepare high-quality, relevant features to improve model accuracy, efficiency, and interpretability


(C) Compress datasets only


(D) Backup only




Exit mobile version