T4Tutorials .PK

NSCT-Ensemble Learning MCQs

1. . Ensemble learning is:

(A) Encrypting models


(B) A machine learning technique that combines multiple models to improve overall performance


(C) Compressing datasets


(D) Backup only




2. . The main goal of ensemble learning is to:

(A) Reduce errors, variance, and bias compared to individual models


(B) Encrypt predictions


(C) Compress predictions


(D) Backup only




3. . Bagging in ensemble learning stands for:

(A) Balanced Algorithm


(B) Binary Aggregation


(C) Bootstrap Aggregating


(D) Backup only




4. . Bagging improves model performance by:

(A) Compressing predictions


(B) Encrypting models


(C) Training multiple models on random subsets of data and averaging predictions


(D) Backup only




5. . Random Forest is:

(A) Encrypting trees


(B) An ensemble of decision trees using bagging and feature randomness


(C) Compressing trees


(D) Backup only




6. . Boosting is:

(A) Backup only


(B) Encrypting boosting


(C) Compressing boosting


(D) An ensemble method that trains models sequentially, giving more weight to previously misclassified instances




7. . AdaBoost stands for:

(A) Adaptive Boosting


(B) Automatic Boosting


(C) Algorithmic Boosting


(D) Backup only




8. . Gradient Boosting works by:

(A) Encrypting gradient


(B) Optimizing a loss function by sequentially adding models to correct errors


(C) Compressing gradients


(D) Backup only




9. . Stacking in ensemble learning is:

(A) Compressing predictions


(B) Encrypting stacks


(C) Combining predictions of multiple models using a meta-model


(D) Backup only




10. . Voting classifiers in ensemble learning:

(A) Encrypt votes


(B) Make predictions based on majority voting from multiple models


(C) Compress votes


(D) Backup only




11. . Ensemble methods are used to:

(A) Compress datasets


(B) Encrypt models


(C) Increase accuracy and robustness of machine learning models


(D) Backup only




12. . Key advantage of ensemble learning is:

(A) Encrypting predictions


(B) Reducing overfitting and improving generalization


(C) Compressing features


(D) Backup only




13. . Random Forest handles overfitting by:

(A) Using multiple decision trees and averaging their predictions


(B) Encrypting trees


(C) Compressing trees


(D) Backup only




14. . Difference between bagging and boosting is:

(A) Bagging encrypts data


(B) Bagging trains models in parallel, boosting trains sequentially


(C) Boosting compresses features


(D) Backup only




15. . Out-of-bag (OOB) error is used in:

(A) Encrypting error


(B) Random Forest to estimate prediction error without separate test data


(C) Compressing OOB data


(D) Backup only




16. . Ensemble methods are particularly useful when:

(A) Backup only


(B) Encrypting models


(C) Compressing datasets


(D) Individual models have high variance or bias




17. . Weighted voting in ensemble learning:

(A) Assigns different importance to each model's prediction


(B) Encrypts weights


(C) Compresses votes


(D) Backup only




18. . Bagging reduces:

(A) Data size


(B) Bias


(C) Variance of model predictions


(D) Backup only




19. . Boosting reduces:

(A) Backup only


(B) Variance only


(C) Data size


(D) Bias and improves model accuracy




20. . The main purpose of ensemble learning is to:

(A) Encrypt all models


(B) Combine multiple models to create a stronger, more accurate, and robust predictive model


(C) Compress all features


(D) Backup only




Exit mobile version