NPTEL Introduction to Machine Learning Assignment 7 Answers 2023

Hello Learners, In this Post, you will find NPTEL Introduction to Machine Learning Assignment 7 Week 7 Answers 2023. All the Answers are provided below to help the students as a reference don’t straight away look for the solutions.

NPTEL Introduction to Machine Learning Assignment 8 Answers👇

CLICK HERE

Note: First try to solve the questions by yourself. If you find any difficulty, then look for the solutions.

COURSE NAMEANSWER
NPTEL Introduction to Machine Learning Assignment 1 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 2 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 3 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 4 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 5 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 6 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 7 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 8 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 9 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 10 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 11 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 12 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 7 Answers 2023
NPTEL Introduction to Machine Learning Assignment 7 Answers 2023

NPTEL Introduction to Machine Learning Assignment 7 Answers 2023:

We are updating answers soon Join Group for update: CLICK HERE

Q.1. For the given confusion matrix, compute the recall

  • 0.73
  • 0.7
  • 0.6
  • 0.67
  • 0.78
  • None of the above

Q.2. You have 2 multi-class classifiers A and B. A has accuracy = 0% and B has accuracy = 50%. Which classifier is more useful?

  • A
  • B
  • Both are equally good
  • Depends on the number of classes

Q.3. For large datasets, we should always be choosing large k while doing k− fold cross validation to get better performance on test set.

  • True
  • False
NPTEL Introduction to Machine Learning Assignment 7 Answers Join Group👇

Q.4. We have a dataset with 1000 samples and 5 classes for classification. What would be the training size for a 20 fold cross validation?

  • 50
  • 200
  • 800
  • 950

Q.5. Which of the following are true?

TP – True Positive, TN – True Negative, FP – False Positive, FN – False Negative

  • Answer: A and B

Q.6. In the ROC plot, what are the quantities along x and y axes respectively?

  • Precision, Recall
  • Recall, Precision
  • True Positive Rate, False Positive Rate
  • False Positive Rate, True Positive Rate
  • Specificity, Sensitivity
  • True Positive, True Negative
  • True Negative, True Positive
NPTEL Introduction to Machine Learning Week 7 Answers Join Group👇
CLICK HERE

Q.7. How does bagging help in improving the classification performance?

  • If the parameters of the resultant classifiers are fully uncorrelated (independent), then bagging is inefficient.
  • It helps reduce variance
  • If the parameters of the resultant classifiers are fully correlated, then bagging is inefficient.
  • It helps reduce bias

Q.8. Which method among bagging and stacking should be chosen in case of limited training data? and What is the appropriate reason for your preference?

  • Bagging, because we can combine as many classifier as we want by training each on a different sample of the training data
  • Bagging, because we use the same classification algorithms on all samples of the training data
  • Stacking, because we can use different classification algorithms on the training data
  • Stacking, because each classifier is trained on all of the available data

Q.9. Which of the following statements are false when comparing Committee Machines and Stacking

  • Committee Machines are, in general, special cases of 2-layer stacking where the second- layer classifier provides uniform weightage.
  • Both Committee Machines and Stacking have similar mechanisms, but Stacking uses different classifiers while Committee Machines use similar classifiers.
  • Committee Machines are more powerful than Stacking
  • Committee Machines are less powerful than Stacking
NPTEL Introduction to Machine Learning Assignment 7 Answers Join Group👇

Disclaimer: This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Brokenprogrammers. This tutorial is only for Discussion and Learning purpose.

About NPTEL Introduction to Machine Learning Course:

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

Course Layout:
  • Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap)
  • Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance
  • Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
  • Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis
  • Week 4: Perceptron, Support Vector Machines
  • Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation
  • Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures
  • Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting
  • Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
  • Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
  • Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
  • Week 11: Gaussian Mixture Models, Expectation Maximization
  • Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)
CRITERIA TO GET A CERTIFICATE:

Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

If you have not registered for exam kindly register Through https://examform.nptel.ac.in/

Leave a Reply

Your email address will not be published. Required fields are marked *