NPTEL Introduction to Machine Learning Assignment 4 Answers 2023

Hello Learners, In this Post, you will find NPTEL Introduction to Machine Learning Assignment 4 Week 4 Answers 2023. All the Answers are provided below to help the students as a reference don’t straight away look for the solutions.

NPTEL Introduction to Machine Learning Assignment 5 Answers👇

CLICK HERE

Note: First try to solve the questions by yourself. If you find any difficulty, then look for the solutions.

COURSE NAMEANSWER
NPTEL Introduction to Machine Learning Assignment 1 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 2 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 3 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 4 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 5 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 6 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 7 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 8 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 9 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 10 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 11 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 12 AnswersClick Here
NPTEL Introduction to Machine Learning Assignment 4 Answers 2023
NPTEL Introduction to Machine Learning Assignment 4 Answers 2023

NPTEL Introduction to Machine Learning Assignment 4 Answers 2023:

We are updating answers soon Join Group for update: CLICK HERE

Q.1. Consider a Boolean function in three variables, that returns True if two or more variables out of three are True, and False otherwise. Can this function be implemented using the perceptron algorithm?

  • no
  • yes

Q.2. For a support vector machine model, let xi be an input instance with label yi . If yi(β^0+xTiβ^)>1(β^0+β^)>1, where β0β0 and β^)β^) are the estimated parameters of the model, then

  • xi is not a support vector
  • xi is a support vector
  • xi is either an outlier or a support vector
  • Depending upon other data points, x i may or may not be a support vector.

Q.3. Suppose we use a linear kernel SVM to build a classifier for a 2-class problem where the training data points are linearly separable. In general, will the classifier trained in this manner be always the same as the classifier trained using the perceptron training algorithm on the same training data?

  • Yes
  • No
NPTEL Introduction to Machine Learning Assignment 5 Answers Join Group👇

For Q4,5: Kindly download the synthetic dataset from the following link

Click here to view the dataset

The dataset contains 1000 points and each input point contains 3 features.

Q.4. Train a linear regression model (without regularization) on the above dataset. Report the coefficients of the best fit model. Report the coefficients in the following format: 

β0,β1,β2,β3 . (You can round-off the accuracy value to the nearest 2-decimal point number.)

  • -1.2, 2.1, 2.2,
  • 1 1, 1.2, 2.1, 2.2
  • -1, 1.2, 2.1, 2.2
  • 1, -1.2, 2.1, 2.2
  • 1, 1.2, -2.1, -2.2

Q.5. Train an l2 regularized linear regression model on the above dataset. Vary the regularization parameter from 1 to 10. As you increase the regularization parameter, absolute value of the coefficients (excluding the intercept) of the model:

  • increase
  • first increase then decrease
  • decrease
  • first decrease then increase

For Q6,7: Kindly download the modified version of Iris dataset from this link.

Available at: (Click here to view the Iris dataset)

The dataset contains 150 points and each input point contains 4 features and belongs to one among three classes. Use the first 100 points as the training data and the remaining 50 as test data. In the following questions, to report accuracy, use test dataset. You can round-off the accuracy value to the nearest 2-decimal point number. (Note: Do not change the order of data points.)

Q.6. Train an l2 regularized logistic regression classifier on the modified iris dataset.

 regularized logistic regression classifier on the modified iris dataset. We recommend using sklearn. Use only the first two features for your model. We encourage you to explore the impact of varying different hyperparameters of the model. Kindly note that the C parameter mentioned below is the inverse of the regularization parameter λλ. As part of the assignment train a model with the following hyperparameters:
Model: logistic regression with one-vs-rest classifier, C=1e4=14
For the above set of hyperparameters, report the best classification accuracy

  • 0.88
  • 0.86
  • 0.98
  • 0.68
NPTEL Introduction to Machine Learning Week 4 Answers Join Group👇
CLICK HERE

Q.7. Train an SVM classifier on the modified iris dataset. We recommend using sklearn. Use only the first two features for your model. We encourage you to explore the impact of varying different hyperparameters of the model. Specifically try different kernels and the associated hyperparameters. As part of the assignment train models with the following set of hyperparameters

RBF-kernel, gamma=0.5 , one-vs-rest classifier, no-feature-normalization. Try C=0.01,1,10 . For the above set of hyperparameters, report the best classification accuracy along with total number of support vectors on the test data.

  • 0.92, 69
  • 0.88, 40
  • 0.88, 69
  • 0.98, 41
NPTEL Introduction to Machine Learning Assignment 4 Answers Join Group👇

Disclaimer: This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Brokenprogrammers. This tutorial is only for Discussion and Learning purpose.

About NPTEL Introduction to Machine Learning Course:

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

Course Layout:
  • Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap)
  • Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance
  • Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
  • Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis
  • Week 4: Perceptron, Support Vector Machines
  • Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation
  • Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures
  • Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting
  • Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
  • Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
  • Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
  • Week 11: Gaussian Mixture Models, Expectation Maximization
  • Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)
CRITERIA TO GET A CERTIFICATE:

Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

If you have not registered for exam kindly register Through https://examform.nptel.ac.in/

Leave a Reply

Your email address will not be published. Required fields are marked *