The NPTEL Introduction to Machine Learning course for the July-October 2024 session covers critical topics in Week 3, including linear classification, logistic regression, and Linear Discriminant Analysis (LDA). This week's assignment tests the understanding of these concepts through various questions designed to challenge and enhance learning.
Question 1
For a two-class problem using discriminant functions ( - discriminant function for class ), where is the separating hyperplane located?
Given:
- The separating hyperplane is where .
Since , we get:
Therefore, the separating hyperplane is defined by:
Answer:
Question 2
Given the following dataset consisting of two classes, and , calculate the prior probability of each class.
Feature 1 | Class |
---|---|
2.3 | A |
1.8 | A |
3.2 | A |
1.2 | A |
2.1 | A |
1.9 | B |
2.4 | B |
Calculate and :
- Number of samples for class A,
- Number of samples for class B,
- Total number of samples,
Prior probabilities:
Answer:
Question 3
In a 3-class classification problem using linear regression, the output vectors for three data points are , , and . To which classes would these points be assigned?
Assignment is based on the highest output value for each data point:
- Data point -> Class 1 (0.8 is the highest)
- Data point -> Class 2 (0.6 is the highest)
- Data point -> Class 2 (0.4 is the highest, tie between class 2 and class 3)
Answer:
- -> Class 1
- -> Class 2
- -> Class 2
Question 4
If you have a 5-class classification problem and want to avoid masking using polynomial regression, what is the minimum degree of the polynomial you should use?
For a -class problem, to avoid masking, we need to use a polynomial of degree .
For 5 classes: Minimum degree of the polynomial:
Answer: 4
Question 5
Consider a logistic regression model where the predicted probability for a given data point is 0.4. If the actual label for this data point is 1, what is the contribution of this data point to the log-likelihood?
Log-likelihood contribution for logistic regression is given by: Where is the actual label and is the predicted probability.
Given:
Contribution to log-likelihood:
Answer:
Question 6
What additional assumption does LDA make about the covariance matrix in comparison to the basic assumption of Gaussian class conditional density?
Linear Discriminant Analysis (LDA) assumes that the covariance matrix is the same for all classes.
Answer: The covariance matrix is the same for all classes.
Question 7
What is the shape of the decision boundary in LDA?
In LDA, the decision boundary is linear.
Answer: Linear
Question 8
For two classes and with within-class variances and respectively, if the projected means are and , what is the Fisher criterion ?
The Fisher criterion is given by:
Given:
Calculate :
Answer: 0.8
Question 9
Given two classes and with means and respectively, what is the direction vector for LDA when the within-class covariance matrix is the identity matrix ?
For LDA, the direction vector is given by:
Given:
Calculate :
Since , the direction vector is:
Answer: