Suppose we are doing ordinary least-squares linear regression with a fictitious dimension. Which of the following changes can never make the cost function’s value on the training data smaller? A: Discard the fictitious dimension (i.e., don’t append a 1 to every sample point). B: Append quadratic features to each sample point. C: Project the sample points onto a lower-dimensional subspace with PCA (without changing the labels) and perform regression on the projected points. D: Center the design matrix (so each feature has mean zero).
Suppose we are doing ordinary least-squares linear regression with a fictitious dimension. Which of the following changes can never make the cost function’s value on the training data smaller? A: Discard the fictitious dimension (i.e., don’t append a 1 to every sample point). B: Append quadratic features to each sample point. C: Project the sample points onto a lower-dimensional subspace with PCA (without changing the labels) and perform regression on the projected points. D: Center the design matrix (so each feature has mean zero).
Related questions
Question
Suppose we are doing ordinary least-squares linear regression with a fictitious dimension. Which of the
following changes can never make the cost function’s value on the training data smaller?
A: Discard the fictitious dimension (i.e., don’t append a 1 to every sample point).
B: Append quadratic features to each sample point.
C: Project the sample points onto a lower-dimensional subspace with PCA (without changing the labels) and
perform regression on the projected points.
D: Center the design matrix (so each feature has mean zero).
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step 1: Introduction to ordinary least-squares linear regression with a fictitious dimension:
VIEWStep 2: A:Discard the fictitious dimension (i.e., don’t append a 1 to every sample point).
VIEWStep 3: B:Append quadratic features to each sample point.
VIEWStep 4: C:Project the sample points onto a lower-dimensional subspace with PCA (without changing the labels)
VIEWStep 5: D:Center the design matrix (so each feature has mean zero).
VIEWSolution
VIEWStep by step
Solved in 6 steps with 3 images