PS#6
pdf
keyboard_arrow_up
School
California Lutheran University *
*We aren’t endorsed by this school
Course
IDS575
Subject
Mathematics
Date
Apr 3, 2024
Type
Pages
9
Uploaded by SuperHumanCrabPerson1153
Q1 Boundary and margin
25 Points
Q1.1
5 Points
Given a dataset with binary labels in the following figure, which line is more likely to be a decision boundary of SVM? (Hint: Using hard-
margin vs soft-margin does not matter)
Q1.2
7 Points
You have four positive examples and five negative examples. Among three examples {A, B, C} (red-circled), choose EVERY example that changes the decision boundary if removed and (re)training a SVM.
f
g
Q1.3
8 Points
The figure shows positive (blue circles) and negative (red circles) examples with a decision bounary (solid line) and its margin borders (dashed lines). Suppose an optimal decision boundary is learned by our standard SVM formulation. Choose the right option for the following four values:
N1 = where is a hypohtetical point (not drawn) on the solid line. N2 = where is one of the three blue dots on the upper dashed line. N3 = where is one of the two red dots on the lower dashed line. N4 = the actual distance between two dashed lines.
A
B
C
None
(
w
,
b
)
∗
∗
w
x
+
∗
T
b
∗
x
w
x
+
∗
T
b
∗
x
w
x
+
∗
T
b
∗
x
Q1.4
5 Points
Assume you have a linearly separable data set, learning the best parameters and by a hard-margin SVM. If we double them as and ,
N1=0, N2= -1, N3=1, N4=
∣∣
w
∣∣
∗
1
N1=0, N2= 1, N3= -1, N4=
∣∣
w
∣∣
∗
1
N1=0, N2= -1, N3=1, N4=
∣∣
w
∣∣
∗
2
N1=0, N2= 1, N3= -1, N4=
∣∣
w
∣∣
∗
2
w
∗
b
∗
2
w
∗
2
b
∗
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Q2 Hard optimal-margin classifier
48 Points
Given the following toy dataset consisting of seven exmaples just two features, you are supposed to learn maximal margin classier.
Q2.1
10 Points
Visualize all seven training examples and sketch the optimal separating hyperplane. Write down the equation for this hyperplane.
Q2.1.pdf
Download
the decision boundary hyperplane will change, the functional
margin will change, and the geometric margin will change
the decision boundary hyperplane will not change, the functional
margin will change, and the geometric margin will change
the decision boundary hyperplane will not change, the functional
margin will not change, and the geometric margin will change
the decision boundary hyperplane will not change, the functional
margin will change, and the geometric margin will not change
the decision boundary hyperplane will not change, the functional
margin will not change, and the geometric margin will not change
1
of 1
Q2.2
10 Points
Clearly state the rule of classification for predicting "No". (Hint: It should be of the form )
0.5 - x1 + x2 <=0
b
+
w
x
+
1
1
w
x
≤
2
2
0
Q2.3
5 Points
Is the data linearly separable?
Q2.4
12 Points
The value of geometric margin of the second point . You can assume . Limit your computation only with 3 decimal places.
0.354
Q2.5
6 Points
The support vectors are:
Q2.6
5 Points
Would a slight perturbation of the 6-th example affect the optimal margin hyperplane?
Yes
No
γ
g
(2)
y
=
(2)
1
{(
x
,
x
)}
1
2
{(2, 2), (2, 1), (4, 3)}
{(2, 2), (4, 3)}
{(2, 2), (4, 4), (2, 1), (4, 3)}
{(2, 2), (2, 1)}
Yes
No
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Q3 Lagrange Duality and Optimization
27 Points
Q3.1
8 Points
Assume our loss function to minimize is , and we have a constraint . Formulate a Lagranian with the Lagrange multiplier . (Hint: As you have only one constraint, your must be a single-dimensional scalar)
ℒ
(x, y, 𝜆
) = (3x–2y)^2 – ν(x-y+1)
Q3.2
8 Points
Continuing from your answer in Q2.1, compute the optimal and . (Hint: You can easily verify whether or not your answer is correct by finding optimum of a univariate quadratic function)
(x*, y*) = (2, 3)
Q3.3
5 Points
The function is convex. (Hint: Think visually by drawing inidividual graphs, taking the max, and checking its shape)
Q3.4
f
(
x
,
y
) = (3
x
− 2
y
)
2
x
+ 1 =
y
ν
ν
x
∗
y
∗
f
(
x
) = max(1/2,
x
,
x
)
2
True
False
6 Points
Is the following optimization problem is convex?
minimize subject to , , and f
(
x
)
0
x
≤
2
0
x
≤ 100
x
=
3
0
Yes
No
Cannot be decided
GRADED
Problem Set (PS) #06
STUDENT
Urvashiben Patel
TOTAL POINTS
94 / 100 pts
QUESTION 1
Boundary and margin
25
/ 25 pts
1.1
(no title)
5
/ 5 pts
1.2
(no title)
7
/ 7 pts
1.3
(no title)
8
/ 8 pts
1.4
(no title)
5
/ 5 pts
QUESTION 2
Hard optimal-margin classifier
48
/ 48 pts
2.1
(no title)
10
/ 10 pts
2.2
(no title)
10
/ 10 pts
2.3
(no title)
5
/ 5 pts
2.4
(no title)
12
/ 12 pts
2.5
(no title)
6
/ 6 pts
2.6
(no title)
5
/ 5 pts
QUESTION 3
Lagrange Duality and Optimization
21
/ 27 pts
3.1
(no title)
8
/ 8 pts
3.2
(no title)
8
/ 8 pts
3.3
(no title)
5
/ 5 pts
3.4
(no title)
0
/ 6 pts
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help