3. Interpreting large datasets is a necessary yet difficult challenge in most of the data science disci- plines. Hence, dimensionality-reduction techniques are often used in data pre-processing, where the result should preserve the most significant features of the original dataset with minimal information loss. In this question, we will illustrate a dimensionality-reduction technique to determine the "most popular" show on a streaming service based on three metrics: the average rating, the average number of recommendations, and the retention rate. 1 2 3 4 5 Name The Butcher of Blaviken Salary Swindle Unusual Objects Calamari Contest The Sicilian Defense Rating Recommendations Retention 4.0 2.0 0.60 4.2 2.1 2.0 2.1 2.2 3.9 4.3 4.1 The covariance matrix Σ contains the variances of the variables along the main diagonal and the covariances between each pair of variables in the other matrix positions as follows: Eigenvalues A₁: From Part (a) A₂ = 0.000159 A3 = 0.00439 Let wi= A₁ + 2 + 3 A₁ has the largest weight. 0.025 0.0075 0.00175) = 0.0075 0.0070 0.00135 0.00175 0.00135 0.00043/ For this question, quote numerical answers to at least 3 decimal points or 3 significant figures, whichever is more precise. X₁ Given that ₁ = (12.241, 4.462, 1) is an eigenvector of E. Find its associated eigenvalue A₁. Answer for (a):0.027875 (b) The eigenvalues of Σ and their associated eigenvectors are summarised as follows: 0.59 0.58 0.62 0.63 Eigenvectors ₁ = (12.241, 4.462, 1) 2 (-0.016,-0.180, 1) 73=(-2.184, 5.767, 1)T = be the weight of each eigenvalue A₁. Determine w₁ and show that (c) We perform dimension reduction by projecting each data point onto the eigenspace Ex, (E) of the largest weight eigenvalue X₁. Write each data point as a three dimensional vector 7= (Rating, Recommendation, Retention), e.g, ₁ = (4.0, 2.0,0.60)T. Compute , (F), the orthogonal projection (w.r.t standard inner product) of each 7, onto the eigenspace of A₁.

Advanced Engineering Mathematics
10th Edition
ISBN:9780470458365
Author:Erwin Kreyszig
Publisher:Erwin Kreyszig
Chapter2: Second-order Linear Odes
Section: Chapter Questions
Problem 1RQ
icon
Related questions
Question

help B and C

3. Interpreting large datasets is a necessary yet difficult challenge in most of the data science disci-
plines. Hence, dimensionality-reduction techniques are often used in data pre-processing, where
the result should preserve the most significant features of the original dataset with minimal
information loss. In this question, we will illustrate a dimensionality-reduction technique to
determine the "most popular" show on a streaming service based on three metrics: the average
rating, the average number of recommendations, and the retention rate.
Rating | Recommendations Retention
2.0
Name
1 The Butcher of Blaviken
Salary Swindle
Unusual Objects
4.0
0.60
4.2
2.1
0.59
3
3.9
2.0
0.58
4
Calamari Contest
4.3
2.1
0.62
The Sicilian Defense
4.1
2.2
0.63
The covariance matrix E contains the variances of the variables along the main diagonal and
the covariances between each pair of variables in the other matrix positions as follows:
0.0075 0.00175)
E= | 0.0075 0.0070 0.00135
0.00175 0.00135 0.00043,
0.025
For this question, quote numerical answers to at least 3 decimal points or 3 significant figures,
whichever is more precise.
Given that = (12.241, 4.462, 1)T is an eigenvector of E. Find its associated eigenvalue
d1. Answer for (a):0.027875
(b) The eigenvalues of E and their associated eigenvectors are summarised as follows:
Eigenvalues
A1: From Part (a)
A2 = 0.000159
A3 = 0.00439
Eigenvectors
1 = (12.241, 4.462, 1)T
T2 = (-0.016, -0.180, 1)"
iz = (-2.184, 5.767, 1)"
Let wi =
be the weight of each eigenvalue A;. Determine wi and show that
A1 + A2 + A3.
A1 has the largest weight.
(c) We perform dimension reduction by projecting each data point onto the eigenspace E, (E)
of the largest weight eigenvalue A1.
Write each data point as a three dimensional vector i = (Rating, Recommendation,
Retention), e.g, F = (4.0,2.0,0.60)". Compute T, (F), the orthogonal projection (w.r.t
standard inner product) of each 7; onto the eigenspace of A1.
Transcribed Image Text:3. Interpreting large datasets is a necessary yet difficult challenge in most of the data science disci- plines. Hence, dimensionality-reduction techniques are often used in data pre-processing, where the result should preserve the most significant features of the original dataset with minimal information loss. In this question, we will illustrate a dimensionality-reduction technique to determine the "most popular" show on a streaming service based on three metrics: the average rating, the average number of recommendations, and the retention rate. Rating | Recommendations Retention 2.0 Name 1 The Butcher of Blaviken Salary Swindle Unusual Objects 4.0 0.60 4.2 2.1 0.59 3 3.9 2.0 0.58 4 Calamari Contest 4.3 2.1 0.62 The Sicilian Defense 4.1 2.2 0.63 The covariance matrix E contains the variances of the variables along the main diagonal and the covariances between each pair of variables in the other matrix positions as follows: 0.0075 0.00175) E= | 0.0075 0.0070 0.00135 0.00175 0.00135 0.00043, 0.025 For this question, quote numerical answers to at least 3 decimal points or 3 significant figures, whichever is more precise. Given that = (12.241, 4.462, 1)T is an eigenvector of E. Find its associated eigenvalue d1. Answer for (a):0.027875 (b) The eigenvalues of E and their associated eigenvectors are summarised as follows: Eigenvalues A1: From Part (a) A2 = 0.000159 A3 = 0.00439 Eigenvectors 1 = (12.241, 4.462, 1)T T2 = (-0.016, -0.180, 1)" iz = (-2.184, 5.767, 1)" Let wi = be the weight of each eigenvalue A;. Determine wi and show that A1 + A2 + A3. A1 has the largest weight. (c) We perform dimension reduction by projecting each data point onto the eigenspace E, (E) of the largest weight eigenvalue A1. Write each data point as a three dimensional vector i = (Rating, Recommendation, Retention), e.g, F = (4.0,2.0,0.60)". Compute T, (F), the orthogonal projection (w.r.t standard inner product) of each 7; onto the eigenspace of A1.
Expert Solution
steps

Step by step

Solved in 7 steps

Blurred answer
Recommended textbooks for you
Advanced Engineering Mathematics
Advanced Engineering Mathematics
Advanced Math
ISBN:
9780470458365
Author:
Erwin Kreyszig
Publisher:
Wiley, John & Sons, Incorporated
Numerical Methods for Engineers
Numerical Methods for Engineers
Advanced Math
ISBN:
9780073397924
Author:
Steven C. Chapra Dr., Raymond P. Canale
Publisher:
McGraw-Hill Education
Introductory Mathematics for Engineering Applicat…
Introductory Mathematics for Engineering Applicat…
Advanced Math
ISBN:
9781118141809
Author:
Nathan Klingbeil
Publisher:
WILEY
Mathematics For Machine Technology
Mathematics For Machine Technology
Advanced Math
ISBN:
9781337798310
Author:
Peterson, John.
Publisher:
Cengage Learning,
Basic Technical Mathematics
Basic Technical Mathematics
Advanced Math
ISBN:
9780134437705
Author:
Washington
Publisher:
PEARSON
Topology
Topology
Advanced Math
ISBN:
9780134689517
Author:
Munkres, James R.
Publisher:
Pearson,