Implement and test: • Logistic regression (LR) with L1 regularization • LR is differentiable • But L1 norm is not • Use proximal gradient descent • For L1 norm, that’s soft-thresholding • Use tensorflow library • Dataset – the same as in HW2: • Classify two digits from MNIST dataset • See: tensorflow_minimizeF.py • Performs projected gradient descent on a simple function • The function has global minimum at • w1=-0.25, w2=2 • But the feasible set Q is: w1>=0, w2>=0 • For this function, the best solution is w1=0, w2=2 • The code does the following, in a loop: • Gradient step on the function, followed up by proximal step • Here, the proximal step is just “make w nonnegative” by replacing negative values with 0, the closest non-negative value • Feasible set Q is set of all vectors with nonnegative coordinates, i.e., for 2D, w1>=0, w2>=0 • In your actual code, you should use soft-thresholding instead • See: tensorflow_leastSquares.py • Performs gradient descent on a function based on data • We have some fake data x,y, where y=w*x+b+small_gaussian_noise • The code tries to find best wbest, bbest that predict y • It uses the loss: (y-ypredicted)2 • ypredicted = wbest*x + bbest • In your code: • x,y will be taken from the MNIST dataset • the loss should be logistic loss • you need to add the proximal step / soft-thresholding • Constant L is unknown, you should try several gradient step sizes • Constant in front of L1 penalty is unknown, you should try several values A report in PDF n Results of tests of the method on MNIST dataset, for decreasing training set sizes (include you #, and what are your two digits defining the two-class problem). n Code in python for solving the MNIST classification problem (for full size of the training set)

Database System Concepts
7th Edition
ISBN:9780078022159
Author:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Chapter1: Introduction
Section: Chapter Questions
Problem 1PE
icon
Related questions
Question
100%

• Implement and test:

• Logistic regression (LR) with L1 regularization

• LR is differentiable

• But L1 norm is not

• Use proximal gradient descent

• For L1 norm, that’s soft-thresholding

• Use tensorflow library

• Dataset – the same as in HW2:

• Classify two digits from MNIST dataset

• See: tensorflow_minimizeF.py

• Performs projected gradient descent on a simple

function

• The function has global minimum at

• w1=-0.25, w2=2

• But the feasible set Q is: w1>=0, w2>=0

• For this function, the best solution is w1=0, w2=2

• The code does the following, in a loop:

• Gradient step on the function, followed up by proximal step

• Here, the proximal step is just “make w nonnegative” by

replacing negative values with 0, the closest non-negative value

• Feasible set Q is set of all vectors with nonnegative

coordinates, i.e., for 2D, w1>=0, w2>=0

• In your actual code, you should use soft-thresholding

instead

• See: tensorflow_leastSquares.py

• Performs gradient descent on a function based on data

• We have some fake data x,y, where

y=w*x+b+small_gaussian_noise

• The code tries to find best wbest, bbest that predict y

• It uses the loss: (y-ypredicted)2

• ypredicted = wbest*x + bbest

• In your code:

• x,y will be taken from the MNIST dataset

• the loss should be logistic loss

• you need to add the proximal step / soft-thresholding

• Constant L is unknown, you should try several gradient step sizes

• Constant in front of L1 penalty is unknown, you should try several values

A report in PDF

n Results of tests of the method on MNIST dataset, for decreasing training set

sizes (include you #, and what are your two digits defining the two-class

problem).

n Code in python for solving the MNIST classification problem (for

full size of the training set)

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Knowledge Booster
Temporal Difference Learning
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, computer-science and related others by exploring similar questions and additional content below.
Similar questions
Recommended textbooks for you
Database System Concepts
Database System Concepts
Computer Science
ISBN:
9780078022159
Author:
Abraham Silberschatz Professor, Henry F. Korth, S. Sudarshan
Publisher:
McGraw-Hill Education
Starting Out with Python (4th Edition)
Starting Out with Python (4th Edition)
Computer Science
ISBN:
9780134444321
Author:
Tony Gaddis
Publisher:
PEARSON
Digital Fundamentals (11th Edition)
Digital Fundamentals (11th Edition)
Computer Science
ISBN:
9780132737968
Author:
Thomas L. Floyd
Publisher:
PEARSON
C How to Program (8th Edition)
C How to Program (8th Edition)
Computer Science
ISBN:
9780133976892
Author:
Paul J. Deitel, Harvey Deitel
Publisher:
PEARSON
Database Systems: Design, Implementation, & Manag…
Database Systems: Design, Implementation, & Manag…
Computer Science
ISBN:
9781337627900
Author:
Carlos Coronel, Steven Morris
Publisher:
Cengage Learning
Programmable Logic Controllers
Programmable Logic Controllers
Computer Science
ISBN:
9780073373843
Author:
Frank D. Petruzella
Publisher:
McGraw-Hill Education