Course»Course 15»Spring 2016»15.084»Homepage

15.084  Nonlinear Optimization

Spring 2016

home page image

Professor: Robert Michael Freund

Teaching Assistant: Haihao Lu

Lecture:  T, Th 11:00-12:30  (E25-111)
Recitation (alt. 1):  F 10:30-11:30  (34-303)
Recitation (alt. 2):  F 11:30-12:30  (34-303)
TA office hours:  W 3:00-5:00  (2-334)  

Description: 

The goal of this course is to provide a unified analytical and computational approach to nonlinear optimization problems. Topics include unconstrained and constrained optimization, linear and conic convex optimization, optimality conditions for unconstrained and constrained optimization, Lagrange and conic duality theory, interior-point algorithms and theory, general first-order methods for structured optimization, Newton’s method, semidefinite programming, and extensions. Algorithmic methods include steepest descent, Newton's method, subgradient descent, interior-point methods, conjugate gradient and quasi-Newton methods, and related computational schemes. Team projects allow students to apply the material to current research applications.

Announcements

upcoming talk at MIT by Yurii Nesterov

I invite all students (and faculty too) to attend the upcoming talk at MIT by Yurii Nesterov on May 19 at 3pm.  Details are as follows:
Date/time:  May 19, 3pm
Venue:  E51-149
Title:  Universal Newton Method
 

Abstract:  In this talk we present a second-order method for unconstrained minimization of convex functions. It can be applied to functions with Holder continuous Hessians. Our main scheme is the Cubic Regularization of Newton Method, equipped with a special line-search procedure. We show that the global rate of convergence of this scheme depends continuously on the smoothness parameter. Thus, our method can be used even for minimizing functions with discontinuous Hessians. At the same time, the line-search procedure is very efficient: the average number of calls of oracle per iteration is equal to two. We show that for finding a point with small norm of the gradient, the Universal Newton Method must be equipped with a special termination criterion for the line-search.

 

Announced on 11 May 2016  12:39  p.m. by Robert Michael Freund

Project presentation on Thursday

Hi Class,

The following groups will give a presentation on Thursday's class.
11:05-11:20 Phil Chodrow, Steven Morse, Ilias Zadik
11:20-11:35 Neel Doshi
11:35-11:50 Tao Lei, Tom Yan
11:50-12:05 Hanzhang Qin, Li Wang, Zhe Liu
12:05-12:20 Eduardo Candela, David Hunter, Bradley Sturt

The above groups should send me your slides by 10am Thursday. Let me know if you have any questions or concerns.

Best,
Sean

Announced on 10 May 2016  10:31  p.m. by Haihao Lu

Project presentation on Tuesday

Hi Class,

The following groups will give a presentation on tomorrow's class.
11:05-11:20 Phil Chodrow, Steven Morse, Ilias Zadik
11:20-11:35 Kenji Kawaguchi
11:35-11:50 Remi Lam
11:50-12:05 Yuanchu Dang
12:05-12:20 Florian Feppon

The above groups please send me your slides by 10am Tuesday. Let me know if you have any questions or concerns.

Best,
Sean

Announced on 09 May 2016  11:43  a.m. by Haihao Lu

This Wednesday was the last office hour

Hi Class,

This Wednesday was the last office hour. Good luck for your project!

Best,
Sean

Announced on 05 May 2016  10:01  p.m. by Haihao Lu

Course project and presentation

Hi Class,

The course project report is due 11:59pm this Sunday. Email me your pdf. file report with name 'NLP_report_Your names_Project name'. No late submission would be accepted unless you get extension permission from Prof. Freund.

The presentation will be hold in class next week. We want each group to prepare for it. We would randomly choose and announce five groups for Tuesday presentation before Monday Midday, and another five groups for Thursday presentation before Wednesday Midday. The probability to be chosen is uniform among each group for the global fairness.

Best,
Sean

Announced on 05 May 2016  4:51  p.m. by Haihao Lu

View archived announcements