`01:00`

STAT 20: Introduction to Probability and Statistics

- Linear Models Review
- Concept Questions
- Optimization with Algorithms
- Problem Set 6.1: Method of Least Squares

Go to `pollev.com`

and get ready for a kahoot.

An engineer working for Waymo self-driving cars is working to solve a problem. When it rains, reflections of other cars in puddles can disorient the self-driving car. Their team is working on a model to determine when the self-driving car is seeing a reflection of a car vs a real car.

Please identify A. the response, B. the predictor(s), and C. whether it is regression or classification (poll only has C.)

`01:00`

An analyst working for the UC Berkeley Admissions Office is working to help the university decide how many students to send offer letters to. They have a target class size (that fits within the budget and residence halls), but they’re not sure how many students will accept the offer. How many should they admit?

`01:00`

Certain models (like least squares) can be fit simply by taking partial derivatives, setting to 0, and solving.

There are many iterative algorithms that accomplish the same task, some better than others. Two examples:

- Gradient Descent: the most-used algorithm currently. Used to fit deep learning models.
- Nelder-Mead: an older and more general (and generally not as reliable!) algorithm.

The downhill simplex method now takes a series of steps, most steps just moving the point of the simplex where the function is largest (“highest point”) through the opposite face of the simplex to a lower point. These steps are called reflections, and they are constructed to conserve the volume of the simplex (and hence maintain its nondegeneracy). When it can do so, the method expands the simplex in one or another direction to take larger steps. When it reaches a “valley floor”, the method contracts itself in the transverse direction and tries to ooze down the valley. If there is a situation where the simplex is trying to “pass through the eye of a needle”, it contracts itself in all directions, pulling itself in around its lowest (best) point. (from Wikipedia)

Can we use Nelder-Mead to find the mimimum value of this function (with zero calculus)?

\[ f(x) = \left(x + .5 \right)^2 \]

- Functions are created with
`function()`

and assigned to an object (here, our new function is`f()`

) - The arguments go inside the parens of
`function()`

- Functions are created with
`function()`

and assigned to an object (here, our new function is`f()`

) - The arguments go inside the parens of
`function()`

- The guts of the function goes between
`{}`

. - Once you run this function once, you’ll have access to
`f()`

in your environment.

`optim()`

- The function to optimize is passed to
`fn`

. You provide a starting point for the algorithm with`par`

(which must be a scalar or a vector). - This is a random algorithm - each time you run it you’ll get a (slightly) different answer.
- The best guess of the algorithm will be returned as
`$par`

**optim’s guess**: -0.4

**true answer**: -0.5

`25:00`