# Optimisation

## What is optimization?

[edit | edit source]Optimization is the process of **finding extrema** (maximum or minimum values) of a particular function. In this guide we will use **differentiation** to determine these extrema of functions of one or more variables.

## How to find extrema

[edit | edit source]Consider the following image:

As you can see it has a maximum value and the tangent at this value is horizontal. This means the gradient of the function at the maximum value is 0. Therefore to find these 'stationary points' you need to find where the derivative of the function is 0.

So differentiate the given function and then solve with the function equal to 0. This will give you the point(s) where the function has extrema.

To determine whether the point is a maximum or a minimum, you need to find the second derivative (ie differentiate again). If the value of this function at the stationary point is less than 0 it is a maximum, otherwise it is a minimum.

**Example 1**

*The effectiveness of a particular drug is given by the function:*

where E represents the effectiveness and C is the concentration of the drug. Find the concentration of the drug where it is most effective.

Ensure that this value is a maximum:

Therefore is a maximum

**Example 2**

*A man need to run from one point to another which is along a flat road measuring 200m and up a slope of length 50m. Given that he can run at 6ms ^{-1} along the road and 4ms^{-1}. Find the value of x (see diagram below) to minimise the time taken and therefore find the minimum time.*

As can be seen, the shortest path is along the road and then up the slope at an angle. [marked in red] From mechanics, time is the distance divided by the speed.

Let the horizontal distance be x, therefore the time of that section = .

To work out the slope distance we use pythagoras' theorem. ie dist = [(200 - x)

^{2}+ 50

^{2}]

^{0.5}

and therefore the time taken is .

Finally this gives us our equation of time:

Differentiation gives us the equation:

This is equal to 0 when x = 155.2786404m.

This means that

## Off-site courses

[edit | edit source]### Convex optimization

[edit | edit source]*CVX101 Convex Optimization*, Stephen Boyd, Stanford University, Spring 2014*Convex Optimization I*, Stephen Boyd, Stanford University, Summer quarter 2013–14*Convex Optimization II*, Stephen Boyd, Stanford University, Spring quarter 2013–14

### Discrete optimization

[edit | edit source]- Discrete Optimization, Pascal Van Hentenryck and Carleton Coffrin, The University of Melbourne (through Coursera)

## Resources

[edit | edit source]- Hyper-Textbook: Optimization Models and Applications, L. El Ghaoui, EECS Department, UC Berkeley.