Optimization Methods

Pablo Winant

In this tutorial you will learn to code and use common optimization algorithms for static models.


Profit optimization by a monopolist

A monopolist produces quantity \(q\) of goods X at price \(p\). Its cost function is \(c(q) = 0.5 + q (1-qe^{-q})\)

The consumer’s demand for price \(p\) is \(x(p)=2 e^{-0.5 p}\) (constant elasticity of demand to price).

  1. Write down the profit function of the monopolist and find the optimal production (if any). Don’t use any library except for plotting.
c(q) = 0.5 + q*(1-q*exp(-q))
x(p) = 2*exp(-0.5p)
π(p) = p*x(p) - c(x(p))
# π(p::Vector) = ...
# π(p) = let xx = x(p) ; p*xx - x(xx) end
π (generic function with 1 method)
using Plots
pvec = 0.1:0.01:4 # this is a range (an iterable)
# in case we want a vector we can do
pvec_ = [pvec...];
# [π(p) for p in pvec]
# equivalent to vectorized call
πvec = π.(pvec);
plot(pvec, πvec)
# let's look a the index of the maximized profit

i_max = argmax(πvec)
p_max = pvec[i_max]
2.54
pvec = range(;start=0.1, stop=4.0, length=100000)
2.539301728339302
πvec = π.(pvec);
i_max = argmax(πvec)
p_max = pvec[i_max]
# this version allocates much less memory (no construction of πvec)
@time argmax( (π(p) for p in pvec) )
 15.184407 seconds (205.79 k allocations: 9.831 MiB, 0.24% compilation time)
625461982
# Let's try Newton Raphson
# we need π' and π''

c1(q) = (1-q*exp(-q)) + q*(q*exp(-q) + -exp(-q))
x1(p) = -exp(-0.5p)
π1(p) = x(p) + p*x1(p) - c1(x(p))*x1(p)
π1 (generic function with 1 method)
π1(0.2)
2.4825253505808935
using ForwardDiff
π1(p) = ForwardDiff.derivative(π, p)
π2(p) = ForwardDiff.derivative1, p)
π2 (generic function with 1 method)
f(p) = (π(p), π1(p), π2(p))
f (generic function with 1 method)
f(0.10)
(-1.6722012721418804, 2.7322286500588673, -2.5860651404210704)
"""Maximizes  function f"""
function newton_raphson(fun, x0; T=100, τ_η = 1e-10, verbose=false)
    
    
    fun1(p) = ForwardDiff.derivative(fun, p)
    fun2(p) = ForwardDiff.derivative(fun1, p)
    
    f(p) = (fun(p), fun1(p), fun2(p))

    for t=1:T
        # evaluate function derivatives
        (_, f1, f2) = f(x0)
        # apply Newton Raphson step
        x1 = x0 - f1/f2
        # compute successive approximation errors
        η = abs(x1-x0)

        #optionnally print iterations
        verbose ? println("$t: $η") : nothing
        
        # check convergence:
        if η < τ_η
            return x1
        end
        # update the guess
        x0 = x1
    end

    error("Algorithm didn't converge after $T iterations.")

end
newton_raphson
@time newton_raphson(f, 0.5)
  0.000003 seconds
2.539301752872053
?newton_raphson
search: newton_raphson


Maximizes twice-differentiable function f

newton_raphson(
    u->log(1+u)*exp(-u),
    1.0
    )
0.7632228343518966

Constrained optimization

Consider the function \(f(x,y) = 1-(x-0.5)^2 -(y-0.3)^2\).

  1. Use Optim.jl to maximize \(f\) without constraint. Check you understand diagnostic information returned by the optimizer.
using Optim
f(x,y)  = 1- (x-0.5)^2 - (y-0.3)^2
f (generic function with 1 method)
# reformulate the function to maximize as a function of vectors
x0 = [0.1, 0.2]
res = optimize(u -> -f(u[1], u[2]), x0)
 * Status: success

 * Candidate solution
    Final objective value:     -1.000000e+00

 * Found with
    Algorithm:     Nelder-Mead

 * Convergence measures
    √(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    28
    f(x) calls:    56
res.minimizer
2-element Vector{Float64}:
 0.49993816060911933
 0.3000756301799443
  1. Now, consider the constraint \(x<0.3\) and maximize \(f\) under this new constraint.
# your code here
  1. Reformulate the problem as a root finding problem with lagrangians. Write the complementarity conditions.
# your code here
  1. Solve using NLsolve.jl
# your code here

Consumption optimization

A consumer has preferences \(U(c_1, c_2)\) over two consumption goods \(c_1\) and \(c_2\).

Given a budget \(I\), consumer wants to maximize utility subject to the budget constraint \(p_1 c_1 + p_2 c_2 \leq I\).

We choose a Stone-Geary specification where

\(U(c_1, c_2)=\beta_1 \log(c_1-\gamma_1) + \beta_2 \log(c_2-\gamma_2)\)

  1. Write the Karush-Kuhn-Tucker necessary conditions for the problem.
# your code here
  1. Verify the KKT conditions are sufficient for optimality.
# your code here
  1. Derive analytically the demand functions, and the shadow price.
# your code here
  1. Interpret this problem as a complementarity problem and solve it using NLSolve.
# your code here
  1. Produce some nice graphs with isoutility curves, the budget constraint and the optimal choice.
# your code here