c(q) = 0.5 + q*(1-q*exp(-q))
x(p) = 2*exp(-0.5p)
π(p) = p*x(p) - c(x(p))
# π(p::Vector) = ...
# π(p) = let xx = x(p) ; p*xx - x(xx) endπ (generic function with 1 method)
Pablo Winant
In this tutorial you will learn to code and use common optimization algorithms for static models.
A monopolist produces quantity \(q\) of goods X at price \(p\). Its cost function is \(c(q) = 0.5 + q (1-qe^{-q})\)
The consumer’s demand for price \(p\) is \(x(p)=2 e^{-0.5 p}\) (constant elasticity of demand to price).
π (generic function with 1 method)
15.184407 seconds (205.79 k allocations: 9.831 MiB, 0.24% compilation time)
625461982
π1 (generic function with 1 method)
π2 (generic function with 1 method)
"""Maximizes function f"""
function newton_raphson(fun, x0; T=100, τ_η = 1e-10, verbose=false)
fun1(p) = ForwardDiff.derivative(fun, p)
fun2(p) = ForwardDiff.derivative(fun1, p)
f(p) = (fun(p), fun1(p), fun2(p))
for t=1:T
# evaluate function derivatives
(_, f1, f2) = f(x0)
# apply Newton Raphson step
x1 = x0 - f1/f2
# compute successive approximation errors
η = abs(x1-x0)
#optionnally print iterations
verbose ? println("$t: $η") : nothing
# check convergence:
if η < τ_η
return x1
end
# update the guess
x0 = x1
end
error("Algorithm didn't converge after $T iterations.")
endnewton_raphson
Consider the function \(f(x,y) = 1-(x-0.5)^2 -(y-0.3)^2\).
* Status: success
* Candidate solution
Final objective value: -1.000000e+00
* Found with
Algorithm: Nelder-Mead
* Convergence measures
√(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 28
f(x) calls: 56
A consumer has preferences \(U(c_1, c_2)\) over two consumption goods \(c_1\) and \(c_2\).
Given a budget \(I\), consumer wants to maximize utility subject to the budget constraint \(p_1 c_1 + p_2 c_2 \leq I\).
We choose a Stone-Geary specification where
\(U(c_1, c_2)=\beta_1 \log(c_1-\gamma_1) + \beta_2 \log(c_2-\gamma_2)\)