Multiple variables in SciPy's optimize.minimize
Question:
According to the SciPy documentation, it is possible to minimize functions with multiple variables, yet it doesn’t say how to optimize such functions.
from scipy.optimize import minimize
from math import *
def f(c):
return sqrt((sin(pi/2) + sin(0) + sin(c) - 2)**2 + (cos(pi/2) + cos(0) + cos(c) - 1)**2)
print(minimize(f, 3.14/2 + 3.14/7))
The above code try to minimize the function f
, but for my task I need to minimize with respect to three variables.
Simply introducing a second argument and adjusting minimize
accordingly yields an error:
TypeError: f() takes exactly 2 arguments (1 given)
How does minimize
work when minimizing with multiple variables?
Answers:
Pack the multiple variables into a single array:
import scipy.optimize as optimize
def f(params):
# print(params) # <-- you'll see that params is a NumPy array
a, b, c = params # <-- for readability you may wish to assign names to the component variables
return a**2 + b**2 + c**2
initial_guess = [1, 1, 1]
result = optimize.minimize(f, initial_guess)
if result.success:
fitted_params = result.x
print(fitted_params)
else:
raise ValueError(result.message)
yields
[ -1.66705302e-08 -1.66705302e-08 -1.66705302e-08]
scipy.optimize.minimize
takes two mandatory arguments: the objective function and the initial guess of the variables of the objective function (so len(initial)==len(variables)
has to be true). As it’s an iterative algorithm, it requires an initial guess for the variables in order to converge. So the initial guess has to be an educated guess, otherwise the algorithm may not converge and/or the results would be incorrect.
Also, if the objective function uses any extra arguments (e.g. coefficients of the objective function), they cannot be passed as kwargs but has to be passed via the args=
argument of minimize
(which admits an array-like).
Since the OP doesn’t have a multi-variable objective function, let’s use a common problem: least squares minimization.
The optimization problem solves for values where the objective function attains its minimum value. As unutbu explained, they must be passed as a single object (variables
in the function below) to the objective function. As mentioned before, we must pass an educated guess for these variables in order for the algorithm to converge.
def obj_func(variables, coefs):
gap = coefs[:, 0] - (variables * coefs[:, 1:]).sum(axis=1)
return (gap**2).sum()
initial = [0, 0]
coefs = np.array([[0.4, 1, 0], [2, 1, 1], [5, 1, 2], [7, 1, 3], [8, 1, 4], [11, 1, 5], [13, 1, 6], [14, 1, 7], [16, 1, 8], [19, 1, 9]])
result = minimize(obj_func, initial, args=coefs)
minimizers = result.x # [0.50181826, 2.00848483]
minimum = result.fun # 2.23806060606064
As least squares minimization is how OLS regression coefficients are calculated, you can verify that the above indeed computes it by the following:
from statsmodels.api import OLS
ols_coefs = OLS(coefs[:, 0], coefs[:, 1:]).fit().params
np.allclose(ols_coefs, minimizers) # True
According to the SciPy documentation, it is possible to minimize functions with multiple variables, yet it doesn’t say how to optimize such functions.
from scipy.optimize import minimize
from math import *
def f(c):
return sqrt((sin(pi/2) + sin(0) + sin(c) - 2)**2 + (cos(pi/2) + cos(0) + cos(c) - 1)**2)
print(minimize(f, 3.14/2 + 3.14/7))
The above code try to minimize the function f
, but for my task I need to minimize with respect to three variables.
Simply introducing a second argument and adjusting minimize
accordingly yields an error:
TypeError: f() takes exactly 2 arguments (1 given)
How does minimize
work when minimizing with multiple variables?
Pack the multiple variables into a single array:
import scipy.optimize as optimize
def f(params):
# print(params) # <-- you'll see that params is a NumPy array
a, b, c = params # <-- for readability you may wish to assign names to the component variables
return a**2 + b**2 + c**2
initial_guess = [1, 1, 1]
result = optimize.minimize(f, initial_guess)
if result.success:
fitted_params = result.x
print(fitted_params)
else:
raise ValueError(result.message)
yields
[ -1.66705302e-08 -1.66705302e-08 -1.66705302e-08]
scipy.optimize.minimize
takes two mandatory arguments: the objective function and the initial guess of the variables of the objective function (so len(initial)==len(variables)
has to be true). As it’s an iterative algorithm, it requires an initial guess for the variables in order to converge. So the initial guess has to be an educated guess, otherwise the algorithm may not converge and/or the results would be incorrect.
Also, if the objective function uses any extra arguments (e.g. coefficients of the objective function), they cannot be passed as kwargs but has to be passed via the args=
argument of minimize
(which admits an array-like).
Since the OP doesn’t have a multi-variable objective function, let’s use a common problem: least squares minimization.
The optimization problem solves for values where the objective function attains its minimum value. As unutbu explained, they must be passed as a single object (variables
in the function below) to the objective function. As mentioned before, we must pass an educated guess for these variables in order for the algorithm to converge.
def obj_func(variables, coefs):
gap = coefs[:, 0] - (variables * coefs[:, 1:]).sum(axis=1)
return (gap**2).sum()
initial = [0, 0]
coefs = np.array([[0.4, 1, 0], [2, 1, 1], [5, 1, 2], [7, 1, 3], [8, 1, 4], [11, 1, 5], [13, 1, 6], [14, 1, 7], [16, 1, 8], [19, 1, 9]])
result = minimize(obj_func, initial, args=coefs)
minimizers = result.x # [0.50181826, 2.00848483]
minimum = result.fun # 2.23806060606064
As least squares minimization is how OLS regression coefficients are calculated, you can verify that the above indeed computes it by the following:
from statsmodels.api import OLS
ols_coefs = OLS(coefs[:, 0], coefs[:, 1:]).fit().params
np.allclose(ols_coefs, minimizers) # True