scipy optimize one iteration at a time

Question:

I want to control the objective of my optimization as a function of the number of iterations. In my real problem, I have a complicated regularization term that I want to control using the iteration number.

Is it possible to call a scipy optimizer one iteration at a time, or at least to be able to access the iteration number in the objective function?

Here is an example showing my best attempt so far:

from scipy.optimize import fmin_slsqp
from scipy.optimize import minimize as mini
import numpy as np

# define objective function
# x is the design input
# iteration is the iteration number
# the idea is that I want to control a regularization term using the iteration number
def objective(x, iteration):
    return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2 + 10 * np.sum(x ** 2) / iteration

x = np.ones(2) * 5
for ii in range(20):
   x = fmin_slsqp(objective, x, iter=1, args=(ii,), iprint=0)

   if ii == 5: print('at iteration 5, I expect to get ~ [0, 0], but I get', x)

truex = mini(objective, np.ones(2) * 5, args=(200,)).x
print('the final result is ', x, 'instead of the correct answer, which is close to [1, 1] (', truex, ')')

output:

at iteration 5, I expect to get ~ [0, 0], but I get [5. 5.]
the final result is  [5. 5.] instead of the correct answer, [1, 1] ([0.88613989 0.78485145])
Asked By: kilojoules

||

Answers:

No, I don’t think scipy offers this option.

Interestingly, pytorch does. See this example of optimizing one iteration at a time:

import numpy as np

# define rosenbrock function and gradient
a = 1
b = 5
def f(x):
   return (a - x[0]) ** 2 + b * (x[1] - x[0] ** 2) ** 2


# create stochastic rosenbrock function and gradient
def f_rand(x):
   return f(x) * np.random.uniform(0.5, 1.5)

x = np.array([0.1, 0.1])
x0 = x.copy()

import torch
x_tensor = torch.tensor(x0, requires_grad=True)
optimizer = torch.optim.Adam([x_tensor], lr=learning_rate)

def closure():
   optimizer.zero_grad()
   loss = f_rand(x_tensor)
   loss.backward()
   return loss

# optimize one iteration at a time
for ii in range(200):
   optimizer.step(closure)

print('optimal solution found: ', x_tensor, f(x_tensor))

If you really need to use scipy, you can make a class to count iterations, though you should be careful when mixing this with an algorithm that is approximating the inverse hessian matrix.

from scipy.optimize import fmin_slsqp
from scipy.optimize import minimize as mini
import numpy as np

# define objective function
# x is the design input
# iteration is the iteration number
# the idea is that I want to control a regularization term using the iteration number

def objective(x):
    return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2 + 10 * np.sum(x ** 2) 


class myclass:
    def __init__(self):
        self.iteration = 0
 
    def call(self, x):
       self.iteration += 1
       return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2 + 10 * np.sum(x ** 2) / self.iteration

x = np.ones(2) * 5
obj = myclass()
x = fmin_slsqp(obj.call, x, iprint=0)

truex = mini(objective, np.ones(2) * 5).x
print('the final result is ', x, ', which is not the correct answer, and is not close to [1, 1] (', truex, ')')
Answered By: kilojoules