How to make Min-plus matrix multiplication in python faster?

Question:

So I have two matrices, A and B, and I want to compute the min-plus product as given here: Min-plus matrix multiplication. For that I’ve implemented the following:

def min_plus_product(A,B):
    B = np.transpose(B)
    Y = np.zeros((len(B),len(A)))
    for i in range(len(B)):
         Y[i] = (A + B[i]).min(1)
    return np.transpose(Y)

This works fine, but is slow for big matrices, is there a way to make it faster? I’ve heard that implemeting in C or using the GPU might be good options.

Asked By: Rael

||

Answers:

Here is an algo that saves a bit if the middle dimension is large enough and entries are uniformly distributed. It exploits the fact that the smallest sum typically will be from two small terms.

import numpy as np

def min_plus_product(A,B):
    B = np.transpose(B)
    Y = np.zeros((len(B),len(A)))
    for i in range(len(B)):
         Y[i] = (A + B[i]).min(1)
    return np.transpose(Y)


def min_plus_product_opt(A,B, chop=None):
    if chop is None:
        # not sure this is optimal
        chop = int(np.ceil(np.sqrt(A.shape[1])))
    B = np.transpose(B)
    Amin = A.min(1)
    Y = np.zeros((len(B),len(A)))
    for i in range(len(B)):
        o = np.argsort(B[i])
        Y[i] = (A[:, o[:chop]] + B[i, o[:chop]]).min(1)
        if chop < len(o):
            idx = np.where(Amin + B[i, o[chop]] < Y[i])[0]
            for j in range(chop, len(o), chop):
                if len(idx) == 0:
                    break
                x, y = np.ix_(idx, o[j : j + chop])
                slmin = (A[x, y] + B[i, o[j : j + chop]]).min(1)
                slmin = np.minimum(Y[i, idx], slmin)
                Y[i, idx] = slmin
                nidx = np.where(Amin[idx] + B[i, o[j + chop]] < Y[i, idx])[0]
                idx = idx[nidx]
    return np.transpose(Y)

A = np.random.random(size=(1000,1000))
B = np.random.random(size=(1000,2000))

print(np.allclose(min_plus_product(A,B), min_plus_product_opt(A,B)))

import time
t = time.time();min_plus_product(A,B);print('naive {}sec'.format(time.time()-t))
t = time.time();min_plus_product_opt(A,B);print('opt {}sec'.format(time.time()-t))

Sample output:

True
naive 7.794037580490112sec
opt 1.65810227394104sec
Answered By: Paul Panzer

A possible simple route is to use numba.

from numba import autojit
import numpy as np
@autojit(nopython=True)
def min_plus_product(A,B):
    n = A.shape[0]
    C = np.zeros((n,n))
    for i in range(n):
        for j in range(n):
            minimum = A[i,0]+B[0,j]
            for k in range(1,n):
                minimum = min(A[i,k]+B[k,j],minimum)
            C[i,j] = minimum
    return C

Timings on 1000×1000 A,B matrices are:

1 loops, best of 3: 4.28 s per loop for the original code

1 loops, best of 3: 2.32 s per loop for the numba code

Answered By: Francesco Turci

Here is a succinct and fully numpy solution, without any python-based loops:

(np.expand_dims(a, 0) + np.expand_dims(b.T, 1)).min(axis=2).T
Answered By: wingedsubmariner
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.