Intensity normalization of image using Python+PIL – Speed issues

Question:

I’m working on a little problem in my sparetime involving analysis of some images obtained through a microscope. It is a wafer with some stuff here and there, and ultimately I want to make a program to detect when certain materials show up.

Anyways, first step is to normalize the intensity across the image, since the lens does not give uniform lightning. Currently I use an image, with no stuff on, only the substrate, as a background, or reference, image. I find the maximum of the three (intensity) values for RGB.

from PIL import Image
from PIL import ImageDraw

rmax = 0;gmax = 0;bmax = 0;rmin = 300;gmin = 300;bmin = 300

im_old = Image.open("test_image.png")
im_back = Image.open("background.png")

maxx = im_old.size[0] #Import the size of the image
maxy = im_old.size[1]
im_new = Image.new("RGB", (maxx,maxy))


pixback = im_back.load()
for x in range(maxx):
    for y in range(maxy):
        if pixback[x,y][0] > rmax:
            rmax = pixback[x,y][0]
        if pixback[x,y][1] > gmax:
            gmax = pixback[x,y][1]
        if pixback[x,y][2] > bmax:
            bmax = pixback[x,y][2]


pixnew = im_new.load()
pixold = im_old.load()
for x in range(maxx):
    for y in range(maxy):
        r = float(pixold[x,y][0]) / ( float(pixback[x,y][0])*rmax )
        g = float(pixold[x,y][1]) / ( float(pixback[x,y][1])*gmax )
        b = float(pixold[x,y][2]) / ( float(pixback[x,y][2])*bmax )
        pixnew[x,y] = (r,g,b)

The first part of the code determines the maximum intensity of the RED, GREEN and BLUE channels, pixel by pixel, of the background image, but needs only be done once.

The second part takes the “real” image (with stuff on it), and normalizes the RED, GREEN and BLUE channels, pixel by pixel, according to the background. This takes some time, 5-10 seconds for an 1280×960 image, which is way too slow if I need to do this to several images.

What can I do to improve the speed? I thought of moving all the images to numpy arrays, but I can’t seem to find a fast way to do that for RGB images.
I’d rather not move away from python, since my C++ is quite low-level, and getting a working FORTRAN code would probably take longer than I could ever save in terms of speed 😛

Asked By: Happy

||

Answers:

See http://docs.scipy.org/doc/scipy/reference/generated/scipy.misc.fromimage.html#scipy.misc.fromimage

You can say

databack = scipy.misc.fromimage(pixback)
rmax = numpy.max(databack[:,:,0])
gmax = numpy.max(databack[:,:,1])
bmax = numpy.max(databack[:,:,2])

which should be much faster than looping over all (r,g,b) triplets of your image.
Then you can do

dataold = scip.misc.fromimage(pixold)
r = dataold[:,:,0] / (pixback[:,:,0] * rmax )
g = dataold[:,:,1] / (pixback[:,:,1] * gmax )
b = dataold[:,:,2] / (pixback[:,:,2] * bmax )

datanew = numpy.array((r,g,b))
imnew = scipy.misc.toimage(datanew)

The code is not tested, but should work somehow with minor modifications.

Answered By: rocksportrocker
import numpy as np
from PIL import Image

def normalize(arr):
    """
    Linear normalization
    http://en.wikipedia.org/wiki/Normalization_%28image_processing%29
    """
    arr = arr.astype('float')
    # Do not touch the alpha channel
    for i in range(3):
        minval = arr[...,i].min()
        maxval = arr[...,i].max()
        if minval != maxval:
            arr[...,i] -= minval
            arr[...,i] *= (255.0/(maxval-minval))
    return arr

def demo_normalize():
    img = Image.open(FILENAME).convert('RGBA')
    arr = np.array(img)
    new_img = Image.fromarray(normalize(arr).astype('uint8'),'RGBA')
    new_img.save('/tmp/normalized.png')
Answered By: unutbu

This is partially from FolksTalk webpage:

from PIL import Image
import numpy as np

# Read image file
in_file = "my_image.png"
# convert('RGB') for PNG file type
image = Image.open(in_file).convert('RGB')
pixels = np.asarray(image)

# Convert from integers to floats
pixels = pixels.astype('float32')

# Normalize to the range 0-1
pixels /= 255.0
Answered By: Cloud Cho