Open CV Python: Quantize to a given color palette

Question:

Given some arbitrary input color palette, how can I quantize an image to that given color palette?

While there are a lot of stack overflow solutions using kmeans, I want to be able to do something similar with a preset palette specifically in opencv. For example, this might include using a clustering technique where the k is fixed and the centroids are predetermined by the palette such that each pixel gets assigned to the nearest color in the palette.

For example, I would like to quantize a picture to the nearest colors in white, black, blue, green and red.

My BGR palette would look something like this:

palette = [
    [0,0,0],
    [255,0,0],
    [0,255,0],
    [0,0,255],
    [255,255,255]
]

Given some input picture, I would like to have each pixel in the picture assigned to the closest item in the above palette.

So the end result would look something like the following:

import cv2

palette = [
    [0,0,0],
    [255,0,0],
    [0,255,0],
    [0,0,255],
    [255,255,255]
]

def quantize_to_palette(image, palette):
    # do something
    # return the same image with all colors quantized to the nearest palette item
    return

input = cv2.imread('input.jpg')

output = quantize_to_palette(image=input, palette=palette)

cv2.imwrite('output.jpg', output)
Asked By: conmak

||

Answers:

You can consider it as a nearest neighbor problem where:

  1. image pixel is a point
  2. palette is a index
  3. all image pixels are queries

So you need for each query (a pixel in your image) find the nearest palette color using some distance.

Original image:
enter image description here

Quantized with your palette :

enter image description here

quantized with some random 4-color palette:
palette = np.random.randint(0, 255, size=(4, 3))

enter image description here

Example:

import cv2
import numpy as np

palette = np.array([[0, 0, 0], [255, 0, 0], [0, 255, 0], [0, 0, 255], [255, 255, 255]])


def quantize_to_palette(image, palette):
    X_query = image.reshape(-1, 3).astype(np.float32)
    X_index = palette.astype(np.float32)

    knn = cv2.ml.KNearest_create()
    knn.train(X_index, cv2.ml.ROW_SAMPLE, np.arange(len(palette)))
    ret, results, neighbours, dist = knn.findNearest(X_query, 1)

    quantized_image = np.array([palette[idx] for idx in neighbours.astype(int)])
    quantized_image = quantized_image.reshape(image.shape)
    return quantized_image


input = cv2.imread("tiger.png")
output = quantize_to_palette(image=input, palette=palette)
cv2.imwrite("output.jpg", output)

Also you may consider using other distances and color spaces. For example the L2 distance with the LAB color space is better reflects how the person perceives the color similarity.

https://en.wikipedia.org/wiki/CIELAB_color_space#Perceptual_differences

Image taken from https://www.publicdomainpictures.net/en/view-image.php?image=156214

Answered By: u1234x1234
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.