Do We Really Need 24-bit Color?
When I was paining in childhood, I only have 24 colors, and somethings I only buy 4 colors to use (i.e., red, green, blue and white). However, blending of all colors could create vivid pictures.
(image from pininterest)
On the other side, the storage of digital colors in computer can be very expensive: for each pixel, we need 256*256*256 = 16M to store just a single pixel.
How to reduce that? Color quantification is very useful in computer graphics, image compression and the visual web. However, traditional color quantification suffers from ugly blocks.
(Image from Wikipedia)
I learnt image compression from iq’s wavelet compression.
I got some great results but I am not satisfied with using the entire color space. So how shall we go further?
Initially, people do color quantization by population algorithm / method, which essentially constructs a histogram of equal-sized ranges and assigns colors to the ranges containing the most points.
A more modern popular method is clustering using octrees. The most popular algorithm by far is the median cut algorithm. Voronoi diagram is used to decompose the color cube.
Another useful library is scolorq: A Spatial Color Quantization Utility
In this post, I would like to introduce Floyd–Steinberg dithering, which is a very inexpensive way to generate vivid pictures using only a few colors. An example can be seen from the trailer image. And here is the comparison:
(image taken by myself)
Almost no difference, eh?
Floyd–Steinberg dithering is an image dithering algorithm first published in 1976 by Robert W. Floyd and Louis Steinberg. It is commonly used by image manipulation software, for example when an image is converted into GIF format that is restricted to a maximum of 256 colors. The algorithm achieves dithering using error diffusion, meaning it pushes (adds) the residual quantization error of a pixel onto its neighboring pixels, to be dealt with later. It spreads the debt out according to the distribution (shown as a map of the neighboring pixels):
Here is a working code in C++ for Floyd–Steinberg dithering, I modified from StackOverFlow:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 |
#include <opencv2/opencv.hpp> #include <iostream> #include "time.h" #define PALETTE_DIM 8 #define PALETTE_LENGTH 64 using namespace cv; using namespace std; cv::Mat floydSteinberg(cv::Mat img, cv::Mat palette); cv::Vec3b findClosestPaletteColor(cv::Vec3b color, cv::Mat palette); Mat palette(PALETTE_DIM, PALETTE_DIM, CV_8UC3); Mat palette_large(PALETTE_DIM << 2, PALETTE_DIM << 2, CV_8UC3); int main(int argc, char** argv) { clock_t begin_time = clock(); // Number of clusters (colors on result image) int nrColors = PALETTE_LENGTH; cv::Mat imgBGR = imread(argv[1], 1); cv::Mat img; cvtColor(imgBGR, img, CV_BGR2Lab); cv::Mat colVec = img.reshape(1, img.rows*img.cols); // change to a Nx3 column vector cv::Mat colVecD; colVec.convertTo(colVecD, CV_32FC3, 1.0); // convert to floating point cv::Mat labels, centers; cv::kmeans(colVecD, nrColors, labels, cv::TermCriteria(CV_TERMCRIT_ITER, 100, 0.1), 3, cv::KMEANS_PP_CENTERS, centers); // compute k mean centers // replace pixels by there corresponding image centers cv::Mat imgPosterized = img.clone(); for (int i = 0; i < img.rows; ++i) for (int j = 0; j < img.cols; ++j) for (int k = 0; k < 3; ++k) imgPosterized.at<Vec3b>(i, j)[k] = centers.at<float>(labels.at<int>(j + img.cols*i), k); // convert palette back to uchar cv::Mat palette; centers.convertTo(palette, CV_8UC3, 1.0); // call floyd steinberg dithering algorithm cv::Mat fs = floydSteinberg(img, palette); cv::Mat imgPosterizedBGR, fsBGR; cvtColor(imgPosterized, imgPosterizedBGR, CV_Lab2BGR); cvtColor(fs, fsBGR, CV_Lab2BGR); for (int i = 0; i < PALETTE_DIM; ++i) for (int j = 0; j < PALETTE_DIM; ++j) { for (int k = 0; k < 3; ++k) { cout << (int)(uchar)centers.at<float>(j + PALETTE_DIM * i, k) << " "; palette.at<Vec3b>(i, j)[k] = (uchar)centers.at<float>(j + PALETTE_DIM * i, k); } cout << endl; } //cv::Mat paletteBGR; //cvtColor(palette, paletteBGR, CV_Lab2BGR); //resize(paletteBGR, palette_large, palette_large.size(), 0, 0, INTER_NEAREST); imshow("input", imgBGR); // original image imshow("result", imgPosterizedBGR); // posterized image imshow("fs", fsBGR); // floyd steinberg dithering imshow("palette", palette); // palette cout << "* Color quantization costs: " << float(clock() - begin_time) / CLOCKS_PER_SEC << " s" << endl; waitKey(); return 0; } cv::Mat floydSteinberg(cv::Mat imgOrig, cv::Mat palette) { cv::Mat img = imgOrig.clone(); cv::Mat resImg = img.clone(); for (int i = 0; i < img.rows; ++i) { for (int j = 0; j < img.cols; ++j) { cv::Vec3b newpixel = findClosestPaletteColor(img.at<Vec3b>(i, j), palette); resImg.at<Vec3b>(i, j) = newpixel; for (int k = 0; k < 3; ++k) { int quant_error = (int)img.at<Vec3b>(i, j)[k] - newpixel[k]; if (i + 1 < img.rows) img.at<Vec3b>(i + 1, j)[k] = min(255, max(0, (int)img.at<Vec3b>(i + 1, j)[k] + (7 * quant_error) / 16)); if (i - 1 > 0 && j + 1 < img.cols) img.at<Vec3b>(i - 1, j + 1)[k] = min(255, max(0, (int)img.at<Vec3b>(i - 1, j + 1)[k] + (3 * quant_error) / 16)); if (j + 1 < img.cols) img.at<Vec3b>(i, j + 1)[k] = min(255, max(0, (int)img.at<Vec3b>(i, j + 1)[k] + (5 * quant_error) / 16)); if (i + 1 < img.rows && j + 1 < img.cols) img.at<Vec3b>(i + 1, j + 1)[k] = min(255, max(0, (int)img.at<Vec3b>(i + 1, j + 1)[k] + (1 * quant_error) / 16)); } } } return resImg; } float vec3bDist(cv::Vec3b a, cv::Vec3b b) { return sqrt(pow((float)a[0] - b[0], 2) + pow((float)a[1] - b[1], 2) + pow((float)a[2] - b[2], 2)); } cv::Vec3b findClosestPaletteColor(cv::Vec3b color, cv::Mat palette) { int i = 0; int minI = 0; cv::Vec3b diff = color - palette.at<Vec3b>(0); float minDistance = vec3bDist(color, palette.at<Vec3b>(0)); for (int i = 0; i < palette.rows; ++i) { float distance = vec3bDist(color, palette.at<Vec3b>(i)); if (distance < minDistance) { minDistance = distance; minI = i; } } return palette.at<Vec3b>(minI); } |
Reference: