spacejunkjim 5 minutes ago

When I saw this, I immediately had flashbacks to a little project I did for my CS course when I was an undergrad! We were all assigned a computer graphics algorithm and were tasked to build an animation explaining how it works.

This was nearly eight years ago, but I managed to find it this morning and uploaded it to YouTube.

Here was the resulting animation: https://youtu.be/FHrIQOWeerg

I remember I used Processing to build it, and it took so long to animate as I had to export it frame-by-frame. Fun days!

mattdesl an hour ago

It might be worth using a lightness estimate like OKLab, OKLrab[1], or CIE Lab instead of the RGB luminance weighting, as it should produce a more perceptually accurate result.

The other issue with your code right now, is that it is using euclidean distance in RGB space to choose the nearest color, but it would be probably also more accurate to use a perceptual color difference metric, a very simple choice is euclidean distance on OKLab colors.

I think dithering is a pretty interesting area of exploration, especially as a lot of the popular dithering algorithms are quite old and optimized for ancient compute requirements. It would be nice to see some dithering that isn't using 8-bits for errors, is based on perceptual accuracy, and perhaps uses something like a neural net to diffuse things in the best way possible.

[1] https://bottosson.github.io/posts/colorpicker/

magicalhippo 34 minutes ago

I've always been curious to what degree, if any, color constancy[1] affects color dithering.

Seems that at some level it should, though perhaps not directly at the pixel level due to the high frequency of the per-pixel differences, but maybe at the more coarse "averaged" level?

One of those things I've wanted to explore but remains on my to-do list...

[1]: https://en.wikipedia.org/wiki/Color_constancy

Clamchop 3 days ago

They may not want to imply that didder's linearized rabbit is wrong, but I'm comfortable saying so. It's not just a little dark, it's way dark, to the point of hiding detail.

The linearized RGB palette is similarly awful. It clobbers a whole swath of colors, rendering them as nearly black. Purples are particularly brutalized. Yellows disappeared and became white.

On my phone, the middle palette doesn't appear too bright to my eyes, either.

Even the linearized gradient looks worse, .

Maybe linear is not best for perceptual accuracy.

  • badmintonbaseba 35 minutes ago

    I agree. I think the problem is a banal missing color transformation somewhere in the pipeline, like converting the palette and image to linear colorspace, doing the dithering there and mistakenly writing the linear color values instead of sRGB color values into the image.

    Others suggest that the error is using the wrong metric for choosing the closest color, but I disagree. That wouldn't such drastic systematic darkening like this, as the palette is probably still pretty dense in the RGB cube.

    Where the linearisation really matters is the arithmetic for the error diffusion, you definitely want to diffuse the error in a linear colorspace, and you are free to choose a good perceptual space for choosing the closest color at each pixel, but calculate the error in a linear space.

    Visual perception is weird. But when you squint your eyes to blur the image, you are definitely mixing in a linear colorspace, as that's physical mixing of light intensities before the light even reaches your retina. So you have to match that when diffusing the error.

    edit:

    It also doesn't help that most (all?) browsers do color mixing wrong when the images are scaled, so if you don't view the dithered images at 100% without DPI scaling than you might get significantly distorted colors due to that too.

  • Sesse__ 2 hours ago

    For perceptual color difference, there are much better metrics than “distance in linear RGB”. CIE has some implementations of a metric called ΔE*, for instance.

    I don't know if they actually do well in dithering, though. My experience with dithering is that it actually works better in gamma space than trying to linearize anything, since the quantization is fundamentally after gamma.

  • obrhubr 3 days ago

    Thanks for your comment! I'm glad you're seeing the same thing :) I re-implemented the linearised dithering in python and got similar results. I checked and rechecked the colour profiles in GIMP, nothing... At this point I can only hope for an expert to appear and tell me what exactly I am doing wrong.

  • nextts 2 hours ago

    > We have just committed a mortal sin of image processing. I didn’t notice it, you might not have noticed either, but colour-space enthusiasts will be knocking on your door shortly.

kaoD an hour ago

> Dithering a black-to-white gradient will be wrong without linearising first.

TBH both look wrong to me. If I squint, neither dithering patterns match the original gradient... but the non-linearized one looks the most similar.

What could be causing this?

  • gus_massa 19 minutes ago

    Mac vs pc?

    They have a different default gamma and they may show a different gray level.

    (It bite me a long time ago. I made a gif that has the same RGB bacground than a webpage. In my pc it was fine, but in a mac they the border was very visible and the result horrible. My solution was to change the backgroung of the webpage from a RGB number to a 1 pixel gif with repetition or scale to fill the page.)

  • hagbard_c 42 minutes ago

    > What could be causing this?

    Hypercorrection, in this care over-linearisation.