Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Surface Blur and Median (photopea.com)
63 points by IvanK_net on Sept 30, 2016 | hide | past | favorite | 34 comments


The article buried the lead, so here's your TLDR summary:

- GIMP & Photoshop use the same O(n²) algorithm

- GIMP's implementation is single threaded, while photoshop's is multithreaded and therefore performs much faster on modern multi-core CPUs

- Photopea uses a different algorithm with O(n) characteristics

- O(n) in a single threaded javascript is faster than O(n²) in parallel C/C++ for large values of n

- For photo editing, we used to work with relatively small photos, so algorithm choice didn't matter so much. But with modern digital cameras algorithm choice matters as we are seeing photo filters now working with larger values of n (pixel radius in this case).

Therefore, choosing the right algorithm matters.


Not exactly. I wanted to avoid too much scientific notation.

- GIMP uses Θ(n * r²) algorithm

- Photoshop uses Θ(n * r) algorithm

- r<=10: Photopea uses Θ(n * r) algorithm

- r> 10: Photopea uses Θ(n) algorithm with a large linear coefficient

For r<=10 Photopea and Photoshop use the same algorithm. PP is slower not only because of Javascript, but also because of multithreading in PS.


Would you consider porting your implementation to GIMP? I'm sure the community would love to see it.


Actual implementation has about 50 lines of code. But I am afraid it would take me hours or days to install sources and compiler and to get into the structure of the GIMP code.

BTW. I think my method corresponds to this: http://registry.gimp.org/node/24208


is there any article/paper about the algorithm?


It is based on well-known histogram algorithms, there are several links in this discussion :)


Ok, but is it actually computing the same thing? Are the blurred images from photopea and photoshop identical? The article mentions always doing no more than N*10 steps -- is this an approximation algorithm?


Photopea and Gimp do compute the same thing (Surface Blur by definition). However, in Photoshop, the results at the radius >= 50px are slightly different. I have a suspicion, that Photoshop uses an approximation algorithm.

N*10 algorithm has an overhead, it needs a little more time (and memory) per pixel, but it does not depend on the radius. It is an exact algorithm (no approximation).


I can't speak to the actual algorithm, but Photoshop also does various types of color space conversions unless you explicitly set it not to. I didn't see anywhere in the article where that was mentioned, or whether Photopea does those same conversions.


I hope that Photoshop does not do any conversion, when it is not necessary. Color space conversion is performed e.g. when changing saturation, vibrance etc. But I think it is not necessary in this case.


Blurs can look a bit more natural when done in a linear color space. Does your app work in linear or sRGB or something else?


It works in sRGB.


Since it's not mentioned in the article, this is also called bilateral filtering [1]. The external links section of the Wikipedia article lists some famous SIGGRAPH papers describing fast implementations, and there is some code available [2].

[1] https://en.wikipedia.org/wiki/Bilateral_filter

[2] http://people.csail.mit.edu/sparis/bf/


Note that Sylvain Paris now works for Adobe’s Advanced Technology Lab.

As far as I can tell the reason that Adobe can’t replace their current Photoshop bilateral filter implementation (which is now 15 years old or something) with a better one is that the precise pixel-level details would be slightly different, potentially affecting someone’s existing workflow.

This stuff should really be done using one of the fast GPU algorithms.


Cool stuff, congrats.

I really like your idea of having a "willitwork" page that dynamically lists the specific deficiencies in the user's browser. So many web applications just silently fail. This is way better. Every web application should have a willitwork page.


It sounds like a good idea, but every time I've seen this implemented it's been as a simple whitelist lookup against user agent. This has two problems. First, user agent is... unreliable. Second, people that use less common systems are erroneously told the page won't work. (I've lost count of the number of times a website has told me that my operating system isn't supported yet the page works fine. What does my os have to do with a web page anyway?)


I have been developing my app for years and I am trying to be as OS-independent and browser-independent as possible.

To detect the availability of the feature, I am chechking if the constructor is available, or if some property of an object is available. I never try to whitelist or blacklist any specific OS or a browser. I have never made any part of the code behave differently for some OS or some browser. But I have reported dozens of bugs to browser developers, most of them were fixed.

Photopea runs quite well on phones and tablets. If your microwave oven passes WillItWork, it will work on it too :)


Very cool. Cheers.


Photopea appears to do it properly with feature testing (no UA checks): https://www.photopea.com/willitwork/test.js

FYI, Modernizr is a fantastic library for browser feature testing. You can just "add to cart" the tests you want to perform. No hand rolling necessary: https://modernizr.com/


Is there a typical number of radius pixels people use? I'm not a photographer, so I can't really say.

Below 10px, your tool is the worst, from 10-30px, GIMP is awful, Photoshop is the best. Above 30px, your tool is the best. However, Photoshop doesn't seem to fall too far behind in actual seconds even as you approach 100px.

So do photo editing people typically use a smaller radius or larger radius, or does it just depend?


Below 10px radius, all programs take under 0.5 seconds, so you don't even notice the difference.

The picture in an example was made with a radius about 12px. But Photoshop and GIMP allow larger radii, so I think large radii are used sometimes, too.

BTW. it really surprised me, that Adobe Photoshop uses such a simple algorithm, while faster (and slightly more complex) algorithms exist for quite a long time.


I guess it depends on the resolution. To remove a given scratch of a given absolute size in cm, if your photo has twice the resolution you will need twice the pixels so twice the pixel size.


Twice the resolution is 4 times the pixels, no?


Yes, but twice the radius (in pixels) Is also four times the (affected) pixels.


I don't do photo work; but, I went and played with the tool and, at least for the photo provided, anything above 10px really started taking the photo into the "Uncanny Valley" territory.


As someone else pointed out - the radius is relative to the resolution. 10px is nothing when you're working at high DPI.


here's a cool median filtering algorithm (extensible to bilateral filtering/surface blur) which exhibits O(log r) performance: http://www.shellandslate.com/download/fastmedian_5506.pdf http://www.shellandslate.com/download/medianslides.pdf

also related are adobe's local laplacian filters: http://people.csail.mit.edu/hasinoff/pubs/ParisEtAl11-lapfil...

at recent SIGGRAPHs, they showed off the halide DSL which can optimize filtering/schedule of image processing kernels for CPUs/GPUs.

also pretty cool related algorithms are domain transform edge aware filters: http://www.inf.ufrgs.br/~eslgastal/DomainTransform/ if you use open cv, they're in 3.0 called DTFilter


Am I the only one surprised by the gap between GIMP and Photoshop in calculating the surface blur example?


I was very surprised, that GIMP uses a naive algorithm for computing the surface blur. It is like Selection Sort for sorting.

I have seen much faster Surface Blur for GIMP available as a plugin. I am wondering, why it wasn't added as the default method.


Not really. Photoshop is an expensive closed source application designed for professional use. GIMP is open source and the cross-over between professional users and open source GIMP coders is likely next to nil. Meaning that any significant production performance issues are pretty invisible to GIMP.

In my experience with the art/design mac/photoshop crowd, something like GIMP only earns blank stares.


That's backwards reasoning. Literally you're reasoning, this is bad so it's no surprise that it's bad. The reality is that it's quite surprising that the solution is simple but nobody has added it to GIMP.


Not at all - the reasoning is that the overlap between programmers and users is tiny, the programmers won't use it in the way artists do, and so problems don't get noticed. This is just what happens when you don't eat your own dogfood.

As tools go, Photoshop is pretty embedded, and most professional artists use it. Few professional artists use GIMP. (I've worked with lots of artists - this is just how things are. Most haven't even heard of it, and don't have any interest in it, which is fair enough, because most of the time it can't even load Photoshop files properly.)

Out of the artists that do use GIMP, professional or not, even fewer work on GIMP, as in, write the code that makes it do stuff. (Few artists have much inclination and/or talent when it comes to programming.)

This means that the intersection of people that write stuff for GIMP and people that use it is small; the intersection between GIMP programmers and GIMP professional users is tiny. So if there's something wrong with GIMP's implementation of stuff that professional artists use all the time, there's a good chance it won't even get noticed, let alone fixed.

I'm guessing a bit here about all of this stuff, I do admit. (How much insight do I have into the GIMP development process? None, of course.) But I've got good evidence for the theory, since most of the time GIMP can't even load Photoshop files properly! Even though that's, like, requirement #1 for anything that purports to be a useful tool for artists. GIMP is about as much use as a spreadsheet that can't open xslx files and has no OLE automation support.


Mods: in this case - aterrible decision to change the post title. Please allow us to salvage articles with titles that provide no context.


very cool!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: