Hacker News new | past | comments | ask | show | jobs | submit | mikeyk's comments login

Beautiful tribute — you captured Peter perfectly.


Thanks Mike. Big hugs.


Best I’ve found are the ones at Homestate in LA: https://www.myhomestate.com/


IG co-founder here: users 1 and 2 were our first two attempts at creating users end to end when getting Instagram v0.1 hooked up to the backend we'd written. There was a bug so they were left in an incomplete state; post bugfix, 3 was my co-founder Kevin and 4 is me.


Two ghost users, left in an incomplete state by a bug in a previous version of the codebase... you basically created the Twins from The Matrix!


Hey Aaron, sorry to hear and thanks for posting. We're hiring at Instagram for roles in NYC, SF and Menlo Park; if you/folks on your team want a direct line feel free to email me and I'll connect them to the right folks on the team.

mike [at] instagram [dot] com.


Browsers are all over the place, unfortunately. It's part of why sRGB because the only reasonable color profile for Web use. I think we'll see wide color become common in apps before the Web.


All over the place in what way? Support for different color profiles? Actually handling color spaces at all? The fact that there's no consistency when it comes to untagged images? The mess which is plugins? The ability to specific CSS colors in a specific color space?


We built this in already! We don't have a "1x" or "2x" indicator, but the dual lens camera is fully used in Instagram now and will do the smart transition between 1x>2x optical and 2x+ digital zoom.


Oh no way! That's awesome, apologies for not knowing this and thanks!


They mentioned this in the post but it was fairly well hidden, can't blame you for missing it.


I used the same approach as the Webkit image, so the same applies here, too (it's also why we only serve Display P3 photos to iOS clients with wide color screens, most Android devices would treat them incorrectly)


Thanks for the confirmation!


Good to know--I didn't run it through my Pixel. Some devices will do a relative projection from Display P3 to sRGB, which means that it will look "relatively" like it would in Display P3 but projected onto the sRGB color space.

Edited to add: and some other ones are doing something even less fancy, which is just to ignore the color profile entirely and just assume sRGB and display it incorrectly, taking for example what would have been the maximum red point for Display-P3 and making it the maximum red point in sRGB.


Since you brought up your Pixel: what is the point of adding something 1% of your customers maybe can see instead fixing that horrible horrible compression that makes uploads from android (80% world wide market share) look like crap?

(Not an android user, I just want to figure out how a company of your size prioritises between bugs and features.)


I can't speak for OP but say this does effect 1% of users today, what percentage does it effect in 6 months, or a year? Not bad to be proactive.

And regarding android compression issues, although resources are always finite, I imagine in this case the android team is fairly separate, so they may very well be working on that compression issue while iOS is pushing forward into new terrain.


Instagram should hire some faster Android developers if that is the case, it's been an issue since 2012:

https://www.xda-developers.com/a-look-into-instagrams-compre...


> I imagine in this case the android team is fairly separate

This is likely it, right here. So many people forget that larger companies have different teams working on different things. I bet a lot of their "iOS people" that are working on this project have no clue how the Android app works, and Instagram likely has a separate team working on the compression issues.


"Working on" since 2012?


I don't use IG, so I wasn't aware of that problem or how long it had been around. That said, the general sentiment stands: one of their teams working on one thing doesn't show that they don't have another team working on something unrelated.


> (80% world wide market share)

Not of instagram user. Not of app users.


Well, given how Instagram has treated its android users can you blame them?

I've seen a number of SV companies releasing ugly and buggy android apss then use their lowering android user base as a proof that android users use like their services.

To be honest, things could be worse. You could be a tinder user in win phone...


We started using OpenGL in 2011. Our CPU-based image filters used to take 4+ seconds with pixel-by-pixel manipulation, and now can render 30+ times per second.

If you have some sample images where the current image pipeline is going wrong let me know and we can look into improving.


I think that's back when Gotham died :/

http://randsinrepose.com/archives/rip-gotham/


I also went through that process with my app Picfx. Using OpenGL for filters is much quicker, the only downside I've found is being limited by the texture size, I did set up a way to process images in tiles but ultimately decided to just limit images to the texture size. Great info on the colour space, I'm sure it will be useful.


Instead of fixing the horrible JPEG encoding can you please add support for webp? It's quite a bit smaller and well supported with polyfills since it's just a single vp8 frame


polyfills in general are a really awful user experience.

They are typically pushed by people who use the latest chrome, so they have an excuse not to care about other browsers.

Their preformance, and usability is almost invariably terrible.


You don't need a polyfill to deploy Webp. Chrome automatically sends webp in the Accept headers for images, so on the CDN level you could implement some logic to seamlessly swap in Webp images for the browsers that support it. Imgix does this for instance.


Here if anyone has questions on the implementation or such.


It's great you listen. So I'll try.

1. Scaling down in linear colorspace is essential. One example is [1], where [2] is sRGB and [3] is linear. There are some canary images too [4].

2. Plain bicubic filtering is not good anymore. EWA (Elliptical Weighted Averaging) filtering by Nicolas Robidoux produces much better results [5].

3. Using default JPEG quantization tables at quality 75 is not good anymore. That's what people referring as horrible compression. MozJPEG [6] is a much better alternative. With edge detection and quality assessment, it's even better.

4. You have to realize that 8-bit wide-gamut photographs will show noticeable banding on sRGB devices. Here's my attempt [7] to reveal the issue using sRGB as a wider gamut colorspace.

[1] https://unsplash.com/photos/UyUvM0xcqMA

[2] https://cloud.githubusercontent.com/assets/107935/13997633/a...

[3] https://cloud.githubusercontent.com/assets/107935/13997660/b...

[4] https://cloud.githubusercontent.com/assets/72159/11488537/3d...

[5] http://www.imagemagick.org/Usage/filter/nicolas/

[6] https://github.com/mozilla/mozjpeg

[7] https://twitter.com/vmdanilov/status/745321798309412865


I thought there was something familiar about that name Robidoux … I see he's the same one that got some crowdfounded work on GIMP's new scaling methods: http://libregraphicsworld.org/blog/entry/advanced-samplers-f...


"When we started this project, none of us at IG were deep experts in color."

This is pretty astonishing to me. I always thought applying smartly designed color filters to pictures was basically Instagram's entire value proposition. In particular, I would have thought that designing filters to emulate the look and feel of various old films would have taken some fairly involved imaging knowledge. How did Instagram get so far without color experts?


In the "How I Built This" podcast for Instagram, Kevin Systrom specifically says the filters were created to take the relatively low-quality photos early smartphone cameras were capable of and making them look like photographer-quality photos. Filters were about taking average photos and making them "pop" but not necessarily by virtue of having deep domain knowledge of color.


I was never under this impression. I was always under the impression the filters are just created by a designer playing around with various effects until it looked nice.


Because it isn't complicated or novel to make compressed 8-bit jpegs have color filters. There are tools for the job and they've been around for a long time.

Working in a different color space than standard requires a little bit of familiarity and finesse that modifying 8-bit jpegs for consumption on the internet did not require.

Many photographers and printers are familiar with this dilemma in a variety of circumstances, where the cameras create images in a different color space and higher bit depth that can't be perceived at all with any technology or the human eye.


I'm sure the comment you're replying to wasn't thinking of the algorithm that applies a filter to a jpeg, but the process by which that filter is created in the first place. The assumption being that there's some sort of theory to colour that allows you to systematically improve the aesthetic qualities of images.

As an analogy, think of the value music theory (i. e. https://en.wikipedia.org/wiki/Scale_(music)#Harmonic_content) for composition.


The creative process isn't novel. There isn't even the capability of any layer masking in most mobile apps, including Instagram, compared to pre-existing more robust tools on desktop (and other mobile apps), severely limiting the 'technical interestingness' to begin with.


Instagram value proposition is that other mostly young people use it.


Instagram's value prop was, and is, mobile.


Bit of a jump in topic, but I'm kind of curious: I don't use instagram myself, bugt I'm sure it resizes images for online viewing, saving bandwidth ans such. Does it do so with correct gamma[0]? Since that's a thing many image-related applications get wrong.

[0] http://blog.johnnovak.net/2016/09/21/what-every-coder-should...


If it's using Pillow for the resizing, probably not. I've not looked at Pillow specifically, but PIL certainly wasn't very good.


Pillow doesn't do anything with gamma by default, nor does it require that color management be compiled in.

(I'm a maintainer of pillow)


That would be a bit sad, given that they went through all the trouble of getting wide colour.


They don't store full resolution images: Square Image: 1080px in width by 1080px in height

Vertical Image: 1080px in width by 1350px in height

Horizontal Image: 1080px in width by 566px in height


Not necessarily on topic - but given your code samples are written in Objective-C, how much of your codebase is still in Objective-C and what's your opinion on porting it over to Swift?


Good q. All of it is still Objective C and C++; there's a few blockers to starting to move to Swift, including the relatively large amount of custom build + development tooling that we and FB have in place.


Hi Mike, since Instagram is listed in the React Native Showcase, could you tell us where are you using it? Thanks in advance.

https://facebook.github.io/react-native/showcase.html


It's getting used more and more in our app--few recent examples are the "Promote Post" UI if you're a business account and want to promote a post from inside Instagram, the Saved Posts feature, and the comment moderation tools we now provide around comment filtering.


I'd like to know this, too.


The swift compiler is probably too slow for projects as large as instagram.


Peculiar this is being downvoted - compiler speed / reliability / support for incremental builds are all issues with large Swift projects. I've even see people go as far as splitting up their project into frameworks in order to bring build times below times as large as 10m and restore incremental compilation.


The swift compiler is used in many large projects.

And you probably overestimate how large Instagram, the app, is.


I know there's not much love for the API these days, but is / will there be a way to access wide color images from the API? iPads and Macbook Pros also have wide color screens nowadays, so it would make sense to display them for specific use-cases in third party clients.


Great article. Question not directly related to your writeup: in your casual usage of a phone/photo app with the new, wider color space, do you notice a difference in your experience of using the app/looking at images? Or, in other words, does the wider color space feel at all different?


Is Wide Color preserved when processing video?


Photos only. Apple's APIs only capture in Wide Color when shooting in photo mode, and their documentation only recommends using wide color/Display P3 for images.


Our iOS dev hunkered down today and may have come up with a decent solution to your EAGLView issue. You can find the write-up on medium: https://medium.com/imgly/bringing-wide-color-to-photoeditor-....


Your link somehow got truncated. Here’s the full thing: https://medium.com/imgly/bringing-wide-color-to-photoeditor-...


I this why iPhone photos of red roses, or other vibrant roses look poor? I love the idea of a photo room is it hand painted?


Yes they look poor if you watch them on not color managed software environments (should not happen on Apple devices). See here a sample image https://code.google.com/p/android/issues/detail?id=225281


Do you test with Adobe RGB (which many prosumer cameras can output) or only P3 (which AFAIK only Apple devices output)?


Just P3, though the wider gamut available in our graphics operations should benefit photos brought in using Adobe RGB too since iOS is fully color managed.


What is the pixel format of your OpenGL surfaces?


Maybe a stupid question. But how to save an image as Display P3 JPEG from Photoshop? I want to play with this color standard.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: