Agreed. After reading several paragraphs about how speed as the only toggle was a brilliant feature, I started wondering, "Wait a minute. Am I reading reporting or an aggressive attempt at framing the lack of features by a PR team?"
Am I the only one bothered by the misuse of the term Native Advertising. Why not use advertorial, which is more accurate and descriptive.
Yes, an advertorial is a type of native ad...but paid search ads are also native ads.
Turning atomic consumption units into an ad units (actual definition of native ads) is good for users. It creates a seamless experience and puts the onus on advertisers to engage authentically in a native way. It encourages content creation and doesn't interrupt our lives.
Yes, advertorials have a foul taste about them and seem more manipulative than anything else...but its time we stop bashing native ads, advertorials are a very small piece of the native ad world.
Apple did not coin the term Yosemite either, but if a new product from Google came and was called "Yosemite" (and it was also an OS) it would be a little awkward and worthy of a mention.
It's not just that "hyperlapse" is a pre-existing word (despite my spell-check underlining it right now), but that "hyperlapse" is exactly what this app does.
Microsoft did not invent hyperlapse last week. They showed off a method of producing a hyperlapse. Instagram now has another method of producing a hyperlapse.
Here's a video that, coincidentally, was also published last week about shooing a hyperlapse with a DSLR.
Interesting. I think that the confusion arises because the Microsoft paper was many people's first exposure to the term, which led people to assume that they had coined the term. Based on the Wikipedia article, it looks like it is a relatively new term, with the first usage I could find in 2011, but that it does predate the Microsoft paper.
It looks like Dan Eckert coined the term for this video: https://vimeo.com/19441262 then later used it to refer to the general technique, for which he registered http://www.hyperlapse.com/ in 2012. Other people started using it, like https://vimeo.com/50238512, and now Microsoft and Facebook are both developing applications that both use the term to automatically create hyperlapse sequences.
Microsoft Research was manually computing a motion map using heavy image-processing techniques that they then used to decide which frames to use and how to align them. Their work can do things like throw away a series of 'noisy' frames that wouldn't fit the overall scene motion. For instance, if you had a head mounted camera and were riding around on your bike, it could throw away the moments where you look both ways before crossing the street.
Instagram's technology utilizes extra sensor data to make their frame corrections. As far as I can tell from the brief article, the technology doesn't do much besides stabilization. That's not to belittle it, as it's a really neat application demonstrating the benefits of mobile sensor integration. And you might be able to create a similar motion map using the gyroscope data that the folks at Microsoft create through their image processing techniques!
A "motion map" ? I think you'd better read the paper again. They compute a full 3D reconstruction. Much much more computationally expensive. You would not "be able to create a similar motion map using the gyroscope data".
Different. Microsoft's technology only requires the video stream. Instagram is sniffing the gyroscope sensor data on the phone so it doesn't have to do all that expensive image processing.
I also find it weird that they chose a name that was already taken in the Apple App Store. If I search for "Hyperlapse" they are the second result. I bet you the developer of that $0.99 app is seeing an exponential increase in revenue!
Wow... I know there are services that track apps in the app store... I wonder if someone has the data (besides the developer) on the amount of downloads they received.
I almost bought the wrong application myself. But - I bet the store reverses the positions of the apps within 24 hours.
Apple store is not great on launch day for new hot products - but almost always within 24-36 hours, the proper app gets moved into position #1. (For some definition of proper)
Yep I was thinking that as I saw this but we all have to remember that Facebook and Microsoft have a partnership. Maybe Microsoft has finally been kind enough to let all the good stuff from the Research division to be implemented into consumer-ready products! A welcome change! :D
For folks who haven't seen the Hyperlapse technique before, here's a great example of the style done professionally blended with traditional timelapse:
Maybe Instagram saw the Microsoft project and decided to push their product out as soon as possible, hence no Android version yet. Also helps them give them an edge over Horizon, although a similar tactic failed to have them squash Vine. Full disclosure: I have no idea what I am talking about.
Since the cameras and the sensors, and the APIs for them, are pale imitations of the camera and sensor hardware and software in the iPhone and iOS, I wouldn't expect a decent version of this app for Android, ever. Maybe, at a stretch, they could get it to work on specific handsets, but I'd be surprised if they bothered.
>Maybe Instagram saw the Microsoft project and decided to push their product out as soon as possible, hence no Android version yet.
Sounds plausible, but the "hence no Android version yet" seems like the standard operating mode for Instagram and most startups. It's not like most come with Android versions from the start.
It’s not the same. To be honest, with this product out I’m struggling to see the usefulness of the time lapse feature. In isolation it’s maybe somewhat useful in some cases, but in a world with Hyperlapse out Apple should just remove it from the camera app and concentrate on the basics with their apps and leave the cool stuff to other people.
The usefulness (or at least coolness) of Hyperlapse is immediately obvious upon using it and it’s a good fit for smartphones, unlike the time lapse feature.
Smartphones are small and light. Standing them on an edge is in the best case precarious and in the worst case impossible (because the edges are sometimes rounded). Depending on where you prop them up (e.g. stone walls) it might also be damaging to the phone. Also, since they are so small and light (or not so much small, just not really ergonomically shaped for steady holding) holding them steady is a real challenge, especially when moving.
Hyperlapse tackles these problems with time lapse on a phone very well. It’s much more thoughtful and considerate of the hardware it’s running on than a simple time lapse feature.
(Plus, that UI is fucking amazing. One button to record, one slider to adjust the speed. This is inside the bounds of complexity Apple lets in their own very basic camera app. The slow motion feature Apple offers has similar UI complexity. So this isn’t even a super-complex feature from a user perspective.)
>Smartphones are small and light. Standing them on an edge is in the best case precarious and in the worst case impossible (because the edges are sometimes rounded). Depending on where you prop them up (e.g. stone walls) it might also be damaging to the phone. Also, since they are so small and light (or not so much small, just not really ergonomically shaped for steady holding) holding them steady is a real challenge, especially when moving.
Huh? There are 200000 mounts available for all kinds of tripods.
Exactly. As I said, “… it’s maybe somewhat useful in some cases …”
If you need to mount your device and buy a mount to actually mount your smartphone then that is and always will be a niche feature. (Also, practically all mounts don’t help a lot when moving the phone around.) It’s not completely useless, but it is useless to most people. Hyperlapse is a much more useful and device-appropriate implementation of the same idea, more or less, that works without the user really having to be very careful or creative or having to buy anything. Plus, the UI is super-simple and not at all confusing.
I first read about this app today, downtown, while drinking a coffee. To get decent video out of it I had to hold the camera in front of me and walk around a bit, not really trying to hold it super-steady, and I got cool results. Also, I didn’t have to buy a mount. Do the same with the time lapse feature and it’s a disaster.
Just because something is possible doesn’t mean it’s a great experience – or a great addition to the OS. My point is this: Hyperlapse is much more useful than the time lapse feature for most users, all the while not being more difficult to use.
>* Hyperlapse is a much more useful and device-appropriate implementation of the same idea*
What same idea? These do entirely different things. Hyperlapse is a time-sped-up video with intelligent image stabilization, and timelapse is, well, a timelapse.
Not the same idea or feature at all. In fact even Apple themselves describe timelapse for what it is: a timelapse feature for static scenes: "Capture the experience of the sun setting, a city street bustling, or a flower blooming with the new Time-lapse mode in Camera".
No, it’s the same, with one actually useful on smartphones and one not. The thing is, keeping a smartphone static is really fucking hard, so the correct thing to do (for a default feature – if you have to buy some crazy attachment it’s no longer a basic feature that deserves any space in the camera app) is to not force you to keep the camera static. Which hyperlapse does. it so, so, so much more elegantly and perfectly fits the device, time lapse is an embarrassment in comparison.
So a user will be able to combine the powers of a native time-lapse feature and stabilize it through Hyperlapse. Now that I have actually taken the time to think about it I'm truly impressed.
I guess bad gyroscope data in lots of Android devices, lots of fragmentation in OS and hardware capabilities, and less people using Android on the more valueable for advertisers market segment, limits developers will to push something simultaneously.
It's similar with music apps, were Android has crappy audio APIs and latency issues. Almost all synths, samplers, etc from professional music companies like Korg et co are made for iOS only.
Is it not enough to have the user run a calibration step when first using the app? I've had panorama apps do this.
Edit: Bounden has a calibration step every time you use the app, and again every time it detects too much drift. If it was a quality issue, they could just warn people that they'll get crummy results if they have crummy sensors.
Nah, I talked to the guys who develop Horizon, and the Android APIs and devices are very spotty about gyroscopes/accelerometers. Each device needs to be developed for almost separately, and the iPhone APIs are higher-quality on these things in general.
That doesn't mean you can't do it, it's just easier on the iPhone, which sucks for us Android users.
They're getting better. But unfortunately due to the way Android phones receive updates many phones have different, older, API levels which don't support many of the features.
So while they could likely make and release an Android app, expect only the flagship phones from the last 12-24 months to be supported, nothing a day older (and even a lot of phones within that window still lack the latest API updates).
Phone Mfg's will sometime release manufacturer-specific API's for Android that will only work on their phones.
I've always been disappointed at how/why Google allows fragmentation.
It certainly seems to me that they have the power to force a reunification in Android (and that if they did, the world would cheer).
Why not tell Samsung that Google must determine the update schedule, not Samsung? And if Samsung adds non-standard hardware, they must provide working updated code by this schedule.
And if Samsung doesn't like it, they are welcome to move everything over to their Tizen OS.
Even if Google cannot fix the past, why not do this going forward?
Google doesn't control Android after they release it. It's open software (which is why Samsung gets to "skin" it the way they do) and Samsung could add anything they want or even rip bits out. It's not Google's decision.
Edit: this is a lot of downvotes for no comments. What's wrong with my comment?
The downvotes might be because you're slightly mistaken about the level of control that Google possesses. While the code base of Android itself is open source, that means little because Google apps such as the Gmail, Maps and the all important Play Store are not part of AOSP code base. If a manufacturer wants to include those (and lets face it, they have to) they need to agree to several onerous terms and conditions that Google stipulates.
An example of such a condition is that if a manufacturer makes any phone that includes Google apps, they must not manufacture a phone that runs without it. This is why Amazon had such a hard time finding a manufacturer for their Fire tablet/phone. As for skinning, Samsung can only do that because Google explicitly allows it.
Lastly, I feel that someone should have pointed this out instead of downvoting you.
Looks like it depends on gyroscope data gathered while capturing video in this case. But I agree that a solution like Microsoft's not commercially available Hyperlapse solution is more ideal when you have another camera or footage already shot.
I have a category in the back of my mind labeled 'Stuff Apple should do to prove that they can do it without Jobs." Top of that list is an iOS laptop. But just below that is getting back to their tradition of being a popular easier to use option for professional and prosumer media/art.
Stuff like this should be coming from Apple. For example, it would be awesome if they created prosumer software for filming with a wireless multi-camera setup and edit it all live on a macbook. would love it if upstart young-Turks like web shows could get closer to the production quality of TV talk shows.
Apple seems to not want to get into the prosumer market because it is a tiny fraction of the size of the general iPhone/iPad/Beats consumer market. The buzz about future Apple products seems to be about things that could achieve that sort of scale. (TV, watch, something video game related, etc)
Their prosumer computers essentially only exist so that developers can enrich their mobile platform. Growth is essentially flat on the Mac line, whereas iPhone sales have grown steadily since launch.
Amusingly, Jobs actually panned the idea of a laptop with a touch screen. Doesn't mean he wouldn't have done it if he changed his mind/marketing, but he did state it was an ergonomic disaster, and the history suggests he (and Apple) believed it.
Getting a bit of hyperbole out of this (shocking, I know).
Stabilization is not something that costs $15k, at least at this quality. Yes, it's interesting on a technical level (both with Instagram and MS), but there are low-tech solutions that handle this aptly.
So in essence we're talking about a video accelerator (and presumably decelerator)? This doesn't exactly wow me, but then again Instagram was founded on applying filters to photos, so I should recognize the value of hype (and subsequent audience size).
Um, that's what I was saying. Stabilization (at least at this level) is still not a $15,000 investment. Prosumer ones start in the $500 range.
Software stabilization obviously also supplants hardware cost, which makes even that # moot. Then there's the acceleration, which can be done by low-level video prod software already.
I'm seriously not trying to just poo-poo this, it's just very underwhelming as a total package.
We do this all the time, and the guidelines are as they have always been: the original title is preferred except when it is misleading or linkbait. There are also a few more detailed rules, one of which is that we take out arbitrary numbers from titles. The "$15,000" in the original title arguably trips all three of those criteria. When users complain about titles being misleading or linkbait, we particularly listen.
The arbitrary number rule very specifically calls out list type posts:
If the original title begins with a number or number + gratuitous adjective, we'd appreciate it if you'd crop it. E.g. translate "10 Ways To Do X" to "How To Do X," and "14 Amazing Ys" to "Ys." Exception: when the number is meaningful, e.g. "The 5 Platonic Solids."
The $15,000 doesn't seem to qualify.
That being said, you could certainly qualify it as linkbait, but why just take out the number? The simile is ruined, and the title just sounds stupid now. If you're going to edit it to make it not linkbait, change it to something like:
Hyperlapse, Instagram's new video stabilization app
Certainly you can moderate how you see fit, just thought I'd offer my 2 cents. I get annoyed when I come back to the same article I've already read with a different title so the inconsistency is frustrating sometimes. Sometimes arguably better titles get changed to match the source, and then when (someone else decides) the source has a bad title we change that too...
Yes, the arbitrary number guideline should be reworded. I've added that to my todo list.
We'll change the title to your suggestion. Thanks!
All: the best way to complain about a bad title is to suggest a better one. We see hundreds of articles and titles a day—the idea of getting all of those perfect is cruel. Therefore we're delighted to get help, as long as the principles are clear: an HN title should be accurate and neutral while staying as close to the article's own words as possible.
IMO there should be two titles, the resources given title and an editorial title which would at least initially be provided by the submitter if they felt the given title was not descriptive. That should curtail titular editorialisations.
Honestly at first this did seem odd, but if someone can remove some level of marketeering at any point down the social aggregation line, it feels like a win.
I think that the MS Research project probably has a few advantages over this although I've only looked at Instagram's example videos on Vimeo so far.
The main advantage I see for MS Research is that they enable the full dropping of multiple frames to remove large camera motions such as having a head mounted camera and briefly looking from side to side.
In the Instagram examples, the camera stays on a fixed subject and their Hyperlapse algorithm reduces the shake. I suspect that if there were large camera motions, they would be translated into the final product which could detract a lot from the appeal of that video.
The major advantage is that Microsoft's solution just requires the video. Instagram's app needs data from a gyroscope, so if you exported the video or you filmed it on a camera without a gyroscope, you're out of luck.
The MS Research Hyperlapse and this are totally different. This is just video stabilization but it's not as computationally expensive and so can be done quickly and on a phone. MS Hyperlapse requires an absurd amount of computing power, but it goes above and beyond stabilization, creating an entire 3d model of the world and then rendering that. The downside is it has a lot of weird artifacts and jumpiness.
Hi Colin, all Hyperlapse videos get saved out to your Camera Roll, and no FB or IG account is needed to use the app (there are sharing options for those two in the app, but they aren't a requirement to using it)
Yeah, for me the biggest advantage of the Microsoft project is I don't have an iPhone and even if the Instagram software ran on my Nexus 5, I'm still primarily interested in using this with video from non-phone cameras (specifically my Canon 70D and my Sony A7).
For the curious, this is a video rendered with my iPhone 4S, and the road was very very bumpy. This is the quality you get with an old phone and with zero setup / art direction.
I had worse luck while trying to walk with it. I didn't want to be that guy recording people so I just dangled the iPhone in my arm. And yeah. I was worried I was recording the pavement and adjusted to much.
"Our algorithm first reconstructs the 3D input camera path as well as dense, per-frame proxy geometries. We then optimize a novel camera path for the output video (shown in red) that is smooth and passes near the input cameras while ensuring that the virtual camera looks in directions that can be rendered well from the input."
Instagram:
"... Smartphones didn’t have nearly enough power to replicate video-editing software, but they did have built-in gyroscopes. On a smartphone, instead of using power-hungry algorithms to model the camera’s movement, he could measure it directly. And he could funnel those measurements through a simpler algorithm that could map one frame to the next, giving the illusion that the camera was being held steady"
Wow. I would never have guessed that this would be possible to do by analyzing the gyroscope data. It must be sensitive enough to detect small / tiny jerks in your phone as those seemingly tiny jerks can often translate into enormous motion in your camera lens especially if a subject is far away.
This seems like a simpler version of what Microsoft demonstrated in the paper. Between the final and original shots in Instigram's demos you can still see much of the original motion. This is especially apparent in the car clip.
What it looks like the Instagram app is doing is selecting and stabilizing individual frames instead of the full 3D scene reconstruction that the MSFT paper described. EG there's no pop in and out of details like the MSFT demos and there's still a small but noticeable judder in some of the videos.
Is this a surprise? Microsoft had the Surface like a decade before Apple had the iPhone or iPad, and nothing came of it. Another stupid mistake by not focussing on innovation but instead focusing on how to preserve their Windows monopoly.
Ah of course, Nokia, should have guessed. Interesting video on their stabilisation technology - on the 925 it doesn't seem to add that much thickness, that one is only 8.8mm thick, vs 7.7mm for iPhone 5. They should have licensed their camera tech to all manufacturers. Wonder how much of it will pop up in iPhone 6 now that they snagged one of the main PureView engineers.
I think the earlier 1020 etc were thick because of huge sensor size.
I have OIS dSLR lenses but none of them are quite as good as that Sony's Optical Steady Shot linked above - it's almost as good as some brushless gimbals I'm using on drones, which are rock steady. Sony claim it's 13 times more effective than their previous stabilisation.
There's a difference between gyroscopic adjustment at/near real-time, vs multi-frame interpolation for generation of a path view outside the camera frame/path requiring hours/weeks of processing time.
Interesting it can be done real time on the phone (I assume).
Personally I've been using free video stabilizer Deshaker for 10 years with similar (or better) results. The downside is two-pass rendering. Upside is that it is fairly configurable and can stabilize existing videos. You can get ok results with fairly terrible source video.
I was excited to download and try it, but honestly it didn't seem like it was stabilizing the frame at all...the camera shaking was still in the video...
iOS 8 has timelapse, implemented as a series of photos taken a few seconds apart and then stitched together. This appears to be a video stream that is then smoothed and then (optionally) sped up.
Looks like it only has native share options for Instagram and Facebook. Seems a bit petty not to offer Twitter. How about let me decide where I want to share?
From an implementation point of view, this app doesn't use UIActivityViewController, but instead one they've rolled themselves.
They could in fact have added Instagram to the built-in share view controller but decided not to. This also takes away options of Twitter, SMS, email.
What does the year have to do with it? 2014 isn't the year Google fixed Android fragmentation. It isn't the year it managed to get OEMs and carriers to start shipping timely OS updates. It isn't the year Android users started actually buying stuff. I'm an Android user, so I'm just as frustrated by it as you, but it's a problem inherent to Android, not to the developers.
I think a significant part of this problem is that it's hard to know when choosing an Android device exactly what tradeoffs you're making. How many reviews include information about the quality and calibration of the gyroscope, motion sensor and magnetometer systems? Other than perhaps some test shots, how much do you really know about the camera quality? Are all the relevant APIs implemented, and how good a job are the drivers doing?
It's great the Android offers so much choice in terms of features and price, but the upshot is that you're not buying a device with an android sized slice of market share from the point of view of app support, for many purposes you're buying a device with a that-device sized slice of market share.
In the old PC days the exact spec of a machine didn't usually matter much. However for mobile devices, the plethora of advanced sensors they come with and the complexity of software capabilities that take advantage of them has made fragmentation a much bigger issue, but really only if you're interested in the most advanced features.
For most phone users it doesn't make that much difference, but if you're interested in cutting edge features the quality of hardware and software integration and application support has become critical. Samsung almost has enough critical mass to establish a stable platform for advanced Android applications, but the problem is it just doesn't have the vision and discipline to do so by actually establishing a stable baseline for the features and software in it's phones.
In this particular case, they specifically say it is because they are waiting for certain Camera API changes. I suspect it is probably related to the need to gather gyroscope / accelerometer data that is keyed to the individual frames of the video.
Instagram is certainly capable of putting out apps for both platforms simultaneously. I don't think they wouldn't fail on that without a good reason.
Besides the fact that it's monumentally more difficult to support all Android platforms instead of one iOS platform, Do you get Mac versions of software at the same time as Windows?
(the simple answer is 'usually no' because it would mean the porting team would have to be working with the main dev team and the launch delayed until the port was done)
I get this complaint all the time and it's getting old. What's clear to developers is usually not to the user, that the combination of device + hardware + os makes developing "an Android version" a royal pain in the app store.
Some of the smaller shops including solo founders and two man teams only have the resources to build one app at a time. If they're building something in their spare time, it's even harder.
It's not the end of the world. I'm sure Android version will be developed faster than Instagram for Android was. My guess it this has to do with a lot of Camera API tuning since it uses the gyros on the phone to stabilize. Easier to do it on iOS and get to market with a decent user base, then start figuring out Android devices one by one.
Hopefully the new Camera API in Android L makes this app possible -- you have much better control of the incoming frames with accurate timestamps, so matching them up to recorded gyroscope information and cropping/offsetting them before writing them out to the encoder would be much easier.
PDF here: https://research.microsoft.com/en-us/um/redmond/projects/hyp...
Examples here: http://research.microsoft.com/en-us/um/redmond/projects/hype...
It's even more weird that the WIRED writer didn't mention this. It was major news all over the place two weeks ago. Good PR folks at Instagram / FB.