One must wonder why they chose to expend significant R&D to make a complicated, power-sucking adapter for use on a mobile device that provides sub-par output and costs quite a bit more than a "dumb" cable.
Maybe I'm missing something, but I feel like Lightning is just the next Firewire without all the usefulness that FW had (low latency, dedicated bus that wasn't USB, lots of pro audio gear that worked with it, ability to daisy-chain, faster real-world speeds than USB, near-ubiquity before FW800 came out, super small form factor (the 4pin is tiny; not much bigger than a lightning)).
The lightning connector doesn't have enough bandwidth to drive a HDMI connexion directly
Of course, compressing (inside the iPad) for streaming, sending this through a limited bandwidth channel and then decompressing is complicated, but looks like it's the easiest solution (because all components - especially sw ones - are off the shelf)
It's like a mini Raspberry-pi in that adaptor, amazing! (if not slightly overpriced)
Apple could be losing a decent amount of money on each adapter sold. They can afford to lose money on an adapter few will by (in comparison to devices sold), especially while they're now making money on all the new 3rd party lightning accesseries.
Hopefully it's in pursuit of a connector that's somewhat forward & backward compatible for the next decade. It would be nice to learn more about Lightning. In particular the 1600x900 limitation is irritating. It'd be reassuring if this is something that will be addressed with a firmware update (similarly with the artifacts, though I have somewhat less hope for improvements there).
I'm gung-ho for Lightning but if it turns out to be nothing more than a fancy way to make USB expensive and proprietary I'll be having another think. If it guarantees accessory compatibility for 6-10 years it's worth a couple compromises but I do expect 1080P output as advertised.
After careful parsing of Apple's specs languag, I now believe that mirroring from sub 1080P devices (iPad mini, iPhone, iPod touch) is limited to 1600x900 but that 1080P video playback works (presumably without lame encode artifacts). It appears full size iPads will mirror 1080P.
The form factor of Lightning is top notch - all of the USB-variants seem terribly designed after using it. Too bad Lighting isn't free-to-license like Firewire was.
That said, the design for this AV adapter is puzzling. I wonder what the hardware limitation was that prevented straight digital video out. (# of pins?)
There are definitely not enough pins to do straight pin for pin HDMI through lightning. Of course, if you are going to go to the trouble to do what they did, you may as well bitstream it through the adapter, ...
According to Wikipedia MHL allows proprietary connectors, and MHL-through-microUSB is most often done with 5 pins. Lightning is 8 pins + shield, so it would have been possible for Apple to put HDMI signalling inside the devices, and get native 1080p/60 out.
However, MHL-through-Lightning might have prevented the AV adaptor having a daisy chained lightning socket of it's own. Maybe that's why Apple jumped through these hoops.
Reading the reviews on the Apple site, the passthrough Lightning connector on the adapter is apparently charging-only, which you can do just fine with MHL-through-microUSB IIRC.
The current design prevents users from accessing the actual 1080p video stream from the device, too.
The cynic in me says this is an end-run around any DRM issues giving access to that 1080p bitstream could introduce: instead let's just output a degraded, upscaled stream and call it HD.
This isn't an answer, and in fact, supports the opposite of your point :)
It would have been just as easy to use micro-usb 3.0, which had plenty of bandwidth, and do the exact same pin or format conversions.
Standardized serial buses that are less complicated already existed. What exactly do you think USB 3.0 is?
If the only reason for lighting was "a general purpose next generation serial format", then it is definitely a horrible idea.
Of course, none of this (except DRM concerns) answers why they'd not bitstream the format and convert it to HDMI signaling instead of doing the weird crap they did here.
>One must wonder why they chose to expend significant R&D to make a complicated, power-sucking adapter for use on a mobile device that provides sub-par output and costs quite a bit more than a "dumb" cable.
And people have answered this convincingly. From comments on TFA:
= = = = =
The reason behind this stupid expensive solution: It seems that the lightning connector is similar to the thunderbolt connector idea.
With USB you have Power/USB/MHL output. Each method requires a specific combination of pins. Updating to a different protocol, support for other devices requires difficult pin sharing (thus hardware modification), support for currently unknown future connections is impossible. If in the future something new appears it won’t work with older USB devices.
With lightning you can connect anything. Similar to thunderbolt it does not use a specific protocol to handle a specific task. It uses a general method to communicate between adapter and iOS device. That way Apple can send anything through this lightning adapter. The adapter itself gets updated automatically because it loads its firmware from the used device. So Apple can send USB2.0, USB3.0, Analog audio, Digital Audio, HDMI and any new standard through it, in theory.
The idea is great: The lightning adapter will be Apples only port and it will remain future proof.
In reality however, the lightning connector also has a limited bandwidth thus they have to compress their data, the conversion to and from the Apple lightning protocol requires lots of processing power, on both sides, and reduces quality. And finally it’s expensive.
Apple should have tried to use thunderbolt instead of their own propietary technology, but then they would have used a common technology which other manufacturers could use and benefit from, too, that’s not the way Apple likes it, so they introduced their own thing, which is crap right now.
AND:
Just remember, this is v1 hardware for a tough future proofing problem. I genuinely appreciate the lengths Apple went through to create A) a reversible plug, so you can stop mashing and flipping. B) handled all the connector options.
The built in decoder module could take a compatible 1080p video stream and push that to the TV at full resolution, the screen mirroring is a separate protocol. Now I’m curious if the Lightning-SVGA adapter has the same rig!
So, v1 of the Lightning connector is probably (assuming) going at USB2 speeds. In v2 of the connector we could see USB3 (5Gbit/s) or thunderbolt (10Gbit/s) speeds, but all the accessories still work.
Assuming the faster speeds come, it’ll support 4K TV output video streams, same plug. Yay future.
The reason you can't just serialize everything instead of using different pin functions is that its both ridiculously complex and the bandwidth is simply not there. Apparently Lightning can't even serialize 1080p, so they have to do lossy compression. This is not a connector, its an active data corruption stick. And then they even try to hide it by horribly upscaling it to 1080p.
Thunderbolt definitely is designed as a replacement for Firewire, USB, 10 Gbit Ethernet, HDMI, and all other current wired connections. It's been advertised as such for a while.
I first heard about Lightning just 2 days ago. I'm now confused between Thunderbolt and Lightning, and frustrated that I'll have to go learn about their differences.
So Lightning is an 8-pin replacement for the 30-pin connector.
And Thunderbolt is still the all-in-one replacement for all wired connections. So apparently the purpose behind the Thunderbolt connector subsumes the purpose behind the Lightning connector, yet the existence of Thunderbolt does not obviate Lightning's existence.
Nicely done. The very very frightening for me is the clunky move to wireless. There are about 20 Macs in the office, and when half the keyboard batteries go flat after a long weekend and you need to quickly check something, scrounging the building for a USB keyboard is tiring. This combined with airplays momentary drop outs and refusal to start (infrequent but annoying) make me sad. It's like we're so close to the cableless world, but just far enough away that its awful.
Because once you accept Lighting as the connector, there is no "dumb". You have USB and this is where you have to start. You don't get DVI/HDMI signals, analog audio, or anything else.
Someone isn't practiced enough at weaselwords yet. Had they said 'mirrors in its entirety what you see on your device', they would have been accurate while misleading, while now they're just wrong.
So instead of just going with USB3 like other phone manufacturers and having native HDMI they created their own interface that results in an a picture display with noticeable artifacts.
The Micro USB plug is an abomination, only exceeded in horribleness by Apple’s predecessor to the Lightning plug, the Dock Connector.
Hardly anyone will use the HDMI out. Prioritizing the plug seems like the correct tradeoff to me. And make no mistake, this is what this is. It’s always about tradeoffs.
Most non-Apple smartphones have micro-USB ports for charging, and most made in the past few years also support sending an HDMI signal through this port, either via MHL [1] or SlimPort (notably, Nexus 4). All it takes is a $10 adapter cable rather than a $40 SoC transcoding things.
They also have hardware in the phone to generated HDMI (although electrically modified to fit on the USB connector). Given that only a tiny percentage of people will ever plug their phone into HDMI, it makes sense to have that minority pony up $40 instead of making everyone pay an extra dollar for the hardware they will never use.
But if they cared about cutting costs they wouldn't have invented their own cable to begin with. When has Apple ever worried about making people pay for things they don't use?
I'm disappointed because Lightening was billed as the next innovation in design, but if the picture quality is worse then what's the point?
Apple has no problem making pay – but they do have a problem making people pay for stuff they don’t use. Nothing to do with costs per se, it’s more a philosophical stance.
I'm not sure they even bothered to create their own interface. All the evidence so far is consistent with Lightning being USB 2.0 OTG plus a few extra pins for a lockout chip bundled into a proprietary connector. (More accurately, the usual multiplexed USB 2.0 and TTL serial that smartphones tend to have.)
I think what makes this particular dongle odd is that unlike most active cables, which contain just a encoder/decoder chip, Apple has put a full ARM SoC.
If bored, you could try probing those various little gold test pads on the off chance they have a debug port[1] somewhere. Then again, Apple paranoia makes that overwhelmingly unlikely.
If this thing decodes the vodeo stream and gets the code the run from the iOS device when connected, might we see a future iOS update that enables full HD, or would the video decoder hardware be the limiting factor here?
Also, for jailbreakers: find that code, and improve it.
Finally: I find it a pity that Panic reports this now. It would have been funny if it were published exactly one month later (or is the clock in their CMS off by a month?)
I heard somewhere that the Apple lightning cable, that comes with newer iPod, iPhone, iPads contains a chip and we'll be very hard to replicate (by Chinese cloners). Any insight on this?
Having just spent an evening trying to sort a strange video issue which turned out to be an intermittent issue with a component cable, the thought of this isn't nice.
I think they're way off on the theory the cable is running iOS.
I think what's most likely is the phone is putting out a h264 stream and the chip is just decoding it. Rather than a whole ios stack & AirPlay protocol.
Sure this shares some similarities with how AirPlay works (showing a h264 stream from the device ) but wouldn't require a whole system on a chip just for showing this feed.
Why not? AirPlay is a pretty high-level protocol, and it's got an ARM SoC, so the natural thing for Apple to do would be to use the ARM version of XNU with a minimal userland that has their AirPlay implementation and whatever codecs it needs.
At least, If someone handed me one of these dongles with a copy of the protocol spec and asked me to write the software for it, I'd do something similar except using Linux. (Sidenote: buildroot is great for creating embedded systems running Linux.)
This provides no useful information whatsoever. I wouldn't be hugely surprised if they wrote code to the ARM MCU (if it has 2GB of RAM to work with), but what does "stub copy of iOS" even mean?
Also. Jimmy Hoffa is buried inside that microcontroller. I won't corroborate this further, but this is the dead-on truth.
No need to be sarcastic. I find it interesting that somebody goes through the trouble of creating a new user account, just to confirm a theory that nobody could confirm without getting in trouble.
It doesn't prove anything, but it certainly fits the context. I would say the theory (and the extra bit of information) merit being analyzed further.
FWIW, a "stub copy of iOS" would mean in this context "a copy of iOS where every OS function is stubbed except what's necessary to run AirPlay."
Apple has fired people over less. /. has a long standing tradition of ACs breaking rank and posting insider stuff, I see no reason why the same couldn't happen on HN.
It makes a lot of sense to me now. The adapter is tied to iOS's supported video formats. If it will play on the device, it will play on the adapter. The downside being that mirrored content has to go through the not-awesome encoder (at least on the A5). Using the same optimized software stack and shared SoC logic probably eliminates a bunch of headaches and saves money.
If it really works like that it's kind of brilliant. Future devices will have better mirroring output due to better hardware encoders (I haven't scrutinized the mirrored output from an A6, it may already be a lot better than the output pictured in the post). It ought to work with every Lightning device. The only question is future video format support, but since it's limited to 1080P it's probably future proof enough.
Right, definitely just the video transport part of AirPlay. This is exactly how Miracast works too, except that's over a direct wifi connection between 2 devices.
Someone is very confused as to what "AirPlay" is. This doesn't make any sense. AirPlay is a network protocol tied to mDNS/Bonjour. There's no way the Lightning controller is supporting an entire network stack plus video transcription. Calling this only half-thought-out seems generous. I thought it was well known that these were smart cables. Thunderbolt is the same way. At best the SoC is some sort of smarter adapter.
I honestly don't know. Someone is somehow implying that the adapter is implementing a full stack capable of being an AirPlay receiver and transcoding video on the fly. I think Cabel is wrong or has explained what they think is occuring incorrectly. /shrug
In this case, AirPlay is our easier-to-understand way of saying "the results you're getting are exactly the same as when you stream iOS video using AirPlay". (Which is really weird for a video-out dongle! Especially since the former, non-Lightning one did proper video out.)
I've tweaked the post to make it clearer that AirPlay isn't _necessarily_ the _exact_ mechanism being used! It could just be H.264 or MPEG or whatever.
Mostly I'd love to know exactly what this chip/system does, so if anyone here with far more advanced hardware knowledge than any of us have feels up to hacking around, that'd be amazing! :)
The key takeaways from the post are:
1. It's feels unusual that a AV adapter would have a full SoC ARM-based CPU with RAM etc, not just a video encoder/decoder chip. Is it unusual? Let me know. :)
2. It's a bummer that the Video Out isn't very good, and not true 1080p, and has MPEG artifacts.
> Mostly I'd love to know exactly what this chip/system does, so if anyone here with far more advanced hardware knowledge than any of us have feels up to hacking around, that'd be amazing! :)
Its converting an encoded steam to HDMI output. There aren't enough pins on a lightning connector to directly output HDMI and even if it could, you still need a transceiver somewhere (not particularly trivial in terms of space or power to stuff in the phone).
The SoC is most likely a little ARM core to manage stuff (Cortex-M0/3) with a video decoder and HDMI transceiver, the 256MB of ram is there primarily for the decoder to use. It also needs to mux the audio stream into the HDMI encoding.
As for 'it runs iOS', your really splitting hairs, its pretty normal for the master devise to load a slave device with its firmware when plugged in (rather than storing the firmware in flash on the slave). The fact that they would use some knocked down version of iOS isn't terribly surprising. Embedded versions of more powerful OSs are used all the time; There is a non-zero chance your stove and microwave are 'running linux'.
> There aren't enough pins on a lightning connector to directly output HDMI and even if it could, you still need a transceiver somewhere (not particularly trivial in terms of space or power to stuff in the phone).
MicroUSB+MHL accomplishes the same thing quite well on the current crop of Android phones. Why Apple chose the method they did is still quite odd.
MHL is an option but it only fixes the number of pins issue. It has the same problem as HDMI out: you need a transceiver capable of multi-Gbit data rates. You also have to get the video output to the transceiver. Its not the easiest thing to do inside a phone, its really not the easiest thing to do when your next phone release is focused on 'thinner and lighter'.
Apple already made a strong commitment to AirPlay, so they already had a focus on building a fast, smooth, low power encoder that could encode the entire screen. Once encoded, the stream is probably only a few Mbit/s, a data rate that can easily be transmitted with single ended protocols like SPI. Almost all SoCs already have multiple SPI buses so no need to change any hardware.
Their solution may seem inelegant to some, but I think it is great. They managed to support a feature with almost zero hardware costs on the core device (depending on how you assign the lightning connector). A feature that I'm willing to bet only a very small % of all users will ever use. That is a big win, not having to dump extra hardware into a device that only 5% of all users will ever activate does great things for your margin and design flexibility.
> There is a non-zero chance your stove and microwave are 'running linux'.
Unless Linux comes with a BSD license, there is, in fact, zero chance. Apple is known to run a NetBSD variant on the airport routers - I'd say that is what is likely or whatever the hell a "stub version of iOS" means.
There is a false assumption that manufacturers don't use Linux because of GPL. Most products don't use Linux because they don't need it, and they use something like FreeRTOS or another commerical RTOS. Plenty of companies use Linux in all sorts of home appliances, mostly TV products, but an increasing number of fridges from companies like Samsung do have Linux and they do release the code. What manufacturers don't do is release the applications that they run on the OS (despite the false assumption that they should).
The GPL requires that any changes (made to code used in released products) be released under the GPL. I'm unaware of any home appliance manufacturer that freely releases the OS of their products under the GPL, as they would have to if they were using Linux.
If they did, they would probably do it the way Amazon does with Kindle or Samsung does with Android. A giant tarball you can download from somewhere on their site, but not necessarily easy to just know about (like a link on the homepage).
If you go here: http://opensource.samsung.com you can see that they have the OS for some TV's. This is the typical way they comply (with unuseful, unannotated giant dumps).
It's not that weird at all, in a sense. With this mystery cleared up, all the evidence is basically consistent with "Lightning" being just USB 2.0 over a proprietary connector plus some proprietary authentication chips in the cables. There is literally nothing it does that could not be done with a standard micro USB 2.0 connector.
Knowing what we know now that blog post is kind of amusing. Apple's Lightning appears to be less sophisticated than the DHL video out over micro-USB connectors it complains about. It's like Apple said "screw trying to get a proper video out, we'll just cram it all down USB 2.0 and lossily compress the hell out of it to get it to fit, then stick an entire ARM SoC in the adapter to decompress it again" - it's the most kludgy solution imaginable. (I have no idea whether Lightning actually uses USB 2.0, but it seems to be in the same ballpark bandwidth-wise.)
Yeah, someone mentioned something similar elsewhere in the thread and it certainly makes sense to me. It fits in with the comparison to Miracast, the need for hardware support for full screen mirroring over AirPlay. I can see the adapter processing a stream in a similar format or some such as AirPlay (without being a classic "AirPlay receiver" which is what I was hung up on)
For your two points:
1. I think that comes with the territory of being a Lightnin accessory and makes sense if they're doing some sort of compression or something before hand. It certainly fits with Apple's persona. Control the content, don't let any-old-body create a video adapter. They have to deal with Apple's handshake/stream/whatever because Apple doesn't pin-out the HDMI for the Lightning connector...
2. That is a major bummer and seems like one of those "details" that Apple would get right. I'm certainly curious the more I think about it as well.
No, I understand. "Someone is confused about AirPlay" was more accusatory than I meant. I genuinely don't know if Cabel knows much about AirPlay, I only know what I know from working with it and uPnP/DLNA for the last some months, and it made this seem unlikely to me. He may well know more than I and I learn something new and useful. :)
The 30 pin dock had 30 pins so it could put video out directly and things like that.
Lightening is a SERIAL FORMAT with 9pins. So it streams audio and video out in an encoded form.
The AV adapter, need to take that audio and video and turn it into a standardized AV format for the AV plugs.
Now, rather than a lot of odd incompatibilities because Apple added new features to new devices that older docks don't support, we have a common communication format in lightening that should be much more robust going forward.
Apple can add whatever protocols it needs over the serial connection to support future tech, rather than the old way of redefining what some of those 30 pins meant from period to period-- remember the 30 pin connecter started out in a time when there was firewire taking up some of those pins!
People like to ascribe nefarious purposes to Apple or claim apple is "rippng them off" because a small computer that does digital AV conversion costs $30... and they dont' realize that the 30 pin connector didnt' do any conversion of formats was just bringing the signals out to a standard connector. This one actually has to do work, which is why it has a SoC on it.
People don't realize that the 30 pin connector didn't do any conversion of formats because they don't care. They want a connector, and what Apple delivered is a power-sipping SOC that actively throws information away, given the artifacts seen, and then upscales to 1080p because they didn't even have enough throughput to get lossy compressed 1080p through the bus.
So yes you get your 30$ worth of components but it's horribly inferior to a 5$ adapter you can get for every other connector on the planet. It's a rip off alright.
A rip off in the sense that its a crappy output, not in that they are overcharging. I see mention that its likely making no money or actually losing money selling this cable.
I really don't get the logic. As if the cable was the thing that you would absolutely want to keep going from one version of the iphone to the other. If a new technology is out, (let's say 4k output), you'd have to upgrade memory and cpu to handle that power anyway, along with a new TV or monitor, and pretty much everything along the chain.
Hypothetically beeing able to save the cable seems really not the problem anyway..
Which would go absolutely nowhere to explain why my Pioneer car stereo head unit, which used to allow remote control of my iPhone's music collection, and a Pandora app, no longer works with the iPhone 5 - "iPod Out" would be one of the first protocols you would expect to be supported, even 'out of the box', but, no it's not, rendering an $800 head unit a lot less useful.
Perhaps Apple should think of such things first, before "future proofing" things.
Despite your defense of them, and whether there was any merit.... this change has the benefit of netting Apple a healthy profit, and rendering billions of dollars of accessories obsolete.
Maybe I'm missing something, but I feel like Lightning is just the next Firewire without all the usefulness that FW had (low latency, dedicated bus that wasn't USB, lots of pro audio gear that worked with it, ability to daisy-chain, faster real-world speeds than USB, near-ubiquity before FW800 came out, super small form factor (the 4pin is tiny; not much bigger than a lightning)).