Hacker News new | past | comments | ask | show | jobs | submit login
The Promise and Confusion of USB Type-C (recode.net)
149 points by lxm on Jan 21, 2016 | hide | past | favorite | 101 comments



"Eventually, most of these connections will likely become wireless."

I have my doubts. It takes a lot of effort to enable 10GB/s wireless transmission. The spectrum is physically constrained by bandwidth, noise and interference, and it becomes evident in wireless. While a fairly cheap piece of ethernet cable and router will enable several Gbit/s (up to 400 Gbit/s soon, that's simple impossible in the 2.4Ghz band due to spectrum and current interference situation. So you need to use a less overcrowded, shorter range band with multiple physical antennas of large size, and you get beautiful monstrosities such as this:

http://www.amazon.com/dp/B016EWKQAQ

And the only way up from there (about 10Gbit/s) imo, is to move to other frequencies entirely, and lose many aspects that make wireless convenient, like wall penetration and long range.

So I think wireless isn't a very elegant solution for high bandwidth. What would be ideal I think is that every outlet would have a Type-C port that you could plug your device and get both power and high speed internet.

Keep WiFi and other wireless technologies low power and low bandwidth (actually improving the interference situation), like browsing, reading remote sensors and controlling appliances -- for things/situations that are meant to be remote, wireless.

Note: this was already addressed in some comments here... I'll leave this for my own reference :)


Those antennas are just for show.

There are plenty of short-range wireless technologies currently under development that use the SHF/EHF bands to send data at very high speeds. There is plenty of bandwidth up at 60GHz. Furthermore, the processing power of affordable chips is very high, so they can control an array of antennas that send the signal exactly where the other end of the connection is (WiFi is starting to do this, and cellular networks have done this for a while). For connecting your hard drive and monitor to a nearby laptop, I think this will work fine.

The big "problem" with these bands is that they don't propagate through walls, or even your hand or a sheet of paper. But with beamforming they can go around your hand, so it's not a problem for things like connecting to monitors, in theory.

Wall penetration is a big problem for WiFi. WiFi's very limited bandwidth is shared among all stations that can hear each other. When walls block the signal, it's like you're creating a brand new Universe free of other stations and interference, and so you get free bandwidth. This is why corporate AP deployments turn _down_ the power of their access points, so they only cover a small area, thus allowing the system to have more total bandwidth.

Ideally the clients would also turn down their transmit power, but they don't because that's bad for benchmarks and reviews, so they kind of ruin this by bleeding their signals intended for the nearby AP into the range where other clients that can't hear that AP are. Fortunately things like beamforming and MU-MIMO are fixing this, by allowing receivers to move a null in their receive antenna system to where the unwanted device is, thus providing isolation from nearby devices and increasing the total bandwidth of the system. We've been promised this for years but it's still in what I think are the early stages.

(As for the use case of "I want one WiFi router to cover my whole city!" we're not there yet. You need an AP in every room, with a wired backhaul. Meshing might solve this... but it is also in the very early stages.)


The antennas are not just for show. Having multiple antennas allows you to increase bandwidth, reliability and range.

https://en.wikipedia.org/wiki/MIMO

https://en.wikipedia.org/wiki/Antenna_diversity

https://en.wikipedia.org/wiki/Beamforming


The post I replied to linked to this: http://www.amazon.com/dp/B016EWKQAQ

You do not need your antennas to look like that, that's pure marketing.

Here's an access point with 12 antennas: https://on.google.com/hub/


Oh you're right you can pack them more discreet, but that should have a worse performance than the one I showed. That's because the coherence length of wireless is fundamentally related to the wavelength -- so to get independent signals you need a certain physical separation, no matter how clever you are with their placing.


Antenna length and placement certainly is a huge factor in wireless performance. But those specific antennas on the linked router are thick and huge mostly because of the plastic decoration around them. As the GP said, it's marketing.


The wavelength is 6cm for 5GHz and 12cm for 2.4GHz, so a 1/2 wave dipole will be half that. (Nobody uses dipoles for WiFi, though. Undesirable radiation pattern, etc.)


From what I understand, beamforming won't allow high frequencies like 60GHz to 'bend it like beckham'. If your hand can block 60GHz, then the radio waves probably aren't bouncing around the room enough for it to make a difference. This is because, unlike the name, beamforming is simply phase modulation used to alter the constructive and deconstruction interference patterns of the broadcast radio waves. So for 2.4GHz, where you already have radio waves coming in multiple directions, the constructive interference patterns can give you a higher signal strength.


> "Eventually, most of these connections will likely become wireless." ... I have my doubts.

Eventually. Not tomorrow. Not next year. But it will come. If you include the full quote, it becomes in my opinion clearer what he thinks:

> Eventually, most of these connections will likely become wireless, but given the need for power and the expected challenges around delivering wireless power to many devices, it’s clear variations on USB Type-C, particularly Thunderbolt 3.0 and later iterations, will be around for some time to come.

While there are physical limitations to what wireless can do, over the last decades we've had groundbreaking research which repeatedly has increased the BW available to wireless systems by orders of magnitudes.

Why do you think we've hit the peak now and it can't happen again?

> The proliferation of USB Type-C clearly marks the dawn of a great new era of connectivity for our devices, but it may require a bit of homework on your part to fully enjoy it.

Fully agreed on this. In my head USB Type-C was ... just USB Type-C until I read this article.


> While there are physical limitations to what wireless can do, over the last decades we've had groundbreaking research which repeatedly has increased the BW available to wireless systems by orders of magnitudes.

And they have been outpaced by the bandwidth improvements on wired networking.

Why settle for some increase in speed (maybe, assuming you don't have signal quality problems – which only grow worse as adoption increases), when you can have a much larger increase in speed (guaranteed)?

Sure, wired doesn't always make sense: Anything that can be sensibly battery powered is a candidate for wireless connectivity. But if you're stringing wires anyway, why not use much more reliable wired networking while you're at it? We're putting power delivery into Ethernet and USB, and Ethernet into HDMI, so we're increasingly not even needing additional wires.


I am not an adept of wireless because of performance problems but wire consolidation makes a lot of sense. For instance it would make sense for all laptops to have the ethernet connection as well as additional connections going through the power supply. So that a single wire to plug when you go wired and all wireless when on the move.

Docking stations sort of do that but you loose the "lap top" ability. I think most people use their laptop in bed or on a couch when not working. And even on a table it's nice to have the wiring underneath the table.


That's not a half-bad idea, you could do that relatively easily with current-gen tech by building a powerline-ethernet adapter into the charger brick. You might need to use a "dock" connector that has pins broken out for the ethernet.

Powerline Ethernet has some issues, its bandwidth is limited and it can be susceptible to interference. But when you can't run a real ethernet cable, it's a very solid option. In particular it's much lower latency than wifi.


I would used a custom connector for the power adapted. Powerline ethernet is slow and messy. Pretty much any laptop uses a custom port anyway so that wouldn't be a problem in itself.

In fact Surface does pretty much that: https://www.microsoft.com/surface/en-us/accessories/surface-... except that it comes with a very short cable which makes it unsuitable for sofas or beds.


Again, powerline ethernet isn't ideal. It's two selling points are (1) not having to run a new cable, and (2) significantly improved latency and packet loss compared to wifi while maintaining reasonable bandwidth. I normally get about 70-80 mbps across my PLE adapters. It is susceptible to picking up some interference at times, which cuts performance in half, but during those times the wifi is unusable for any sort of realtime/latency-sensitive applications. I have local GigE segments in my living room and computer room, but I would need to run a drop to connect them and I rent.

USB-C and Thunderbolt are point-to-point bus connectors, they're not designed to have multiple clients on a network. Even if you designed a switch that could use ethernet-over-Type-C as a physical layer, the maximum length for an active cable is only 50 feet for USB 3.0 speeds and 82 feet for 2.0 speeds. Passive cables are 15ft. That's not really enough for long runs inside a wall, so we're back to asking the question: OK, I plugged my USB Type-C into the wall. Now what?

If you're going to actually run a new cable to your outlets, running Type-C seems like a poor choice. You could stick a Type C ethernet adapter in a wall-socket formfactor, run CAT-6 in the walls, and call it a day. That gets you runs of up to 330ft for GigE. But if your vision is "plug into the wall and instantly get on my network, without pulling ethernet to every outlet" then you need something that runs over a power-line, because that's the only wire that's guaranteed to be run to a power outlet.


Again, not powerline ethernet.

You can create a custom connector for your power plug (power plug to device) with 40-ish pins that make all the individual wires go through one cable (2 pins for power, 8 for ethernet, 10 for USB, 19 for HDMI, and I am sure there are synergies).

Then the power supply has an ethernet socket, USB socket, etc, and all your accessories (printer, TV, etc) are connected to the power supply instead of connected to the device.

The surface dock does effectively that, except that I suspect they only pass a USB3 wire along the power, and have a USB switch inside the dock, and that ethernet goes over USB instead of directly.


> So that a single wire to plug when you go wired and all wireless when on the move.

That's the whole idea behind USB-C with Thunderbolt 3 and USB 3.1, yes.


Thunderbolt can accomodate many protocoles but can it do them at the same time? I.e. drive a 4k monitor, and an ethernet connection, and power the laptop, and connect an external hard drive and printer through USB? To go the power supply hub way you need simultaneity.


>Eventually. Not tomorrow. Not next year. But it will come.

Physics and physical limits don't really change with time.


But naive methods give way to more clever ones.


Yes, but clever methods can't bypass physical limits either. Sometimes that's just it.


Bypassing limits is exactly what clever solutions do. We can't break physics, but we can use different methods to solve the same problem. Like wires to get into a room, and a lifi or laser network inside of it, or wireless between the rooms with furniture sized antenna. Or, more likely, something far more clever.


Wireless, pff. Give me a wire any day.

At any given technology stage a wired connection will be faster than wireless. Not to mention you don't have to worry about interference, and losing the signal.

If wireless is so great, tell me this, how come wireless routers have wires _inside_ them! :-)


Li-Fi can be extremely high bandwidth.

https://en.wikipedia.org/wiki/Li-Fi

"It is so far measured to be about 100 times faster than some Wi-Fi implementations, reaching speeds of 224 gigabits per second."

I know that in practice, my colleague has tried it, and yes, in practice, it can work.


Only works with line of sight though, not useful or practical when in most line sight situations you could just run a cable anyways, and not have to worry about potential interference.


Seems pretty practical for e.g. a docking station scenario - rather than connecting a cable, just toss your phone on the desk and it connects to your monitor/docking station.


There are multiple problems with LiFi. The most obvious being diffusion and diffraction as well as line of sight (you need a bright light very close to your device). But also LiFi relies on there being essentially zero power fluctuation in the actual lights. This might be possible in a tech demo where the lights were on for 5 minutes, but not for lights that have seen several months of use.

Not to mention that the power costs are immense (only a fraction of the power of LiFi is actually spent on data, the rest is actually powering an array of lights).


> http://www.amazon.com/dp/B016EWKQAQ

Hell with wifi, I just want one of those to hang on my wall as a sci-fi art piece.


> The crux of the problem is that not all USB Type-C connectors support all of these different capabilities and, with one important exception, it’s almost impossible for an average person to figure out what a given USB Type-C equipped device supports without doing a good deal of research.

A Google engineer has been reviewing USB Type-C cables on Amazon, in order to resolve some of the confusion[0]. His reviews have since been collected in a spreadsheet[1].

0: http://arstechnica.com/gadgets/2015/11/google-engineer-leave...

1: https://www.reddit.com/r/Nexus6P/comments/3robzo/google_spre...


Bensons reviews aren't really about capabilities. He's warning people against poorly designed A-to-C cables which identify as high-power sources, permitting the USB-C device to draw up to 3A from a USB-A port which is only required to handle 0.5A.

It's a sad state of affairs that so many cable manufacturers can't get a simple ID resistor right.


Well the USB power specs are a unholy mess since 3.1 was introduced.

You have your default 0.5A USB.

Then you have the battery charging spec that goes up to about 2A (either negotiated in 0.1A increments, or in bulk via a data pin short at the supply end).

Then you have your C port spec that defines C to A or B, via a resistor in the wire. And no less than two tiers of C to C via two other resistors.

And then you have the power delivery spec. It builds on the Battery charging spec by allowing voltage to go up to 20V depending on the wire, and will work with any port (A, B or C).

Basically what is happening is that various wires are being made according to the C spec, but using a C to C resistor where they should be using a C to A/B resistor.

I suspect the confusion stems from both the Battery Charging spec and the C port spec use resistors. One of them in the wire, the other in the port, to signal that the device can draw more than 0.5A.

Why any of that is even there when a C port have way more pins than any A or B port can deliver, and thus it should be bleedingly obvious that you are not dealing with a C to C wire, is beyond my comprehension.


Honestly as complicated as it is, if you are a manufacturer it's your job to read and understand those standards.

Every spec has edge cases and little things that you need to watch out for, and if these manufacturers are missing pretty major pieces of this spec, well i'm not confident that they aren't missing parts for others as well.


Can't the owner of the USB copyright mark revoke and sue them for using it in a non-standard implementation?

If you don't follow the spec, and nobody punishes you, don't be surprised if more people follow suit...


Many cables don't include the symbol because it is expensive to get and requires passing tests they might fail. Go look at your cables, I bet you'll find ones without it, even ones that came with more expensive products.


It's a trademark too, so if they don't enforce it then they'll lose it? Perhaps enforceability is already lost...


It's trademark, not copyright. And... ya, maybe.


Not related to the issue discussed in the article. It's about type-C supporting like 10 different speeds and modes.


I wish they put their foot down and said any cable/device that supports USB-C must support a. USB 3.1 b. USB Power Delivery.

I am looking at vendors like OnePlus and Google/LGE for their OnePlus 2 and Nexus 5x which do not support USB 3.1 transfer speeds.


No, that's a terrible idea. Not all devices need USB 3.1 speed or Power Delivery. Requiring every device to support both would lead manufacturers to avoid the standard altogether.


They could stay with Micro USB B though? Perhaps at least require all cables support USB 3.1 and power delivery?


The problem isn't the cable, it's what you're plugging into on the other end. I need to be able to plug my new phone into my old laptop.


if the USB Type-C cable was USB 3.1 compatible and the device (phone?) with the USB Type-C port was USB 3.1 compatible, I'd assume they'd still be able to talk to a USB 2.0 device (notebook pc?) on the other end. This is how I imagine USB works... I imagine one would try to negotiate the highest possible specification and degrade from there. We had a chance to make a standardization, to say that "not just it is reversible but we guarantee a fast route if it is Type C on both ends"



"Just because a device has USB Type-C connectors does not mean it supports power or any other alternate mode, such as support for video standards DisplayPort or MHL (used on some smartphones to drive larger >In fact, technically, it’s even possible to have USB Type-C ports that don’t support USB 3.1, although in reality, that’s highly unlikely to ever occur."

This is actually a bigger problem than the author theorizes. Both the Nexus 6p and the Nexus 5x support USB-C on USB 2.0 rather than 3.0 or even 3.1. When USB-C computers become more prevalent, people might be sad to see their fancy device lacking the promised bandwidth they associate with the connector rather than the protocol.


I am designing some embedded devices that would only support USB 2.0, and I considered using Type C because it would be more convenient and also more compatible in a future world where most people have type-C cables.

So I am wondering why the author thinks type-C ports that don’t support USB 3.1 would be that rare.

Edit: The author might have meant the downward facing ports in a computer or hub, and it would be rare to have those type-C ports not support USB 3.1. That would make more sense.


For the same reason you're using USB 2.0 when you could maybe get by with USB 1.1: in a few years a big majority of the mainstream embedded chips you'll use will support USB 3.1, so you'll effectively get it for free.

I say "a few years", but I seem to recall it taking 5 or more for the transition from USB 1.1.


High speed USB 2.0 support is quite rare on cheap chips. Most are still at USB 1.1 speeds.

USB 3 connectors? WAY COOL. I'm running USB 1.1 on the inner pairs and Ethernet on the USB 3 pairs for several projects.

USB 3 standard? Oh, hell, no. The signal integrity requirements are outrageous. And most embedded chips can't even transmit at the 400+Mbps necessary to saturate even USB 2.0.


> I'm running USB 1.1 on the inner pairs and Ethernet on the USB 3 pairs for several projects

That sounds like a nice trick. Could you elaborate, or is there any public information you could link to?


Nothing proprietary. Just look at a Type C pinout.

http://arstechnica.com/gadgets/2015/01/usb-3-1-and-type-c-th...

Two TX pairs/Two RX pairs. Standard USB 2.0 in the middle.


Embedded chips (i.e. microcontrollers) won't support USB 3 for a long time (think 10 years at least). Very very few even support USB 2 High Speed and that was released 15 years ago.


There's a division forming in embedded where "real embedded" like my dishwasher uses an 8-bit microcontroller to turn pumps and valves on and off in sequence which may never have a USB and for price reasons will never support above USB1. It has very little stored state to talk about, and the more state and sensors the less reliable and productive it'll be, so that's unlikely to change any time soon.

The other division of embedded is best described as duct taping a tablet computer permanently to a refrigerator, and those will have USB-C like next year. In the '80s we put TVs and VCRs into the same case and called it innovation... This is the '10s version.

One type of embedded is like industrial control, the other type is like product tying.

One segment is extremely price conscious because China sells the USA 10 million value engineered dishwashers per year, and using a microcontroller that costs $1 more to do something the market is completely uninterested in is a $10M loss which the market will not tolerate. The other segment is luxury gadgets for rich people where price is no object and sales never exceed the thousands, although the web pages are extremely elaborate and expensively designed.


I care more about symmetrical plugs than I do about bandwidth. Hallelujah! My prayers have been answered!


It's my favorite thing about the nexus 5x. The 3a charger ain't bad either.


The new Google Pixel has a type-C on both sides. It is heaven. I can pick the pixel up with my eyes closed and always get it in (please, no jokes). And on my lap the cord can drape out either side.

These features may seem like small ones but they make a big difference in everyday convenience.


Imagine if the connector was round...


I can't remember the last time I plugged my phone into my laptop. Bandwidth for USB-C is way low on my list of reasons to like it on my 6P.

#1 for me, personally, is that it's more durable.. my micro USB ports on my phone always started to loosen up, and the cables would stop staying plugged in (yes, even after cleaning out pocket fuzz). My 6P's connection, so far, seems much more robust.

But also, more power, 3 amps is pretty sweet, and the reversible connection is very nice to have, if not really a huge deal.


That's because Google has been actively discouraging the use of the cable for data transfer in order to encourage cloud use. Your phone won't transfer to Mac or Linux at all except via ugly buggy apps you learn about after digging through online forums.


Oh? An Android phone plugged into a Ubuntu 14.04 LTS system appears as a USB drive. Android brings up a popup asking if you want to enable USB access (a basic security measure against hostile charging ports) and, if allowed, the phone's storage appears as a folder.


An Android 2.x phone. Android 4.x+ replaced the MSD protocol they spoke with MTP which nobody seems to get right so you have FUSE filesystems like jmtpfs, mtpfs-simple etc to deal with your particular brand of breakage.


That was a pretty valid criticism about 2 years ago. I haven't really had any problems with MTP on Fedora or Ubuntu since about that time. Whatever gvfs is using for MTP works fine these days.


I sometimes plug my Android 4.x+ phone into a desktop with Ubuntu 14.04 LTS to copy some photos from the phone. It works fine, with no extra software or configuration required. On the graphical interface, it appears like a USB drive.


Yeah, that's not happening with my Samsung Galaxy on 4.4.2. Plug-and-play.


Maybe this is a Windows bug. Linux users don't seem to be having problems. Is it not working on Windows?


I really doubt an issue with FUSE drivers is a Windows bug since Windows does not have support for FUSE.


Agreed, I don't really care so much about USB data speeds. What I do love is how solid it feels and how fast it charges. I am very impressed with the rapid charging on my 6P.


USB is still the primary method of backing up your phone for most people, although, admittedly, a lot of people don't bother doing that.


Or finding out that my Chromebook Pixel 3 has USB Type-C, but not Thunderbolt 3... which means I can't have an eGPU.

And with the Razer Blade + Core set to actually make eGPUs a big thing... I'm pretty disappointed. I'll probably end up with a Razer Blade sooner rather than later.


I may get downvoted into oblivion for this, but i do much prefer to pay and use Apple's Lighting Cable with MFi for charging.

The amount of absolutely terrible USB cables alongside with disastrous Charger are simply insane.

I hope there is a USB-C Certification, or Heck rename it as USB-D or something, that provide guarantees to speed ( USB3.1 ) and Power Delivery etc.

Personally I dont see much confusion with the USB -C cable, as long as you use it to plug into the same Logo Shown on both side it should be ok. The Logo being a display of Whether it is using HDMI / Thunderbolt etc.


First we had mSATA and M.2 types A, B, C, D, E, F, G, H, I, J, K, L, and M with profiles S1, S2, S3, D1, D2, D3, D4, and D5. Now we have USB Type-C which may-or-may-not support a handful of other standards, speeds, and wattages.

What happened? Did the hw/cable vendors take over the standards bodies and "growth hack" in some defensive differentiation?


+1 for "Virtualization" for physical port insight.

What we need now is a logo for each supported protocol on the back of every device or on top of the port (if available) - power / displayport / usb / thunderbolt.


Don't worry, they'll all be there on the back of your game console in raised black on black plastic.

If it's not on the back, it will be printed in special low contrast gray paint.


"Okay, I’ll admit it — it’s not exactly the sexiest topic in tech."

I actually find USB-C one of the most "sexiest". As the author himself puts it, it's a "Virtualization" of connectivity ports. But I think it virtualizes "power" delivery too. With a high powered adapter (e.g. 80 watts) and USB PD (Power Delivery) support on both ends, I can use a single adapter with many devices. Wait a while and you'll see adapters with multiple USB-C ports on them each supplying difference voltages to connected devices. The possibilities are mouth watering.

I'm currently researching to build a DC-DC converter and I'm thinking of providing a USB-C output with PD support.


"I can" More like "you could".

The point of an insanely complex protocol with zillions of shipped variants is to benefit top to bottom silo manufacturers (apple, etc) while destroying the market for non silo manufacturers.

USB 1 really could charge off almost any port and almost any port could access almost any flash drive. That is being eliminated other than in silo'd ecosystems.

My favorite part of type C is the high voltages, its going to be fun watching Chinese grade cables short and utterly fry and ignite USB connected devices. Much like the fire department inspector gets out of whack about seeing extension cords plugged into extension cords, the inspector of the future is going to condemn office buildings where USB cables are present, of any sort, due to USB-C contamination.


x86 and Unix are insanely complex too and a big part of that complexity has to do with backward compatibility. I've scanned through the specs and I haven't seen any complexity just for the sake of it. Care to point out any specifics? Also note that some of the added complexity is to support new features, such as power provider/consumer switching and power safety.

As for the substandard Chinese grade products, they've always done that and will continue to do it. Bulk of existing USB chargers you find have horrible ripples and have power factors that seem to have been engineered to be embarrassingly low.


Even the standard C-C cables aren't all the same: some are wired for USB 2.0 (with 6 wires) and others for USB 3.1 (with 17 wires).

As one might expect, the 3.1 cables are thicker, more expensive, and pretty much impossible to find in lengths greater than 1 meter.


It's even more confusing: There are 5 different profiles for USB power delivery [1] so you have effectively 10 different types of cables with no marking whatsoever.

[1] https://en.wikipedia.org/wiki/USB#PD


USB 3.1 has a huge issue that it pollutes all over the 2.4 Ghz/5Ghz space thereby reducing Wifi/Bluetooth to almost useless.

Longer cables exacerbate the issue and make it worse because it now is a nice long antenna.


Not really. Once it's longer than 1/4 wavelength at the frequency in question (about an inch at WiFi frequencies) it doesn't matter how long it is.


With the new MacBook it seems like Apple is moving towards USB C as the does-everything connector, including for charging.

However on the iPhone and other recent mobile devices they seem to really be behind Lightning.

Is there some reason why the iPhone and MacBook couldn't or shouldn't use the same?


Ultimately Apple controls the Ligtning spec and they can add capabilities to it as they see fit without having to wait on an external body to ratify. This is super-important for them as the iPhone is the core of their business and they need to be able to iterate as they see fit.

One of the biggest death-knells for Type-C coming to the iPhone is the connector is bigger than Lightning. Apple continues to move towards thinner, lighter with less ports. There's no way they're going to move to a connector that is bigger than their current one.


USB-C is too thick to fit in the (thin) iPhone, so Apple must use something else.


That's a myth. With lightning you additionally need space for pins in the device. With usb type c the pins lie in the inside of the port.


In March 2015, Karsten Nohl is quoted wrt type C in the context of BadUSB:

'"The additional openness and flexibility of USB Type-C comes with more attack surface," says Karsten Nohl, one of the researchers who first discovered BadUSB. "No solution for BadUSB is in sight even with this new standard."'

(source: http://www.theverge.com/2015/3/16/8226193/new-apple-macbook-... )

Any new developments or new information available on that, other than, say, iOS prompting the user on 'Trust This Computer'?


I'm certainly not excited about the idea of all the random USB devices I plug into my machine having a PCIe lane with DMA capabilities available to it. Random vendor-provided USB mass storage sticks being able to read all the system memory? What could possibly go wrong?

In theory, IOMMU can mitigate these risks. In practice, barely any OS actually enables those protections, and AFAIK the CPU manufacturers (at least Intel) are still using availability of IOMMU as a differentiating factor for high-end CPUs.


I recall being utterly astonished to read[0] that the USB C cable that comes with the latest MacBook only supports USB 2 speeds!

"You can also use the USB-C Charge Cable to transfer data at USB 2.0 speeds between your MacBook and another USB-C device."

[0] https://support.apple.com/en-gb/HT204360


OK, so if I have this straight:

* USB Type-C is the physical connector, and in practice usually implies USB 3.1 gen-2 support

* USB 3.1 gen-2 is a signalling specification which depending on the device will support some combination of USB 3.0 data transfer, advanced power/charging capability, video signals, and PCIe lanes

* USB 3.1 gen 1 is just USB 3.0

* However, Type-C devices can fall back to USB 2.0/1.1 speeds if they don't support super-/hi-speed transfer, or if the cable is not wired with the extra pins for 3.0.

Allowing Type-C cables to be sold with only the 6 wires for 2.0 instead of the 17 wires for 3.1, in particular, is a real head-scratcher.

Also, renaming 3.0 to 3.1 Gen 1 is a bonehead move since they're totally different. One is a data transfer standard while the other is a Thunderbolt-style multi-stream connector. In practice that makes the "USB 3.1" designation entirely meaningless.


It's not the cable that matters rather what kind of "controllers" on each end of the cable are connected and that causes a lot of confusion as people think of all those features as being "cable" based rather than controllers.


The progression of USB Type-C is simply going to continue to split the web of port connectors as we enter into a season where Thunderbolt 3.0 is already confusing everyday users vs Thunderbolt v.1&2.

What we need is a single identifier for a wireless standard, not the continued progression in port and the problem with icon design. USB was supposed to the universal port, but even what the author suggests will be the dawn of a new era of connectivity, what we really should be relishing are new steps in wireless methodology.


Wireless is a pipe dream. You can't power something wirelessly over any reasonable distance with any reasonable power draw... furthermore, the bandwidth capacity needed just keeps increasing such that even wired standards have a hard time keeping up. Ever tried running a 4k monitor over HDMI? Older versions couldn't even do it.

So, you need a cable for power, and you need that cable for video, you might as well make it bring over all your data, too.


In addition to the sibling, there is only so much wireless bandwidth, and crowding it with several point to point mini wireless networks is a surefire way to get poor user experiences due to interference. Wires can use all the bandwidth with very little interference to other wires.


Speed is an issue too. The 2015 MacBook has USB-C with 3.0 speeds (which is considered USB 3.1 gen 1 or something, it's not the new high speed mode 3.1 adds, see: http://arstechnica.com/gadgets/2015/08/what-the-usb-if-is-do...)


> Also, full-bandwidth Thunderbolt 3 cables can be expensive, because they require active electronics inside them.

So far all thunderbolt cables needed active electronics. Now you can use cheap full-featured usb-c cables for thunderbolt (at reduced speed).

> it’s even possible to have USB Type-C ports that don’t support USB 3.1, although in reality, that’s highly unlikely to ever occur.

The cited macbook doesn't support 10Gbit/s


How I wish there were some nice diagrams in this article explaining which standards are superset of which and how the different connectors look like.


I haven't gone searching for answers, but purely from speculation, the sturdiness of the USB Type-C ports seems like a regression with comparison to USB 1/2/3 ports. I love how rugged USB ports have felt in the past (that is, before type-C).


I find the opposite. USB Type-C ports seem more sturdy to me than prior USB standards.

It's possible that's because my Type-C hardware is newer, and the other hardware I'm comparing it to has years of use. I find that with regular use, connectors and ports naturally loosen up over time.

One other possibility might be spec compliance. A lot of the third-party Type-C hardware out there right now isn't fully compliant. If you're shopping for Type-C adapters or cords, definitely only buy stuff that's 100% compliant.


Funny, because data says otherwise:

Original USB: 1,500 connect-disconnect cycles

Mini USB: 5000 cycles

Micro-USB: 10,000 cycles

USB Type-C: 10,000 cycles

[0] http://www.anandtech.com/show/8377/usb-typec-connector-speci...


Connect-Disconnect cycle performance doesn't have much to do with a cable's "rugged"-ness, i.e. robustness to normal stressors encountered during normal use. I've destroyed a million percent more Micro-USB cables than previous USB standard cables because the connectors are prone to irrevocably deforming when stepped on or when the cable gets lightly tugged up or down when plugged into a device.

Simply put, connect-disconnect cycles aren't the biggest threat that USB style cables encounter in normal use, and so this metric has little to say on the topic of cable ruggedness.


Survival to cycling is very different than ruggedness. The power supply to my 2015 pixel (60 watts, btw) goes over type-C, I accidentally stood up with it plugged in, and the wires were pulled out of the connector. I didn't feel much resistance. The connector was still in the socket and the laptop was fine.

The worst part is that the cable is hard-wired to the supply so I had to throw out a very expensive supply instead of just replacing a cable. I had to pay for a real google supply because it is the only 60W available. The second closest is the Apple at 35W.

And, the google supply comes into stock once a month and sells out in two days each month. I had to wait and check the site twice a day. I suspect people buy them for other devices since they can't be matched.


I can't really comment of the ports, but the connector at least seems like a step up from microUSB connectors for phones etc.. Those have always seem flimsy to me, and I've damaged a couple trying to plug them in backwards. I don't foresee having the issues with Type-C (especially as it is reversible as well).


Have you used a USB type-c device? They look small, but they have a really strong hold and feel much stronger than any other connector type I've used.


How would you compare them to Apple's Lightning connectors in that regard?


I find the sturdy feeling is mostly derived from the specific devices in question. My "main" laptop has looser ports than my old netbook; when I connect some brands of flash drive, they tilt a little bit out of the port. But with other brands the fit is solid. In all cases we're talking about USB2 devices.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: