The fact that every port used to be different is actually a feature, as it helps you identify which device supports what based on the ports it had. If you see an HDMI port on a laptop you can be sure it's going to output some kind of video signal over that. If you see Thunderbolt then you're sure the machine supports Thunderbolt.
USB-C just jams all these incompatible standards into a pin-count-constrained port (so they can't even all fit in, and trade-offs need to be made) without any way for the user to tell nor select which features should be enabled on which port.
My 12-inch MacBook for example has a USB-C port just like the MacBook Air and Pro, and yet if I plug in a Thunderbolt Display it won't work on the 12-inch one but will on the Air and Pro. As a user there is no easy way for me to tell the 12-inch Macbook's USB-C port is different than the Air/Pro unless you explicitly search for it in the tech specs (which is counter-intuitive as the whole point of USB-C is to be universal so in an ideal world you wouldn't/shouldn't even think about searching that).
Once this is sorted out we would have a wonderful world. Using a single Thunderbolt port I connect my notebook to a dock and immediately convert the small notebook into a desktop with three external monitors plus many other ports. I can also magically use the same USB-C hub I use in the notebook in my mobile phone to add a keyboard and mouse plus an HDMI monitor. I have never seen this in the past: my previous notebook needed a specific dock to work and had a two monitors limit (with three different display ports available) while the phone needed a specific accesory/cable to connect to an external monitor.
And we have not started talking about eGPUs yet ;-)
I tried buying into this, got burned. Dell XPS Dev Edition, top of the line Dell Thunderbolt Dock, should work great, I thought. Nothing but trouble, tons of complaints online, Dell refused a refund as it's "not defective" - it just doesn't work with Ubuntu well. The OS preinstalled on _their_ hardware. Most of the complaints about the Dell Thunderbolt docks are from Windows users, to top it all off.
My point being that the scenario you envision is now being sold by some manufacturers, but not necessarily delivered. These docks also have their own firmware, sometimes _multiple_ different firmwares, and at least those I've tried only let you update via Windows. There's UEFI updates that are supposed to help as well, again only updatable via Windows. So far, I can't say I'm super impressed with the reality of non-Mac Thunderbolt.
That world will not happen unless every single device including entry level phones and size-constrained devices somehow magically include all the circuitry required to handle all the different alternate modes.
Also, as far as I understand there just isn’t enough pins to support all the modes at once so trade offs have to be made...
I can plug in a X1 carbon thinkpad in my monitor (some ultrawide dell) with USB-C support just fine. I then get video, audio, USB (the monitor has a hub where I have mouse and keyboard connected) _and_ charging/powering my laptop over a single link.
Works today with Linux here! It's just magical, I tell you.
The world started to happen already, it may not become all that good and 100% universal soon or anytime, but good enough for a wide spread adoption.
I think the frustration is that it isn't totally known what you're getting until you try something.
Nintendo Switch uses the USB-C connector, but they're off spec which lead to people breaking their devices by connecting them to cabling/docks that weren't to the Switch's implementation. Think there was more nuance to it than just that, but grand scheme of things, the whole fiasco around that highlights concerns as no one wants to brick their devices.
This seems like a relatively easy fix that as you mention isn't 100%, but exists. On my MacBook Pro, the only reason I know my ports are Thunderbolt is because I know their specifications: the ports aren't marked in any particular way. The body/bodies involved with USB/Thunderbolt should make it a requirement that ports (and perhaps cables) indicate what it is capable of in a similar way to how USB 3.x TypeA/B cables have blue coloring within the port and the plug. At that stage, if I see a blue colored port or plug, I know I'm dealing with a 3.x USB one. If you refuse/can't meet the specification, you then shouldn't be able to use USB Type C for your product.
Instead, in their rush to get everything on USB Type C, they have seemingly tossed out any pretense of standards needing to be met to use the port and may stifle the port's adoption going forward.
I also recently bought a monitor with USB-C (Thunderbolt 3) which I can plug it's single cable into my DELL XPS and charge it, output 4K@60 to the display, and have my mouse and keyboard which are connected to the back of the monitor.
The downside is when I want to use the same kb/mouse with my desktop, I have to unplug them from the monitor and plug them into the desktop.
I use the same solution (not hub like you, but USB 4x switch, originally intended for printers, with an extra hub for have keyboard/mouse). Not all brands work for keyboard/mouse, but I have several that do.
Thats a great idea thanks. It's a shame the monitor didn't integrate this for when I switch display input, I was hoping to reduce the number of cables and boxes I use :(
Which monitor did you buy? I've been looking for one with USB-C support as I also have an XPS13. I have a 9360 and I know it uses a neutered TB3 PCIe lane (or something of that nature) so I'm concerned about being able to use a 4k 60 fps display with it. Have you had any issues? I use both Ubuntu and Windows on the machine, have you tried both with the monitor?
I don't run Windows so I can't answer that but Ubuntu 19.04 runs fine, although I've had minor issues with the laptop coming out of sleep and not outputting to the display occasionally.
EDIT: It's really nice in portrait for coding. I don't use the supplied stand but a desk mounted arm.
And if you plug a Samsung Galaxy S9/10/Note into that same USB-C, a full desktop will appear on your monitor as well using your existing ethernet, mouse, keyboard and audio outputs. Something not possible with full sized HDMI ports.
Interesting; I have the S9 and I've done this by accident while charging, but the monitor only shows what's normally on the phone screen, it doesn't stretch to a "full desktop". The mouse does work though.
Is there some setting I need to enable on my phone, or is my setup just different in some way?
It depends on a type of USB-C dongle I think - Samsung obviously recommends theirs, but it worked for me on my MacBook USB-C "dock" dongles with HDMI+ethernet+USB+power. IIRC you need the one that supplies power or you just get screen mirroring.
There's also a setting to switch between modes under Settings -> Connections -> More connection settings -> HDMI mode.
My ASUS Chromebook Flip C302CA (released January 5th, 2017) supports this, so it's really a hardware support issue. Take into account that most devices that would benefit from USB C/Thunderbolt 3 only really began supporting it in (very) late 2018/beginning of 2019. Most consumer devices just don't have the right connectivity, but that's changing.
Same here... I have a Dell USB-C dock and a Dell Precision and when I plugged my boss' 2017 MacBook Pro and all worked out of the box, including ethernet and charging, we were impressed.
Then I plugged my LG G6 and I could download at 80mb/s with the ethernet cable and I can use my mouse and keyboard, it even recognized my NAS.
Has anybody yet made a GPU and monitor that use a type-c alternate mode (whether it be display-port or something else) plus power delivery, such that your monitor does not need a separate power cable?
I would love to reach the point where a desktop computer only had a single power cable leading to the wall, with everything else getting powered via it.
It should theoretically be possible, since even fancy high-luminance monitors only run about 30W in use.
The real limiting factor here is that, assuming you could get a GPU with USB-C-out, you'd have to pass that power and any for accessories hooked up to the monitor through the GPU, and the nice ones currently are hitting the limits of available power already all by themselves (75W PCIE, 2x 150W power connectors - and a Vega 64, for example, can briefly peak at 360W).
So, to power your monitor over the same cable (and have at least another 15W for accessories plugged into it), you'd need a GPU that maxes out well under current ones on the market, which kind of defeats the point, you know?
On a side note, the way GPUs are increasingly hitting power constraints is why you see new designs switching to weird proprietary stuff like Apple's MPX modules, which allow a dedicated 500W per graphics card.
It would be hell on earth if you had to perform the same actions with multiple ports. If I had to choose between curing polio, and not having to use multiple ports, it'd be ports every time.
All the issues you are reporting are just growing pains and will be sorted out/disappear alongside adoption and as the standard matures. USB-C is what a computer connector should have been from day 1.
I wish that were true, but I won't hold my breath.
What is far more likely is that more standards and complexity will infiltrate the market before USB-C stabilizes into something that vendors can be consistent with and which consumers understand.
You have to be a detective to track down which cable is compatible with which port on your computer. Of course most e-commerce platforms don't show features of the cable, it's usb-c and be done with it...
I have no idea why the standard isn't mandating clear markings (eg something like the bands on resistors, or whatever that works for color blind folks too).
There was a comment about that here on HN a few months back and that kind of convinced me that it's to enable the very-very cheap gadgets to be at least compliant. Cheap chips, slow clocks, but at least they follow the protocol. At least that's the theory.
Apple wants white cables, some people want red, blue or yellow cables, so on. And the connectors are so small there is no place to put anything on them.
That's not true. It'd help a lot. Apple can use their cable and simply not say that it's USB3/TypeC/FancyX. No one would care if it would actually work.
But at least there were clear markings for those who care.
It would take up about 2 cm on each end of the cable, a set of bands, like a barcode. It doesn't even have to be that strikingly different in color.
Yes, to me that's the worst part of the USB-C family of standards: it expects the operating system to tell the user "you're plugging it wrong" (and it does have ways in the protocols to detect all these cases, like using a 2.0-only cable to a 3.x device, or the host not supporting the desired alternate mode, or a lower grade cable when both ends could use a higher grade cable), but that requires the operating system to care about it, and doesn't help at all before you buy and unpack the cable/device.
> it's usb-c and be done with it...
It's even worse: most cables I've seen in stores labeled "USB-C cable" are actually USB-A to USB-C cables. Other than these, so far I've only seen an unlabeled USB-C to USB-C cable, and a Sony brand USB 2.0 USB-C to USB-C cable, both costing around R$ 100 (way more expensive than I'd expect for a basic 2.0-only cable).
My experience watching a shop deal with transitioning to new Macs was that it was impossible to actually know a macbook+cable+device combo would work until you tried it. You could research all you wanted and there was still a chance it'd fail, or be glitchy and weird, unless you found someone with exactly the same combo to confirm that it worked fine.
Ditto dongles to connect to older devices. Will it work? No way to know, just try it and find out.
A nitpick but thunderbolt never had its own port. Versions 1 and 2 used mini display port plug and, version 3 used USB-C and well, USB-4 is just renamed Thunderbolt 3.
USB4 is more than just "renamed Thunderbolt 3". It adds USB 3.x tunneling while Thunderbolt 3 could only tunnel PCIe and DisplayPort (AFAIK, it faked USB 3.x tunneling by having a PCIe USB 3.x host on the device).
For work I had to connect a portable audio player to a USB audio amp recently. The amp is designed to plug into a regular computer, using standard USB-A at the computer end.
Since the audio player had USB-C we needed a basic USB-C to USB-A adapter.
Being audio only, it was low bandwidth, USB 2.0, no power transfer: we didn't need any fancy features from the cable. Just a plain USB-C to USB-A adapter.
I think we tried five different USB-C to USB-A adapter cables from different vendors before the audio player recognised the amp it was connected to.
It was so tempting to think there was a firmware bug, until we found out it was critically dependent on some mysterious, unknowable quality of the adapter.
Due to their high bandwidth I think we should have standardized around display connectors. It actually looks like USB-C "gives up" and just pushes out DP on unused copper? So it's like some super high bandwdith negotiation layer that could just... not be there.
Or, it looks like thunderbolt would have been a good option. But it seems overengineered -- the transport layer doesn't need to know anything about the data.
To add to this I’ve just had USB-C cause a major inconvenience; it appears the USB-C cable I brought with me doesn’t play nice with the official Apple charger and my laptop is now out of charge. It’s an AmazonBasics cable that worked fine with an Anker charger for ages so I didn’t think it would be a problem. I’m in a remote country with no Apple Stores in sight and so my only alternative is to play roulette by buying random unbranded cables from the local mall and hope they won’t have the same issue (given even AmazonBasics is apparently not good enough for the Apple charger) nor blow up my laptop by being completely out of spec (happened to that Google guy that was testing cables, can’t remember his name now).
These are pretty funny examples considering the amount of different HDMI ports and cable versions and the fact that Thunderbolt doesn't even have a port. It's either a mini display port or a USB C port.
This post shows exactly how useless relying on physical port was.
Not only that, but the USB-C plug itself is the worst-designed plug I have seen in a decade. It's flimsy sheet metal on the plug side with a nice big cantilever, and flimsy sheet metal soldered to a PCB on the other. Perfect recipe for broken ports and cables.
I break about a cable a week and break a device every now and then. Mostly doing very normal things like charging a phone in my pocket while hiking.
USB-C seems like a perfect example of a product designed by people who sit in an office chair all day and never go hiking, skiing, ice skating, rock climbing, or otherwise seek to understand what their users normally do with plugged-in devices in the field.
I miss the old DC barrel connectors -- they never broke, and they were usually designed such that housing of the electronics took the brunt of everything, not the PCB.
> I miss the old DC barrel connectors -- they never broke, and they were usually designed such that housing of the electronics took the brunt of everything, not the PCB.
I've patched up so many of these over the years, that doesn't really seem that accurate. Though it is relatively true for properly designed stuff (connector bottoms out on the case and not on the jack) and things using the rather rare panel-mount jacks, which are indeed very solid if only lacking in ingress protection.
Agreed. We had a bronze PowerBook and replaced the DC socket on it over and over before giving up. Worse yet, the socket was underneath a bunch of stuff inside the laptop, so the process was even slower than it might be.
And then there's all the guitar pedals I make with chassis-mounted jacks... they can take an amazing amount of abuse.
Same for audio jacks. PCB-mounted 3.5mm jacks need resoldering often enough, but panel-mounted lasts forever.
I disagree with the notion that just because they break that I'm "doing it wrong". Please listen to the customer. Rather, I'd say that they're just not rugged enough for my (consumer) standard of use, which needs to be more rugged than office use. I have legitimate needs to engage in sports while devices are plugged in and accessible. Other consumers get stuff bitten by children and pets, stepped on, tangled, and caught in things. That's consumer life. It should be accepted as a valid design problem to make a product rugged enough to tolerate that. Often a simple change in housings and shrouds would deal with this problem inexpensively -- in the case of USB, all it would take is for the shroud to be a part of the standard itself, and the housing it mates with to hold and relieve strain on the shroud.
What about IEC cables? They're super rugged and virtually impossible to break. Proportionally downsizing that connector to be small enough would likely maintain the same level of ruggedness while being small.
I'd appreciate if people don't downvote me just because you disagree with me. That discourages useful discussion on HN. I downvote trolls and people who abuse the system with spam, not people I disagree with.
>What about IEC cables? They're super rugged and virtually impossible to break. Proportionally downsizing that connector to be small enough would likely maintain the same level of ruggedness while being small.
Those are only designed to carry power though... not a complex list of protocols...
I've heard that USB-C is intentionally designed to break the cable rather than the port, in situations where one of those must break. That's absolutely been true in my experience.
It doesn't happen often, but I've dropped two phones while they're charging onto the cable. The micro-USB phone never charged the same again; depending on the cable, you needed to put pressure on it to tilt it in the right direction. My new USB-C phone cleanly bent the cable, and the connection seems as solid and as reliable as it ever was with the replacement.
I like the intent, but I would prefer ruggedizing them to the extent that neither of them breaks. Many automotive connectors, for example, are both inexpensive and extremely difficult to break either.
I want a phone that's small, light, is very powerful and has a battery that lasts a month.
Look what you are asking for is not economically possible. The connectors you are comparing with are much larger and don't even have close to the number of lanes as USB cables. It's a useless comparison.
There exist magnetic charging cables that can help with the breakage issue you're having. The connector and charging cable are connected by two magnets and some pins, instead of breaking the cable or the device itself, the cable will just disconnect.
What they say is they have a charger in their pocket and connected to their phone while hiking. So they're like yanking the cable, sometimes slightly, sometimes harder, thousands of times per day, as they walk, climb, etc. I'd say put the thing in a backpack, or use wireless, or whatever...
I'd say it should be designed for being able to put a phone in your pocket while connected to a charger and engaging in sports. That sounds like a perfectly normal consumer-friendly feature, and consumers will do it anyway. Putting it in a backpack means you cannot easily look at it.
The solution is fairly simple -- the /housing/ around the plug needs to be standardized and have a snug fit into the housing of the phone. That would relieve strain on the connector or contact.
I really don't understand why people are downvoting me here. I thought downvoting was to deal with trolls and abusers. When did people start downvoting people they disagree with? That discourages meaningful discussion.
Didn't downvote, but I can see how you might come across as being disingenuous or trollish by saying it should be designed for a particular use case (which implies it isn't), observing that it doesn't work, and suggesting it be redesigned. Most people have learned that it's a bad idea to keep your phone in your pocket while charging, decide not to do that, and subsequently don't have this problem.
Well, they keep breaking on me and (a) DC barrel connectors (b) headphone jacks (c) XT30U (d) BNC (e) DB9 and a bunch of other connectors almost never break on me.
I haven't straight up broken any connectors, but my Pixel 3's USB-C port is super picky about what cables will fit well enough to actually charge. I end up plugging it in, flipping the cable around, trying again, moving the phone so the cable hangs slightly different, etc. Port is clean (apparently they collect pocket lint), tried 5+ cables, only the original one has a decent hold once plugged in. All this means I basically _can't_ charge my phone on the go, in the car, etc.
I think in future devices are supposed to support as many popular functions as possible. Due to trend of thin devices we need smaller ports and fewer of them. Multiple possible use of same ports helps.
> As a user there is no easy way for me to tell the 12-inch Macbook's USB-C port is different than the Air/Pro unless you explicitly search for it in the tech specs
Seems like manufacturer failure.
> If you see an HDMI port on a laptop you can be sure it's going to output some kind of video signal over that.
But even HDMI/DP have versions. It's especially visible now, as we have >=4k HDR high-frequency displays.
> Due to trend of thin devices we need smaller ports and fewer of them.
People keep repeating this lie for some weird reason. Similar to the "3.5 had to go because phones are too thin for it" lie. Why? Why do you keep lying? What possible incentive do all these people have? Is it self-deception? "Oh yeah I'm totally fine with losing X, Y and Z because I get thinness and weight reduction in exchange!" You get neither.
iPhone 5 124x59x7.6 mm, 112 g
Six years later
iPhone 11 150x75x8.3 mm, 194 g
Now yes, its true, the iPhone 5 was part of a group of lightweight and thin phones that are now extinct, because clearly things are getting smaller, thinner and more lightweight to boot.
For notebooks the same observations can be made. The 2008 MBA is only about 100 g heavier than the current model, the thinnest part is the same(!) and thickest part about 4 mm thicker (1.9 vs 1.5 cm). A slight but minuscule improvement. You could have a 1.3 kg notebook at the end of the 90s, btw.
I blame marketing. People will run after every fad as though it is mission critical stuff and will re-tell marketing copy verbatim as though it is their own thoughts. I've seen this with a friend who watches Top Gear a lot, I heard him say something that did not make sense for him to say and tracked it back to a specific episode of Top Gear and confronted him with it. He was surprised himself!
This stuff really works and it gets under people's skins in ways that they are not aware of. If you want to keep an even keel I would like to propose that the only way you will be able to do so is to radically limit your ingestion of any kind of media.
USB-C is a train-wreck for those of us who work in electronics design.
USB-C's chipset support isn't all that good, it seems like. With USB 3 type A, you pretty-much just run your USB lines to the ports on your CPU. Simple, straightforward. Manufacturers are happy to do it, because it's dirt cheap and requires nothing except the connectors and maybe some ESD protection.
USB-C is a different beast entirely. Just to use the USB 3 lanes, you need to put down a high-speed mux just to deal with both connector orientations.
Want to support USB Power delivery (transmitting or receiving)? That'll be another set of chips to put onto the board. Plus software to deal with the PD negotiation.
Want to support Displayport? That'll be _another_ mux chip. Plus more software to advertise the functionality.
The cables are also much harder to manufacture, and this is reflected in their average price after you chop off one or two outliers.
Is your item a USB host? A device? Does it accept or provide power over USB? Does it support Displayport? Thunderbolt? Analog audio? Something else entirely? There's no way to know from looking at the connector. And no standardized markings that I've ever seen.
I'm completely ok with it being complicated to design and cost 10 euros more if it delivers.
And despite all its flaws, USBC does. It's small. It's reversible. I can charge all my devices on it: my samsung tablet, my oneplus phone, my logitech mouse, my jabra headphones, my switch and my XPS 13 laptop, which I can also dock to my screen/ethernet. It's amazing.
So yes, I had to take the time to find the proper cable, and I bought a few before finding brands I can trust to work with all my machines... and not fry them.
It sucks, but it will improve with time, like it did with early USB and HDMI (too many people forgot how terrible THAT was). It already has, in fact, improved a lot, and it will keep it up.
Meanwhile, I have only one cable and one dongle in my bag that do everything. I don't even have to get them out most of the time.
This is imperfect. This needs fixing.
But this is better than what we had before. Way better.
There's also a middle-ground that would have made people happier across the spectrum. Standard power cable, for phone, laptop, etc. Standard data cable, for your ereader, camera. Standard video-audio cable.
I think it's a good principle to have that if the engineers suffer, that will run downhill on consumers too. I think every engineer can find examples of that.
I like plugging my laptop into the USB-C hub at home, and that giving me keyboard, mouse, display, ethernet and power without having to plug in multiple cables.
I probably misused the "across the spectrum" idiom. What I meant is that it would make developers much happier than USB-C and it would make consumers much happier than USB 3. My idea was that it's a more wholesome situation than content consumer and frustrated developer.
As a consumer I want one cable to do all of it. Sure, like anything else, it will be difficult and slow progress to get there. As developer you sould see it as challenge.
It's good that you know your ideal user-experience (developers are aware of it too), but the trade-offs will depend on what's happening on the developer's end, so dismissing developer difficulties is not conducive to a good consumer experience.
Developer difficulty is a symptom of a more systemic problem, and it will usually affect the end-user through: higher-cost, lower-reliability, accumulation of technical debt, and/or resource-inefficiency.
Not dismissing, I understand it is a hard problem. I'm simply saying that is the developer job to solve the issue.
Similar thing happen in web development, there are tons of device, screen size, javascript framework, browser, os, all come with its own quirks. Is it suck ? Yes it is but it is my job as as a competent developer to manage all those complexity and solve the issue. From the user prescriptive, all they need to know is the site is working with whatever device they use, that is the goal.
Yes, but because of "the suck" of web it's in some ways in a dismal state and that affects users. We can have only a small number of browsers that are well maintained; browsing is resource-intensive; web is extremely inaccessible to people with disabilities; web is notorious for security problems; fingerprinting can be seen as a side-effect of bad design; web requires a lot of man-hours to develop and maintain, like you mentioned; etc. The list seems endless.
Sure you I wish that the state is in whatever it is that I prefer but thats unrealistic. As developer you have to be ready with whatever state you are in.
Sure, If there are only small number of browser or all of my user is not disabled, I would prefer that and my job will be easier but that is not realistic.
I have to be ready with whatever situation I've been dealt with. As developer my job is to solve problem and manage complexity. As a developer dealing with complex and hard problem is given.
In the end is the developer job to handle those issue so that in the perspective of the user, everything just works.
I can sympathize with wanting to solve as many problems as possible and as well as possible and to provide a great user experience; however, our ability as devs is necessarily limited and that should be accounted for. Both the developer and the user are in this together.
Where I work many people have a docking station with a USB-C plug and a power plug molded into one connector. One cable to plug in, can safety deliver more power than USB-C can alone. With a bit of work engineers could have standardized an expandable cable. You have your basic USB-c cable with one fast (display port?) lane, one usb-3 data lane, the - basic cable in the article, enough for phones and light use. Then the larger connector that can supply more power for laptops and has a few more high speed data lanes for more monitors. It is a bit of work to make the connector work for either mode, but it is a solvable problem.
What happens when you get a laptop with a slightly different port arrangement? Just throw out all the old docks? Seems like quite a solution from a manufacturer's perspective!
The reality is every time someone gets a new laptop they get a new docking station. Docking stations seem to last about 3 years. It is very nice when traveling to get to a remote office and hear "Joe is out this week, you can use his office - he even has a docking station that will fit your laptop". However it is questionable if his docking station will fit your laptop.
If connector N (usb-c or whatever) has enough bandwidth and a standard connector it would be really nice if we could give everybody that docking station. Even if we have to upgrade docking stations every few years, so long as I can mix and match I'd be happy.
I just use a MacBook charger and usb-c cable, even if i don't have any apple devices.
The reason is that I move a lot and forget my chargers one or twice a year in a customer office, a hotel or at the airport.
The MacBook pro charger is very expensive, but charges everything I have and I can buy it pretty much everywhere, including at the last minute in an airport.
Anker. They're consistently reliable, and accessories are their whole business, so a problem would hurt their reputation a lot more than one of the big hardware companies.
What's worse, while there are spec sheets with examples for the mux chips, good luck finding a datasheet with a full example including USB-C PD in source and sink capability, DisplayPort, Thunderbolt and "classic" USB. And don't get me started on Thunderbolt controllers themselves, Intel doesn't even release datasheets for these.
I get that Intel may want to prevent people selling Thunderbolt stuff without certification etc. but this shitty behavior really sucks for advanced tinkerers willing to get their hands dirty on some modern technology.
If someone does have an application example that includes all necessary chips including an USB3 controller, a Thunderbolt controller, a MUX chip for DisplayPort and a decent PD controller supporting source and sink operation mode, I'd be happy about any hints.
Oh and if someone from the Raspberry Pi foundation reads this, please for the love of everything that's holy update the Compute Module and expose PCI-E there!
Why should customers care about making electronics designers happy?
Customers have wanted to charge, transfer data, and transfer display information on one cable. That’s what we want, and you can see that many folks are willing to pay for it.
Everything dealing with electronics is complicated and expensive until one day it isn’t.
One could see this as a push to incentivize the development of a very configurable analog front end for device to device communication. If it were intentional, that is.
Great article. I was recently wading through this problem for my work setup. I was going crazy wondering what the actual technical limitations were with USB-C vs Displayport over USB-C versus Thunderbolt. I ended getting this WAVLINK dock for $150 with Thunderbolt 3. If you want Dual 4k montiors and USB 3 output + power for your laptop I recommend this.
The first dongle is driving 2 Dell 23" 1080 screens, my 27" 4K screen is driven directly by a thunderbolt to displayport cable, and all my USB stuff goes through the second dongle. It works really well so far, and I have an upgrade path for 2 more 4K screens with the Cable Matters dongle.
Yours is an all-in-one that really looks nice though. How warm does it get?
Thank you, that's my experience with the USB-C hub that I'm using. I kind of wish it wasn't tucked out of the way behind monitors, then I could use it as a hand warmer :)
As much as people complain about USB-C, I absolutely love it. I have a 2018 MacBook Pro, and I use the Caldigit TS3 hub (referenced in the article) as my docking station at work. I plug a single cable in, and I immediately have 2 x 27" 4k monitors, power, USB peripherals, headphones, Ethernet, and much more if I need it.
I travel with a single dongle that cost $15: SD card, HDMI, 2xUSB 2.
I have not had to buy any extra cables, besides what came with the computer.
When you find a computer/dock/cable combination that works, it's fantastic. But, in my experience, it's extremely hit or miss whether a different monitor, or laptop, or cable will work with other components. Particularly with 4k@60Hz.
Last night, I was ready to pull my hair out trying to figure out what dock + cable I need, so this article is quite the blessing. However, I’m still left confused on whether I need a USB3.1 usb-c cable or Thunderbolt 3 cable.
In light of the MBP16 announcement, I wanted to plan my purchase, including the peripherals. I currently have 2 4K monitors on a PC. I’d still like to use that PC from time to time, but it’d be convenient to do so through a dock. Enter the Targus dock that I happen to have received through work (https://www.amazon.com/dp/B07BPQKP8V/ref=cm_sw_r_cp_api_i_0b...) which claims to be Thunderbolt 3 “compatible.” However, none of their ports are specced to be such. Furthermore, the included host attachment USB-C up cable is only capable of transferring 5GB/s instead of the 40GB/s a la Thunderbolt 3.
Now, this article explains DisplayLink nicely, but what happens when I use the Targus Dock with and without a Thunderbolt 3 cable? Does it bypass DisplayLink and just go directly to the dual 4K monitors at 60hz? Or does it still fall victim of CPU overhead due to processing DisplayLink?
Interesting! Was wondering about this recently when shopping for a USB-C hub, and couldn't find many that supported 4K@60hz. Ended up getting one that does 30hz because my Display at home is only 1440p.
I have used Displaylink adapters before (have one floating around somewhere), and was not super impressed (my machine was a bit flaky when using it), although I did have 2 external displays going on my Macbook at one point (this was before the days of multiple Displayports on Macbooks), which was kind of fun.
I went through 3 of them before finding one that worked (the first was buying before I knew the issue existed. The second was being impatient). I'm using this one on a Macbook Air. (https://www.hypershop.com/products/hyperdrive-duo-hub-for-us...). I'd prefer one that used a single USB-C connection (assuming that's possible) and that used a cable so it was useful for more than just the couple of devices that have 2 USB-C ports exactly that far apart but at the time, a year ago, that was one of the few choices. The good thing is it does work and I have it on 4k@60hz
I wish I had come across this article when I was looking for a dock for my MBP (connected to a pair of 4k monitors). It was a pain to try and figure out if I'd be able to get 60hz video from a particular docking station. I went with the Caldigit one and luckily it worked. It's nice to learn why.
Although a hub is useful, it's worth dedicating one of your USB-C / thunderbolt ports to running your external screen, especially if you can't afford a good hub at the moment. I use a DP to USB-C cable on my MacBook Pro.
As the article points out, it's not worth bothering with HDMI as you'll easily end up with 30Hz at 4k. 30Hz is absolutely unbearable, and I think most developers buying a new screen now will invest in a good 4K monitor.
Note about my setup:
I then run 3008x1692 (2560x1440 is also good) scaled resolution on my 4K monitor as the full 3840x2160 resolution will make everything extremely small. Everything is crisp and running at a beautiful 60Hz.
I personally believe VirtualLink is dead. Its use case (four-lane DisplayPort and USB 3.x at the same time) is also met by USB4, which I believe will be more widely implemented.
However, I also believe that the VirtualLink trick of repurposing the USB 2.0 pins will end up being used by a future USB standard for two more lanes of USB4, gaining 50% more speed without reducing even more the maximum cable length.
Interesting how you say short lived, I had a look and it's on founders edition NVidia RTX2000 series cards, though not mandatory [1] and Valve recently cancelled support for it on the Index headset. [2]
Do you think it will be cancelled on the next series of cards?
The need for yet another specialised variant of USB-C cables does look a downside for me, vs DisplayPort getting fast enough to need only two lanes for VR, so standard cables can be used.
> Unlike most alt-mode; this will also remap A7, A6, B6, B7 to carry USB 3.0 signal, instead of the usual passive USB 2.0 signal. This means that one will not be able to extend the cable using a standard USB-C 3.0 cable, which has these pins mapped only for unshielded USB 2.0 signals. Also this will require the VirtualLink port to also detect the correct orientation of the USB-C plug to ensure that the USB 3.0 TX and RX lanes are correctly connected.
A fun/interesting hack! But breaking superposition is a huge dealbreaker. Imagine seeing notices in software saying to flip your cable upside down...
Why couldn't the connections be swappable electronically inside the connector? The way that modern Ethernet ports adapt to the correct Rx/Tx without needing a crossover cable.
Entertaining read. What a blunder the USB group has made with v3.x, it’ll all work out and eventually be awesome when every port and every cable just works and things charge fast but, almost every major feature of USB has had issues going back to first release. More reviews and articles should have a liars section.
> They are very likely DisplayPort Alternate Mode designs using four lanes.
If a hub claims to support 4K @ 60Hz and USB 3.0 it doesn't necessarily mean it supports both at the same time. It can be convenient to have a flexible hub which, when connected to a monitor where two lanes are enough then USB 3.0 is provided, if you go above then it slows down to USB 2.0. At least one of the hubs linked from the article (very small with one power, one USB 3, one HDMI and a non-captive cable) is such, this was confirmed by one of the sellers, this device is available with many, many labels.
The article also omits that many, many notebooks with a single Thunderbolt 3 port will not support multiple 4K @ 60Hz because they put only four DisplayPort lanes in there using a cheaper, lower power controller called the Alpine Ridge LP. And yes, that pos is still in use despite I hoped Titan Ridge will deprecate it. Dual port Thunderbolt controllers are all full bandwidth, single can be half or full. Hard to tell without looking up the TB controller used in the device on social media / stackexchange etc -- the manufacturers are often quite tight lipped on the topic.
Another fun fact: while the Thunderbolt 3 bus is indeed 40gbps, that's only filled out when video is used, the data speed is nerfed to 22gbps by all existing controllers. Dell admits this in one article https://www.dell.com/support/article/us/en/19/sln307875/thun...
>
Thunderbolt 3 may only reach a maximum data transfer rate of around 7 to 22Gbps even though it is advertised with 40Gbps
but this is so rare, every other article, even the one linked from this rare gem of truth claims some baloney like "Systems with 4 lane design will support up to 40gbps of data transfer over PCIe" which is obviously not true because even without nerfing PCIe 3.0 x4 is 32gbps only but it is nerfed. The official TB3 brief also tells you this but a little overt because it is never stated in the text but Figure 7 shows what's up in https://thunderbolttechnology.net/sites/default/files/Thunde... And make no mistake, there are only PCIe and DisplayPort signals on the TB3 bus, there are no USB signals, the TB3 controller in the dock provides a USB root hub. Someone should bring a class action lawsuit to Intel and practically all laptop makers for false advertising in this... (Side note: USB 4 will be different in the signals, if my understanding is correct then the Thunderbolt 3-like bus which is an optional mode for USB 4 will finally contain PCIe, DP and USB signals.)
There's a lot of complaining from the engineering side regarding the USB-C but for the customer it is godsend. Now instead of million little different ports, they can use one which in well-engineered (emphasis here) products will cover most of their use cases.
Yes, I know it is a pain to manufacture and design software for but the user doesn't care about that. They care about convenience.
The USB-C PD charger of my Nintendo Switch does not charge my phone, but it does charge my Surface. My phone's USB-C charger does charge the phone and Surface, but not the Switch. My USB-C battery pack charges my phone, but is charged by my Surface. Yeah, it just works.
How do, say, a battery pack and a Surface negotiate who is charging who over a symmetric USB-C-to-C cable?
The power adapter that comes with the Switch supports only parts of the usb-pd standard. That’s why it didn’t work with your other phone.
You’re absolutely right about the protocol being unreasonable with two batteries connected together. FireWire has the same problem. You could charge a FireWire iPod from its wall adapter, and you could charge one from an iBook. If you accidentally plugged the AC adapter into the FireWire port of the iBook it would destroy the port.
Buses where every participant is an equal peer are expensive and complicated.
On the topic of power, I really don't understand how USB-C is supposed to replace charging cables. USB-C is limited to 100W. That's enough for a switch, phone or low power laptop, but if this[0] is anything to go by the 6 core intel laptop CPUs already eats 80W of power. Add an extra 2 cores and you might not even be able to sustain a CPU load without draining the battery. Not to mention much higher power draw GPUs.
My real question here is what's the plan? Because it's not possible to push 200-300W down a USB-C cable.
Big gaming laptops will always require big power supplies, probably using barrel jacks. I've seen really gigantic ones that use two!
But pretty much every other laptop can get away with well under 100 watts. I use a 30 watt USB C charger at home, and it's fine for using my 15" i7/560X MBP at my desk, because I'm not running it at 100% CPU/GPU for extended periods.
It will drain the battery at full load, the i9 is a 45 watt TDP and the RX 5500M is 85 watt TDP (plus display, ram, disk, etc).
However, there's no way the cooling on the laptop can support that much power draw, so I'd expect it to throttle itself way before you'd drain the battery while plugged in.
i used to think usb c was awesome and would lead to a 1 wire for everything world.
Now i have 4 different kinds of usb c wire i need depending on usecase -
passive usb c for charging my headphones since they didn't implement a resistor, active usb c for charging laptop,usb c for charging mobile(fast charging),usb c for data transfer/thunderbolt/video display
The spec isn't a disaster. Vendors implementing is.
The main problem is that OEMs don't give a sh*t about the spec and cutting corners whenever they can.
The spec should require clear labelling of SOME kind to differentiate between the different types of wires. I feel like that's been a common issue with the USB spec and one they refuse to address
I saw this headline and was briefly excited for a moment that a USB-C hub finally existed. That is, a device like an ethernet hub or a USB hub, that would turn one port into several other ports of the same kind.
This is an article about USB-C port replicators. Four years into this and not a real single USB-C hub exists. I’ve never seen such a botched rollout.
It's quite a strange design choice: I can plug in the stuff I need when working at my desk (USB 2 keyboard, a couple of external drives, USB-C SSD, monitor) into my Mac only because I have a USB-C to USB 3 hub; but surely down the track more and more of my devices will be native USB-C... at which point I won't have enough ports to plug them all in?
I've looked at the USB-C spec, and it does allow for hubs like that. The USB 2.0 and USB 3.x parts of the hub are the same as for non-USB-C hubs, and the USB-PD standard even has an example with a 3-port USB-C hub at the end.
Where in the spec is it written that USB-C hubs are not allowed? The USB-PD spec even has an example with a 3-port USB-C hub, showing how the power delivery negotiation with the downstream ports should work.
Power, most likely. You physically couldn’t connect 4-5 full powered devices to a hub. There is an expectation that any USB-C port will accept any power any compatible device.
re "Driver availability and compatibility for Mac/Linux is spotty to non-existent":
Is it true that DisplayLink doesn't work well with Linux? Some years ago people were saying that the usb-to-monitor dongles worked well. A casual web search also gives this impression, with good support provided by the DisplayLink company.
You can make Display Link (DL) work with Linux, there are drivers from the vendor and an adaptation script for Debian based distros [1] (and variants for others IIRC). You may have to fiddle with the configuration to avoid issues like minor display corruptions or cursor glitches. I made it work clean once, but in the end I returned the dock. DL is to me a bad idea, except is some specific cases.
The way DL works is that a driver on your PC creates a virtual frame per screen, composed on your PC. Then this screen(s) content is compressed and send to the dock DL chip using a proprietary driver. The chip on the dock reconstructs full images and send that to locally attached screen(s) using local HDMI or DisplayPort interfaces. Now yes, those local interface(s) may run at 60 Hz and be 4k. But the bottleneck is not there, it is in the compression and transmission over the USB link. In my experience, any use-cases with large changing content (e.g.: full screen video) will give a bad experience on a laptop (which is what I used, and is natural with a dock). The compression will ramp up the CPU load on the laptop, the ventilator will get noisy, and the display content was choppy too. It's definitely not the smooth experience you expect from 60 Hz.
I guess DL can provide very good results with a laptop for use-cases with mostly static content: reading documents, spreadsheets, word, etc. Just don't expect to have a nice experience when most of the screen content changes quickly.
So I returned my DL USB-C dock, bought a super cheap one as actually I only have a 1440pt external screen for which any cheap dock will support true 60 Hz, and I have a nice experience. For 60 Hz with 4k it'll have to wait, and it won't be with DL as far as I'm concerned.
I was curious about that compression and found an analysis[1] of the DL protocol. The first thing I saw was that it doesn't just render to a framebuffer on the PC, it renders twice, once at 16bpp and another at 8bpp (which it will combine to achieve 24bpp). The use of Huffman encoding seems consistent with the state of compression in 2003 [2]. If doing it today you'd probably go with x264 lossless but that would have been too slow 15 years ago. In conclusion I'd guess the lag is caused by the split framebuffers. Or an inefficient proprietary driver.
The problem is that the driver situation is horrible mess and also that there are three totally incompatible generations of the technology.
The driver that works best is udl, which integrates with KMS/DRI and presents the dongle like additional CRTC and thus allows the existing GPU to be used for acceleration.
There is also udlfb (IIRC both udl and udlfb are in kernel source tree) which tends to be what is used by default by most distributions. udlfb exposes the dongle as additional /dev/fbX device. That means no acceleration to speak off and handling hot-plug is should we say interesting.
Both of the opensource drivers support only USB2 DL dongles.
The there is the binary driver from DisplayLink itself, which I suppose (haven't used it) comes with it's own set of issues, probably starting with the typical issues of binary-only proprietary drivers.
I use display-link dongle as port of USB-C docking station. It plugs and unplugs well, but driver uses a lot of cpu and there is no graphic acceleration. Not sure if would handl 4k video. Great for IDE.
Uh, yes. Avoid that one. We have ZBooks and these Docks at work and we all have all kind of troubles. But the funny thing is, we all have different troubles. Mostly the keyboard or one or two monitors does not work ..
I'm actually a bit confused by why it matters at this point. HDMI is ubiquitous, can handle a high enough resolution/ rate that most human eyes can't distinguish better, and is relatively cheap. It's an optimized, widely available format, and does a great job. USB is a downgrade for this particular task. Other than novelty, I don't see a reason to ever do this.
What do you mean? Plenty of computers now ship with no other connectors than USB-C. In other words HDMI is not ubiquitous
What I love about USB-C is I plug in a single cable to my laptop and I get keyboard, mouse, wifi, and monitor (4k 60hz)
At work I have a USB-C hub for my one cable to my computer. The hub has Displayport so I can use HDMI on a different device. At home my monitor does USB-C natively so my keyboard and mouse are plugged into that.
Oh you sweet summer child. If you want to buy a laptop these days, you have to shell out 300$ for what is basically half of the motherboard and connect it externally over USB-C.
The fact that every port used to be different is actually a feature, as it helps you identify which device supports what based on the ports it had. If you see an HDMI port on a laptop you can be sure it's going to output some kind of video signal over that. If you see Thunderbolt then you're sure the machine supports Thunderbolt.
USB-C just jams all these incompatible standards into a pin-count-constrained port (so they can't even all fit in, and trade-offs need to be made) without any way for the user to tell nor select which features should be enabled on which port.
My 12-inch MacBook for example has a USB-C port just like the MacBook Air and Pro, and yet if I plug in a Thunderbolt Display it won't work on the 12-inch one but will on the Air and Pro. As a user there is no easy way for me to tell the 12-inch Macbook's USB-C port is different than the Air/Pro unless you explicitly search for it in the tech specs (which is counter-intuitive as the whole point of USB-C is to be universal so in an ideal world you wouldn't/shouldn't even think about searching that).