Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What the author means: if you ever worked with actual Bluetooth you'll see that it's full of insane quirks and vendor bugs. This opens up the risk that your product will be seen as bad because the user has a crappy BT chipset in their computer. Just see the debate about Bluetooth headphone latency here on HN: even engineers don't realise that experience is very much dependant on what kind of chipset they have on BOTH sides.

This is why manufacturers build their own 2.4GHz proprietary dongles using proprietary protocols - it ensures consistency for the users.



Supposedly the drivers are also full of hacks. Blacklists and Whitelists for which devices actually support certain features. A device might say it supports a specific codec, but then when the phone starts streaming audio it causes the headset to lock up.

It's a bunch of hacks all hacking around the other hacks.


The BT protocols are quite complicated and therefore the different implementations will be full of quirks IMHO. Try debugging an audio BT issue on a phone and you will see the amount of complexity.

Interoperability is tested, but not in an organized way and not repeatably - AFAIK 3 different teams are supposed to check that a new protocol works against each other. The different vendors verify their BT implementations but not against each other. That would be expensive, I guess.

The firmware in BT devices can also use a number of hacks which are not breaking the spirit of the specs as such .. and this is not being tested. On top of this often BT and WiFi has to work together (shared antenna) and this means that BT will not get all the time it needs but a percentage based on what Wi-Fi is doing at the time.

I try to use Ethernet every time I get the chance.


> it's full of insane quirks and vendor bugs

As someone not familiar at the low level, can somebody explain how this can still be happening today? Are there not libraries that help ensure proper implementation? I really don't understand.


There's multiple effects going on. Firstly the bluetooth spec itself is quite complex: the 'core specification' document is near 3000 pages of very dense technical details, because it's aiming to allow anything to connect to anything. Secondly, many bluetooth devices are both heavily power and cost optimised: you don't have a lot of silicon and you don't get to use it very often if you want your product to have a reasonable battery life. This is especially true of the raio in a bluetooth device, and adds a whole other level of complexity to your implementation, and is one area where bugs are very easy to introduce. It also requires a fairly complex dance between hardware and software (most of the bluetooth spec is implemented in software, usually C. The hardware mostly consists of a radio capable of transmitting on the right 2.4Ghz signals with the right modulation scheme, but when you add power management to that it gets gnarly).

Thirdly, hardware companies are generally not great at software. A lot of effort goes into physical testing and validation but the base level of quality of the code isn't great. Also the quality of a given bluetooth device often depends on the quality of the custom code for it: there's a generaly trend of running application code on the same CPU as the bluetooth code, and in an embedded context there isn't generally great seperation between the two (nordic's approach is probably the best here). A third-party making a cheap pair of bluetooth headphones is not likely to make a quality result, even if using a library which is relatively bug-free.

Finally, there's a vicous sprial effect: because you need to interact with a wide variety of other implementations, which may all be buggy or quirky in their own ways, you wind up needing to test with a wide array of other devices (expensive and may not always happen) and play whack-a-mole with the remaining issues which appear, which in turn creates more quirks and more opportunity for bugs.


Hardware makers basically can't write good software.


That doesn't explain why Bluetooth is the protocol that causing the most trouble. Proprietary 2.4Ghz devices appear to work just fine.


A Bluetooth device is much more complex than the proprietary wireless devices. The complexity is largely because Bluetooth hardware must support many different types of devices all with differing requirements. A proprietary keyboard dongle doesn't have to be able to deal with the audio traffic coming from a microphone or going to a headset. Furthermore, it doesn't have to support proper identification and authorization of each individual device; many of them will accept input from any compatible keyboard that happens to be near by.


Not an expert in this kind of low-level stuff, but bluetooth is mostly implemented in hardware. I don't imagine you could get a compact lightweight headset that can last 10 hours on a charge if it needed to run even something as lightweight as a C library. And obviously, you can forget something like Airpods.


No, bluetooth is pretty much entirely in software. I say that as someone who has worked on Bluetooth devices at CSR/Qualcomm. Embedded software quite routinely runs hours on batteries. Indeed, some (entirely software) devices, last many years.


Are you basing this off of an assumption or do you have a source?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: