>Apple Watch nowadays is using Apple (codenamed Marconi) Wi-Fi + BT chips, hopefully they'll switch all their product lineup to those.
Apple has renewed their WiFI commitment to Broadcom for at least another 3 years. Marking WiFI, or in fact any Wireless Controller is hard, making it that is also a moving target ( Both WiFi 6E/ WiFi 7 and 5G / 3GPP Rel 17 ) is even harder.
But if Apple do intend to make their own WiFi ( Which makes perfect sense from an BOM cost reduction perspective ), please also bring back AirPort Wireless Router.
One of the recent phones (X or 11?) used Intel for part of its builds (as far as I remember you couldn’t tell from serial number). Apple had a lot of problems and swore off the Intel parts, which killed Intel’s wireless business.
I assumed Apple bought the Intel product line for IP, especially after that episode.
Apple used Intel modems for many generations - the iPhone 7, X/8, XS, 11. For some generations they used Qualcomm chips in the models for CDMA regions since Intel didn't make a CDMA modem. So you could tell which chip you'd get from the iPhone model number.
Apple ditched Intel due to Intel being at least a year behind on 5G.
These modems were all LTE modems though. They've always used separate Broadcom chips for WiFi/Bluetooth.
> "For some generations they used Qualcomm chips in the models for CDMA regions since Intel didn't make a CDMA modem."
Yeah. The Qualcomm-equipped international version of the iPhone X notoriously got significantly better LTE speeds compared to the Intel-modem version sold in UK/Europe.
the iPhone 7 used the intel wireless chip. it was a major downgrade for me from my iPhone 6 at the time: speeds were lower for data transfers, and reception was less (could be attributed to an antenna change?)
overall, I was very happy to get a phone that didn't use the intel chips.
I was under the impression that modern laptops ship with intel WiFi cards (I think I saw some Lenovo laptop use it). Do you mean it killed their modems ?
Broadcom bluetooth/wifi chips ran out of firmware hot patch ram slots long time ago. Company seems to be too cheap to respin the rom mask with all the fixes baked in. From what I remember even brand new iphone x ships with no room for BT firmware patching.
Broadcom chips aren't very well regarded in PC Gaming or Server enthusiast markets either, they're a step above Realtek. Oddly Intel is well regarded though...
It mostly comes down to drivers from what I understand but I'm not entirely familiar with all the objections.
> Broadcom chips aren't very well regarded in PC Gaming or Server enthusiast markets either
I always thought that it was funny that their NICs are much to be desired yet their ASICs power a wide variety of enterprise network switches capable of moving terabits of traffic at low latency.
Even Broadcom's wired NICs have different generations that came from different acquisitions: Tigon3 (tg3) came from Alteon AceNIC, I forgot where NetExtreme 2 (bnx2) came from, and NetExteme E (bnxt) is an in-house effort that is somewhat derived from the StrataXGS switches.
So yeah, Broadcom is a rollup of two dozen different companies so the quality of one product tells you nothing about others.
I would much rather have a realtek or intel controller than a broadcom one. My thinkpad has a realtek, my last laptop (an xps13) had an intel controller, and they just worked without issue.
Admittedly it's been a while since I had a broadcom controller, but I had all sorts of problems with it. Random disconnects etc.
I long ago realized that just buying an Intel wifi card and replacing whatever crap my laptop ships with is totally worth it. (At least many 15" laptops ship with replaceable cards, don't know about smaller ones).
Realteks also have documentation and thus open-source drivers, which was for a very long time (and very much still is) a problem with Broadcom's notorious closedness.
In my experience, Intel has higher performance and more features (which might be a downside from the point of complexity and configuration) at a higher cost, while Realtek is slower, but cheap, simple and reliable.
I dont think Broadcom's WiFI are inherently bad. If I have to argue ALL WiFi are bad.
The only reason why Realtek dominates is because they are the cheaper alternative. The only reason why Intel dominate WiFI on PC is because they are bundled as part of their CPU sales for Laptop.
Most of the half decent WiFI AP still uses Broadcom. Although increasingly Qualcomm is making inroad. So Boradcom is stuck in the middle of no where in consumer market. They dont have WiFi in PC Laptops, nor do they have WiFi in Android Smartphone. Apple being the largest Broadcom customer for their consumer grade WiFI. And that is why every 2-3 years you may have read Broadcom's intentional leaks about they want to sell their WiFi chipset division because they dont want to play balls with Apple's WiFi asking price.
You can't compare these two markets. Whether or not these chips work well comes down to firmware and drivers, and every customer gets their own custom datasheet, firmware, and driver. If you hack that into something good, your product will be good. (You're going to need to fix a lot of code and of course have a lot of meetings. Worth it if you're selling a million units, I suppose.) If you just buy some chips off the back of a truck in Shenzhen, solder them onto some circuit board you built in your garage, and use the three-years-out-of-date reference drivers written in a day by some random intern, the results will not be as good. Apple, I assume, uses the first approach when working with hardware vendors. The WiFi card you got for $3 on Amazon probably doesn't.
Don't take this as an endorsement of Broadcom or anything.
Dell is also following the first route and they shipped a PC(alienware) with shitty broadcom drivers(that was a 4yrs ago). I have avoided broadcom ever since..
Sure there are, it's called brcmfmac. If the AsahiLinux team hasn't already got it working, certainly Corellium did a few months ago in their much hyped public demo image.
Broadcom Wi-Fi languished for something around a decade under the "b43" driver before Broadcom showed up with their brcmfmac driver, it caused significant anger within the kernel community that they decided to write a new driver instead of just augmenting the existing one
They have various product families and more than 2 drivers.
I need the bcmwl driver (dkms) for some 4-5 year old HP laptop and it has severe problems with some APs and slight problems with others. Overall it's just bad.
Debugging it showed that it's only a thin Linux wrapper around some huge binary blob they use. So not really FOSS, but of course that's what we are used to with firmware. This was't really firmware, they use the blob on the CPU IIRC. (It's been years I debugged this, since then Broadcom is just something that raises a red flag to me.)
Marconi was the name of my kitty. A person at the veterinary clinic called him Macaroni by mistake (she’d read it but not heard it). The vet was rather adamant about making sure she got it right when I was just gonna let it go.
He turned to me to check that my guy was indeed named for the inventor of radio. Yep, that’s how he got his name. I wanted him to have a name with dignity.
Another Apple internal project was dubbed Sagan or Carl Sagan. It was later changed to BHA.[0] The story behind it is amusing. [1]
I've always been more of a Caroso fan, especially his second treatise. I've even learned Alegrezza d'amore from both his first and second works and did them back to back :)
Then again, Petit Riens is my favorite danes of Pesaro's... I call it mosh pit riens given how rowdy it can be!
A patent has to disclose the invention in order to be valid. An invention, however, can comprise a new and non-obvious combination of pre-existing components. Most inventions, in fact, comprise new and non-obvious arrangements of familiar components, some of which may be patented by others. If a component is patented by another, that just means the inventor won’t be able to actually build his invention without buying or licensing the component from the other inventor. I’m not familiar with the Marconi case, but that may be the situation.
So, I’m dumb about this stuff, but I have a wireless keyboard that uses a WiFi dongle to connect. I’ve lost connection with it a few times on my new m1 Mac. I had never lost connection with it in the previous 5 years.
Could this be related to the phenomenon the authors are describing?
Dongles are usually not Bluetooth or WiFi.
But sometimes share the same frequencies.
The M1 MacBooks have been having Bluetooth issues.
Not applicable to your dongle issue, but for others:
One weird trick that works without fail for me is disabling WiFi, connecting your Bluetooth devices, and reenabling WiFi. Don’t know why it works.
To elaborate on this for GP, if a keyboard has its own wireless dongle, it is usually in the 2.4GHz frequency band, same as used by microwave ovens, Bluetooth, and 2.4GHz Wi-Fi. The keyboard and/or dongle should list the frequency band it uses.
The Wi-Fi chipset in the computer is not used at all here. The dongle presents itself to the computer as a USB human input device, just like any other USB keyboard or mouse.
This is why you can use a keyboard like this to access the BIOS settings at boot time on a PC, which you can't do with a Bluetooth keyboard.
That said, the wireless connection between the dongle and the keyboard is subject to interference by other 2.4GHz devices.
So one thing to make sure of is to connect to a 5GHz Wi-Fi network, not 2.4GHz. And as you mentioned, Bluetooth can also interfere.
I had an interesting case of this some time ago. I was using a ThinkPad TrackPoint Wireless Keyboard II with a desktop computer on the floor several feet away. This keyboard can either use Bluetooth or its own USB dongle; I used the latter so I could access BIOS settings.
It worked great - except when I used my Bluetooth Apple AirPods with a phone (either iPhone or Android). Then the keyboard would start missing characters or repeating them wildly.
The solution was to use a USB extension cable to put the keyboard dongle on my desk near the keyboard itself. That improved the signal between the two and eliminated the Bluetooth interference.
Yeah same
I thought my laptop broke because my AirPods and my mouse weren't working. Turned out it's the other laptop connected to 2.4GHz across the room.
It's not so much the Type C connector causing the problem but rather the high data rates of SuperSpeed USB (or whichever one it is that causes the trouble; I can't keep track of the USB-IF's recent garbage). So Type A (9-pin) connectors are vulnerable as well. But given that a BT or BTLE dongle can't really run very fast, I doubt they're using anything above USB HS (480Mbps) and therefore unlikely to interfere at 2.4GHz no matter what the connector.
Yeah, the dongles themselves work fine, it's other USB 3 devices that cause the problem (eg. hard drives or USB 3 hubs).
Bluetooth is also affected by the interference. The advantage of a dongle is that you can use an extension cable and move the transmitter away from whatever is causing the interference.
They always seem to come with a dongle just incase though, and sometimes people use the dongle despite having bluetooth, or plug them in and then connect to the computer's built-in bluetooth.
The problem with Bluetooth is that it just doesn't work well when you want to use one keyboard or mouse with multiple computers.
For this reason I prefer the dongles. Want to use your mouse on a different Mac? Just plug the dongle in a different computer.
I don't know how often I ended up at the password prompt with no way to type the password because the keyboard didn't connect, or was connected to a different Mac, and I had no idea how to get it to connect to the Mac I was trying to start.
Logitech also has a Bluetooth variant of this where the mouse will have 3 different "personalities" that you can associate with different devices and then switch with a button. As far as the host devices are concerned it's 3 different mice so there's no switching involved.
It works great and it's the best of both worlds. Convenient and no dongles (Bluetooth on my MacBook was very reliable for a mouse).
The Snow Leopard days of the Mac made me want one. Everything since then has been a disappointment. I miss the big little details like the Welcome intros and beautiful usage of CoreAnimation. Now it seems like nothing uses it, and macOS seems far far less magical than ever.
I could never see Apple coming up with the original Time Machine design anymore.
QuartzComposer was really impressive and enabled a bunch of really useful applications (as far as data-driven graphic displays go). And now it's just… gone.
Honestly, it's easy to hate on Apple, but you start to appreciate macOS once you have to use Windows or Linux for anything serious. Yes, there is no good desktop OS, but macOS sucks the least, and by far.
I think that is just your opinion, and not objective truth. I am a happy debian user, currently forced to use macOS at my current company, and can't say I have a great time developping on a mac. As a web developper, Linux is great: native docker, native linux terminal experience...
I however think this is a personnal opinoin, and I won't argue that Linux is better that macOS. If I has been a macOS user from the start I would have maybe been equally unhappy to be forced to use Linux.
What i'm trying to say is, we are all different people with different tastes and priorities, and I don't think we can objectivly say that one OS is better (or "sucks the least").
macOS has a native BSD terminal experience, more or less. Docker is definitely slower. I like using Linux on the desktop, but I don't like maintaining it and my installations always seem to break eventually. To me, Windows is worse than Linux in every way except compatibility and stability.
I use Macs for programming, music (DAW), and video (NLE), and don't see to have all these big problems people have. Several Macs over the years. And those are quite demanding applications (for programming, resource wise, I didn't use Docker until recently, but did use Vagrant, Vmware, and virtuabox for ages).
My major gripes are swallen batteries (2 times in 20 years), and the crappy 2015-2019 keyboards (plus, whoever idiot designed the usb-at-the-bottom mouse).
But as far as the OS is concerned there are some bugs I see here and there, but I don't see the big fuss. I don't have peripherals disconeccting at random, crashes, data loss, and so on.
I've had way worse bugs and issues on Linux and Linux software over the years (which I also use since 1997), and even used as my main pro desktop for years (1999-2003, Debian) in far more optimistic times.
One key differentiator might be that I don't install any kind of crap (I have a no-haxies policy for example, and spare for Little Snitch, I avoided kexts too. A vendor has some special e.g. keyboard driver? Fuck it, I'll use one keyboard with class compliant drivers), but I do install a lot of apps still.
Another might be that I usually re-install fresh every 2 years or so (I like to get rid of accumulated junk, including personal documents and downloaded files, and install what I really need to use at the current, sort of like spring-cleaning).
> plus, whoever idiot designed the usb-at-the-bottom mouse
The batteries last so long in those mice, plus you can get an almost full day's charge in about two minutes. It takes a real idiot to run into issues with it. :)
I must be a real idiot then (which is very possibly the case, but I'm pretty sure not for disliking this design).
On top of the batteries starting dying much faster later on (the "last so long" thing is only true on the first 1-2 years), the warning is shown and dismissed far too easily, and far too late - and then they just die.
High Sierra was meh for me. They rewrote the window server to use Metal, but the Nvidia GPU driver was so terrible at it that it sometimes crashed and it was always sluggish. You literally got a performance boost when you switched to the integrated GPU. They fixed most of the issues with Nvidia in Mojave tho. It still leaks memory but at least it's usable.
And I agree that you skipped Mavericks (10.9), it was one of the best releases. I used it for like 3 years but had to update to whatever was current at the end of 2016 because I got a new job and needed the latest Xcode for it.
The Broadcom SoCs used in the RPi's are not "good stuff" at all - they are one of the worst pieces of junk I have ever had the displeasure of using.
They are extremely unstable, and if you want any kind of moderately reliable deployment of an RPi, you need to use a watchdog timer to reset the device if it freezes. Except - guess what - the watchdog timer itself is broken on the RPi! (As of the last time I tried this, iirc with an RPi 4) Either it's busted in hardware or the driver is broken. Most semi-high-reliability devices built using RPis have elected to use external watchdogs, which is super annoying and would not be necessary on a correctly designed SoC.
Their clocking and boot schemes are also totally insane garbage, with random peripherals breaking if you change the GPU clock, and with the GPU silicon running the bootloader or something ridiculous like that.
I would prefer almost any other ARM SoC to the BCM used in the RPi.
And the lack of 3D accelerated video with rpi4. I wish I knew about these prior purchasing rpi. There is a good market for small form general purpose computer with 4g ram + decent gpu. I am still not sure if Intel’s NUC would worth a try in this category.
Broadcom "enabled" the Pi to be created because the SoC for the original RPi 1 was a flop. They wanted to use it for Set Top Box (Cable TV, Google TV etc) devices but couldn't find any buyers
One of the engineers who worked on it worked out a deal with Broadcom to simply reuse the design and market it toward education instead
And it boots with the GPU which is not 3D accelerated in version 4 because the whole project was done by one person who just left. This is one of most popular SBCs out there. I wish it was a bit better.
Are there any high performance single-board computers on the market currently? I use a lot of Rpi's in personal projects, and I've been thinking about a home file server in a super small form-factor. What I would like is an SBC somewhere on the size scale of a smartphone, with performance in the range of a current flagship phone.
if wifi packet capturing is broken, then this effects web browsing too right? if most packets received are garbage, then this cuts down the bandwidth/download speed etc.?
Big Sur has been a significant drop in quality, even worse than Catalina. Some are bugs such as this one and some are just deliberately stupid design decisions like the changes to the notifications. I’m worried for the future of the Mac.
As Linux laptop user, I'm bemused when the competition gets worse.
Maybe Linux on Apple Silicon will be the best of both worlds in a few years --- I think the dedicated porting effort will more than succeed and will get something more polished than Linux on Intel macs ever was.
I don't share your optimism. If proprietary firmware and dkms blobs remain _de rigueur_, I can't see the situation on linux improving w.r.t. things like graphics cards and wifi chips.
WiFi is literally the only blob Linux will have to deal with as far as we can tell, and I'll make sure our installer pulls it from the system firmware partition (which will always exist on a functional Mac) so nobody will have to worry about it. All the other blobs are loaded by system firmware before Linux even gets to run, so they might as well be invisible to us (UEFI on PCs does much the same thing).
No DKMS, ever. All our drivers are headed to mainline. I'm not even going to bother with DKMS most likely; people who want to use bleeding edge drivers before they're merged can use our kernel trees (and we'll have builds available for those folks).
I wouldn't call it optimism per se, not in absolute terms at least. That will certainly continue to be a problem. But Apple messing up their own software either by accident or on purpose is a nice boon. And a single piece of consumer hardware being so good that that their a patreon for it is also interesting and good.
For the reasons you say, I personally would rather buy e.g. some pine64 thing just might benefit from this porting work. (Probably more in the userland than kernel itself, tbh.)
The one good path I see for hardware is that in the system on chip era, open designs can gain a foothold by saving the companies money, just as no one could compete with free linux for server software 2 decades prior. There was no hope getting a foothold with 1 design = 1 chip, but a world where people can mooch some free IP for their dohicky might do the trick.
Still no where near as bad a Linux on the laptop the last time I tried it. I haven't used Linux on a laptop in a decade because it was so bad I swore it off.
I have Fedora 33-34 with gnome (initially with KDE as a I was a fan back in my Ubuntu\Arch days) and I only keep it to install updates from time to time to see if it is as usable for a general use as macOS.
A lot has changed in the last decade, particularly with TLP and now suspend / hibernate being set up by default on most distributions. You might give it a try again.
It worked in my case. The issues with Catalina and the mac hardware pushed me to switch to Linux for my personal dev machine a few years ago. The transition was tough at first but now I love that machine. It makes me wayyy happier than the MacBook I have to use at work.
How does Linux handle scaled displays? MacOS handles it best, Windows has issues and inconsistencies, and last time I checked Linux support was abysmal.
I run MacOS almost entirely because multidpi running so seamlessly.
Windows support is kinda funny, it works fairly well normally but has all kinds of strange bugs. For example, if you switch in/out of full screen windows will just randomly move between displays. There are quite a few inconsistent scaling bugs.
I haven't played with xrandr multidpi, although I hear a lot of complaints about it. I've tried wayland, but I wasn't keen. I was very frustrated with sway (twm), which advertises itself as a drop in replacement for i3, but does many things differently. The fractional scaling on wayland can be blurry. There are still loads of bugs.
Linux on desktop users shouldn’t be bemused until Linux desktop experience isn’t a pile of UX trash.
Linux on the desktop is still so frustrating, ugly, and only marginally more useful over a Mac for extremely specific use cases, that I would rather use Windows + WSL2.
Whenever I read these comments I really wonder what do you do with your desktop environment. I spend most of the time inside an app, mostly the browser, the terminal or the editor. As long as the desktop env doesn't get in the way it's good. The only annoyance I experience when I switch back and forth from GNOME to MacOS is really the different keyboard shortcuts and that I have to install an additional tool (Rectangle) to get basic window tiling on mac. I can even use Windows these days, as long as WSL is installed
I've become a wsl convert now as I can trust windows to deal with sleep/wakeup and wifi much better than linux. If macosx experiences a developer exodus I think it's likely many will follow suit. Especially with things like WSLg coming out
These past 6 months with WSL2 are the most productive I've been since I've spent 2 years on the Apple ecosystem or 15 years on Linux.
I don't have to debug OS wide breakages or weird Wifi/BT issues or unstable GUIs and it's running Linux, not BSD on a weird kernel and file system with... interesting permissions and security settings.
And I can play games and use Docker at native speed.
> Linux desktop experience isn’t a pile of UX trash.
Desktop Linux is the only bearable desktop UX. Anyone who actually prefers the UX of desktop Windows is mentally deranged and should not be allowed to make user facing software.
Hey, I make no promises as to my level of sanity.
But yes, desktop Linux UX is trash.
Trash being defined by incomprehensible design and inconsistent implementation of proper design lessons learned in the last 20 years.
Early versions of Catalina were also not great. (Heck, early versions of any major macos release tend to be relatively buggy!)
I think people tend to forget things like that, and only remember the later releases of an OS since that is the version they are likely switching off of it to the newer one.
I personally have found big sir no more buggy than most releases. Better than some[1], worse than others. About middling. Not bad for it including new hardware (m1)!
[1]: ye gods, remember how bad 10.10 was with discoveryd before they rolled back to mdnsresponder! yikes!
Later versions of Catalina remain not great. The problem is that each version of macOS compounds onto the next, since there's no time for the "tock" after the "tick" anymore. Sure, catastrophic bugs may get fixed, but some stuff just seems to degrade forever (I can't remember the last time drag and drop worked like I expected to... even on Apple apps). The OS usually launches around November, and By WWDC, we're already being told about macOS N + 1 and getting betas of that, so you basically get half a year of "bug fixes," before focus shifts to the next release. There's no "stable" period anymore.
Tiger through Lion had a 2-year release cycle. Ever since Mountain Lion, it's been a 1-year cycle. Further complicating this is that these have included unintuitively "ambitious" (or at least "labor intensive") changes. Big Sur was a UI redesign that touched the entire system. But let's not forget that Mojave was also a huge undertaking to get everything working with Dark Mode (both for Apple developers and third party developers). Meanwhile entire suites of apps get thrown out and the rewrites are used to dogfood Catalyst. It's just infinite churn. The core of the technology has taken on the properties of fashion. And that is just not a great recipe for stability.
I've personally chosen to try to approximate the missing "tock" cycle by upgrading to every other macOS. Clearly this is nowhere near the same benefit of Apple actually dedicating a full year and a half to a release, but at least I don't have to go through the (largely arbitrary) UI changes every year or so, and I have the "bugs I know" to deal with vs. the ones I don't yet know about.
I just moved on to Big Sur since I got an M1... the previous MBP was on High Sierra, so you can imagine my culture shock! Big Sur reminds me of nothing more than Windows XP, and that's not a compliment.
I’ve taken a big liking to Big Sur. It is a graceful evolution of the UI that needs a lot more precision work (like Preferences, the menu bar shortcuts and Notifications) but it’s a solid base for the next couple of years.
I really, really wish they'd ditch the yearly cycle for macOS. It's a mature operating system—older now than classic Mac OS was when OS X launched. It's not missing huge chunks of functionality like iOS, nor is it in lockstep with yearly hardware like the iPhone.
At this point all I do with my Mac is work. I would much rather it be rock solid and reliable than be crammed with new features and change for the sake of change every year. Most of the features that get added for iOS parity/compatibility could probably be delivered in point updates anyway.
The Apple Silicon transition is arguably something they have handled extremely well, with the only unknown being what the pro-level chip will be like. Hopefully it can all be over sooner rather than later and they can get back to work on general software quality.
You would think that with each release being so much worse than previous versions that eventually OSX would quit booting up and become nothing more than a proof of concept.
So much hyperbole in this thread. A solid subset of users declares that the sky is falling and this is the end for Apple. Rinse and repeat with each iteration. I am not saying there haven’t been missteps, but it ain’t all bad. My personal experience with Big Sur has largely been a non event and I have tons of boutique usb synthesizers audio interfaces and software. I use windows Mac and dabble in a few Linux distros. None are perfect but they’re all pretty good.
Does Big Sur let you loopback audio out yet, or are third-party extensions still required? Do Soundflower & BlackHole even work on M1?
The fact there hasn't been native loopback for so long is mind-boggling. I'm a Mac user and use BlackHole.
I've lost count of the number of times I've been on a call and someone with a Mac has gone to share screen and been confused as to why their audio isn't being shared. COVID/WFH has made this a regular occurrence.
Agreed, this isnt a datapoint. And I have an app called Loopback by Rogue Amoeba that does this job very well. There are plenty of reasons that you could assume OSX doesnt include this natively. The biggest I can think of is if you can capture streaming audio at no quality loss then you kinda blow up some business models.
I dont hear any detailed descriptions of how the software quality has declined. What I generally see is bugs being reported which is then called evidence that apple software is bad. I think the UMN story has shown that all OS software is large, complex and bug ridden.
I personally dont love how OSX is becoming more of a walled garden, but I also think that given the fact that bad people exist to make sure we cant have anything nice its a reasonable reaction to try and make the OS more secure. The days of the OS is your playground are ending. I wish that werent true but there are crappy folks out there that want to steal your data, identity, scam you, commit computer crimes against you.
I will reiterate that I think Big Sur was probably the smoothest big release of MacOS Ive been through(since Tiger). Others may have different experiences, but I can only speak to mine.
Software quality is a moving target. iOS 2.0 was exceptional quality in 2008, if a competitor released something akin to iOS 2.0 now they'd be laughed at.
Additionally, the comment was in response to a comment about audio interfaces, so it's relevant on multiple accounts.
This is a popular trope but I’d really want to see some supporting data. It’s notoriously hard to correct for only hearing from people who found something they don’t like, people assuming their experience is universally shared, and for people to tend to forget old problems.
Anecdotally, I think Big Sur is better than either Mojave or Catalina: in my experience, it solves a lot of the bugs I was trying to avoid by sticking with High Sierra.
Big Sur made it worse. For probably the greater part of the last 6-7 years I've felt Windows performs better on equivalent hardware (even after Mac OS fresh installs). By Big Sur it's so bad the latest Intel MacBooks chug on it.
That said, the M1 changes things, IME. It's just sad it takes that much.
I disagree.
As a 20 year Mac user, Catalina has waaay worse.
Big Sur has been painless. So much so that we’ve rolled it out enterprise wide at [Giant Internet Company]
With the two previous transitions, Apple went through all sorts of weird pains. This time is no different, but unfortunately they still have to release yearly and implement the macOS version of each new service.
Apple Watch nowadays is using Apple (codenamed Marconi) Wi-Fi + BT chips, hopefully they'll switch all their product lineup to those.