Since Catalina I have been getting daily freezes, where the entire operating system locks up. I can still move the cursor around and audio keeps playing as well, but everything else is completely frozen.
Once in a while the freeze lasts so long the computer crashes resulting in a crash report:
Termination Reason: WATCHDOG, [0x1] monitoring timed out for service
Termination Details: WATCHDOG, checkin with service: WindowServer returned not alive with context:
unresponsive work processor(s): WindowServer main thread
I am at my wits end to what can be causing this. Is anyone else experiencing similar freezes?
There's like a dozen threads on MacRumors and the Apple discussion forums on various crashes, mostly related to sleep, but also to the watchdog timing out for WindowServer due to GPU lockups.
> After waking from sleep and running a process using hardware accelerated video decoding / encoding, the UI will freeze for a couple of seconds, then starts working again, then freezes again, and so on. The mouse can still be moved but the UI is non-responsive. Eventually, after waiting for long enough, a kernel panic occurs and the machine reboots.
How to reproduce:
1. Use any Navi10 or Navi14 graphics card inside a Mac Pro 2019, eGPU case or use a MacBook Pro 16.
2. Open Video Proc or any other app utilizing hardware video acceleration (e.g. Safari playing Youtube, exporting with FCPX / iMovie).
3. Run the video acceleration test in Video Proc, start playing a H.264 / H.265 video in Safari or export a H.264 / H.265 file from FCPX / iMovie.
4. Put the machine to sleep.
5. Wake it again and re-test video acceleration.
6. If machine doesn't freeze, repeat the cycle from step 4.
7. Repeat this until the UI starts freezing / video acceleration stops working.
8. Eventually the machine will kernel panic after several minutes.
I’ve had this happen on my work MBP (13” non-touch bar, BT magic keyboard and trackpad and HDMI monitor connected via a cheap USB-C hub) a handful of times. Really have no idea what motivated it.
I don't even let my screens turn off anymore; these days I get this weird thing where after waking the monitors up the monitor does not get a signal, but if it is plugged in, the mouse jumps around at like 2fps on the main display as if there's some weird interrupt denial of service going on.
To fix this, I have to restart just to use my monitor again.
I have that problem too and I tried that, but then the OS forgot the setting during an update, and now I'm back to unplugging and replugging my adapter every time I grab a coffee or take a phone call.
External display wakeup problems have been an issue on many generations of MacBook Pro going back to PowerBooks (even with Apple displays). Very annoying!
I experienced this issue using an HDMI to DVI adapter from my 2018 Mac mini to a Dell Ultrasharp display. After purchasing a replacement cable that didn't remedy the problem, I discovered that it's a known issue.
The fix was to use a USB-C -> DisplayPort cable (using DisplayPort via USB-C alternative mode). It appears that the HDMI port can't be used with a DVI converter. DVI ports have been replaced by DisplayPort, HDMI, and DisplayPort for some time but it was still a disappointment.
Genuine question: How is GPU accelerated rendering more prone to crashes than just doing it on the CPU?
Also, on macOS, Apple practically controls the whole stack too. Putting macOS on something that isn’t their hardware is a challenge.
Side note: there’s errata on the Intel CPUs, but I’m sure there’s also some (confidential) errata on the “A” series processors in iDevices. If iOS can work around the errata, why couldn’t macOS?
On Mac and Windows, you are relying on Drivers to do the work where you dont have the source code. While the piece of code is running on OpenGL or Direct X or Metal. And on top of all that there is a composition layer.
Modern GPU drivers is an insane piece of optimisation / engineering. But the complexity means you will never quite grasp the problem when things go wrong. In the early days of Firefox or Thunderbird it is recommended you turn off Hardware Acceleration whenever you see Font or Graphics rendering problem. While a lot of these have been fix ( or migrated ), it is still popping up from time to time.
There was a recent thread on HN ( I cant find it sorry ) Firefox asked for help on a GPU rendering bug that they have a hard time figuring out what's wrong.
On iOS, Apple owns the whole thing they can figure out what's wrong and fix it. As Apple write their own GPU drivers.
The GPU is a whole separate computer, running code independently, with its own clock scaling and power management. Having two separate things that can break is more complexity than one thing that can break. And having to have the two communicate in order to do delicate procedures like waking from sleep increases complexity dramatically.
Also, on many Macs, it's actually three separate things. There is the CPU, the integrated GPU, and the discrete GPU. The system switches between the two GPUs dynamically based on load and whether the system is plugged in.
> May be it is better on iOS since Apple owns the whole stack from Metal to Drivers to actual GPU
This goes the other way than what you would think.
Owning the whole stack lets you reach through abstraction layers. This can be good for, say, efficiency. You don't have to assume the implementation of another component could work in multiple different ways, you can look inside it and do something that only works with the specific implementation you're actually using.
It tends to be bad for reliability. Having clean interfaces between things makes them easier to reason about, because they're independent components instead of a giant ball of spaghetti. Each component only has to consider the other component's published interface and not have to worry about every detail of its internals. It's "do one thing and do it well" vs. "nothing is ever guaranteed not to change in any way at any time."
The latter gives you greater flexibility at the cost of greater instability and fragility.
Why do you guys put up with this? Do you like pain?
Edit: I'm not trying to troll, I genuinely want to know why people put up with this. I bounce between Linux and Windows and have no loyalty to either of them. I've settled on Windows with WSL2 on my desktop but I'm tempted again to install the new Fedora on my Thinkpad. Is it the lock in? With my hardware I can run anything I want except for macOS. Once you buy Apple hardware it seems incredibly difficult to go elsewhere.
I've got a brand-new 2019 16" and am dealing with the widespread crashes on wake from sleep. Honestly, it's a harder decision to send it back than you might think, for several reasons:
1) The hardware quality of other OEMs is bad too. Lenovo has really gone downhill. I sold my maxed-out X1 Carbon 7th Gen, which in less than a year: a) came with an LTE card that never worked for more than an hour at a time; b) had the display bezel start peeling off (it's a sticker!); c) developed a bunch of dead pixels; d) started randomly locking up; e) had weird issues with power usage, where some service related to the touchpad would go crazy and start using all the CPU.
2) Other laptop makers make design choices and trade-offs that I find unacceptable. For me, the 16" MBP is the perfect balance of power, weight, and battery life. It compromises by using a 3K screen instead of a 4K screen, and gets 11+ hours of light usage. Alternatives like the XPS 15 or Lenovo X1 Extreme force you to choose between an FHD panel and good battery life, or a 4K panel and substantially shorter battery life. The Surface Laptop 15" uses a 15W ultra-mobile processor in a machine that runs $2,800 when configured with 32GB of RAM, and has a battery half the size of the one in the MBP 16". Many 15" laptop vendors include a numeric keypad, which is a deal-breaker for me because it forces you to type at an angle to keep your hands centered in front of you when in the home position.
A guy at work installed Windows on his MacBook. He claimed it was the best Windows machine around. I'm not sure if his tune changed after Retina, because I really do have to hand it to Apple there: their high-DPI scaling is flawless.
I bootcamp'd Windows LTSC on my old late 2013 MBP. Your friend's experience is similar to mine. Everything is smooth and the magic trackpad 2 works flawlessly, which used to be an issue with Windows.
I won't switch back to Windows, but if I ever need to, I'd definitely consider using a MBP for it.
I have a similar issues with OEM's laptop quality years ago with both HP and Dell. System76 laptops have been a pleasure to use.
The new Lemur Pro has a 10th gen CPU, does not have a numpad, and does have an FHD panel. They claim up to 14 hours of battery life on Linux. (21 hours coding on VIM.)
I’m often away from a power outlet, so battery life is paramount. The battery life in Bootcamp is unacceptably short (3 hours) because the DGPU is always on.
I was recently using Bootcamp on a relative's 2020 Macbook Air, and I was generally impressed with how well everything worked. Especially the Trackpad, which I remembered being a problem under Bootcamp many years ago.
If I didn't dislike Windows so much, I thought I could see myself using that setup full time.
Even my old MacBook Pro 2012 in BootCamp is very poor under BootCamp. Getting the right drivers for it is the worst bit because who knows whether I should be using Bootcamp 5 or 6? The display brightness and volume keys no longer work. The discrete GPU is used all the time so battery life is very bad.
What magic do I need to perform to do this? It worked and then stopped working after a Windows 10 update and Bootcamp 5 (for my 2012 mac) doesn't know about Windows 10 drivers or something...
If you're going to run Windows you'd likely be better of getting a Razer or one of the other systems that, at least for GPU work, is like an order of magnitude faster than a MacBook Pro.
x1 carbon, yeah good. but the T-Series are better. any thin, laptop you're gonna run into the same macbook problems, due the design constraints of letting the laptop be thin and lighweight. I have a t490, hardware solid all around. though lenovo put clippers on the bottom panel. don't know why. I know previous models t480 etc didn't have clippers.
Yes, this is reality. But for some reason some people decide that they're absolutely certain about the huge scale of issues with Apple products while being absolutely certain about the lack of scale for issues everywhere else. It couldn't _possibly_ be their bubble.
I'm seriously considering the new XPS 15. It hits much of the same great balance as the 16" MBP, and like the Macbook Pro, it has a 16:10 display. I'm gun shy, because of potential coil whine: https://xps-15.fandom.com/wiki/Coil_Whine
> As Coil Whining is a big problem on this model we will try to find out witch parts are causing this noises. First we need to disassemble this notebook and identify the noisy parts.
> I immediately noticed I felt slightly 'uncomfortable' when it was on, and within about 30 seconds realised that it was emitting coil whine at a volume/pitch high enough to physically bother me. It's like an ever-present noise that I can only imagine must be how low levels of tinnitus feels to people who have to deal with that type of thing. I am pretty disappointed to say the least.
I might just assume that this is random internet griping, but I had this exact same problem on my last Dell laptop (admittedly, it was almost 20 years ago) and it drove me insane when working late at night. It's one of those little "quality of life" things I'm willing to pay extra for. (Not to say that Apple doesn't have quality-of-life issues, like randomly removing ports. But for the most part I find I can solve those issues just by throwing money at the problem. I can't do that with coil whine, or a non-centered keyboard, or a laptop that only offers either an FHD or a 4K display and no power-efficient compromise.
I switched to a XPS-15 9570 from a MBP of roughly equiv spec, but previous gen, after the last round of laptop shuffles at work - I went from MacOS to Windows at the same time in order to feel the pain of our environment under Windows.
I'm not saying that it's a bad laptop. But there are just so many little annoying things about it after coming from the MBP and MacOS;
* Trackpad is subjectively worse, but I can't tell you why beyond its just smaller - but I am objectively much less accurate with it and trigger tap-clicks when I don't mean to. The trackpad positioning is also slightly uncomfortable to use.
* Fan noise - the fan curve has it ramping up earlier than a MBP
* Windows was hosed out of the box (fine, fixable, but it wasn't a good start)
* The "soft" covering results in so many fingerprints
* Webcam on the bottom bezel is an awful angle
Pros;
* Great screen, touch is a nice add-on but found I rarely use it now
* Keyboard isn't terrible
* Power adapter isn't a brick
Would I do it again. Honestly, not sure. My daily driver isn't the laptop and I just deal. If I had to use the laptop daily, I'd probably reconsider.
I tried the XPS 15 7590 late last year before I switched back to Mac when the 16" MBP came out.
It felt like absolute garbage in comparison. Trackpad was so bad I had to use the touchscreen to scroll. Internet stopped working intermittently. Display felt like far inferior quality (and, like you said, your choices are too few pixels or too many).
It was honestly a confusing experience to me, like reviewers or other people who treat these machines as comparable were living on a different planet or something. Of course I don't expect everyone to like Macs and there's nothing wrong with preferring the Dell but I don't understand why the quality differences are not more widely acknowledged.
I've been using Macintoshes since 1986. I've used Mac OS X since 10.1. I've used dozens of different models of Apple computer.
There was only a single combination of hardware/OS X release that I ever recall experiencing kernel panics often enough to bother me. It was one of the first generations of Intel iMacs on release 10.4 or 10.5.
I'm currently using 10.5 (Catalina) on a work-provided 2018 13" MBP hooked to an external display via a Thunderbolt 2 dock. I use it for compiling Android and iOS mobile apps, web browsing, accessing work via VPN, Slack, Google Meet, all the normal things you'd expect of a developer machine.
I also have a personal 2017 non-retina MBA running Catalina. My wife has the same laptop running Mojave. Both kids have 11" MBA's running Mojave. There's a 2013 iMac in the house running Mojave.
I can't recall having seen a kernel panic on a single one of these machines ever. I'm not denying they happen to other folks, but if this is so common an issue, I'm surprised I haven't seen it across any of my machines.
Are there other issues with macOS and iOS that occasionally drive me crazy? Sure, let me tell you about iCloud drive syncing issues I've had to deal with. Or the bug in Bootcamp I just experienced yesterday where it doesn't properly account for the SSD portion of a Fusion drive when partitioning the drive for Mac/Windows. Or how screen rotation sometimes just gets stuck on iOS in the wrong orientation or at the wrong aspect ratio.
But having just installed Windows 10 yesterday... OMG, I'll take macOS over Windows any day of the week.
And no, Linux is still not an alternative. (I'm a recovering Sys Admin and used Linux as my desktop for about 5 years.) I want a single OS I can run on all of my devices, and Linux just isn't an acceptable consumer OS for myself and certainly not for my wife and kids. I want an OS that just connects to WiFi, just works with my printer, can sync photos and music with my phone, allows me to rent movies, sleeps and wakes properly, that I don't have to futz with, etc. I want an excellent tablet experience for my artist daughter. As much as possible, I don't want to have to be tech support for my family.
So that's why this guy puts up with Apple, warts and all. It's the best set of tradeoffs for me and my family.
It's kind of why I still continue to use my Macbook (non-pro), despite of the warts that Catalina brings. The reason being that there is no credible desktop alternative other than macOS or even Windows. I don't see myself wrestling to get things done or googling general OS issues on macOS, when I try installing a Linux distro these days when there's WSL2.
TouchID does the security for me rather than having to repeatedly type in my password unless I restart the machine. Interestingly, it can be done on Apple Watch if you're savvy enough but I won't expect typical consumers to care about that but only TouchID. They would probably ask if this exists on Windows or another Linux distro, which the answer is 'there are equivalents'. For the former, it depends on the computer, but for the latter the kernel may support it, but requires some digging into getting it to work on your chosen distro, which isn't acceptable for a "consumer OS".
Even with Apple's incompetencies with NVIDIA, Metal requirements, no 32-bit support and their silly notarisation services snooping my executables, in the end I still use it. Why? Because macOS still 'just works' and gets out of the way. Some Linux distros are getting there but I have no time to play around with my dotfiles, desktop environment or fix my window manager to get work done. Which is why I'm now dual-booting with Windows for WSL2 instead of installing a Linux distro and worrying about breaking it by installing a conflicting library or system component.
In general, you're most likely to see issues with the first model of any given Intel microarch, IME. The 2016 MBPs were quite kernel-panic-y for a while, before OS updates smoothed it out. Same for the original Haswells, I think.
Not sure what the 16"'s excuse is, tho; that's still a Skylake++++ (possibly more pluses, but basically still Skylake).
Personally I'm staying the hell away from Catalina, but it's not like Windows is the land of milk & honey. I've lost overnight testing data multiple times thanks to Windows' hostile "we're rebooting right now, whether you want to or not" upgrade strategy, and their mandatory telemetry bullshit is well-known at this point.
Basically: Catalina sucks, but there's nowhere to go that's unequivocally better.
Mojave will be good for a long, long time to come. I run El Capitan on one machine and it still gets security updates. Homebrew sometimes gets confused trying to find an El-Cap specific formula but that's usually fixable.
> I run El Capitan on one machine and it still gets security updates.
No, it doesn’t.
That caveat aside, if you can switch package managers, MacPorts explicitly supports older OS X releases (back to Tiger!) and thus may give you less problems.
For non-Home Basic versions of Windows you can change the policy for this (which I've done, and which works well, for exactly the reasons you've mentioned):
Because the alternatives are also bad in a bunch of ways and people still prefer these problems to those problems. Let's not pretend like other hardware vendors and operating systems are magical fairy realms of candy and rainbows where nothing goes wrong, everything makes sense, and everyone is happy all the time.
Mojave wasn't perfect (I got a gray screen every few months or so), but Catalina has been measurably less stable on the same hardware, same monitors, same peripherals, etc. to the point where (as stated by others) I can't even safely sleep/hibernate with external monitors connected anymore.
Even if it were a hardware problem that Catalina somehow tickled more often, if it only affected 5% of users it's still a problem. Being one of the lucky 95% doesn't diminish the experience of the 5%, especially if with Mojave the affected user base could have been 1%. We're not talking about $200 Chromebooks. These are $3K+ professional tools.
The common experience is yours - and mine. I’ve had minor problems with Catalina, mostly around slowness and my 2014 iMac 5k randomly sleeping from overheating (which might not be Catalina, might just be dust).
I use it heavily 10+ hours a day and the only problem I’ve had is my windows are sometimes resized when I wake it from sleep. Bug related to using an external monitor.
I'm sure many people haven't noticed any problems when using WD's SMR drives in a RAID either. Does that mean there are no issues with SMR in RAID or that it is insufficiently common?
I really think Snow Leopard is remembered overly fondly because it's direct successor—Lion—was quite bad. But, Apple fixed Lion's problems in 10.8 (Mountain Lion), and avoided introducing any new ones in 10.9 (Mavericks).
Consequently, 10.9/Mavericks is my actual favorite version, and what I'm typing on right now.
When I decided I was going to downgrade my main machines earlier this year, I did a lot of comparisons of Snow Leopard, Mountain Lion, and Mavericks. I could not see a compelling reason to pick Snow Leopard or Mountain Lion.
Mavericks performs well and never seems to crash. When I loaded all three up in VMs and starved them of resources, Mavericks remained the most responsive, I suspect because it supports memory compression. (Interestingly, Snow Leopard lists 1 GB of memory as the minimum required, whereas Mavericks lists 2 GB, but Mavericks still seemingly did better with 1 GB than Snow Leopard.)
Design-wise, while I prefer Snow Leopard's higher contrast in some places, it has a tendency to look either garish or plain in others. 10.8/9 look more refined, without going all flat. And while 10.8/9 have a lot of stupid defaults, I have a setup script I've been using for years which fixes most of them: https://github.com/Wowfunhappy/Mac-OS-X-Wowfunhappy-Setup-Sc...
The one big advantage of Snow Leopard would have been compatibility with PowerPC apps via Rosetta. But, there are only a handful of PowerPC apps I'd actually want to use, and a lot of Intel apps that support 10.9 but not 10.6.
It's years of fantastic painless experience and even now it's relatively good. Before using these laptops I never knew what it meant to just slam the lid, then hours or sometimes days later open it and start from where you began, instantly.
The privacy. The comfort of having two devices (phone and computer) to be in sync in a private and safe manner. Excellent app ecosystem. Timely and long term updates, upgrades, and support.
But as the parent commenter says I am at my wit's end as well, for the first time since I started using Macs a decade ago - crappy recent OSX and iOS releases, iPhones getting bigger and bigger, being trapped in a data/communication silo etc.
So why am I still here? Because there's nowhere else that's better.
Simple answer: MacOS and Apple hardware are still a much better and more productive experience for many developers and everyday computer users. Windows and Linux both have their own problems.
That question presumes that crashing is the only axis upon which computer users experience pain.
Linux users said the same thing in 1996. It didn’t work then, either.
Anyone working on desktop Linux should know they’ve hit saturation with their current approach. What‘s missing or broken that’s causing Mac users to prefer a system that crashes once a day?
As a macos and linux user I’d say linux has the same problems it always had:
- driver support (even my thinkpad has an unsupported fingerprint driver, and a buggy intel wifi driver)
- fragmentation that affects user experience. For example, there is still the gnome / kde divide, and using a combination of apps from both is visually jarring.
- lack of microsoft and adobe apps, lack of high quality productivity apps (e.g. i would love a pixelmator equivalent)
- many paper cuts, like how copy and paste of images only works half of the time
- when things break, you have to drop down to the terminal to fix them, instead of running some automated repair wizard
The post I replied to mentioned macOS locking up and being unusable daily. I can't remember the last time this happened to me on Windows. It's definitely been 4 or more years since the last time I've looked at a Windows BSOD. WSL2 is pre-release but still hasn't given me issues.
So, no snark intended, but where you have you been for the last 5 years? Windows and Microsoft are undergoing a renaissance and moving ahead and improving. macOS is regressing.
What kind of 90's hardware you have that you can't run MacOS on it? I've been running MacOS in both Hackintosh and virtual machines for over a decade now.
1 - Hardware acceleration can be used if you use the right virtualization tool.
2 - Not really. On paper perhaps. On real life not really. I had Apple support live via screen-sharing and it was about my app (well, my client's app) on their store. They couldn't care less that I was running High Sierra on a VM. Maybe because the project I was working on was a successful app that was bringing revenue to them (and my client as well)? In the end money talks.
I don't like the pain, but I prefer the least amount of pain given, so in my case, macOS fits the bill the best.
I bought XPS 13 laptop about a year ago, in the hopes of finally jumping to the other side.
I have tried to use both, Linux and Windows, and UX just isn't there. Linux is still the most buggy desktop OS (I know, it's not an OS, but a kernel, but in this context I mean every possible combination of drivers, programs, and desktop environments).
If we start with the XPS 13, which is/was regularly recommended as the Linux laptop, the most annoying things are coil whine and fan loudness. I cannot even watch a movie at night, without either of them bothering me. My MBP is fairly quiet, especially when not doing much (doing hardware accelerated decoding of H.264 is something that should produce any noise). Not just the laptop, even the charger has the coil whine (thankfully I can just reuse MBP charger here).
Moving on, let's talk about Linux. There is still no sane way of having multiple screens with different scaling factor (I can't see anything useful on 4k 13" display without the scaling, and external 4k monitor shouldn't need it.) I have tried Wayland, which sort of fixes the issue, but there are other problems there. Last time I checked, GNOME Wayland cannot handle playback of a video, and scrolling in the browser on the same screen. When you scroll the content in the browser, video frames drop, and make video unwatchable. I forgot what the underlying issue was, but it was open for quite a while. Most of the things are broken on KDE wayland, and for the rest you usually need hacks.
Then we have things like bad performance (especially regarding the graphics) of web browsers, which work fine on Windows, on the same machine.
Touchpad issues are well known; I couldn't make it to mimic Windows behaviour (which should be possible since it's the same hardware), so let's just not get started on macOS experience here.
Regarding the Windows, general UX is much much better than Linux, and apart from touchpad, it seems on par with macOS, if we ignore all the telemtry, preinstalled CandyCrush and other unneeded software. Some things are bettern than on macOS, and vice versa.
Regarding the development UX, we do have WSL2, Windows Terminal, and stuff like this, which is great. It's a major improvement from the state we had a year or more ago.
However, WSL2 is far from perfect. I have seen issues such as WSL2 not starting for a single user, but it worked for the rest, which required reboot to solve. Sometimes, Visual Studio Code couldn't connect to the WSL2, and again, occasionally machine would have to be rebooted.
There is no USB support for WSL2, so I had to resort to exposing of USB devices via network on Windows, and using a client to attach those devices in WSL2 VM. There is no support for bridged network, so we cannot easily discover LAN devices, for example when we want to flash ESP8266 devices. Did I mention there is no native USB support, so you cannot flash it this way either?
Running Windows programs from WSL or viceversa has its own problems, and since there is no virtualization support in WSL2 (understandable of course), using Minikube was far from ideal.
macOS is far from perfect, and even if majority of the apps/libraries need fixes, since macOS is not completely compatible with Linux APIs, majority of the libraries have the support for it. I can easily run majority of the software natively, without any VMs. With Windows, some things work natively, but for the rest you have to resort to an VM or WSL(2).
Using libusb is the same as on Linux, and projects using it can be compiled for macOS with minimal changes.
Any time I get these they come with a bushel of IOAccelerator errors; I'm fairly sure (in my case, at least) it's a problem buried somewhere in the video decoder stack.
I can reproduce it pretty reliably by previewing a bunch of videos in Messages, and quitting the app usually solves it.
Edit: I use a 13" MBP, so that rules out most popular theories e.g. problems with the discrete GPU or graphics switching.
I had similar issues as you but it turned out to be the hardware was bad... I replaced my macbook pro and it's been great... But the other hardware issue on the new macbook pro's is if you charge on any of the USB-c ports other than the one on the right side, you'll likely run the laptop too hot... so charge on the right side not the left side and you should be good... it's crappy i know... but new laptop i've almost and charging on the right side i've nearly forgotten about all of my apple hate...
I had this happening for a while, and I tracked it down to time-machine backups starting. Every hour the backupd cronjob would trigger and I'd get a minute or so of beachball. Similar WATCHDOG termination message. It crashed a couple times and corrupted the backup database. I disabled time-machine and haven't had any freezes since.
Is it a remote time-machine backup, i.e. to a Time Capsule (or a NAS pretending to be one)? Time Machine used to use AFP, but since AFP was deprecated, it now exclusively uses SMB for remote backups. (Specifically, it creates a sparsebundle image on the SMB share, and then locally mounts it.)
Since the Apple SMB stack has some decent amount of kernel integration—and Time Machine doesn’t really have any—I’d guess that recent changes to Apple SMB are also the culprit for any freezes where it’s in the critical path.
And Apple SMB seems likely to blame, because there are already other well-known regressions from the recent changes to Apple SMB. Seemingly, there’s a lot of cowboy coding going on in this part of the system right now. For example, macOS can no longer connect to SMB shares via their mDNS names (e.g. smb://example.local). This means that any SMB server that shows up in the Finder sidebar (through an mDNS announcement configured to achieve AFP or Time Machine serving) now just chokes and kills the Finder process when you try to connect to one of its exposed SMB shares through that sidebar connection (see e.g. https://community.synology.com/enu/forum/3/post/129160). You now have to explicitly connect to the share using its IP (e.g. smb://10.0.0.1) from the Finder ⌘K modal.
Weirdly enough, Time Machine itself isn’t broken for the SMB shares it auto-discovers through its configuration modal, so presumably they realized their error in that specific case and made Time Machine pre-resolve the mDNS name in the smb:// URI to an IP address, before attempting the connection. But they forgot to add this same code to the Finder. :/
I'm hitting the stupid SMB bug everyday but the symptom seems to be different. I tried to manually connect to `smb://nasbox.local` in Finder and it works with `nasbox.local` appearing in the sidebar. I can also click `nasbox` (without `.local` and this is advertised by the avahi-daemon on the nasbox I believe) and it opens the network shares fine.
However, if the Mac goes to sleep for a while and then later wakes up, I can no longer access the shares with the same message "The operation can’t be completed because the original item for “folder_name” can’t be found".
I've given up on Mac-to-Mac networking in Catalina. All my other (older) Macs in the house can talk to each other just fine, but the new Catalina Mac just pretends to connect to them and fails. I have to use thumb dives to move files to/from it.
Apple networking had mostly "just worked" since the 80s. It took a big hit when they abandoned Appletalk, and now with Catalina it doesn't work at all. Apple has fallen a long way since its glory days.
The annoyance of trying to connect to a remote system and the entire desktop and all Finder windows locking up whilst there is a network request is the most annoying issue. I can't understand why it wouldn't make the request in a thread with a callback on success/failure - instead it seems to be in the primary thread and therefore hang the machine until it decides it can or cannot interact with the remote network share. Pretty shoddy.
I think that icon was meant to be bait for Microsoft.
The host metadata, as it shows up in the macOS Finder/open dialogs/etc, is configurable by having your host publish an mDNS _device-info._tcp service record. You can publish a display name (as arbitrary unicode), an icon (as a machine model name), etc.
I'm guessing Apple figured that if they made the default icon insulting, that might incense Microsoft into building mDNS support into their SMB stack, just in order to use it to broadcast that _device-info service, to make Microsoft SMB present more professionally on macOS clients.
(Though, sadly, this exact combination of mDNS-announcing-SMB is what is broken in the newest macOS, so I can't recommend you follow the above guide right now. Maybe we'll get support for this back in a few releases...)
Not all SMB shares. If you use a Mac to host an SMB share, it'll appear on other Macs with a nice Mac icon. ;)
I don't know, I find it funny/charming and I'd be sad if Apple got rid of it. It's something of an easter egg—you can't tell that it's a BSOD unless you open the QuickLook or Get Info window (or set your Sidebar Icon Size to "Large").
> I tried to manually connect to `smb://nasbox.local` in Finder and it works with `nasbox.local` appearing in the sidebar.
Right; that part still works, seemingly because probing for SMB shares doesn’t involve passing an SMB URI through whatever layer of the SMB stack can no longer resolve mDNS origins. It’s only connecting to the shares themselves that generates that arcane error message.
> I can also click `nasbox` (without `.local` and this is advertised by the avahi-daemon on the nasbox I believe) and it opens the network shares fine.
Yup; the Apple SMB stack is seemingly happy to resolve a WINS origin. Which means SMB servers will interoperate fine with macOS clients as long as the SMB server doesn’t run AFP (which nobody has a reason to be running these days anyway) and doesn’t offer Time Machine backup (which... is often the whole point of having a NAS.) If your NAS is configured to offer Time Machine backup, the WINS announcement gets subsumed by/attached to the mDNS host metadata record for the NAS (which is required to make Time Machine work), such that trying to connect to the SMB share via the Networks item (or the sidebar) will try to use the "canonical" mDNS origin for the host, rather than the WINS SMB-service origin—even if mDNS pointed at it.
> However, if the Mac goes to sleep for a while and then later wakes up, I can no longer access the shares
A thing about mDNS is that it gets announced on intervals, and clients are expected to cache it; but like regular DNS, the cached record announcements have TTLs, and you’re not allowed to use a record after its TTL runs out... but unlike regular DNS, you can’t just go re-fetch the mDNS record from the source once it expires; you have to wait for it to be re-announced.
This is why every bonjour/avahi/zeroconf tutorial has a line that says “now wait 15 minutes to see if your changes took effect.”
And this also means that these services inevitably do this thing where their URIs won’t resolve for the first few minutes after your computer wakes up from sleep, until they receive a refreshed announcement of the mDNS peer’s A and SRV records.
This has always been an inherent flaw in mDNS, papered over by various pre-resolution or standards-violating caching strategies by things higher-up the stack than the mDNS resolver itself. I’m not surprised that this sort of hacks papering-over is something prone to regressions, in macOS or any OS.
(This is also why Apple gave up on "Back To My Mac." It was dependent on "Wide-Area Bonjour", which was even more fraught and flaky than regular mDNS, with service records frequently disappearing from their domain, leaving you unable to resolve the address of your remote peer, despite it sitting there happily waiting with ports open. It especially didn't play well with laptops sleeping in a Wake-on-LAN state, despite several generations of Power Nap trying to make it work.)
So basically I cannot have both Time Machine and SMB from the same NAS box servicing a Catalina Mac? I just tried disabling Time Machine share in Samba and Catalina Mac still complains "The operation can't be completed because the original item for "Share" can't be found." if I click the sidebar to connect to a share.
The sidebar entry is still coming from mDNS. Play with the NAS’s config until it shows up the same way Windows peers with shares do: as an all-uppercase-named machine only visible under the WINS workgroup name in Networks. (Finder will give it a sidebar entry from then on after the first time you connect to it, IIRC.)
Thanks for the suggestion. I figured it's actually less painful to just cmd+K in Finder to manually connect to NAS via its local host name before Apple got time to fix the bug…
I am! It only happens when I am on multiple monitors (clamshell opened or closed) and usually when I am in a video conference (zoom, hangouts, and G2meeting). Thought it was overheating.
Im seeming to have the same issue. 2019 MBP connecting through USB-C. However, I find the issue is less frequent (to non existent) after I switched to a USB-C to HDMI adapter.
Have you tried the "nuke from orbit" approach of a full system wipe (T2, SMC and NVRAM reset, delete recovery partitions, Pre-boot volume, etc) and doing a fresh install and setup rather than restoring from backup?
I know it shouldn't be needed but sometimes it is the only option.
It isn't much help to you but my experience with Catalina has been very positive on a 2018 15" MacBook Pro. The initial release and first two point updates had some silly UI bugs but since .4 it has been solid as a rock for me.
I just updated to .5 and noticed I haven't had to restart my laptop since the last update I installed 42 days ago. In those 42 days I haven't had a single issue so I am pretty happy about that.
Yes, I have reset t2, smc, nvram, even reinstalled the entire operating system. I have the same problem on both 16 inch laptops I've owned (first one was stolen) suggesting it has to do with the software I use.
I am unable to identify a single application as the culprit, unfortunately.
I have the same issues !! I have a couple of raid drives and third 10T - a second dell monitor connected to my iMac pro 18 core....If I loop a time line in fcpx, loop audio in Logic, run the Black Magic disk test my system will freeze, still playing audio then give up the cursor then hang fully and crash = in the spin dump we get the watchdog time out !!!! I am on 10.5.5
looks like its some kind of GPU issue
I get 'watchdog timeout' kernel panics. For a while it was daily, since reinstalling OS now it's random.
The issue is I use a far different system than most reporting the issue. I see mentions of macbooks, sleep, etc, but I'm running an iMac 2013, a program that keeps it awake 24/7, etc.
No idea yet what the underlying cause is, but the panics started with 10.15.1 and got worse with 10.15.4, they were not an issue prior to 10.15.1
I had this Watchdog timer issue on OS X and it was causing major slowdowns. My solution was to stop using OS X, Apple support didn't really seem to have any solutions or suggestions other than reinstalling OS X, which I tried, and after a while I ran into the issue again.
If I'm going to be my own Applecare at least with a Free OS, I can access the source code.
I've had the opposite problem: started getting awful freezes on a MBP 2018 running Mojave on the day this latest Catalina version was released. No immediate explanation: cpu/ram/disk load perfectly normal, machine just incredibly slow (think 10 minutes to start up). Exact same behaviour in Safe Mode and nothing obvious after some common diagnoses.
Bit the bullet and finally updated to Catalina, all problems gone. Not sure what to make of this but it definitely resonates with stories of Apple practically forcing updates.
Still having a (at least one of the multiple different) kernel panic during sleep:
panic(cpu 0 caller 0xffffff801e29169c): Sleep transition timed out after 180 seconds while calling power state change callbacks. Suspected bundle: com.apple.iokit.IOGraphicsFamily. Thread 0x2754.
Ever since moving to a brand new 16" MBP there isn't a day that passes wihout me being welcomed by the crash report dialog…
I managed to eliminate (or at least majorly reduce) kernel panics by disabling Power Nap. I've since re-enabled it to see if this is any more stable.
Kind of a joke really, this isn't a brand new model, nor is it using any cutting edge components. I thought we'd worked out sleep mode a long time ago.
Oh damnit, damnit. I've been having those on 10.15.4 too, was really hoping they'd fixed it on this one. It's really profoundly upsetting to spend $3,500 on a brand new computer by the most reliable major manufacturer in the industry and still have daily damn kernel panics.
I have had too many bad experiences with Lenovo business class desktops and servers to hope that one product line from them is somehow exempt from reliability issues.
Also a product long past it’s glory days. Ditched my less than year old X1 Carbon to get a 16” MacBook Pro because it was randomly dying while purporting to have 25% battery left.
I have the same. Every day. I've done a lot of searching, and some people think that certain external monitors might be "the issue"--which I use. It didn't seem to be an issue before Catalina though.
Hmm, same for me perhaps. New MBP 16 and Catalina connected to a external monitor via USB-C (a HP Z27 4K UHD).
If I leave it to sleep for a long duration, then it panics when waking from sleep and just black screens for a good few minutes, then turns on the fans full blast for like 2 seconds and reboots.
I've taken to just shutting down completely when finishing work and not letting sleep.
I have the same monitor and connect over usb-c with a 2018 13' MBP running 10.15.4. I have not experienced any such issue, even though Catalina broke several other things.
In my case, if I use the usb-c port of the monitor to connect the ethernet adaptor, it works for a few minutes and then drops all the packets. This was never an issue before Catalina or with a windows machine.
Another weird issue that I have with this monitor and Mac OS (not only Catalina) is related to the Microsoft sculpt ergonomic keyboard receiver, when plugged to one of the USB ports of the monitor. If I unplug a the MBP and connect a windows machine to the monitor, when I connect the MBP again, there is a >50% chance the receiver will not be recognised. Plugging and unplugging does not work. I have to do a restart of the MBP.
The above make it very annoying when I have to switch laptops. Sometimes I have to work on 3 machines at the same time, which by coincidence run 3 different operating systems (Win 10, ubuntu and mac). The reason I got this monitor is mostly for the build in hub. If it worked properly, I would just plug everything on the monitor and I would only need to plug the usb-C to the laptop and be ready to go.
Sometimes it does work and I feel like I am in the future, using a dock that can support every device, with just one universal cable. Now I often have to plug and plug things directly at the laptops.
One last thing for people who might consider a similar set up. Due to USB limitations, if you plug any storage device on the monitor and the monitor is at 4k@60Hz, you are limited to USB2 speeds.
Same here on my late 2013 15". Other than a few weird moments back in the 10.9 days when I tried installing Xcode on an external drive, it's run like a champ, upgrades and all. I was nervous about putting Mojave on it, but it's run surprisingly well, albeit a bit slow, although that's expected at this stage of the game.
The same can't be said for the machine it replaced, which began life on 10.5 but was brought to its knees by the combination of an 8600M GT and the 10.7 upgrade.
Right now, I'm waiting until things settle down - because COVID 19 - before pulling the trigger on a 13" Pro.
Read a user somewhere reporting that leaving his iPhone plugged into his Mac caused this problem to occur almost daily. After charging it with a wall charger, and not having it plugged in, the problem went away. After resetting PRAM and SMC too. Hope the issue gets resolved for you someday!
Glad to hear I'm not the only one. I've started getting these in the last week. It coincided with a screen replacement but sounds like that might be a coincidence based on your report.
For anyone else having this issue, the fix for most people seems to be disabling Power Nap[1], which is enabled by default when you're on AC power. Can confirm that this worked for me personally. Disappointed to see no mention of fixing this bug in these release notes, however.
FWIW I've ran into this too with a new 16" MBP and what largely fixed it for me (happened once in two weeks vs. every time the computer went to sleep) was to disable "Power Nap while plugged into a power adaptor" in the "Energy Saver" system preferences.
(I'm aware that this might not be applicable to your situation, since you might not even use a power adaptor when the crashes happen, just wanted to point this out in case it helps someone else)
Are you using an external TB3 dock? That's what's causing KPs for me. If the MBP goes to sleep, it goes into endless reboot cycles. Currently keeping it running 24/7.
> The battery health management feature in macOS 10.15.5 is designed to improve your battery's lifespan by reducing the rate at which it chemically ages. The feature does this by monitoring your battery's temperature history and its charging patterns.
> Based on the measurements that it collects, battery health management may reduce your battery's maximum charge when in this mode. This happens as needed to ensure that your battery charges to a level that's optimized for your usage—reducing wear on the battery, and slowing its chemical aging.
I just had a MBP fixed (it was quick!) due to it swollen up one day. It would not lie down flat on a table and the trackpad was useless. Obviously could not close the screen either. How common is this?
I've had this exact thing happen with my 2012 15" retina. Won't lay down flat and clicking the trackpad required pressing really hard. Had the battery replaced, works like new now — on Mojave, I'm not updating to the dumpster fire that is Catalina.
My old mbp swelled up enough to shatter the trackpad glass. During zoom calls it would overheat and swell and disable keyboard and trackpad function until it cooled off.
I'm on my second swollen battery in my 2015 15" MBP.
Java dev though so my computer is always near the temperature of the sun's surface which I think has something to do with it. We purchased 5 of these MBPs on the same date and the other Java devs machine is swollen while the testers machines are still fine.
Very. It has happened to me so many times that now whenever I travel I always take a set of pentalobe screwdrivers with me so I can remove the back plate of my MBP when (not if) this happens to avoid structural damage.
pretty common occurrence, in my experience. if you own a macbook, it is just a matter of time until the battery swells up and needs to be replaced. usual first symptoms are that the trackpad can't be clicked anymore.
Deep discharge & Heat are common accelerators of swelling. Avoiding going to <10% battery and running aggressive heat dissipation will always help prolong the life.
My favorite is Macs Fan Control[0] for removing heat. 40' the fan is spinning mid range speed and by 50' it's full 6500 rpm.
I think the goal of running Macs hot is to prevent the fans from kicking in for most use-cases. If you're running twelve virtual machines and compiling Chromium simultaneously, then you might need earplugs.
Below a certain rpm you practically cant hear the fans. Even with them screetching, if you are in a room with an ac unit or outside, it would be tough to notice.
Never got this gripe personally, as I’ve never had a quiet device in my life. Certainly not from apple.
Swollen batteries were a result of design/manufacturing defects. It was certainly a common issue 5-10 years ago but should hopefully be a thing of the past by now.
Modern batteries should never swell, but will still wear out over time. Smarter battery management should be able to extend their lifespan.
I don't think you can conclusively say "Modern batteries should never swell" since there's always the possibility of manufacturing and design defects in new batteries too.
It's the battery equivalent of the halting problem in software.
No manufacturing and design process is perfect, there will always be errors. If you do find a perfect design and manufacturing process, I think you wouldn't be posting here on hackers news but would be drinking champaign with Bill Gates and Warren Buffet.
Either way, you wouldn't use "should never" when it comes to batteries, which still has bulging, quality issues, etc throughout the entire technology industry with cell phones bulging, phones catching on fire, batteries on airplanes themselves being damaged, car batteries catching fire.
It's very well known that modern lithium ion battery tech isn't all that "failsafe" so using the words "should never bulge" in this context is very deceptive.
It is more likely that any battery using current battery tech in consumer devices will buldge within a certain number of charge cycles (which might be higher than the useful lifetime of the device), but it's very likely that the battery will bulge.
In some cases (like the AppleWatch), a swollen battery upon failure is an intentional design choice, apparently. At least that's what they told me when my Series 0 battery swelled up and caused the face of the watch to pop off. They claimed they do it intentionally so you know it has to be serviced and can't be used anymore. Probably different for laptops, though.
Why does everything have to be automated? Why can't I just specify manually that it should not exceed, say, 80%? This is just another process running and ... using battery.
> Major new releases of macOS are no longer hidden when using the softwareupdate(8) command with the --ignore flag. This change also affects macOS Mojave and macOS High Sierra after installing Security Update 2020-003.
This is an admission that their upgrade rates for Catalina are bad. Between the now-customary list of new bugs and the 32bit massacre, the Vista comparison clearly struck a chord.
It would have been so much easier to just keep shipping the 32bit compat layer.
If MS wanted to kick Apple in the shins they should get the Github Actions runner to support Mojave (right now it only supports Catalina). Would help me stave off upgrading myself and users for a few more months.
Whether this matters is very dependent on what you're doing. For webdev stuff, for instance, you don't need Xcode—much less the latest version—just Apple's "command line tools".
This really only matters for iOS builds, meanwhile I have users that can't upgrade to Catalina yet and I need Mojave/High Sierra environments to build and test against.
I was just about to let the 2020-003 Security Update run on Mojave, but after this warning I will leave that well alone.
Apple had a great OS once, but now they are losing me more with each update. Can't buy new Mac HW either any more, since it will be systematically unable to run the 32-bit SW I need. Apple forcing decades-long users off their platform to Windoze and Linux is just bizarre.
But it's a net minus for usability that it can't run 32-bit software any more. That makes it a no-go for some of us, and Apple knows that, so all the nagging to “upgrade” is pointless anyway.
So because 0.5% of people have some dodgy 32-bit software that they cannot or have not updated, Apple should just not bother trying to make the rest 99.5% of people update?
I didn't say that. But taking away the option to explicitly tell the OS to stop nagging me is still a user-hostile move.
So, you are saying that all 32-bit software is by definition “dodgy”? But let's break it off here since I realize that disagreeing with you may not seem smart.
When Catalina dropped, it became obvious that while Apple computers and I have had a good run, we’re now going in different directions. I started migrating to Linux late last year.
If your workflow is amenable to it, and you can stay productive during the (inevitably drawn-out) transition, I highly recommend you consider taking the plunge.
Meaning? Can you elaborate a bit? I’ve just downgraded to Mojave on an iMac (fusion drive) primarily used for Logic Pro.
Do you think Mojave is not getting updates anymore ?
The "ignored" flag caused macOS to NOT offer the update to the new major version every once in a while (a little popup in the top right corner). It also made the Software Update app report that there are no updates available.
After installing this update, the Update app reports the update to Catalina even though I have ignored the update previously on this machine. This does not mean that Mojave is not getting any more updates, it's just that macOS will nag you to update (but you will still be able to dismiss it manually) without the option to tell it to stop.
Is there a patch somewhere that turns the nag off permanently? I'm definitely not updating several of my machines to Catalina in the next years, and the nag is just an useless distraction.
My MacBook Pro (16-inch, 2019) was having kernel panics multiple times a day after having been asleep for extend periods. This was extremely annoying, as you can imagine. I’m happy to report that the Catalina 10.15.5 update fixes the kernel panics for me.
I use my MacBook Pro (16-inch, 2019) with a CalDigit TS3 Plus Dock and LG 32” 4K monitor. It’s worth noting that using the computer undocked never gave me any problems.
Thank you for documenting this. I have been getting exactly this kernel panic dump on my 16-inch MacBook Pro, and it's good to know I'm not sitting on a lemon.
(Another lemon. My first one was absolutely a lemon.)
> My MacBook Pro (16-inch, 2019) was having kernel panics multiple times a day after having been asleep for extend periods. This was extremely annoying, as you can imagine. I’m happy to report that the Catalina 10.15.5 update fixes the kernel panics for me.
Will update ASAP. Thank you for beta-testing this!
Yep late 2013 MacBook Pro: Catalina brought some security annoyances but mostly pain-free. IMO this computer is the epitome of function and form; I'm worried nothing will truly replace it when that time comes.
I had a mid 2013 MacBook Pro at work and I completely agree - the epitome of function, form and longevity. I used it daily for almost 7 years and just recently passed it on to a non-dev coworker who needed a laptop at home, and got a 2019 MBP instead. I replaced it because I needed more than 500 GB of disk space. Otherwise it still works perfectly. I think that's very impressive for a 7 year old laptop these days.
Most of this thread is anecdata, which tends to skew heavily negative. As a comment above says, it's difficult to tell whether these are issues at scale, and whenever I read these complaint-heavy anecdata threads I always feel like chipping in that I've had very few problems with my Mid-2014 MacBook Pro 15" on all versions of MacOS that have been release since I bought it. I'm not even sure what the purpose of threads like this is, really.
Yes. It's extremely difficult to get an idea of the scale of these problems from these threads because they bring out a lot of +1 piggybacking ("I have this issue too"). From there, it inevitably goes into a thread about the demise of Apple's quality control.
Somebody should make an accurate product problems site where you can share your issues and people can only vote on them if they legitimately have that issue. Then it would be obvious the extent to which the problem is actually there.
Catalina has been fine for me (old macbook and a new iMac), but I do miss some of the old Steam games that no longer work with Catalina (was it a 64-bit thing?).
Is Steam or Apple going to offer some kind of virtualization thing to allow this old apps/games to run? Or did we really just lose these forever without using your own virtualization (VMWare etc)?
(Or is there some news about this that I just missed?)
The group FaceTime UI is very strange, why have several overlapping boxes rather than just dividing the screen equally and using all of it? I know it's like this in iOS, but I'm not sure what group video looks like in macOS (though it sounds like it's the same based on the update text here).
It's weird because the easier thing also seems like the obvious thing, and it's clearly better.
why have several overlapping boxes rather than just dividing the screen equally and using all of it
It's good for smaller screens. I was on a Zoom video conference last week, and all I could think about was how so many people who weren't allowed to/going to speak were taking up valuable screen real estate.
Correct, but if you have multiple people talking at the same tome, you still want to see all of those people, without relying on Speaker Mode to switch between speakers for you.
I typically use video chat apps like Facebook, Google Meet, and Zoom. I think the grid is efficient from a space perspective -- there's less wasted space -- but the grid looks very corporate and boring.
I think there's also a practical advantage of placing tiles haphazardly: it might help spatial memory. It's a lot harder to focus on or find a particular rectangle in a sea of rectangles than it is to follow a shape than looks a little different from the rest. However, I have never used FaceTime group calling (because I don't have most of my friends' phone numbers), so I don't whether this is effective in practice.
FWIW, I don't believe FaceTime requires others' phone numbers. I think it can be a phone number or any email address tied to their Apple ID, similar to iMessage.
I wish there was a way to customize this, though: Thinkpads have (on Linux) allowed setting thresholds for "charge only when less than" or "charge only up to" for many years now.
Given that such primitives must also exist as part of 10.15.5 now, I hope that somebody will figure out how to modify the default behavior.
There's an app for that called Aldente[0]. I am sceptical of the benefits though, too much of micromanagement for a device that's supposed to just work.
I mean you can be better off if instead of undercharging the device all the time and have less than optimal charge when you need it today you can have a device that holds just as little charge some time in the feature and fix it by replacing the battery if it turns into an issue.
Live the high life today worry-free and suffer tomorrow instead of suffering today for some possible occasion in the feature where you might need few more hours of juice.
I'd rather leave it to the OEM to determine the best setting (knowing their batteries and charge circuitry), but have the option in OSs that run on devices whose OEM hasn't done so.
That's my point - I replied to someone wanting it to be configurable on macOS; I'm saying why, I'd rather leave it to Apple, I assume they can do a better/more accurate job than me, but sure, it's great that you can do that on Linux if there aren't OEM drivers available with better tuning.
I'm glad that sort of thing doesn't exist on macOS by default, though third-party tools have existed for a while.
I'd much rather the system figure out the charge threshold dynamically based on all sorts of variables like temperature, battery degradation level, expected imminent usage (learn from my own past usage on-device -- coreduetd) and other variables they can take into account. Which is what it does.
Yes it has, just not for Apple. Chrome/Chromium do Courgette (binary diff) updates to great effect. Those 8GB Xcode minor release monsters could definitely use some binary diff love.
> Binary patching hasn't been a thing for years unfortunately.
True, and it's fair to say binary diffs aren't worth the complexities they introduce anymore, yet the concept of only pulling updates for individual packages has been a thing for over two decades in the *nix world. It seems like a poor choice for Apple to not bother making updates more discerning while also removing binary diffs... certainly a shitty move for anyone without a 99th percentile internet connection - then again everyone outside of that group probably can't afford Apple products anyway.
For some perspective: 3.6 GiB is significantly larger than the downloads required to install my entire OS from scratch, and 36 times larger than my average weekly run of apt.
From my now outdated experience on Mac OS i'm aware they do divide things into installer packages with some kind of receipt for version info that software update probably probes. So I guess it must be that these packages are not very granular.
> binary diffs aren't worth the complexities they introduce anymore
Noob question: Why are binary diffs impractical?
Is it because the (compiled) object code layout dances around too much? If true, isn't that fixable? Meaning: make the order more stable, to minimize the size of the diffs?
I recall a recent story/post about boosting runtime performance by optimizing object code layout. Sorry, I can't refind that article.
If true, couldn't the internals of released code be "sorted" to better enable binary diffings? Maybe the layout optimizer step would minimize the variability enough without requiring a resort.
A fun experiment would be to take a series of releases, run that layout optimizer, and then try the binary diffing again.
--
Didn't Google publish some research, maybe 10 years back, about better binary diffing for publishing updates? Apologies, but I sorta assumed it had become the norm.
Yep, and they use Courgette for all of their Chrome/Chromium updates. It's why no one ever sees long "Downloading update..." progress bars in that browser anymore. 100KB binary diffs are typical. Blink (no pun intended) and you'd miss the download.
Not a noob question... I don't know, but was assuming they were not worth the hassle from experience of using them, having lived through an era of binary diffs for old games where applying them seemed to be very computationally expensive at the time - sometimes to the point that it seemed faster to download a whole release, I assumed it was even more expensive to generate them - Those experiences may not be valid anymore in light of faster hardware or better algorithms, but the gains may also be too minimal when compared to highly granular packaging systems that various Linux distributions use these days.
Sibling comment RE chrome is an interesting one, I can imagine in the specific case of very large and frequently updated binaries like chrome it would still be beneficial to use binary patching.
Definitely frustrating for anyone not on broadband. There are more of us than you'd think. It's a giant 'fuck you' from the software industry, real world efficiency be damned.
Totally agree. Updated to Catalina recently and on top of the huge patch size the update client doesn't seem to have any retry mechanism built in. Connection drops 9.9G into a 10G download? Too bad, start over. Just a simple wget -c would do. Incredible.
I wonder how many real word resources are wasted over this. It costs money to serve that data, it costs money to download that data, it congests the network wherever that data is being routed, it consumes energy however that is generated in the region.
Windows Update does this too, if the transfer fails it starts from zero. Multiply this by... what, a couple million times for each update?
I assume they run the numbers and it's cheaper to just dump all the data and run things inefficiently, compared to having teams deploy it correctly. It's just a waste though, and it pisses off your customers.
My personal monthly usage is in the 100G region; this particular failed update has cost me 20G of burned bandwidth. Where I live around 2% of the country doesn't have access to decent broadband. Since these updates are released every 2 months or so in a very back-of-the-envelope fashion failed updates amount to 0.2% of the country's total bandwidth usage. I might be an order of magnitude off but that is still an incredible waste.
Maybe they did run the numbers, but the amount of engineering effort required to fix it is absolutely minimal (couple of lines...). A significantly worse outcome for society as a whole at the cost of a try-catch block.
If they would only use something like rsync or ostree at the least. But if the biggest files are archives it would require recompression. If it's more like an uncompressed tar archive then it should be easier, but still it would require either a clever algorithm or the format support by the algorithm.
Xbox 360 patches were handled differently -- only the game binary would be included in the initial patch; any content updates were released as free DLC add-on packs (or, if a game was releasing new paid DLC, they'd require a new add-on pack that included some subset of the new content for free).
PS4 updates are notably bad, since they do a full copy & write (vs write-in-place) on patches. It's good if there's an issue on updating, since you can delete the partially-updated copy and start over, but bad because it requires more than 2x the disk space for the patch.
i visited my brother last night and he threw on the xbox. it forced an update. then it downloaded almost a gig and threw an error. then it started again and installed and rebooted and installed.. about half an hour later we'd moved onto other things and never even played it. answered my question about whether or not i wanted one though! lol
I think they’re referring to games. A good example? CoD Modern Warfare (the recent one) _always_ seems to have 50GB updates. It is rather annoying, even with fast internet.
Modern Warfare is a nightmare. 50-80GB updates every month and no online play until you update. That can be over 12 hours for me. It's one thing I've noticed since getting a console recently after not having one since the PS2 era. Huge updates constantly. Even when you buy a disc they have to download a bunch of stuff. Such a horrible user experience.
Usually it's around 30GB. I play on PC and have to remind all of my party to update the night before. Each update seems to only come with small changes, and the total game size only increases by ~5GB usually. So they're either recompressing previous assets or changing stuff that can't be patched in along with new content. These updates usually happens every week/every other week. It's insane. I think all of the previous CoD:MW titles combined isn't as big as this one alone.
My girlfriend and I play with our old roommates most evenings as of late - we've gotten in the habit of turning on my gaming rig and her PS4 around lunch to suss out any updates. We are lucky to have symmetrical gig internet, but even with an Ethernet cable connected the PS4 download speeds abysmal. Those COD patches really are monstrous.
Newer games in general have these humongous updates. I’m somehow pleasantly surprised when checking that GTA V total size is 90GB all in. An open world online game with endless possibilities. Fortnite, a way simpler game, is ~70GB with monstrous patches too. I really wonder what is in those files..
Probably how the stuff is packaged for consoles where content is continuously streamed to make up for the limited memory on consoles and anemic CPU's so texture maps , world data and game logic are interleaved.
Patching all of that is difficult to it is probably easier to replace the content streams in one go.
I can't remember a time I've downloaded a patch that is as big as the base game on a console. Granted, most of the games I play are on physical copies.
I "asked HN" about this recently but unfortunately it has not received much attention. Maybe now there are people around who know what exactly happens during an update?
Not an expert in how the update process works, but the update certainly contains all the changed files, of which there is usually a lot. (I would think this is every project that has had a source code change internally…) There are number of shared- and kext-cache related things that also run afterwards.
Btw does anyone know how to test-install a newer OS other than with Mac OS' upgrade option burning through 6gb+ downloads supposedly also replacing the OS rather than installing it onto an external disk? I've got an unused old mini that's still on the OS it came with - way out of date. I even went to an Apple dealer to get Mojave onto an SSD drive last year, only to find out that the mini needs a firmware update for APFS that only comes with Mojave/Catalina, but I can't find a download option for only the firmware update so that I could boot from an external AFPS disk.
Exactly what I had hoped to avoid. I only keep the Mac for cross-platform builds, site testing on Safari, and for hypothetically being able to open old 3D assets, but the old apps almost certainly won't work on newer Mac OS anyway.
I used to have a Mac Pro (cheesegrater) but switched back about 2 years ago due to having a kid in college and Dell and Windows 10 were not too shabby.
Probably on my 4th Windows 10 release from the original that Dell shipped with and it has been rather stable system.
Anyone using Node.js for work who has updated to Catalina here? How did it go? I heard there were problems with node-gyp, but didn't investigate further. Everything running smoothly now? Any gotchas?
I had to reinstall xcode cli tools as node-gyp couldn't compile anything after the update.
However, this is pretty straightforward and there wasn't any gotchas.
I did not investigate the root cause so reinstalling XCode might be a bit overkill.
Anybody knows what has happened that multiple manufacturers (at least Dell, HP, Lenovo, ASUS, and now Apple) have released _all within this year_ updates to "enhance battery health management"?
Even on devices which were otherwise out of support already.
I ponder what has happened. I didn't hear any recent stories of battery explosions, huge battery recalls, or new regulations, yet all manufacturers are reacting to something.
One manufacturer announces a feature, other manufacturers think "oh crap, we don't want anyone buying a competitor rather than us because of that" so they implement it too ASAP.
I've been in a lot of product decision meetings. In my experience, there's no quicker or more effective way to justify building a feature than "our competitor has it".
Apple got sued for throttling iPhones with overly aged batteries. My understanding is that for all battery-dependent devices, you either sometimes throttle the CPU so it doesn’t demand more voltage than the battery can reliably deliver, or else you risk sudden power loss at what appears to be 20% life.
After the lawsuit I assume now all manufacturers leaving it up to the consumer to decide which behavior they want.
There's nothing wrong with throttling in itself. Pretty much anyone will prefer a performance hit to a spontaneous shutdown.
Where Apple got in trouble was that they were doing it secretly. Now days they still do the throttling, but you can go into battery status and it tells you if the battery is delivering peak performance or not.
It actually gets worse in Apple's case. They were doing it secretly, which is bad enough, but they were also denying warranty repairs to throttled phones (since policy was to only conduct the repairs once an arbitrary "health" metric fell below a certain point, and the throttling kicked in first).
This meant your options were to either deal with the crap performance, or void your warranty by installing a third party battery.
> "your options were to either deal with the crap performance, or void your warranty by installing a third party battery."
Or pay Apple a not-entirely-unreasonable fee to get them to replace the battery ($49 IIRC?), a bit more than the third party price but perhaps worth it for peace of mind.
That price only went into effect after the outrage broke. Additionally, people were turned away at Apple stores since according to the diagnostics, there was nothing wrong with their battery. You couldn't just walk in and demand a battery replacement or a replacement of any other part that passes diagnostics.
When they first rolled out the throttling they were still charging $79 for a battery replacement. They cut the price significantly specifically as a response to the backlash.
This in specific is because so many people have gone from using their laptop battery to keeping their laptop fully charged on the plug all the time due to WFH. That can be unhealthy if the device keeps the battery topped off.
The exact phrasing Dell used is "modified battery algorithm to prolong lifespan and minimize risk of swelling", as seen for instance at https://fwupd.org/lvfs/devices/com.dell.uefia86a3f07.firmwar... and several others in that site. I looked at a few of the Lenovo ones, but didn't see anything battery-related in the recent changes.
"Changes the default setting of the HP Battery Health Manager in the BIOS Settings from Maximize My Battery Duration to Let HP Manage My Battery Health. The new default setting dynamically changes how the system charges the battery based on usage conditions over time to provide optimal battery health. "
(The HP Battery Manager itself is also a recent feature)
I've been an Apple guy since 1984 when I was 12, and Catalina has finally pushed me to Linux for my daily driver. I'll still keep Apples around due to Apple's mostly excellent ecosystem execution, but... something has gone missing lately
Almost there too. Too much crappy deamons, too much cloud integration. And now since a week or two my bluetooth cannot connect to my external speaker. Looks like only a fresh install _might_ fix it...
I've already bought a linux laptop for personal use, but it is just to crappy to use. I might just have to continue my efforts to get a proper linux distro on my mbp.
Read the whole thing. It really appears as if there is a virtualization-escaping CPU-crashing bug in Ice Lake (10th generation) Intel chips that was discovered because it exclusively crashes when used with Jetbrains IDEs.
Sounds unlikely, yes, but seriously read the whole thing.
Self-reply: no, it apparently does not. Microcode rev is the same and the issue is still there.
TL;DR: Jetbrains IDEs all crash on the 2020 Air, which has a 10th gen Ice Lake CPU in it. Sometimes it brings down the OS. Doesn't happen at all on older Macs. At first I and everyone else thought it was a MacOS or maybe graphics driver bug.
Then the plot thickened...
I tried it in a Parallels Linux VM, and it crashed and brought down the entire host. Not the VM, the host. This should not be possible.
Then someone reported that it also afflicts a Microsoft Surface model with the same generation CPU.
So given that it escapes VM isolation and occurs on multiple OSes and hardware, it really appears as if this is an Intel CPU bug being triggered by some exotic pattern of instructions being generated by the JVM.
I must also add that nothing else crashes on this machine. Nothing. I've run multiple OSes in concurrent VMs, done heavy work, run GPU stress tests, and it's rock solid... unless I fire up CLion or GoLand... then kaboom.
No I'm not the only reporter. No it's not (insert mundane thing here). Read the whole thread I linked above.
While I haven't had nearly as many bad experiences with Catalina as some other users, I am indeed starting to feel increasingly uneasy with the overall quality of the platform, both as a user and as a developer.
I've been a Mac user since the late 2000s: the first Mac I could afford at the time was a MacMini G4.
When I booted that machine up for the first time I was amazed by the quality and the refinements of the platform. When I started looking digging into the platform as a developer the feeling was exactly the same: care and refinements were almost omnipresent in every aspect of the platform.
While bugs - and even nasty ones - have never been aliens on Mac OS X/macOS, what I've personally started to experience in the past few years is a feeling of neglect toward the platform.
Yes, Catalyst seems to go against this trend: its intent, though, is to ease the porting effort of iOS applications to Mac. A certainly laudable goal.
What would probably be more laudable for Mac users and developers are: core platform stability and quality, API stability, developer documentation overhaul [1].
Our CI breaks every single time there's a minor macOS update and/or an Xcode update: we're at the point where our macOS UITests (XCTest) require more maintenance to keep working than the application itself.
XCTest on macOS has ridiculous bugs. I lost count of the radars we've opened, and the sample apps for reduction we've sent to Apple.
At every single minor macOS upgrade, few users complain that our application "doesn't work anymore" in certain cases. And to be exquisitely blunt: there ain't wrong with the application itself.
Rebooting after a macOS upgrade fixes whatever event tap regression we hit in Quartz.
In general, what I'd personally like to see:
* a less "hysterical" approach to the OS would be so. much. appreciated. (eg: focus on the few important core things, keep them working)
* release when it's ready: this whole marketing driven annual release cycle is unsustainable
I do understand that the vast majority of Apple's income is not Mac related at this point, but WTH...
When I booted that machine up for the first time I was amazed by the quality and the refinements of the platform.
And the speed of the boot process. I switched to Macs around 2000. One of the things that struck me was how quickly Macs booted compared with Windows machines.
Today, a Macs don't seem to have any advantage when it comes to cold booting. Sleep, yes. But not a real boot.
Since the last Catalina update (10.15.4) Firefox is almost unusable for me, whereas it was buttery smooth before. There are constant multi-second freezes and when I switch tabs I often see a gray loading screen before the content shows up a few seconds later. Never saw that happen before. Maybe it's just a coincidence and has nothing to do with the Catalina update, but I am interested whether other people experience the same.
I experience this in a lot of different apps on catalina. just a brief flash of something. Seems to have gone away in 10.15.5, but I've only had the update for an hour.
My daily driver is a maxed-out (for its time) 2012 MBP15r, and I've been on High Sierra for a long time. I like the keyboard, there are maybe 5 dead pixels, and it's fast enough for my purposes. It's newrly always plugged in so the mediocre battery life hasn't been a problem. A new contract gig is about to commence, and I'll be using their hardware. It's strange to be feeling apprehension about degraded DX on a brand new dev machine vs my 8 year old warhorse.
Speaking for myself (not the person you asked) I still have machines on El Capitan and don't plan to upgrade them because they just work. High Sierra almost completely bricked them. Every new version of MacOS is more buggy than the previous and it takes dozens of hours of googling to figure out how to work around the bugs, restore features Apple removed, and turn off misfeatures Apple added.
Last time I tried, it was not at all well documented how to upgrade to Catalina when your iMac's boot drive is 1) an external SSD and 2) not (yet) formatted to APFS. Giving it another try now, but unless it suddenly got a whole lot easier I'll probably be passing again.
Have it on my 2016 and 2019 Airs though, no issues.
Did anyone face any issues installing the update? I have been lucky till now as I have never faced issue while updating. But then I read news reports that say people faced issues and macbooks were bricked while updating. So I thought from now on I would wait few days before I update.
Sigh. Now every time I use Speech to Text there is a sound before the dictation. Before Catalina you could disable it in setting but now the setting is complete gone.
And Apple wanted Voice Over to replace this dictation. The problem is most Apps doesn't even support this type of input. Example I cant use Voice Over for dictation on WhatsApp, but I could do it with old double Fn Key Dictation.
Not to mention the voice over recognition by default uses Enhance Dictation which is offline, and its results is actually worst and slower than online dictation.
Basically the worst of both worlds. It is super annoying.
> Battery health management is on by default when you buy a new Mac notebook with macOS 10.15.5, or after you upgrade to macOS 10.15.5 on a Mac notebook with Thunderbolt 3 ports.
Unfortunately... this update does not help everyone. I have a MacBook Pro (Retina, 15-inch, Late 2013) and these changes in the System Preferences are not visible. This quote was found only when digging further in the "How to control the battery health management feature" instructions.
Catalina decided to remap my paragraph key to "ö" this week (using a 3rd party Swedish Dvorak layout). Remaking the layout to the proper symbol works for about an hour. It is my Emacs prefix for functions I don't consider important enough to warrant their own direct key bindings. Annoying doesn't cut it.
I haven't done a factory reset yet, but I started backups today to be able to.
Catalina is a disaster. I got my Mac for free from a relative,but I sure as hell won't buy another.
So: The error seems to be that the keyboard assistant expects qwerty. It must have run some time, and it wanted me to press the button next to the left shift key, which on svdvorak is "ö". I had to delete the library/preferences/com.keyboardsomethingsomething and redo the keyboard identification and press the "<" key (which on svdvorak is about as far away as you can come from the ö key).
It's fairly easy to install multiple versions of macOS on your drive to test [0]. You could create a separate install of Catalina and test your applications with it before making the move (which Apple now thinks is a bit more urgent for some reason).
I have a 2014 mbpro. It was working fine but when I try to export from final cut it crashed. After restart I tried to export again and it said that my battery was dying (it was at 55%) then powered down
Finally! They fixed my Image Capture bug! Now I can import pictures from my iPhone AND they will be deleted from the iPhone. I've only imported 7 movies at one time with no problem. So far, so good...
Is there any point in downgrading to Mojave from Catalina? Haven’t looked into it, so I assume it’s a pain. But the semi-daily kernel panic, over heating, kernel tasks taking up 80%+ CPU is rediculous for a $3k+ laptop
I'm still on High Sierra... I wonder if there are really good reasons to upgrade to Mojave? (I only know about disadvantages about upgrading to Catalina.)
I'm on High Sierra on one Mac and Mojave on another.
Other than APFS, I cannot see any real differences in usability, other than opening quickview and then pressing shift and up to go to the previous file in a list (so you could look over many photographs and select them so as to choose which ones to delete) is now busted in Mojave and will only show the first file selected, making quickview now useless.
In High Sierra and prior (back to Snow Leopard at least?) this would work and show you the preview of the most recently highlighted item, so in a selection of 20 items, if you shift-up select item 21 you see 21. Now you only see item 1 all the time. It's a great annoyance.
I have found High Sierra to be subtly faster on older hardware, and subtly less crash-prone (although neither crash often). So I would suggest staying put absent a reason not too.
Very. My MBPs are normally plugged in and the battery is almost always degraded by the end of year 1. I’m looking forward to this update just for the battery protection.
(And I know I should try to keep my MBP unplugged as much as I can, but it’s normally docked.)
My 2015 13" MBP has been charged 221 times in 4.75 years. It has been plugged in for the vast majority of the time, yet still shows 88% total capacity and will still last 8+ hours doing standard web dev stuff.
My iPhone gets discharged every day, but its capacity is down to 78% after just 2.5 years.
I try not to but it has been below 10% many times (and occasionally so empty it turns off).
Now WFH, so it has been plugged most of the time since late March and I'm really just nursing it through until the newer phones are released so I can decided whether to get the updated SE or the newer model.
I think this is false. Lithium-Ion batteries (in contrast to e.g. NiMH) do not have the memory effect [1]. It's actually charge/discharge cycles that reduce battery life for such batteries [2]. Older MacBooks were only rated for 300 charge cycles, nowadays up to 1000.
This also reflects my experience. In recent years, my MacBooks with the highest remaining capacity have been those that I had pretty much always connected to a charger, only using it on battery during meetings.
Do note, however, that keeping the battery _in_ while the device plugged in usually results in the battery kept "floating"; which results in many short charging cycles (i.e. discharge to 99%, then top to 100%, rinse and repeatt). This is what this type of "battery management" updates are designed to avoid, and many manufacturers allow configuring these cycles in ways that are much softer on the battery (e.g. avoid 100%).
How frequently? Could you please point us to some source?
I also keep my MBP plugged in almost all the time. Long ago I had a MacBook Air and I absolutely wrecked its battery pretty quickly by doing a full cycle every day.
What they should do is to allow user to switch between standard (works like now) mode and battery-conservation mode, where user does not care about battery and system does its best to preserve battery health. I'm using my laptop on charger 99% of time and when I need to transport it, I could switch that mode few hours in advance.
Source? If you are letting charge drop in that range and charge again, you are creating charging cycles. Charge cycles reduce the capacity of Li-Ion batteries.
This is an extremely bad UI for battery management logic. If that is the case, a computer that is always plugged should discharge and re-charge the individual battery cells one by one to enlarge their lifespan.
IIRC you should strive to never reach the extremes; i.e. keep in the 20–80% range in order to maximize its lifespan.
Another tip was to avoid rapid energy consumption (such as from intensive use of CPU or discrete GPU usage) while on battery.
My main issue is that whenever you plug an external monitor, this automatically trigger the discrete GPU, and I'm using an external monitor for working from home (due to the cervical), so I'm mostly using the charger.
There are plenty of disruptive ubuntu upgrade issues, for example for almost 5 years (through 12.04 to 16.04!) if you installed on LVM volumes your /boot partition never cleaned up old vmlinuz and initrd images, causing dpkg-reconfigure to fail, leaving dpkg completely broken.
Once in a while the freeze lasts so long the computer crashes resulting in a crash report:
Termination Reason: WATCHDOG, [0x1] monitoring timed out for service
Termination Details: WATCHDOG, checkin with service: WindowServer returned not alive with context:
unresponsive work processor(s): WindowServer main thread
I am at my wits end to what can be causing this. Is anyone else experiencing similar freezes?