30B images isn't that much in the context of Pokemon Go playerbase of ~50 million (conservative estimate based on users today). That's about 600 images per person, in a game that has been out since 2016... that's pretty low adoption as the previous user said. I don't think the quest has been out since 2016, but considering a large fraction of users are basically daily users, it's still quite a small number of images.
600 images per person is a huge amount, especially considering some amount of people are like the GP and don’t do it. The active picture takers would be taking _more_ than 600 images.
> In the new design system, windows now have a softer, more generous corner radius, which varies based on the style of window. Windows with toolbars now use a larger radius, which is designed to wrap concentrically around the glass toolbar elements, scaling to match the size of the toolbar. Titlebar-only windows retain a smaller corner radius, wrapping compactly around the window controls. These larger corners provide a softer feel and elegant concentricity to the window…
Just a bunch of words that raised no red flags, maybe sounded like a decent idea even, but when you see it how is your reaction not “oh, that’s bad”
I feel like this is the design process. You have ideas, they sound ok, you try them out, and then immediately you revert a lot of them. The ideas without the taste to know when not to do something is becoming the new Apple way
I think what they're saying is that larger radii are for 'real windows' that have toolbars and such but there are 'mini windows' and those get smaller radii. It doesn't seem well enough baked for them to release it like it is but there are other UI problems that I've been annoyed about for a long time (in particular shadows around window boundaries so you can never get a truly flat tiled experience).
Rounded corners (and the utterly massive drag area next to them) are touchbar 2.0. Features that no one asked for, has questionable value, and that provides marginal benefit even for its intended audience (touchscreen macs, no doubt).
Except it kind of fails at that too. The window corners seem to be either based on those squircle things or some kind of other varying radii curve which eases out into sides much more gradually than proper circles. The window buttons (close, minimize) the round toolbar buttons anchored to top right corner are based on proper circles. Attempting to center circle in a varying curvature corner results in varying spacing between the circle and corner, which defeats the whole point of why different windows have different corner size (not calling it radius because they are not circles).
When the top right corner contains a search field instead of rounded button, that also seems to use varying curvature instead of capsule with proper circles at the ends. Still results in varying spacing between window corner and the toolbar content.
And that's just the 2 top corners. Attempts to align top corners result in even bigger mismatch with the rest of the window content. For example calculator -> it has a grid of round buttons. While the window corners might match top bar (as good as they can due to different shapes) the main calculation buttons don't match the corners at all.
Similar problem affects many of the popups which have something like confirmation button anchored to bottom right corner.
Rounded scrollbar handle - not aligned with bottom left corner size, instead it awkwardly gets cut of by different amount in each program.
Menus also have this disease. The non circular corner curve of overall menu shape extends way past the corner of item highlight resulting in varying spacing and making it feel almost like whole menu has bulged out instead of flat sides.
In MacOS 26 it's only weirder, because as you say - due to squircle window corners, now we have this constantly varying distance to the edge.
EDIT: I "get" apple's fascination to squircle, but why they made it such a big radius. Probably no one would've complained if they just have changed from current ~15-20px rounded corners into ~15-20px squircles, but they went 50px+ on toolbared windows.
The idea that "nothing in the natural world has the contrast of a modern display" is backwards and keeps being repeated. The real world has far more contrast than any screen that can be manufactured.
Think about standing outside on a sunny day. The sunlit pavement might be tens of thousands of times brighter than the shadow under a tree. Look up at a bright cloud and then glance into a shaded doorway. Your eyes can still make out detail in both. The dynamic range of the real world the ratio between the darkest and brightest things present at the same time is enormous.
Cameras struggle with this. When you take a photo, the camera can't always capture both the bright sky and the dark ground correctly at the same time. Either the sky blows out to white or the shadows become black. That limitation is why HDR photography and exposure bracketing exists.
Even modern digital cameras still capture a far wider range of brightness than most displays can show. That's why we use tone mappers in photography and video. Tone mapping compresses the huge brightness range captured by the camera so it can fit onto a display that only has a tiny slice of that range.
So screens are not "more contrasty" than reality. They're the opposite. Displays are a bottleneck that force a very wide real world brightness range into something much smaller.
Your eyes are also incredibly good at adapting. If you look at a white page of paper in sunlight and then look at black ink on it, the contrast between the two is extremely strong. Snow in sunlight next to a dark rock is another example. Nature is full of intense light and dark differences.
When web developers avoid strong contrast because it feels "unnatural", they're misunderstanding the physics and the biology. High contrast between text and background isn't artificial at all. It's actually closer to how humans evolved to see clearly: dark shapes against bright surfaces or bright shapes against dark ones.
The real ergonomic problem with screens usually isn't contrast between text and background. It's screen brightness relative to the environment. Many people run their monitors far brighter than the room around them. If a screen is glowing like a light source in a dim room, the whole display becomes visually harsh and fatiguing. In ergonomics, the usual advice is to match the screen's brightness to the surrounding lighting so the display feels like part of the environment rather than a flashlight in your face.
When display brightness is set appropriately, strong text contrast simply improves legibility. The discomfort people blame on "too much contrast" is often just a monitor that's set far brighter than it should be. Reality itself contains vastly wider brightness differences than any display, so high contrast text isn't unnatural at all. It's a practical way to make information clear within the limited dynamic range that screens can actually show.
Sorry. I really meant that there's nothing people are reading or viewing that has higher contrast in the natural world. I should've been clearer (and less forthright).
Your points about day/shade contrast and eyes adjusting are correct, as are screen brightness levels people use.
> Look up at a bright cloud and then glance into a shaded doorway.
Yes, and that causes strain on the eyes. Our eyes very are good at adapting (as you said), but it is not pleasant to do rapidly or while trying to concentrate and interpret text.
Bleached white paper and black ink is (pretty generously) ~15:1 contrast ratio in a well lit room, which any half decent screen in the past 20 years surpass.
Pure black/white text is harder to read on screens since they are producing and pushing light at you (as opposed bouncing back ambient light like on paper). We have never seen text printed on paper at the contrast ratios a modern screen can produce, since there is no paper white enough or ink dark enough.
There are many things with typography that are finicky and sometimes counter-intuitive. Making text bigger and all-caps won't always make something more readable (see here: https://www.mentalfloss.com/transportation/roads/why-road-si...), likewise more contrast doesn't always make it more readable for everyone.
Of course there are people who require more contrast and larger type sizes and the great part about reading stuff on screens is we can often accommodate that better. Some websites/apps/etc handle accessibility options well and some really don't.
So I kind of went on a much longer thing than I wanted to... oh well. Sorry if I was too forthright in my initial comment, I guess clarity in intent/meaning can be just as important as readability.
> You can maybe get a cheaper Windows laptop but it will be terrible in almost everything
It will be worse at almost everything, except running my preferred OS (Linux). Being able to upgrade/repair RAM, storage and battery at home is quite a perk too.
I totally get it. I have the M4 Air, grabbed it for 700$ on sale. I also have a MSI Creator with Linux (wayland). Performance wise the base Air crunches through everything up until lots of things are open and gpu is roaring (encoding or streaming), and with colima, I have few incus linux containers up and running. Battery life is formidable.
Nothing comes close.
My linux laptop (32GB ram / beefy gpu) barely withstand 40 min on battery, but can handle very daunting tasks, and obviously gaming.
These are 2 different use cases, but right now, for the ultra portable laptop, Air is the king, until x64 brings back the efficiency per watt. Even qcom can't compete. That being said, I am a big fan of the apple hardware and not the apple software, so whenever Asahi linux is ready enough (with good battery life), I am definitely jumping ship.
That beefy GPU is the killer for battery life. There's quite a few PC laptops floating about that get in the 10-16 hour range battery life on lighter workloads (text editors, fast compilers, streaming video, browsing internet). I'm typing this on one right now. I wish it was running linux, but I need windows for work up until we get the last of our antiquated .net platform on core.
Sure, it's got integrated graphics so it won't win any gaming awards, but that's what the laptop with the beefy GPU sitting in the corner is for :) That thing pumps out enough heat to not be too pleasant sitting on a lap anyway.
Exactly. Big GPUs are the #1 reason battery doesn’t last on Linux laptops.
Power management is not done well with the GPU drivers in Linux. If they are not used, they still draw a lot of power, while that’s not really the case on Windows, from what I heard.
I think the best is to get a good Linux laptop, but with an integrated GPU. If you really want to do anything beefy, you can always use an eGPU :)!
Obviously this will never come close in terms of convenience as having an actual M series MacBook…
Many newer Windows laptops are now having their ability to update ram and storage removed as well. I believe the newest intel architecture introduced this, but my information might be out of date.
There are Intel CPUs which come with bundled RAM. For example Intel Core Ultra 5 238V. It's like SoM: RAM is mounted directly on the CPU package, not even soldered on the motherboard. I'm not sure what particular advantages does that bring over traditional packaging, maybe shorter wires to allow for faster turnarounds between CPU and RAM. But there's zero chance of upgrading or replacing RAM for sure.
In theory, but that is not the case with Lunar Lake, which nowadays does not have a greater bandwidth than the current CPUs with external LPDDR memory.
However, at launch, a year and a half ago, it had a bandwidth about 15% higher than competing CPUs.
For a really "massive increase in bandwidth", it would have needed a wider memory interface, like AMD Ryzen Max, which has a 256-bit memory interface, instead of the 128-bit memory interface of most Intel/AMD laptop CPUs.
Yes, totally. By introduced I didn't mean they were the first in the space but rather they have introduced it to the laptops they're shipping now. But yes, it's been a thing for awhile on other architectures as well.
Soldered RAM is not necessarily bad, because it is usually more reliable than SODIMMs, so it is less likely to require replacements before other parts fail and it is less likely to suffer from transient hardware errors that are not caught due to the lack of ECC memory in most laptops and mini-PCs.
However I consider soldered SSDs and/or soldered batteries as completely unacceptable, as they limit the lifetime of a computer to low values.
And if you are comparing against an M1 or M2 you can find numerous PC laptops that will beat that out in performance and still have a quiet/cool/long battery life operation.
Yes, the MacBook Air is unique-ish for having no fan at all, but a slow running fan that you can barely hear is going to get you more performance with basically zero added cost or compromise.
And for those users who don’t need top performance and just need an affordable office app machine, I’d argue that Snapdragon laptops have the same primary benefits as the MacBook Air.
In terms of competition against x86, Apple is only ahead of competition in their latest two or so generations and only in specific ways.
Want to play games sometimes like 936 million other PC gamers in the world? (The fastest growing segment of people who buy computers) You’ll pay a lot less for an Omen Transcend 14 than a MacBook Pro at the same specs and you’ll get a system with a very similar noise and battery life profile, along with far Better graphics performance.
I don’t personally think Windows is so bad compared to Mac in terms of annoyances. Mac nags you about all of Apple’s subscription services and you can’t even uninstall their apps like News and Stocks. Microsoft lets you uninstall everything including Notepad. It’s really not that annoying after about 5 minutes changing settings and uninstalling some things.
If we are talking about buying a used Mac we are also talking about buying a computer that will lose software support before the Windows equivalent historically. E.g., you buy an M2 MacBook Air and you’ve got about 7 years left or less before you lose major OS versions. Almost guarantee you that won’t be the case with any reasonably recent Windows PC that supports 11 today. My
Not true. Mac OS does not nag you about subscription services. What are you talking about?
Windows is offensive, insufferable trash. From its CONTINUAL hounding about "your Microsoft account" to its bug-riddled, regressive, and shambolic UI. Things Windows users took for granted 40 years ago are simply gone.
Example: Select three PNGs in Explorer and right-click on them, and look for "Open with..."
Literally the moment you buy a Mac. Apple hardware comes with 3 month trials of various subscriptions, and you get a notification about it. They try to get you to sign up in hopes you’ll forget to cancel.
When I bought my iPhone 17 the sales associate even tried to pitch signing up for the trial in person as he guided me through the purchase process.
When you cancel the trial it ends it immediately instead of ending it at the end of your trial period, a dark pattern designed to encourage you to forget to end your trial.
Apple devices also nag you about buying AppleCare in the system preferences.
I’ve never been hounded about my Microsoft account. Be specific. When does this happen? Yes, you need one to set up Windows 11 (just like a Mac and especially iOS are basically useless without an Apple account anyway), but after that I’ve never been hounded around anything related to it.
Never had problems figuring out how to open stuff in Windows. No idea what you’re saying.
Most of these extreme claims about Windows seem to come from people who don’t even use the OS regularly and have forgotten about the ways in which macOS does many of the same commercial OS practices.
Every time my Windows gaming PC updates it nags me about setting up backups to OneDrive.
I cannot install Windows without a Microsoft account unless I apply work-arounds.
It constantly offers Office 365, even adding dummy icons to the start menu.
There are adverts on the login screen.
To be fair I installed Bazzite there, but for a laptop I cannot find an equivalent device at the same price point even ignoring the need for linux drivers.
Mine does not nag me about OneDrive or backups. OneDrive is not even installed. If I search my start menu for OneDrive, nothing even comes up.
Sure, you can't install Windows without a Microsoft account, but realistically a Mac is far less useful if you don't do the same thing. If you don't sign in with an Apple ID you've got zero iPhone integration, for example. I would imagine that 95% of Mac users are signed in to their Apple ID.
Signing in to an account to use commercial software doesn't seem unreasonable to me. I'd rather sign in to my account than deal with entering a product license key and needing to keep track of it.
I have not been offered Office 365 since after the first install.
There are not adverts on my login screen, it's completely blank. Change your settings.
These sales tactics are not unique to Windows, Mac subscriptions are upsold in the system settings and via notifications. They do go away and stay away but they are there when you buy the system.
Even after saying no to OneDrive and doing all I can to remove it, it tends to randomly come back months later and will automatically start uploading my desktop and documents folder to the cloud.
Edit: I even specifically bought the Pro version hoping to be able to shut some of this off.
It's not worth the hassle, but for the Windows machines in my house I set up Windows Server and have all the machines provisioned to an Active Directory domain where I turn off all the crap via Group Policy. You can get by with just editing Group Policy for a standalone Windows Pro copy, but for more than one machine I really didn't want to fiddle with having to update each machine's policy whenever Microsoft does something stupid.
This literally does not happen. Are you on Windows 10? It doesn't happen in 11. It is fully uninstalled. If I search the Start Menu for OneDrive it doesn't even show up.
It very much does happen. I’m one 11. It seems like every time it updates I get the “let’s finish setting up your computer” screen that asks me to setup one drive.
- When you buy a new device you get a few notes in System Settings encouraging you to try the free trials as well as buy AppleCare. You can dismiss these permanently with a couple clicks on each.
- When you open the respective apps they ask if you want to try the free trial.
- Apple once abused the Wallet app to send notifications about a theatrical release.
Other than that I'm not sure what the fuss is about.
Funny you should say that since my framework intel 12th gen just started dropping wifi/bluetooth randomly and one cpu starts looking for it frantically in a loop almost bringing the laptop to a crawl (it's very likely a hardware issue and not a linux issue)
Good to know as I have long hesitated to get my hand on refurbished M1 before opting for the framework. On the good side I was able to replace a bent frame and a broken display for a reasonable price instead of having a useless sitting duck/wondering how to use a broken laptop
Honestly for the price you'd have to pay to get an equivalent Windows laptop you can buy two Macbook Airs. I think Apple's higher end machines are overpriced but their entry level laptops are a bargain, for what you get. Unless you need Linux (and the resulting bugginess/short battery life), it's really a no-brainer.
(Don't tell my Linux isn't buggy. I use it, but I regularly run into nonsense like this: https://bugs.kde.org/show_bug.cgi?id=512297 that doesn't happen on Windows or Mac. I still haven't figured out why VSCode freezes for half a second every few minutes on Linux.)
I think pixel phones get 7 years of updates now? That seems about right. If the battery doesn't go by then, the GPS does. It is weird to me that the gps fails first.
No complaints here, I use a Framework Desktop with this chip. 32G given to RAM and the rest plays VRAM. Can use large models like 'gpt-oss:120b' fine. Splurged and got a second SSD for mirroring, hoping to speed up reads/model loads. Haven't tested this for efficacy, but it also gives redundancy. Shrugs!
Haven't paid a subscription in years or even signed up for $EMPLOYER offerings; handles the rare outsourcing well enough.
Do we have to think? Apparently they amassed 30B images. :)
reply