Hacker Newsnew | past | comments | ask | show | jobs | submit | AlphaAndOmega0's commentslogin

I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations. The rumors about an upcoming touchscreen Mac are interesting, perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing. A man can dream..

There are a number of interesting creative apps for iPad that can make full use of its capabilities. A good example is Nomad Sculpt. There's also CAD software, many DAWs. I haven't tested Numbers yet but I would assume its fairly well optimized.

This really reminds me of the 80/20 articles that made the frontpage yesterday. Just because a lot of HN users lament the fact that their 20% needs (can't run an LLM or compile large projects on an iPad) aren't met by an iPad doesn't mean that most people's needs can't be satisfied in a walled garden. The tablet form factor really is superior for a number of creative tasks where you can be both "hands on" with your work and "untethered". Nomad Sculpt in particular just feels like magic to me, with an Apple Pencil it's almost like being back in my high school pottery class without getting my hands dirty. And a lot of the time when you're doing creative work you're not necessarily doing a lot of tabbing back and forth, being able to float reference material over the top of your workspace is enough.

At this point Apple still recognizes that there is a large enough audience to keep selling MacBooks that are still general purpose computing devices to people who need them. Given their recent missteps in software, time will tell if they continue to recognize that need.


I work with Logic Pro X often. I bought an iPad Pro M4 and the Logic version for it is really compelling. Touch faders and the UI are well thought out. The problem is they want me to subscribe to use it. I wish I could just outright purchase it for $300.

They should charge less if offering one-time. $300 only beats $50/year after 6-7 years, depending on the discount value you would assign to a present value calculation. In software it's more typical to calibrate that around 2-3 years. I like the design as well.

I would not want to use CAD software or a DAW without a proper mouse and keyboard, and maybe a 3D mouse too. An interface made for touch really isn't suitable. Even connecting a mouse to an iPad is a pretty shitty experience, since all the UI elements are too big and you have to wait around for animations to finish all the time.

Shapr3D is an interesting 3D design tool which has some CAD capabilities and an interface optimized for use with a stylus --- Moment of Inspiration was similarly oriented (I really ought to try it).

How is connecting a blue tooth mouse to an iPad any different than connecting to a computer? Especially with iPad OS 26?

That is just one very simple part, connecting the mouse. Literally everything else sucks on iOS. File management, hidden menus, running multiple apps, system management... and the list goes on. Need to convert a STEP file to something else on the iPad? Download 15 apps to see which one works, then try to find the converted file in the abomination of a file system? iOS is hot garbage.

Yes but there is simply no reason to have two devices. There are a large number of Windows tablet-laptop combo machines that work perfectly well and prove touch apps work perfectly well on a desktop OS.

Yeah, that took a long time for MS to get to not suck after Windows 8, but touch and tablet interactions on Windows 10 and Windows 11 work perfectly well.


> There's also CAD software, many DAWs.

Assertions like this are what kill the iPad. Yes, DAWs "exist" but can only load the shitty AUs that Apple supports on the App Store. Professional plugins like Spectrasonics or U-He won't run on the iPad, only the Mac. CAD software "runs" but only supports the most basic parametric modeling. You're going to get your Macbook or Wintel machine to run your engineering workloads if that's your profession. Not because the iPad can't do these things, but because Apple recognizes that they can double their sales gimping good hardware. No such limitations exist on, say, the Surface lineup. It's wholly artificial.

I'm reminded of Damon Albarn's album The Fall - which he allegedly recorded on an iPad. It's far-and-away his least professional release, and there's no indication he ever returned to iOS for another album. Much like the iPad itself, The Fall is an enshrined gimmick fighting for recognition in a bibliography of genuinely important releases. Apple engineers aren't designing the next unibody Mac chassis on an iPad. They're not mixing, mastering and color-grading their advertisements on an iPad. God help them if they're shooting any footage with the dogshit 12MP camera they put on those things. iPads do nothing particularly well, which is acceptable for moseying around the web and playing Angry Birds but literally untenable in any industry with cutting-edge, creative or competitive software demands. Ask the pros.


It's such a shame that the iPad has these limitations. It's such an incredible device–lightweight, very well designed, incredible screen, great speakers, etc. I really do feel that if Apple sold a MacBook in the style of a Surface Book: iPad tablet running MacOS which could dock to a keyboard and trackpad with a potential performance boost (graphics card, storage, whatever), that it would be my dream device.

All I want is to put Linux on it. I already own copies of Bitwig et. al, if the iPad Mini didn't lock me into a terrible operating system then I might want to own one. But I'm not spending $300 for the "privilege" of dealing with iPadOS.

> perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

They've been doing exactly this since the first M1 MacBooks came out in 2020.


> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

Literally everything you do gets the full power of the chips. They finish tasks faster using less power than previous chips. They can then use smaller batteries and thinner devices. A higher ceiling on performance is only one aspect of an upgraded CPU. A lower floor on energy consumed per task is typically much more important for mobile devices.


Right but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms? What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?

> but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms?

You probably won’t notice this when using the new machine.

For me, it only becomes noticeable when I go back to something slower.

It’s easy to take the new speed as a given.

> What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?

You would notice it in increased battery life. A CPU that finishes the task faster and more efficiently will get back into low power mode quicker.


I'm pretty sure that users of the announced Blender for iPad port will notice any additional horsepower.

what users?

Faster can also mean more efficient for a lot of tasks, because the cpu can idle sooner so your battery can last longer, or be smaller and lighter.

"Literally everything" doesn't amount to much if I can't actually control the stupid thing.

The iPadOS limitations are largely orthogonal to being able to make use of the available performance, IMO. For example, search in large PDFs could certainly still be faster, and I don’t think it particularly suffers from iPadOS limitations.

> Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

Did everyone forget that these chips started in general purpose MacBooks and were later put in the iPad?

If general purpose computing is the goal you can get a cheap Mac Mini


> The rumors about an upcoming touchscreen Mac are interesting

What rumors have you seen? Anytime I've seen speculation, Apple execs seem to shut that idea down. Is there more evidence this is happening? If anything, Apple's recent moves to "macify" iPadOS indicate their strategy is to tempt people over into the locked down ecosystem, rather than bring the (more) open macOS to the iPad.


Current rumors point to the M6 generation of MBPs being a significant redesign and featuring an OLED touch panel screen.

I don't understand the appeal, even a little bit. Reaching up to touch the screen is awkward, and every large touchpanel I've used has had to trade off antiglare coating effectiveness to accomodate oleophobic coating. For me, this would be an objective downgrade — the touch capability would never get used, but poor antiglare would be a constant thorn in my side. I can only hope that it's an option and not mandatory, and I may upgrade once the M5 generation releases (which is supposedly just a spec bump) as insurance.


It's convenient, and it also makes usage of a stylus far easier.

FWIW, I often rotate my Samsung Galaxy Book 3 Pro 360 so that the screen is in portrait mode, then hold the laptop as if it's a book and use a stylus and touch on the display with my right hand, and operate the keyboard for modifiers/shortcuts with my left, or open it flat on a lapdesk.


Smudges are off-putting... but, there are times when it would be very convenient to be able to scroll or click on a touchscreen. There are times when presenting when a touchscreen would be preferred over a mouse or touchpad. It's not often, but they are nice to have.

And, in regards to smudges, I mean, just don't use the touchscreen unless you have to and problem avoided.

Antiglare can be a thing but that can be avoided by avoiding string lighting behind you.


There's still the issue of accidentally triggering things (when e.g. adjusting the screen) and sometimes you don't have control of your surrounding lighting. I'd still prefer touch to be entirely optional.

https://x.com/mingchikuo/status/1968249865940709538

> @mingchikuo

> MacBook models will feature a touch panel for the first time, further blurring the line with the iPad. This shift appears to reflect Apple’s long-term observation of iPad user behavior, indicating that in certain scenarios, touch controls can enhance both productivity and the overall user experience.

> 1. The OLED MacBook Pro, expected to enter mass production by late 2026, will incorporate a touch panel using on-cell touch technology.

> 2. The more affordable MacBook model powered by an iPhone processor, slated for mass production in 4Q25, will not support a touch panel. Specifications for its second-generation version, anticipated in 2027, remain under discussion and could include touch support.


> Anytime I've seen speculation, Apple execs seem to shut that idea down.

They also said they weren’t merging iOS and macOS, and with every release that becomes more of a lie.

https://www.youtube.com/watch?v=DOYikXbC6Fs


Strategies change. That was 7 years ago, pre-Apple Silicon. It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background, etc.

If that were all they were doing, nobody would be concerned. It’s the crapifying of the MacOS in order to make it work fine with a touch interface that drives everybody bonkers about the slow merge.

I have Tahoe and it’s just as good at being a desktop os as any of the previous os’s. Not sure what you’re referring to.

There have been lots of complaints all over the place that contradict your experience.

One article that talks about it: https://osxdaily.com/2025/09/19/why-im-holding-off-on-upgrad...

For less discerning users maybe the rough edges aren't that noticeable. But the point of choosing Apple products is you should be a discerning consumer.


This a pretty insulting comment. I’m sure ther is better word then discerning.

I was actually trying to be more neutral :\ than saying something like for consumers with taste

the point i'm trying to make is that "apple consumers" are more critical.


> That was 7 years ago, pre-Apple Silicon.

There have been rumours of Apple wanting to shift Macs to ARM chips for 14 years. When they made that announcement, they already knew.

https://en.wikipedia.org/wiki/Mac_transition_to_Apple_silico...

It was obvious it was going to happen. I remember seeing Apple announcing iPads doing tasks my Mac at the time could only dream of and thinking they would surely do the switch.

> It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background

The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.


> When they made that announcement, they already knew.

Yep, the ongoing convergence made that pretty clear. The emphatic "No" was to reassure 2018's macOS developers that they wouldn't need to rewrite their apps as xOS apps anytime soon, which was (and is) true 7 years later.

This is the same session where Craig said, "There are millions of iOS apps out there. We think some of them would look great on the Mac." and announced that Mohave would include xOS apps. Every developer there understood that, as time went on, they would be using more and more shared APIs and frameworks.

> The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.

That ship has sailed, but it's also completely overblown.


> but it's also completely overblown.

Speak for yourself. I for one despise the current direction of the Mac and the complete disregard for the (once good) Human Interface Guidelines. It’s everywhere on macOS now.

Simple example: The fugly switches which replaced checkboxes. Not only to they look wrong on the Mac, they’re less functional. With checkboxes you can click their text to toggle them; not so with the switches.

I’m not even going to touch on the Liquid Glass bugs, or I’d be writing a comment the length of the Iliad.


My apologies, I thought the "IMHO" was implied.

You'll be happy to know that checkboxes still exist and work like you'd expect. https://imgur.com/a/p2Xe1WL

Apple provides HIG guidance on switch vs. checkbox toggles here: https://developer.apple.com/design/human-interface-guideline... It boils down to, "Use the switch toggle style only in a list row".


Chances that there are both a folding iPhone and a Touchscreen Mac somewhere in the skunk works of Cupertino are 100%.

The Apple Vision Pro was a far more extreme product and was kept pretty well under wraps. (tho a market failure).


The line for market success for a first generation, $3500 VR headset is drawn in different places for different people.

Questions for you:

1. If you don't know what to do with it, why did you buy it?

2. If you wanted a general purpose computer, why did you buy an iPad?

3. Which iPadOS limitations are particularly painful for you?


There are other differences with the iPad Pro lineup unrelated to the SoC. It's just strange to think that a very capable laptop chip is being put into a device with far more limitations.

I'd rather that than an underpowered chip.

It was mentioned, as almost a side comment somewhere, that the M chip is in there for multitasking and higher end image/video editing for "Pros". I could certainly use the M4 in an iPad Pro for iPadOS 26 and it's multitasking. I run into occasional slowness when multitasking on my M2 iPad Air.


1. I do know what to do with it. I take notes, a lot, in my work as a doctor. That's been the case since I owned an iPad Air from 2020, which I replaced with an 11 inch M1 iPad Pro (which broke), and I finally caved and bought a 13" iPad Pro to replace it. I ended up getting the M4 model because there just didn't seem to be older ones reasonably available. Even the M1 was more than fast enough for the overwhelming majority of iPadOS applicantions.

Why an iPad? Android tablets have been... not great for a long time. The pencil is very handy, and the ecosystem has the best apps. Also, I know a few rather handy tricks Safari can do, such as exporting entire webpages as PDF after a full-screen screenshot, that are very useful to my workflow.

2. I already own multiple general purpose computers. They're not as convenient as an iPad. My ridiculously powerful PC or even my decent laptop doesn't allow the same workflow. However, that's not an intentional software limitation, it's a consequence of their form factor, so I can't hold Microsoft to blame. On the other hand,Apple could easily make an iPad equivalent to a MacBook by getting out of the way.

3. The inability/difficulty of side-loading apps, the restriction to a locked down store. Refusing to make an interface that would allow for laptop-equivalent usage with an external/Bluetooth m+k. You can use an external monitor, but a 13" screen should already be perfectly good if window management and M+K usage wasn't subpar. Macs and iPads have near identical chips (the differences between an M chip for either are minor), and just being able to run MacOs apps on device would be very handy. Apple has allowed for developer opt-out emulation of iOS and iPadOS apps on Mac for a while now, why not the other way around?

If not obvious from the fact that I'm commenting on HN, I would gain utility from terminal access, the ability to compile and run apps on device, a better filesystem etc. Apple doesn't allow x86 emulators, nor can I just install Proton or Wine. If I can't side-load on a whim, it's not a general purpose computer. I can't use a browser that isn't just reskinned Safari, which rules out a great deal of obvious utility. There are a whole host of possible classes of apps, such as a torrent manager, which are allowed on other platforms but not on iPadOS. It's bullshit.

My pc and laptop simply aren't as convenient for the things I need an iPad for, and they can't be. On the other hand, my iPad could easily do many things I rely on a PC for, if Apple would get out of the way. iPadOS 26 is a step in the right direction, but there's dozens left to go.


I buy the higher end Apple products not because I plan to use all their power immediately, but because I keep my devices a very long time and want them to retain usability right to the end.

Same here. My launch-day M1 MBP is starting to show its age finally, M5 with twice the perf will be a nice upgrade.

Is it, tough? I feel like everything on my M1 is still as snappy as it was on day 1. My previous MacBook definitely showed it's game after 4 years, but I'm happy to use this one for at least another 2-4.

Mine is an M1 Max and gives me no gripes after four years. Like you, I also felt as though past laptops felt their age sooner. I'm typically using Photoshop, Lightroom, Resolve, Docker and other usual stuff at any given time.

I wasn't super informed on the Apple silicon laptops, so I was kind of disappointed when my last job gave me a 2-3 year old M1 Max laptop.

It blew the doors off every other laptop and desktop I've had (before or since).

When I think back to how quickly obsolete things became when I was younger (ie 90s and 00s), today's devices seem to last forever.


For me it was the memory limits more than CPU speed. Discord, Slack, Teams and a browser, and 16gb on my M1 was basically used up.

And here I am struggling with the 32gb version, always need more :P

Since Apple actually makes a significant amount of money selling hardware itself, I really wonder why they actually wouldn't allow people to install Linux on it, with a full support. After all, it's not like this would jeopardize macOS/iPadOS AppStore earnings — Linux users would simply buy into Apple Hardware they haven't even considered before, and only a fraction of macOS/iPadOS users would switch to using Linux.

do they disallow it or just not provide active support? Active support requires paying for employees to keep it working. Ignoring it and having volunteers do it requires nothing.

I think the comment up one was about Linux on the iPad, which is mostly impossible. Well, iirc there are some projects to get, like, Alpine Linux running inside iOS, but it is emulated or something, and pretty slow, no gui, quite limited, etc etc.

You make it sound like these are the only two options, meanwhile what they _most importantly_ fail to deliver is documentation.

And that's for macOS. For any other platgorm they actively prohibit any third party operating systems.


Last I checked, Apple makes more revenue on services than on Mac and iPad combined. With higher profit margins.

It'll get even weirder if the rumoured MacBook Lite with an iPhone processor ends up happening. Incredibly powerful tablets constrained by a dumbed down operating system, sold right next to much weaker laptops running a full fat desktop environment.

Well A19 Pro beats M1 in benchmarks so while the rumored MacBook might be weaker than mid to high-end iPads, it won’t be a slow machine in general.

Is that really rumored? Sounds like kind of a weak rumor to me. The MacBook Air already exists.

Apple already makes low cost versions of those, which are the previous models that they continue to manufacture.


https://www.macrumors.com/2025/06/30/new-macbook-with-a18-ch...

Apparently there are references to it in macOS already.


Emulators because IpadOs doesn’t allow dynamic dispatch so you need as much CPU as possible.

> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower

Look at glassy UIs. Worth it.


AFAICT, lots of "AI" related stuff runs slow on M1,M2,M3,M4

I don't know if this already exists but it would be nice to see these added to benchmarks. Maybe it's possible to get Apple devices to do stable diffusion and related tech faster and just needs some incentives (winning benchmarks) for people to spend some effort. Otherwise though, my Apple Silicon is way slower than my consumer level NVidia Silicon


No, iPad Pro won't be faster than 4090s or 4070s (or even 5% of the speed of 4090).

But newer chips might contain Neural Accelerator to close the gap a little bit (i.e. 10%??).

(I maintain https://apps.apple.com/us/app/draw-things-ai-generation/id64...)


What improvements did the A19 Pro provide for Draw Things?


That's amazing! Curious how this will translate to the M5 Pro/Max Macs...

I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)


I disagree. For a lot of the personal computing era, the problems with OSes and hardware were mostly a matter of technical progress. The problem with iPadOS is totally different; it's a problem that was basically manufactured in and of itself, and completely artificial at that. I do not think this is a good problem to have at all.

I don’t think you’re representing the state of iPad accurately.

In iPadOS 26, more extensive multi-window multitasking like Mac was added.

The quantity of windows you can keep open at once depends on your iPad’s SoC.

If you have a newer Pro iPad with more memory you can keep more of them open and slow down happens further down the rabbit hole.

The hardware is being pushed and used.

As another example, the iPad has pretty intensive legitimate professional content creation apps now (Logic Pro, Final Cut Pro, and others).

So there are people out there pushing the hardware, although I’ll join those who will say that it’s not for me specifically.


I suppose, that's an interesting way of framing it - yet in my gut I feel like I am paying for something that I am locked away from.

Sometimes though Youtube will make the iPad uncomfortably hot and consume the battery at an insane pace.

So, I guess there's someone using the performance.


>> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

> It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)

For anyone who works with (full-size) image or video processing, the hardware is still the constraint... Things like high-ISO noise reduction are a 20-second process for a single image.

I would be happy to have a laptop that was 10x as fast as my MacBook Pro.


I don't think hardware has been a real constraint since the Pentium era. We've been living in a world of CPU surplus for close to 2 and a half decades, now.

I've been RAM limited more than CPU limited for some time. In my personal workflows, 32GB was not enough and I'd receive out of memory errors. I bumped that up to 64GB and the memory errors went away. This was in a Hackintosh so RAM upgrades were possible. I've never tried an M* series chip to see how it would behave with the same workflow with the lower RAM available in affordable machines.

> can't figure out what to do with even a fraction of the horsepower

That's sort of the funny thing here. Apple's situation is almost the perfect inverse of Intel's. Intel fell completely off the wagon[1], but they did so at exactly the moment where the arc of innovation hit a wall and could do the least damage. They're merely bad, but are still selling plenty of chips and their devices work... just fine!

Apple, on the other hand, launched a shocking, world-beating product line that destroys its competition in basically all measurable ways into a market that... just doesn't care that much anymore. All the stuff we want to spend transistors on moved into the cloud. Games live on GPUs and not unified SOCs. A handful of AI nerds does not much of a market make.

And iOS... I mean, as mentioned what are you even going to do with all that? Even the comparatively-very-disappointing Pixel 10 (I haven't even upgrade my 9!) is still a totally great all-day phone with great features.

[1] As of right now, unless 18A rides in to save them, Intel's best process is almost five YEARS behind the industry leader's.


It’s surprising to me MacBooks have such low market share. I got my first Mac after using Windows all my life and I’m stunned. The laptop: 1. Lasts all day on battery 2. Doesn’t get hot 3. Compiles code twice as fast as my new Windows desktop

I really don’t like macOS but I’ve shifted to recommending Mac to all my friends and family given the battery, portability and, and speed.


Regarding market share and your friends and family recommendations, you’re thinking first world. Rest of the world wants and can only afford sub-$500 laptops.

I’ve found that the $1000 Mac laptop is worth about $500 after 3 years and the $500 laptop is worth $50. The price difference over time really isn’t that big and the Mac is going to have a better trackpad and display and longer battery life.

Yeah but in the longer term the price trends to $0 either way, and Windows will get software support for longer.

My mom is happily using a Lenovo from 2013 and looking to upgrade because it doesn't support Windows 11 and Win10 is running out of support. A contemporary Mac would have been the 2012 Mac Mini which would have received its final OS update with 10.15 Catalina in 2019, and would have received its final security update in 2022. (Desktop, so no difference in peripherals, etc.)

Incidentally, I actually purchased both the Lenovo and a 2012 Mac Mini (refurb) so I have the pricing data - the Lenovo was $540 and the Mac Mini was $730 - and then both took aftermarket memory and SSD upgrades.


If your $1000 MacBook breaks after a year you need $1000 to repair it.

A 500 laptop is probably more repairable and worst case you pay $500 to get a new one. Not to mention battery replacement etc.

The expected total cost of ownership is very high for a Mac. It’s like owning a Mercedes. Maybe you can afford to buy one, but you cannot afford maintenance.


What maintenance? AppleCare also exists if you worry about such things.

larger initial purchases are harder on the lower income earners regardless of the long term value they offer; that's one of the hard parts about being poor, it also makes positive economic decisions harder to accomplish.

That just means that the not-Mac is way more accessible. The high resale value makes Macs more expensive overall for everybody.

Also a lot of people prefer windows. It’s got a lot more applications than Mac. It has way more historical enterprise support and management capability as well. If you had a Mac at a big company 20 years ago the IT tooling was trash compared to windows. It’s probably still inferior to this day.


It definitely depends on what circles you run in. When someone I know or is a degree of separation away from me pulls out a PC, it is always a little bit of a surprise.

I won't buy or recommend one just on principle. I've spent way too much of my life advocating for open firmware and user-customizable systems to throw it all in the trash for a few hours of battery. I absolutely tell everyone they're the best, and why, but my daily driver has been a Linux box of some form (OK fine I have a windows rig for gaming too) for decades, and that's not changing.

Also, again, most folks just don't care. And of the remainder:

> Compiles code twice as fast as my new Windows desktop

That's because MS's filesystem layer has been garbage since NT was launched decades ago and they've never managed to catch up. Also if you're not apples/applesing and are measuring native C/C++ builds: VS is an OK optimizer but lags clang badly in build speed. The actual CPU is faster, but not by nearly 2x.


>> Compiles code twice as fast as my new Windows desktop

>That's because MS's filesystem layer has been garbage since NT was launched decades ago [...]

I confess that this kind of excuse drives me batty. End users don't buy CPUs and buy filesystems. They buy entire systems. "Well, it's not really that much faster, it's just that part of the system is junk. The rest is comparable!" That may be, but the end result for the person you're talking to is that their Windows PC compiles code at half the speed of their Mac. It's not like they bought it and selected the glacial filesystem, or even had a choice in the matter.

That's right up there with "my Intel integrated graphics gets lower FPS than my Nvidia card." "But the CPU is faster!" Possibly true, but still totally irrelevant if the rest of the system can't keep up.


> End users don't buy CPUs and buy filesystems. They buy entire systems. [...] Possibly true, but still totally irrelevant if the rest of the system can't keep up.

At least historically for hardware components of PCs, this was not irrelevant, but the state of things:

You basically bought some PC as a starting basis. Because of the high speed of improvements, everybody knew that you would soon replace parts as you deemed feasible. If some component was not suitable anymore, you swapped it (upgrade the PC). You bought a new PC if things got insanely outdated, and updating was not worth the money anymore. With this new PC, the cycle of updating components started back from the beginning.


But that still doesn't save away, "oh, it's only slow because the filesystem is so slow". Assuming that's true, that's a very integral part of the system that can't readily be swapped out by most people. You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."

> You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."

If performance is so critical, people do find ways around this. Just to give an arbitrary example since you mention file systems:

Oracle implemented their own filesystem (ASM Cluster File System (ACFS)):

> https://en.wikipedia.org/w/index.php?title=Oracle_Cloud_File...

"ACFS provides direct I/O for Oracle database I/O workloads. ACFS implements indirect I/O however for general purpose files that typically perform small I/O for better response time."


I'd have liked more explanation of the actual solutions that programmers used at the time.


For checking? Just a lookup on disk (no db, just a large list with a custom index, then binary search in the retrieved block). Decoding anything was slow, and in-core was basically out of the question [1]. Caching was important, though, since just a handful of words make up 50% of the text.

I once built a spell checker plus corrector which had to run in 32kB under a DOS hotkey, interacting with some word processor. On top of that, it had to run from CD ROM, and respond within a second. I could do 4 lookups, in blocks of 8kB, which gave me the option to look up the word in normal order, in reverse order, and a phonetic transcription in both directions. Each 8kB block contained quite a few words, can't remember how many. Then counting the similarities, and returning them as a sorted list. It wasn't perfect, but worked reasonably well.

[1] Adding that for professional spell checking you'd need at least 100k lemmata plus all inflections plus information per word if you have to accept compounds/agglutination.



The article is about fitting large dictionaries into small memory footprints. Writing a 200K word spell checker on a machine with only 256K memory.

When you need to store your dictionary in under 1 byte per word, a trie won't cut it.


The limit given in the article is 360KB (on floppy). At that size, you can't use Tries, you need lossy compression. A Bloom filter can get you 1 in 359 false positives with the size of word list given https://hur.st/bloomfilter/?n=234936&p=&m=360KB&k=

The error rate goes up to 1 in 66 for 256KB (in memory only);


according to https://en.wikipedia.org/wiki/Ispell ispell (1971) already used Levenshtein Distance (although from the article it is not stated if this already existed in the original version, or if it was added in later years).


Levenshtein distance up to 1, according to that article. If you have a hierarchical structure (trie or a DAG; in some sense, a DAG is a trie, but stored more efficiently, with the disadvantage that adding or removing words is hard) with valid words, it is not hard to check what words satisfy that. If you only do the inexact search after looking for the exact word and finding it missing I think it also won’t be too slow when given ‘normal’ text to spell-check.



The first article I read about the techniques used in the spell program was the 1985 May issue of Communications of the ACM (CACM for those who know), https://dl.acm.org/toc/cacm/1985/28/5, in Jon Bentley's Programming Pearls column.

Not as much detail as the blog.codingconfessions.com article mentioned above, maybe some of the other/later techniques were added later on?

Link to the online version of the 1985 May Programming Pearls column: https://dl.acm.org/doi/10.1145/3532.315102

The PDF version of that article: https://dl.acm.org/doi/pdf/10.1145/3532.315102


I never thought I'd end up posted on HN! I was wondering why on earth Substack's analytics were showing visitors from here. If anyone has any questions, I'm happy to answer here.


To me, it seemed to be a visual equivalent of that auditory trick where a note seems to descend or ascend in pitch indefinitely. The outer aura of color seemed to be shrinking constantly.


The Shepard Tone - https://en.wikipedia.org/wiki/Shepard_tone - brilliantly used by Hans Zimmer in the Dunkirk score.


The person who guessed 16% would have a lower Brier score (lower is better) and someone who estimated 100%, beyond being correct, would have the lowest possible value.


I'm not saying there aren't ways to measure this (bayesian statistics does exist after all), I'm saying the difference is not worth arguing about who was right. Or even who had a better guess.


Yes. Sleep by itself does nothing unique. In fact, you have a slightly lower rate of metabolism than even the "basal" metabolic rate.


Eh? That isn't right. The metabolic rate while asleep is even lower than the basal average. The only reason there's a net loss after sleep is because you're not regularly consuming food or water as you would when you're awake, not anything special about sleeping by itself. You're burning fewer calories than usual at that time!


Indian, Mexican and Brazilian consumers have far less money to spend than their American counterparts. I would imagine that the costs of the hardware and data collection don't vary significantly enough to outweigh that annoyance.


I'm on semaglutide for weight loss purposes, while having no other health issues relevant here.

I don't think it's made any difference to any addictive tendencies or my bad habits (and with ADHD, those certainly exist). It certainly helps with the appetite of course.

This is definitely anecdotal evidence, but it's wise to hold on longer for more data to come in before advocating for it on those grounds alone.


Too many miraculous stories so far I’ve heard, numerous people close to me quitting lifelong addictions cold turkey within months of starting.

I’ve never had an addictive personality but I also found I gave up a couple habits while on it.

I wonder if ADHD specifically isn’t as prone to positive effects there.


I would appreciate citations. I'm a doctor on GLP-1s,who had previously convinced my mother to commence the same. In her case, it was driven clearly by failure of other methods to control her obesity and worsening liver fibrosis, on top of pre-existing diabetes. On my end, no such issues at present, but I consider it safe enough that it's a first-choice approach to robust weight loss, and I personally use it in conjunction with diet and exercise.

"Relatively high levels of significant side effects" is a vague and unhelpful claim:

High compared to what? What counts as a significant side effect here? What actually are the side effects in question? Are those side effects permanent and irreversible? Can they be avoided by adjusting the dose? Dozens of such considerations come into play.

No drug I'm aware of is perfectly safe, and I know many drugs indeed.

To the best of my knowledge, the combined risk of taking semaglutide utterly pales in comparison to the clear and present harms of obesity. The only clear downside is cost, and while I'm lucky enough to to have access to cheaper sources, they're not even that expensive when you consider the QOL and health benefits.


https://pubmed.ncbi.nlm.nih.gov/38629387/

> Conclusion: Semaglutide displays potential for weight loss primarily through fat mass reduction. However, concerns arise from notable reductions in lean mass, especially in trials with a larger number of patients.

That's a significant long-term damage to health, quite possibly permanent for 40+ patients.


Sounds scary doesn't it? It's a shame that the magnitude of lean-muscle loss is entirely comparable to that of going on a strict diet or fasting:

Intermittent/time-restricted fasting

https://jamanetwork.com/journals/jamainternalmedicine/fullar...?

That's simply how the body reacts to a caloric deficit, without additional exercise. If you combine both IFT and resistance exercise, you find no muscle loss at all:

https://pmc.ncbi.nlm.nih.gov/articles/PMC7468742/

That's an apple to oranges comparison, because there's nothing preventing someone from taking Ozempic from exercising on the side.

And in fact, other trials found that the overall ratio of fat:muscle lost was rather favorable, and that functional strength wasn't compromised:

https://dom-pubs.onlinelibrary.wiley.com/doi/10.1111/dom.157...

>Based on contemporary evidence with the addition of magnetic resonance imaging-based studies, skeletal muscle changes with GLP-1RA treatments appear to be adaptive: *reductions in muscle volume seem to be commensurate with what is expected given ageing, disease status, and weight loss achieved, and the improvement in insulin sensitivity and muscle fat infiltration likely contributes to an adaptive process with improved muscle quality, lowering the probability for loss in strength and function*

Interpreting the risks and benefits of medication isn't a trivial exercise, if you're driven by a handful of studies or ignorant of the wider context, then it's easy to be mislead.


> That's an apple to oranges comparison, because there's nothing preventing someone from taking Ozempic from exercising on the side.

Strongly disagree on this. If there was nothing preventing the patient from changing their diet and physical activity / exercise level they could lose the fat through diet and exercise without resorting to taking semaglutides in the first place. Withdrawal studies show that there is a clear tendency for the weight to rebound after withdrawal from semaglutide use, therefore it's very hard to argue that it is the weight / fat mass alone blocking patients from indulging in a healthier lifestyle.

Semaglutide may help manage sustained weight loss by e.g. reducing the effect of reduced leptin baseline, however overall I remain highly skeptical of possibility for semaglutides to be "a first-choice approach to robust weight loss".


That has nothing to do with GLP-1 agonists and everything to do with the fact that rapid weight loss without exercise and sufficient protein intake leads to substantial lean mass reduction.

It's still better unless you were woefully weak, in which case a doctor should have prescribed adequate nutrition and physical activity.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: