Don't we just call these out-of-tree patches? Lots of those for Linux lying about, it's not a new idea. I guess the difference is that they have multiple upstreams, so really they're more of a new project that consists of:
- A set of sqlite patches,
- Other upstreams and patches?
- A custom toolchain to build all the above together into one artefact
If a trunk is a series of patches then isn’t maintaining a series of patches against multiple upstreams just maintaining multiple forks? Feels like mostly semantics to try to differentiate at this point. I mean what is the thing that you would get from doing it this other way? Isn’t it fundamentally still going to require pulling upstream changes regularly and resolving conflicts when they occur? Reading the treatise they say “this allows us to address conflicts that normally you would have to resolve manually”. So Git has tools to pick and choose and auto resolve conflicts, you just have to customize the behavior of your tooling a bit.
Seems like they’re just ditching the inbuilt tools git/github offers to achieve this and doing the exact same thing with custom bespoke tooling. Instead of doing that, I’d be more interested in leveraging what Git has baked in with some sort of wrapper tool to perform my automation with some bespoke algorithm. There are merge drivers and strategies end users can implement themselves for this kind of power user behavior that don’t require some weird reinvention of concepts already built into Git
This kind of talent is ... not rare at all? And pretty easy to find too, just wander around the hallways at a major academic systems conference or hang around the kernel mailing lists, you'll meet most of the people working at the cutting edge of these things and get connected to those working on critical systems components. And yes, most of them work for FAANG or are funded by them.
Seriously, book a plane ticket to ATC/OSDI, EuroSys, etc and talk to all the people there. The good ones are already hired by one of the big established players (FAANG, Red Hat, Intel, etc), which is why you need to offer competitive compensation to lure them away.
Engineering is about tradeoffs -- are the resources invested in improving a system worth the return of said improvement? We know full well how to build bridges that will last a thousand years, we just choose not to because it's not an effective use of public funds compared to a fifty year bridge.
The same applies to software engineering -- each additional edge case you handle increases cost but has diminishing returns. At some point you have to say good enough and ship. The cost of perfection is infinite -- you have finite resources, and a great part of engineering is deciding how to allocate them.
> Besides the option to pay in naira, these companies allow Nigerians to store their data within the country — an advantage most of their Western rivals lack. Local servers give businesses the benefits of low latency and data localization at a time when the debate about who has access to a country’s data is heating up
Uhhh... all major cloud providers let you locate data manually? AWS surely has Nigerian datacenters too, that's not an advantage, that's the minimum.
I will grant the argument that random nigerian startups are not subject to foreign government interference, but saying that low latency is a benefit seems incorrect.
There's been a recent explosion in mesh sizes because of Nanite as well. A lot of devs who don't understand optimization will pull in a 1+ GB movie-quality mesh and just slap Nanite on it, even though they'll never need that level of fidelity.
"I would never buy a $200 ticket, therefore there does not exist a taylor swift ticket worth $200"
As you point out, different people have different price elasticities. Is it so hard to believe that some people really are willing to pay $5000 to attend?
You rightly point out that resale market prices are likely higher than the market clearing price compared to if there were a single-price auction for all the tickets, but saying that they're worth $100 is just flat out wrong.
> MacBook Air with M2 and M3 comes standard with 16GB of unified memory, and is available in midnight, starlight, silver, and space gray, starting at $999 (U.S.) and $899 (U.S.) for education.
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
I don't work in networking, but seeing as most traffic is encrypted these days, does passing through unfriendly hardware matter as much as back in the days of plaintext everything? Sure they can drop packets, but they can't tamper/read it, or is there something I'm missing?
> […] but they can't tamper/read it, or is there something I'm missing?
You could redirect traffic temporarily to get a Let's Encrypt (ACME) certificate issued and then use it to pretend to be some important site. Redirect attacks have been done in the past:
> Among all the scams and thievery in the bitcoin economy, one recent hack sets a new bar for brazenness: Stealing an entire chunk of raw internet traffic from more than a dozen internet service providers, then shaking it down for as many bitcoins as possible.
This type of attack is why LE does "Multi-Perspective Validation":
> A potential issue with this process is that if a network attacker can hijack or redirect network traffic along the validation path (for the challenge request, or associated DNS queries), then the attacker can trick a CA into incorrectly issuing a certificate. This is precisely what a research team from Princeton demonstrated can be done with an attack on BGP. Such attacks are rare today, but we are concerned that these attacks will become more numerous in the future.
If you direct a CA's traffic through your server, you can answer the HTTP or DNS queries that prove domain ownership. And lots of people click past warnings because an IT disruption isn't a day off if they can work around it
Russia could easily “convince” a CA based in their country to do them a favour to facilitate MITM. Or just gather the right kompromat needed to convince one overseas.
With educational pricing this thing starts as $500, and at 16GB of RAM (finally) I think this easily beats any sort of desktop PC you can buy at that price (let's exclude custom builds, they're not the same market).
I think this just became the go-to recommendation I'll give to anybody wanting an entry-level desktop computer of any kind. In fact I might buy one for my parents right now to replace the old mac mini they have. I really can't think of any reasonable competition for it at that price.
One issue to watch out for: Sub-4K res monitors look surprisingly bad on newer versions of macOS with Apple Silicon Macs. And no, it's not simply a matter of non-Retina obviously not looking as nice as Retina monitors - something like a 1440p monitor will look much worse on macOS than it would on Windows or Linux. This is partly caused by a lack of subpixel rendering for text on macOS, but it doesn't affect just text, with app icon graphics and such seemingly optimized for High-DPI resolutions only and thus looking awful too.
You commonly see people using 3rd party apps such as BetterDisplay to partially work around this problem by tricking the system to treat 1440p displays as 5K displays and then downscale, but it doesn't solve this completely.
So yes, the price for the machine is fantastic, but you may want to budget for a basic 4K display as well.
> Having experienced 4k I feel impoverished having to return to lower resolutions.
That's what they said. I've been using Retina/HiDPI displays at work for close to a decade now. Still can't say I prefer one over the other. I have no problem seeing pixels, especially now that I've switched to Linux (KDE Plasma) at home. In fact I kind of like being able to catch a glimpse of the building blocks of the virtual world.
What actually does matter (for me) is uniformity and color accuracy. And you can't have that for cheap, especially not in 4K.
Is this with newer Apple Silicon Macs? My 2020 M1 Mac Mini looks unremarkably normal on my 1440p display. I'm also going between that and my 14" M1 Pro Macbook Pro, which of course looks beautiful but doesn't really make the 1440p on the Mini 'bad'.
Edit: Adding that both of these machines are now running macOS 15.1 at this time.
In my experience, you can’t do any sort of scaling with sub-4K displays. This is “since M1”. Intel Macs, even on the latest macOS, can do scaling eg 1.5x at say 1440p, which last time I bothered with an Intel Mac required a workaround via Terminal to re-enable.
But that workaround is “patched” on Apple Silicon and won’t work.
So yes if you have an Apple Silicon Mac plugged into a 1440p display, it will look bad with any sort of “scaling”- because scaling is disabled on macOS for sub-4K displays. What you’re actually doing when you’re “scaling” on say a 1440p display is running that display at 1920x1080 resolution- hence it looks like ass. Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up to appear as though you had a 1920x1080 display- since it was still utilizing the full …x1440 pixels of the display, “1920x1080” looked nicer than it would now.
So brass tacks it’s just about how macOS/OS X would obfuscate the true display resolution in the System Preferences -> Displays menu. Now with Apple Silicon Macs, “1920x1080” means “2x scaling” for 4K monitors and literally “we’ll run this higher-res monitor at literally 1920x1080” for any display under 4K resolution.
> Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up
I’m almost sure that macOS can’t do that. It’s always done scaling by rendering the whole image at 2x the virtual resolution and then just displaying it on whatever screen you had. For example, for “looks like 1080p” on a 1440p screen it would draw onto a 2160p canvas (and do 2160p screenshots).
Yeah I was never able to get that to work on M1/M2 Macs. Intel, sure, but none of the workarounds (including BetterDisplay) worked on the ARM Macs. Do they now? I last tried in 2022.
If your 1440p monitor looks “fine” or “good”, it’s because the scale is 1x - for many people, including myself, UI elements are too small at 1x 1440p. I had to buy a 4K monitor so I could have larger UI elements AND crisp UI elements.
You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
The same way someone might not notice motion smoothing on a TV, or how bad scaling and text rendering looks on a 1366*768 panel, or different colour casts from different display technologies. All three took me a while before I could tell what was wrong without seeing them side by side.
> You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
Does any of that matter, though? Who bothers with the existence of hypothetical artifacts in their displays they cannot even see?
It matters once you get used to something better. Our brains are really good at tuning out constant noise but once you start consciously recognizing things it’ll remain noticeable. If your vision slips you won’t constantly be walking around saying everything is fuzzy but after using glasses you’ll notice it every time you take them off. Low-res displays work like that for many people – back in the 90s, people were content with 800x600 27” monitors but now that would feel cramped and blocky because we’ve become accustomed to better quality.
This is the biggest issue with Mac hardware at the moment.
All because of a decision to make it easier for their developers (and 3rd party too I guess) to be able to claim they figured out high-DPI before everyone else.
It comes at a large cost now, either more money than reasonable for one of the few compatible displays or accept a much worse experience, that is just not good for devices of this price.
This is why a big affordable iMac is so necessary, but TC's Apple likes money too much to care about their legacy customers.
After such a long history of Mac OS having better font rendering and in general better graphic stack (Quartz, everything is basically a continuous PDF rendering) feels like a big letdown.
The problem is going to improve as more high-DPI displays are released for sale but it has taken a lot of time because most customers like to focus on other characteristics that are arguably more important for other use cases.
There are plenty of premium display that are just good to great but you really have to think how it will work if you buy a Mac, most likely you'll need to compromise, feels bad considering the price of admition...
Wait about what kind of people are you talking about and how niche is that target group?
You are saying Mac are expensive but at the same time the potential buyers cant afford even a cheap 4K monitor? They go by like 200$? now.
and even is that group exists.. its not like 2560p is torture on a Mac especially with that BetterDisplay HiDPI, I would bet many would not even notice the difference.
Unless you want to use your cheap 4k monitor as an equivalent 1080P display (which is not a lot of space for today's standard) it's not at all viable.
2560 is actually 1440p and no, it's not very good, even with BetterDisplay without even talking about the performance and rendering implications.
The fact is that if you buy an expensive Mac desktop, you need to also buy one of the few expensive displays that can work properly with it, otherwise you get degraded experience or compromise, which is unacceptable for hardware this price.
We are in this situation only because of both engineering and commercial decisions from Apple.
Considering that they sold an entry level 27" iMac for a lower price than what the Studio sells for, well into 2021, the position is indefensible. Even if they wanted to make an external display, they didn't have to make it that overbuilt/expensive without any other choice.
It is purely profit motivated move, because they want to extract 50% margin on everything, the old iMac was a low margin device and it's really the only reason it doesn't exist anymore. Conveniently all of that is supported by an unnecessary engineering decision (they didn't have to remove subpixel rendering).
Why do you feel the need to defend a mega corp terrible choices, only made to milk you as much as possible.
As for the niche thing, having installed/managed quite a few of those old 27" iMacs, I can tell you they were extremely popular, precisely because they were cost effective. I think you largely underestimate how large the customer base for the 27" iMac was.
As far as I'm concerned, it's way less niche than a 650$ headphone, but the difference is that they can milk 50% margin on those while a big iMac at that margin target would put it in a price territory where most wouldn't even consider.
So yes, you can get a cheap display to have a price contained setup, but it's really not a great experience and doesn't make a lot of sense to comprise if you are going to get an expensive computer in the first place.
At this rate you can get All-in-one 27" PC for not much more than a standalone display, it's not going to be great but at least it's cheap. If we are going to compromise, at least do it right.
Can confirm, you absolutely need BetterDisplay and a tiny bit of elbow grease to configure the 5k clone to downscale to your real monitor. Not rocket science, but could be more streamlined.
If you say it looks fine without it, I don't know what to say.
Is there a review that demonstrates and corroborates this issue? Is it a difficult problem if choosing to buy a new display for a Mac mini? My old display is 10 years old and I would have to get a new one then.
It's most visible with the macbooks because you have the retina display and the low dpi display next to each other.
In short: you probably want to get at least a 4k display anyway, but if you want to delay that, you should buy BetterDisplay. The difference is night and day.
My 7 year old QHD monitor pair through a M1 Pro MBP still looks fantastic. Then again, I do spend most of my day in apple Terminal, but I'm not really in want of anything more. Some other sibling comments are saying Windows 10/11 looks crappy, and I agree, as I have to occasionally switch between the two, I just don't like working in Windows anymore, mostly because of the poor display.
I use both OS on the same display and Windows looks much better on an "old" no Hi-DPI display, I can tell you that much.
I used to dislike Windows font rendering, but it's still better than what macOS gives you for "regular" displays. You can fix it somewhat with BetterDisplay but still...
Modern versions of macOS don't support subpixel rendering, what Windows calls ClearType, so that is why macOS will always look worse on low resolution displays.
Using BetterDisplay to force a "2x" resolution will give you better rendering but at the cost of lower usable/effective resolution.
Yeah sure I know that, which is why I relate my experience to that is the result of those engineering choices.
It's pretty funny that you need special hardware to keep nice macOS font rendering to stand in comparison to Windows.
Microsoft has a lot of problems but they are way more pragmatic in their choices giving a "good enough" experience on most hardware.
But if you don't follow Apple's choice, your experience can go from great to barely passable in an instant.
Very different approaches, but as I'm getting older, I understand why the one from Microsoft is popular and why they deliver more value.
Basically operating at standard pre-Retina Mac DPI levels. The 27" Apple Cinema Display had exactly this resolution, as well as the 27" iMac before it went to 5K.
I agree, it works… fine. But sadly more and more elements of modern macOS will look blurry / aliased because they are only made with hi-DPI in mind.
For example all SF Symbols, as far as I know, are not defined as pixel graphics but only stored as vectors and rasterized on the fly. Which works great at high res and makes them freely scalable, but on low-DPI displays they certainly look worse than a pixel-perfect icon would.
Came here to echo this. Also, it always amazes me how many people respond to warnings like this (as seen in this thread as well) saying lower-resolution displays look just fine. I returned a M2 Mac Mini solely because it looked so awful on all of my monitors -- I tried 2 different 32" 2k displays, plus a handful of 24" displays. Everything was fuzzy and awful looking. Not something that could be tolerated or ignored... Completely unusable. I feel like this fact is not well known enough.
The fact that so many seem to tolerate "low-res" or "mid-res" displays on the current M-series Macs is really puzzling to me... maybe my eyesight isn't as bad as I thought it was and everyone else's is a lot worse!?
This new M4 mini is tempting enough that I might try a Mac again... but this time I am definitely going to have to budget for a 4k/5k display.
Honestly I am going to say skip 4K and just go to 5K. They are not that much more. I have 2x5K setup and it is great. The main monitor is normal orientation and the other is mounted on the left at a 90 rotation centered on the side of the first. I keep my work on the main and all the documentation, chat, etc. on the vertical one. I hope to be able to ditch the 2 monitor setup next year and go to a single 8K display.
There's still good deals in mini PC land. Yes, the M4 is faster but there's loads of mini PCs with decent CPUs, 32GB RAM and a 1TB of SSD storage for under $600. I think for a lot of people for basic usage they'll get more value out of the larger and upgradable SSDs than the faster CPU.
I bought one of these once. The specs on paper look good, but the CPUs are weak. They’re like those U series Intel CPUs where you could get say an i7-7700U, with 4 physical cores and 8 total threads, but at 15W TDP you were never really going to benefit from the 4 cores and 8 threads.
I do love those particular boxes for certain workloads. I have a few Lenovo ThinkCenter small form factor boxes in my office. They’ve replaced all my Rapsberry Pis. Unlike the Pi, I was able to purchase these!
Yeah the Pi is way too expensive. NUCs are a better deal (and roughly the same price), and x86 is obviously going to smoke ARM in certain workloads (e.g. running a VPN server that has to encrypt/decrypt all traffic).
The concept of the Raspberry Pi is great, but the price point was never where it needed to be. Even when you could get one for around MSRP, doing anything with it was so much added cost. Yeah $35 for a little computer is great, but you needed a power supply, case, microSD card, and whatever hats (the hats have always been overpriced IMO).
I know a couple of iOS developers who recently switched to a M4 MacBook pro and they swear that in some frequent workloads it feels sluggish and slower than the old Intel MacBook pros. Being RAM-starved might have something to do with it though.
> but there's loads of mini PCs with decent CPUs, 32GB RAM and a 1TB of SSD storage for under $600.
I also add that, unlike Apple hardware, these miniPCs are built with extensibility in mind. For example, most NUCs from the likes of minisforum and Beelink ship with a single SSD but support multiple SSDs, with their cases also having room for SATA drives. They even go as far as selling barebones versions of their NUCs, where customers can then pick and choose which RAM and SSDs to add.
From my experience, TCO on most apple products ends up being roughly the same when you factor in resale value.
You'll be able to sell your M4 mac mini in 5 years for $150 for an instant-cash offer from backmarket or any other reseller, while you'd be lucky to get $30 for the equivalent Beelink or BOSGAME after 6 months on ebay.
> From my experience, TCO on most apple products ends up being roughly the same when you factor in resale value.
This reads like the epitome of Apple's reality distortion field. I mean, you're trying to convince yourself that a product is not overpriced when compared to equivalent products and subjecting customers to price gauging by asserting that you might be able to sell it later. That's quite the logical leap.
No that's an accurate TCO calculation.
It's interesting that on this topic, the inventor of the PC also seems to be caught in that supposed "Apple reality distortion field" and can't confirm the "price gouging" that you're trying to convince yourself Apple practices.
I’m curious about your definition of the word waste. If a $600 Mac lasts 5 years and still worth $150 and another machine loses all its value in six months, how is Mac a waste?
600-150 is a bigger number than 30 last time I checked. So even if the $30 machine were to loose all its remaining value instantly, not even scrap metal, you would be far, far ahead.
These are the dollar numbers claimed in the above post.
I do think we should at least use the same measure of time to compare. Even if that means one reaches $0 by the time the other reaches $150.
Macs do generally hold their resell value better than PCs, but that doesn’t necessarily have any correlation to usefulness.
I have bought several ThinkCenter small form factor PCs used for about $200 each, and they’ve each been about 5-7 years old. They’re perfectly fine and I can even get new parts from Lenovo, depending on the part and machine. Fantastic deal. They run loads of services in my home.
> You'll be able to sell your M4 mac mini in 5 years for $150 for an instant-cash offer from backmarket or any other reseller
If you want to put in a bit of elbow grease, you can get a much better deal. M1 Mac Minis in my area are regularly selling for $350+ on FB Marketplace right now.
I just checked out backmarket as I've been shopping for a mini PC with oculink and hadn't thought of them. They have a primary nav across the top of the site which has 5 generic categories (laptops, consoles etc.), one Google product (pixel), 4 Samsung items, and 20 Apple items - more than all the others put together. I guess this very much proves your point.
No need for a black market, there's plenty of public ones (Backmarket, eBay, etc.). That being said $200 seems not terrible given the step change in performance since then (I own a 2019 MBP and think we were very unlucky with our purchase timing). Backmarket seems to sell yours for ~$350-500, so maybe you'll get a little bit more trade-in for it.
I owned a 2014 MBP (~$1200?) for a long time and as late as 2019 it was resellable for $500.
> I think for a lot of people for basic usage they'll get more value out of the larger and upgradable SSDs than the faster CPU
Why exactly?
What are a "lot of people" storing on their computers these days? Photos are in the cloud or on our phones. Videos and music are streaming. Documents take up no space. Programs are in the cloud (for the most part).
None of them have a proper HDMI 2.1 FRL port that is needed to run a 4k 120Hz monitor. Likely because the Iris Xe / AMD equivalent does not support it, and dedicated ITX GPUs are expensive. This isn't a problem with M4.
I would second this! The N100 is super efficient., and can often be found for around $150. I can also recommend looking at used intel “NUC” mini PCs if you’re budget conscience. I have a couple of 5th gen i5 NUCs i got for $60 that that run multiple VMs and LXC containers as part of a Proxmox cluster.
Another valid option is a synology nas, not only can you build the storage you probably want (I have 12TB 1 redundancy with one slot spare, read backed by ssd) but can also run containers on em as well.
Not sure what the best to recomend, what I can say is to stay away from GIGABYTE .
I got a BRIX, which gave me nothing but trouble. Its UEFI is very picky with SSD brands, wasted money on a couple now being used as external drives, and in the end not even with Windows.
It is now collecting dust waiting for the city hall disposal round.
It starts €599,00 for 2(!) core Celeron. Seems absurd when you can get a Mini for an extra €100 (you can run Linux/Windows in a VM and still get a magnitude or few better perf). Or even an used old NUC or something, you'd need to go back very far to get a crappier CPU...
So the actual starting price seems to be €900-1000 (i.e. if you want an i5..)
The Celeron G6900 has a 46W TDP and seems to be around ~20% (multicore) slower than the <10W N100. Seems absurd that they are pushing garbage like that at such prices (even if its the base config)
Cirrus7 is expensive because you are paying for a very high quality machined chassis & case that act as a massive fanless heatsink. Those alone are pretty costly. The price cannot be compared with cheap NUC clones and mini PCs, nor with Apple.
I am not endorsing any particular brand, but Cirrus7 is not that expensive within the fanless market and the quality of the entire build is very high. They also somtimes offer nice discounts for students and SMEs. There are quite a few comparable brands and also DIY options with cases from Streacom or Akasa. If you want something cheaper, Minix is pretty inexpensive, especially when you take into consideration they offer a decent fanless enclosure.
The higher end configs seem fine even if a bit pricy (still, though the Mac Mini seems like great value if you're fine with the OS situation and non upgradable memory).
I still find it weird/confusing why would a reasonably high-end brand be selling configs with such horrible CPUs (especially perf/watt considering the whole fanless thing).
But I suppose they hardly have any options if they want a socketed MB. Laptop chips would probably be a lot better value (both cost and heat wise) but then it's no longer modular and e.g. Lunar Lake doesn't(?) even support non soldered-memory...
That is a good question. They sell those CPUs to industrial clients. Note cases can be configured to be completely sealed for high-dust environments, and it is also possible to get industrial motherboards with connections that no regular user needs. The fanless market has a pretty good niche in factory deployments. There, lots of software is designed to run 24/7 on cheap CPUs.
Mac Minis, and in general most Apple products, tend to offer great value in the lowest configuration. But upgrades are expensive. It is a bit of the opposite situation. I have not used Macs for very long, and I prefer Linux, yet the cheapest Mini looks quite appealing. When Intel Core 2 CPUs entered the laptop market, it was a similar situation. The cheapest MacBooks offered really great value compared to competition.
Install your favorite flavor of Linux then. Beelink devices have a good reputation for being quite happy with a new OS. It's more compatible that the latest Apple devices, that's for certain.
Apple has never put any technical or legal obstacles in the way of installing other operating systems on Mac hardware. Nor do they assist in any way, it's consistent benign neglect.
The old Intel machines made excellent Linux boxes, excepting the TouchBar era because the TouchBar sucked (it was possible to install Linux, it would display the fake function keys, they worked, but not a good experience). I've converted two non-TouchBar Mac laptops into Linux machines, with zero complaints, one of them is in current use (not the laptop I'm typing on this instant however).
Now there's Asahi, which as a sibling comment points out, will surely be supported for M4 eventually. This is a great time to buy the M2 Minis and put Linux on them, if that's what you're into. Or you can wait around for the M4 port, whatever suits your needs.
Yet they made BootCamp.
Do you see how foolish you look trying to defend nonsense?
Apple try to avoid being too heavy handed in the lockdown because they know the outrage it would cause from their legacy customers. Boil the frog slowly.
But they most definitely are trying to make the Mac more like an iPhone and they would rather not you install any other OS on it.
The bootloader not being completely locked is more for legacy reasons and multi-macOS support (dev/debug) but if you have any problem with it, you will need (surprise-surprise), another Mac for a DFU restore, just like an iPhone.
I have a minisforum minipc. first thing I did was wipe windows and put popos on it. super happy with it. That said, getting anyone who isn't used to linux to usd anything other than windows as easy as pulling your teeth. People go towards whats familiar; Even when what's familiar is objectively trash that spies on you.
I don't try to get others to use Linux anymore. "Anyone who isn't used to Linux" can keep doing whatever it is they're already doing. So long as we can use it, I'm happy. I care about Linux usage only as far as it makes it harder for companies to ignore or block us.
> I think this easily beats any sort of desktop PC you can buy at that price (let's exclude custom builds, they're not the same market).
This is squarely in the NUC/SFF/1l-pc territory, and there is plenty of competition here from Beelink and Minisforum.
I just found the Beelink SER7 going for $509, and it has an 8-core/16-thread Ryzen 7 CPU, 32GB DDR4. The 8845 in the beelink is very competitive[1] with M4 (beaten, but not "easily"), and also supports memory upgrades of up to 256GB.
If local LLMs become mainstream then you want as much memory bandwidth as possible. For regular home and office use two channels of DDR4 is more than enough.
It is not more than 60 Gb/s for extreme overclocked DDR4-4000 and sometimes much less than 50Gb/s for regular 3200
DDR5 is reaching 100 Gb/s overclocked for Intel, and 50-70 Gb/s in stock.
When factoring in motherboard, CPU etc, then yes. The max speed is only theoretical, unlike the Apple chips which actually benchmark on the speed specified.
That's such an Apple fanboy trope.
The bandwidth is shared with the GPU part that actually uses most of it.
You can't starve the CPUs for data in typical PC with the "standard" DDR5 bandwidth and they have much higher bandwidth for GPUs.
You know it's almost like if PC industry hardware designers are not complete morons.
There's a huge difference there. Those PCs have to be ordered from Aliexpress, or some other Chinese site, or else from Amazon via a third party resellers that adds their own markup on top.
Neither gets you any kind of useful warranty, at least for most people, who are unwilling to deal with overseas companies.
Apple has actual physical stores, and a phone number you can call.
> Those PCs have to be ordered from Aliexpress, or some other Chinese site, or else from Amazon via a third party resellers that adds their own markup on top
I anticipated this concern, the $509 I gave earlier is the Amazon price that includes the mark-up. The Beelink SER7 costs only $320 on AliExpress.
Modern solid-state electronics are very reliable, most reliability issues for electronics are related to screens or batteries; which desktop computers lack. I guess there was a bad-capacitor problem over a decade ago, but nothing since then. If your risk-aversion for a desktop computer is high, you pay the Apple premium (possibly buying Applecare), or self-insure by buying 2 SER7s for nearly the same price ($640) as one regular M4 Mac Mini and keep the other one as a spare.
IF you're ordering them in the context of a larger buying program like a university or other office you'd at least get some sort of account rep and Apple support as well. I'm not sure if you could get that from Beelink, could you? I see some benefit in that use case.
But that's aside from the main topic which was the personal and home use case. On that topic you get a decent set of products as well such as Pages/Numbers/etc. and others along with software support for the Mac Mini. I'm guessing the Beelink runs on Linux? That may be hard for some to work with (which is unfortunate since it's really not), or maybe they have to separately buy a Windows license? Something to consider in the comparison.
1) external storage to become faster and cheaper every year (subject to constraints around interface)
2) more and more digital assets to be cloud-native, e.g. photos stored exclusively on icloud and not on your computer
So I'm less worried about storage than some. If Asahi Linux achieves Proton-like compatibility with games [0], then we're getting closer to the perfect general purpose game console.
With Thunderbolt 5, once external SSD enclosures supporting it exist, there should be zero performance penalty for external vs internal storage speed, finally. Then you can built a 1PB array, if you want.
Indeed. Realistically if anything, one should consider the “physical world” hassles with permanent external storage arguably more than performance ones:
• Risk of accidental unplugging.
• Contacts may become wonky over time → see above.
• The need to sacrifice a port (or the portion of one in the case of a dongle).
• Enclosures tend to have annoying IO lights.
• Takes a bit of space.
All of these can be solved, especially when dealing with a desktop that stays in place. Paradoxically, there was never a better time to be modest with internal storage.
Although I will say:
> photos stored exclusively on icloud and not on your computer
Over my dead body :) If there’s one thing I’ll always happily carve out SSD space for, it’s local copies of my photo library!
We can expect different storage solutions by product depending on how fast things need to be. It doesn’t need to be lightning quick to load a frame in a movie, for instance, which is why streaming dominates there.
Yeah exactly my thoughts.
I have been trying all kinds of services since 2010 and it always comes back to that.
Even with very fast fiber, there is no realistic way to get the latency into desirable territory. It makes a large amount of games borderline unplayable and a whole lot extremely annoying.
Basically, the only thing half-working are slow paced story games and slow strategy games (mostly turn based), which ironically require little ressource most of the time (so why pay for cloud service ?!).
Like most things "cloud" conceptually it is seductive but in practice extremely compromised.
It's a major hassle and if you are going to get a small box to plug in all kinds of other small box around it with a web of cables, you might as well get a bigger box and put it all inside...
This doesn’t seem to be true, and I don’t even get what it would change if it were true. Developers aren’t the target demographic of the base version with low storage.
Docker in macOS (at least the useful one) just runs in a Linux VM, and I don't see why you couldn't run a VM off an image on an external drive. Maybe the UI doesn't let you select that location?
Apple offering expensive upgrades for storage and memory pre-dates the existence of iCloud storage by decades. It was entirely standard before MobileMe, or Apple offering any kind of "cloud" services.
Apple just charge a lot of money for upgrades, even did when it was trivial to do them yourself, and they're not going to change once they made it impossible to do any kind of internal upgrade.
I’m a mid 30s developer and I use a mac mini for all my hobby development. I’m planning to get a m4 mini to replace my current m1 mini. I like hooking up my own monitor and peripherals - I don’t like working hunched over a small screen and crunched keyboard. Plus, a m4 mac mini with 32GB RAM is only $999 - the most closely spec’d Macbook Air (on an M3, with 24GB RAM) is $1299. and then the m4 Macbook Pro with 32 GB RAM is $1999. So your last point about cost - why should I throw away an extra $1000 for no reason?
IMHO it's not as NUC style mini PCs with x86-64 CPUs from AMD and intel are really cheap and the 256Gb storage is way too small making the "real" price $200 higher for any sort of moderate usage.
> I think this easily beats any sort of desktop PC you can buy at that price
Not really. Do a quick googling for cheap miniPCs from brands such as minisforum or Beelink. Years ago they were selling Ryzen5 and Intel i5 with 16BG of RAM for around $300. No "educational software" bullshit either, just straight from Amazon to anyone who bothered to click on a button.
Then you have to factor in supporting those systems, because you will be the one they call. This is one of the major upsides to family & friends buying Macs.
Why not? As long as you discount the price of the new product by the perceived value of new vs used, that’s the correct comparison to make. If a used product is the same quality and $100 cheaper, and just having something that’s new is not worth $100 to you, you should pick the used option. The goal is to get the best value per money spent.
mental gymnastics; used is not new. even new isn't necessarily new when it's not sold by an authorized seller because it can invalidate the warranty.
new is new and has legal ramifications, you cannot compare them unless you're throwing in a trustworthy extended warranty that matches -- and pretty much nothing matches apple in that regard
The same authors publish regular research reviews on nutrition and health at https://massresearchreview.com/; the general conclusion remains that exercise is a poor intervention for the purpose of weight loss is true. But estimating the effects of exercise is incredibly difficult, it varies wildly and is subject to many confounders.
- A set of sqlite patches,
- Other upstreams and patches?
- A custom toolchain to build all the above together into one artefact