A few things I noticed, as I'm seeing the variety of SKUs becoming more complex.
- Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.
- Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)
- The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.
- The M3 Pro actually has more E-cores than the Max (6 vs 4). Interesting to see them take this away on a higher-specced part; seems like Intel wouldn't do this
I'm wondering now if Apple tracks this sort of stuff in their released machines. I know my iPad always asks me if I want to send stats to Apple (and I always say no). So let's say that enough people do, then do they have a good idea of how often all the performance cores are used? Max memory B/W consumed? Stuff like that. Back when I was at Intel there were always interesting tradeoffs between available silicon/thermal/margin resource and what made it into the chip. Of course Intel didn't have any way (at that time) to collect statistics so it was always "... but I think we should ..." not a lot of data.
Apple shares anonymous usage data by default on all their operating systems and users are asked again on every major update.
Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.
But I suspect at high-end they only really care about the performance of a few dozen professional apps e.g. Logic or Final Cut. And at the low-end it's likely just efficiency.
> Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.
A 95% opt-in rate is INSANELY high for any type of usage-stat opt-in, everything above 50% is usually outstanding.
Apple enjoys a level of consumer trust that essentially no other technology business, and almost no other business at all. Whether that's justified or not is a matter of opinion.
It honestly doesn’t matter. We’re talking about hundreds of millions of devices sending data in either case. A hundred million more provides no additional value.
Major updates are infrequent maybe once a year if you always update, it’s not pestering you. And the UI makes it very easy to skip unlike some designs.
It’s a step in a setup wizard. Whilst it’s explicitly asked, and far from dark pattern territory, it’s designed in such a way that I wouldn’t be surprised by a 95% opt-in rate.
But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.
I was actually more genuinely interested to learn about the "similar defaults" mentioned in the OP, the 95% comment was just a side-note to a huge overestimation on how easy consent is achieved.
> But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.
Thing is, you don't even have 100% of the users' attention in this case. The user wants to use the device, you're just standing in the way.
The scenario is this: You force the user to take a decision between option A and B. Regardless of his decision he will achieve his immediately desired outcome (move to the next screen / use the device).
Getting 95% to vote for 'A' would require some quite aggressive dark pattern, to the point that option 'B' would need to be almost invisible and actively discouraged.
Even if the UI would be a pre-checked check-box and the user would just have to select "Next" to Continue (=opt-out), your rate of consent would not be 95%. As mentioned, everything beyond 50% is already outstanding
Or, let's rephrase: If Apple would have 95% opt-in rate, they wouldn't bother chasing for consent again on every SW-update
Expect something in the ballpark of 20-25%, and that already assumes that Apple's above-average brand-reputation translates into above-average consent on data sharing with them.
To add to this, it's not like a mailing list, either. Marketing opt-in is lower because it's annoying. A lot of people don't want emails.
Anonymized stats from your machine? Most normal people (who don't use computers like we do) do not care and just click the most affirmative option so that they can move forward.
This is deeply misguided opinion aboit 'nornal' people. To nornal people 'Anonymous' is a lie.
My dad can't tell apart Windows and Linux, but he makes sure to uncheck any kind of data collection, tracking, and clicks 'reject all' on every cookie warning
Yeah, I don't think allowing telemetry etc is really a matter of technical literacy, and is more a matter of social trust. High-trusting people will see no problem, low-trusting people will say "no way!". I'd imagine this varies widely but with definite trends based on social/economic class.
Normal people don't even give a second of thought to this. My partner knows the difference between windows and Mac, and is perfectly content to browse the internet without an ad blocker and to read in between all the cookie dialogs. The only time she clicks on one is when it's required to proceed, and she'll click whichever one is the most obvious button.
I think that was kind of the OP point. "Pro" users are significantly more likely to be opt-out in this scenario, unless they are not Pro users but just want the Pro machine for conspicuous consumption, making a much more dramatic swing in the usage data that is collected.
It's not exactly 'opt-out', they ask you on first boot or after major upgrades, and you either select "Share with Apple" or "Don't Share with Apple". It's just that the "Share" button is coloured blue so looks more default since it's more prominent (at least on iOS, I think it's basically the same on macOS).
It's not like it's enabled by default and you have to know to go and find the setting to turn it off or anything..
It’s opt-out, but it’s not enabled silently. It’s a pre-ticked checkbox on a screen you have to review when you first setup the machine (and when you do a major version OS upgrade).
IMO that’s quite different to something that’s just silently on by default, and requires you to set an environment variable or run a command to opt out.
On a phone there is no box at all. It's two options to select. The opt-in is highlighted, but there is no "next" button -- you have to select an option.
It’s not really opt-out or opt-in: it’s an explicit, informed choice you have to make when you start up your Mac after first purchase or major upgrade.
Not the OP, but I am not watching a random YouTube video from a random guy to help you prove your point. I can confidently link you some of these that “prove” that the Earth is flat.
I have no idea what information they’re collecting on me, and it seems very few people do (given that nobody was able to answer the above question).
Could be “how much CPU does this user use?” but could also be “when prompted with a notification that a user’s iCloud backup storage is low, how long did they hesitate on the prompt before dismissing? How can we increase their odds of upgrading?”
Also, my willingness to provide information does not correlate to how much I “like” a company’s products. If I buy a sandwich from a deli, and they ask for my email for their newsletter or something, I won’t give it. That doesn’t mean I don’t like their company or their sandwich. Could be the best sandwich in the world, they don’t need my email.
Sure, that's the title, but at least in this PR they immediately show a graph with a comparison to both.
Presumably it makes more marketing sense to compare to the M1 family up front because most people that bought an M2 last year are probably not going to be upgrading to M3. They are speaking to the people most likely to upgrade.
fwiw, i cant remember the last time i saw a company go back more than a generation in their own comparison. Apple is saying here as much as they're not saying here. M2->M3 may not be a compelling upgrade story.
The vast majority of Mac users go years between upgrades. For any other vendor it might seem weird to show several comparisons going back multiple generations (M1 and x86), but for the macOS ecosystem it makes perfect sense since only a very tiny slice of M2 users will be upgrading.
Windows has distinct groups: the people who buy whatever costs $700 at Costco every 10 years / when it breaks don’t care but there’s also a vocal enthusiast community who do upgrade frequently. That group gets more attention since it’s a profitable niche and gaming generates a lot of revenue.
I used buy a $700 Windows laptop every 18 months in the 2000s. Then I got fed up with them just falling apart and switched to Macbooks. My 2013 purchase is still alive and being used by the kids.
In the 2000s, I went through a wide variety of PC laptops (Lenovo, Toshiba, Dell, Alienware, Sony, etc.) all within the range of $1200-$6500 and they all died within 3 years (except for the cheapest one which was a Lenovo with Linux). Some died within a year.
When my first Macbook lasted for more than 3 or 4 years I was surprised that I was upgrading before it died. I went through many upgrades with almost zero issues(one HDD failure, one battery failure). I still have a 2012 Macbook Pro that I've since installed Linux on.
When I bought the first touchbar Macbook (late 2015?) I spent around $6k maxing out the options, and I was surprised at how totally trash it was. Hardware QC issues were shocking: particles under the screen from manufacturing, keys stuck within the first hour of usage, external monitor issues, touchbar issues...
Did you buy a macbook for $700? That was a pretty low price back then which meant you were buying devices made to a price. Buying a Macbook is one solution, another would have been to spend more money on a higher quality Wintel system.
Right, so when spend twice as much you wind up with a better device. I think this might be only tangentially related to the fact that it was an Apple product, rather, you weren't purchasing the cheapest available device.
Ten years ago Apple was by far the highest quality laptop manufacturer. There was essentially no other option back in the early 2010s. Even now laptops with a "retina" display are not always easy to find for other manufacturers. In retrospect, that was probably the killer feature which induced me to switch.
Yeah, the quality of PC laptops has improved but that really just means you can get closer to equivalent quality at equivalent pricing. I've heard people claim to have saved a ton but every single time I used one there was some noticeable quality decrease, which I find kind of refreshing as a reminder that the market does actually work pretty well.
Windows users buy whatever, from so many brands, that it doesn't matter how often they upgrade, they're likely to not upgrade from the same vendor anyway (so that the comparison to its older generations to be meaningful in the first place).
> and what makes you think windows users update their devices every single generation?
They don't, but the difference is that Windows users generally don't know or care about processor generations. In contrast, it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.
You can test this by asking Windows users what CPU they have. For the few who know and who have an Intel CPU, you can ask what their Brand Modifier¹ (i3/i5/i7) is. If they know that, you can ask what the 5-digit number following the Brand Modifier is — the first two digits are the Generation Indicator¹. I'd be surprised if more than 0.01% of Windows users know this.
Intel's CPU naming strategy used to drive me nuts when trying to talk to anyone at work who knew "just enough to be dangerous." Why is X so slow on this machine, it's got an [6 year old, dual core] i5! It runs fine on my laptop and that's only an [1 year old, quad-core] i3!
> it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.
Not at all. I've worked with FANG developers with brand new M1 MBPs that had no idea what 'm1' meant until something broke.
How are they scummy? The M3 vs. M2 performance improvements they showed looked pretty modest.
My interpretation while watching the event is that this is a company persuading x86 holdouts to upgrade to Apple Silicon, and maybe some M1 users as well.
It’s absolutely not, and that’s fine. The video has statements that the machines are made to “last for years” and they want to save natural resources be making long lasting machines.
I’m currently at 4 to 5 years on laptops and 3 to 4 years on phones, and even then I hand them over to kids/friends/family who get a bit more use out of them.
Huh. So they used to do this, but looking at the M series chips it seems like the architecture assumes the CPU-GPU-RAM are all on the same chip and hooked into each other, which enables zero copy. Someone more well versed in hardware could explain if this is even possible.
Expandable internal storage would be nice, yeah. But I get the sealed, very tightly packed chassis they’re going for.
> get the sealed, very tightly packed chassis they’re going for
The Dell XPS 17 is only 0.1 inch thicker yet has fully replaceable RAM and 2(!) m2 slots. I’m pretty sure what Apple is going for is maximizing profit margins over anything else..
I have an XPS 15. And while I liked that I could bring my own SSD and RAM, the build quality is nowhere near a Macbook Pro... like not even in the same galaxy. I had to have it serviced multiple times within the first few weeks. It had to be sent to Texas, and when it returned, one WiFi antenna wouldn't plug into the card, and the light on the front was permanently broken. I could have demanded Dell fix it - and I'd have been even more weeks without my main work laptop. So, by pure numbers/specs? Sure. By real world quality, no way would I favor Dell.
The issue is often comparing apples (heh) to oranges.
I understand the desire for slotted RAM, but the major limiting factor for nearly 10 years was CPU support for more than 16G of RAM. I had 16G of ram in 2011 and it was only 2019 when Intels 9th Gen laptop CPUs started supporting more.
The Dell XPS 17 itself has so many issues that if it was a Macbook people would be chomping at the bit, including not having a reliable suspend and memory issues causing BSOD's. -- reliability of these devices, at least when it comes to memory, might actually be worse and cause a shorter lifespan than if it had been soldered.
Of course it always feels good to buy an underspecced machine and upgrade it a year later, which is what we're trading off.
But it's interesting that we don't seem to have taken issue with BGA CPU mounts in laptops but we did for memory, I think this might be because Apple was one of the first to do it - and we feel a certain way when Apple limits us but not when other companies do.
There’s a lot of flat-out wrong information in this post. For one, even the low-power (U-series) Intel laptop CPUs have suported 32GB+ of memory since at least the 6th generation[1]. Many machines based on these CPUs unofficially support more than that. I have a Thinkpad with an i7-8550u and 64GB of DDR4, and it runs great.
On top of that, the higher-power laptop SKUs have supported 64gb or more since that time as well.
Secondly, it’s silly to claim that having RAM slots somehow makes a computer inherently more unstable. Typically these types of issues are the result of the manufacturer of the machine having bugs in the BIOS/EFI implementation, which are exacerbated by certain brands/types of memory. If you don’t want to mess around with figuring that stuff out, most manufacturers publish a list of officially-tested RAM modules which are not always the cheapest in absolute terms, but are always night-and-day cheaper than Apple’s ridiculous memory pricing.
Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM. I know because I had to buy a high end workstation laptop (Dell Precision 5520 FWIW) because no other laptop was supporting more than 16G of RAM in a thin chassis.
No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.
I know this because it was something I was looking at intently at the time and was very happy when the restrictions were lifted for commercially viable laptop SKUs.
Citing that something exists predisposes the notion of availability and functionality. No sane person is going to be rocking into the room with a Precision 7520 and calling it portable. The thing could be used as a weapon and not much else if you had no power source for more than 2hrs.
Also, socketed anything definitely increases material reliability. I ship desktop PC's internationally pretty often and the movement of shipping unseats components quite easily even with good packing.
I'm talking as if I'm against socketed components, I'm not, but don't pretend there's no downsides and infinite upgrade as an upside, it's disingenuous, in my experience there are some minor reliability issues (XPS17 being an exceptional case and one I was using to illustrate that sometimes we cherry pick what one manufacturer is doing with the belief that there were no trade offs to get there) and some limitations on the hardware side that limit your upgrade potential outside of being soldered.
This is not a behemoth of a laptop; I'm writing this on a T480 right now, which supports 32GB officially and 64GB unofficially, and it weighs 4lbs with the high-capacity battery (the same as the T470).
I can't tell if you're trolling or what, but if you're serious, you clearly didn't look hard enough.
Edit: since you mentioned Latitudes, Elitebooks, and Fujitsu lifebooks:
- For Lifebooks, I couldn't find an older one that supported 32GB, but this U937 uses 7th gen CPUs, and has 4GB soldered and one DIMM slot which supports up to 16GB. This is a total of 20GB, again, breaking the 16GB barrier: https://www.fujitsu.com/tw/Images/ds-LIFEBOOK%20U937.pdf
I believe these are all 14"-class laptops that weigh under 4 pounds.
One more thought: you might be getting confused here with the LPDDR3 limitation, which was a legit thing that existed until the timeframe you're thinking of.
Any laptop which used LPDDR3 (soldered) typically maxed out at 16GB, but as far as I'm aware, this was due to capacity limitations of the RAM chips, not anything to do with the CPUs. For example, the Lenovo X1 Carbon had a 16GB upper limit for a while due to this. I believe the 15" MacBook Pro had the same limitation until moving to DDR4. But this is entirely the result of a design decision on the part of the laptop manufacturer, not the CPU, and as I've shown there were plenty of laptops out there in the ~2014-2016 timeframe which supported 32GB or more.
DDR4 support was introduced with the 6th gen Core (except Core m) in 2016, LPDDR4 support didn't show up until (half of) the 10th gen lineup in 2019. It's just another aspect of their post-Skylake disaster, wherein they kept shipping the same stuff under new names for years on end before finally getting 10nm usable enough for some laptop processors, then a few years later getting it working well enough for desktop processors. In the meantime, they spent years not even trying to design a new memory PHY for the 14nm process that actually worked.
Yeah, this link is helpful, but IMHO doesn’t actually call out the specific problem I was referring to, which is that only laptops that used LPDDR3 had the 16GB limitation. If the laptop used regular DDR3, or DDR4, it could handle 32/64GB. The table lumps everything together per processor model/generation.
They haven't made slotted ram or storage on their macbooks since 2012 (retina macbooks removed the slotted ram afaik). It might save on thickness, but I'm not buying the slim chasses argument being the only reason, since they happily made their devices thicker for the M series cpus.
It's not soldered. It used to be, but ever since the M1, it's in-CPU. The ram is actually part of the CPU die.
Needless to say it has batshit insane implications for memory bandwidth.
I've got an M1, and the load time for apps is absolutely fucking insane by comparison to my iMac; there's at least one AAA game whose loading time dropped from about 5 minutes on my quad-core intel, to 5 seconds on my mac studio.
There's just a shitload of text-processing and compiling going on any time a large game gets launched. It's been incredibly good for compiling C++ and Node apps, as well.
I have no excuse for flash, but memory can't really be slotted anymore since SODIMM is crap. High hopes for CAMM making it's way into every other machine 2024!
Given that there is a legally mandated 2-year warranty period at least in Europe, I would be surprised if any laptops weren’t made to “last for years”.
The problem with Apple, however, is that their hardware will long outlive their software support. So if they really want to save natural resources by making long-lasting machines, they should put much more effort into sustained software support.
Is it a problem, though? The vast majority of people skip generation and for them the relevant reference point is what they have, which is going to be hardware from a couple of generations ago. M2 -> M3 does not have to be compelling: the people with M3 devices are a tiny fraction of the market anyway.
I find it interesting how people respond to this. On one side, it’s marketing so it should be taken critically. OTOH, if they stress the improvements over the last generation, people say they create artificial demand and things about sheeple; if they compare to generations before people say that it’s deceptive and that they lost their edge. It seems that some vocal people are going to complain regardless.
Given how strong they emphasised the performance over the Intel base - who now have had their machines for 4 years and are likely to replace soon (and may be wondering if they stay at Apple or switch over to PCs), it is pretty obvious that they also want to target that demographic specifically.
> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.
Depends who they are marketing to I think is the point. If the biggest group of potential buyers are not M2 users, then it makes sense not to market to them directly with these stats.
I've got an M1 Max 64GB and I'm not even tempted to upgrade yet, maybe they'll still be comparing to M1 when the M10 comes out though.
The devil tends to be in the details. More precisely, in the benchmark details. I think Apple provided none other than the marketing blurb. In the meantime, embarrassingly parallel applications do benefit from having more performant cores.
Heh, I recall seeing many posts arguing against benchmarks when all Macs equipped with an M2/8GB/256GB SSD scored much, much lower than the M1/8GB/256GB SSD. People said the synthetic benchmarks were not representative of real world use and you'd never notice the difference. 'Twas a battle of the optimists, pessimists, and realists. In reality, 'twas just Apple cutting costs in their newer product.
oh absolutely, I can't wait to see the benchmarks. Per the (non-numerical data) benchmarks in the video tho - it is faster. So... until other evidence presents itself, that's what we have to go on.
> Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?
I thought this at first then I realized the cost-performance benefit gained from adding more cores often outweighs just improving the performance of single cores. Even in gaming. I think this is what led AMD to create their Ryzen 9 line of CPUs with 12 cores in 2019.
That being said, I abhor the deceptive marketing which says 50% more performance when in reality, it's at most 50% more performance specifically on perfectly parallel tasks which is not the general performance that the consumer expects.
M2 Pro was about 20-25% faster than M1 Pro, M3 Pro quotes a similar number. It has faster cores but a weaker distribution of them. Seems like a wash, but we'll see exactly how close when benchmarks are out.
That's only rendering speed, and M3 Max vs M1 Max (not Pro). M3 Pro is only 30 percent faster:
> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.
Let me re-write your post with the opposite view. Both are unconvincing.
<<
Depends. Is it faster? Then it's an upgrade.
Has the CPU industry really managed to pull off it's attempt at a bs coup that more MHz always === better?
I thought we'd learned our lesson with the silly cores Myth already?
>>
Yes, but the number of cores in similar cpus do provide a good comparison. For example, with base M2pro at 6 p cores and base M3pro at 5 p cores, one would want ~20% faster cores to compensate for the lack of one core in parallel processing scenarios where things scale well. I don't think M3 brings that. I am waiting to see tests to understand what the new M3s are better for (prob battery life).
That makes less sense because the MHz marketing came before the core count marketing.
I agree with GP that we should rely on real measures like "is it faster", but maybe the goal of exchanging performance cores for efficiency was to decrease power consumption, not be faster?
You're not considering the difference in performance between the p and e cores. The math should be something more like:
M2 pro = 8*3 + 4 =28 (the *3 representing that the performance cores contribute ~3x more to total system performance than the efficiency cores)
M3 pro = 6*3*1.15 + 6*1.3 =28 (apple claims 15% more performance for the p cores not 20%)
> They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.
They don't claim either of those things. They claim the performance is 20% faster than the M1 pro. Interestingly, they made that exact same claim when they announced the M2 pro.
Energy efficiency might be better, but I'm skeptical till I see tests. I suspect at least some of the performance gains on the p+e cores are driven by running at higher clock rates and less efficiently. That may end up being more significant to total energy consumption than the change in the mix of p/e cores. To put it another way, they have more e cores, but their new e cores may be less efficient due to higher clock speeds. Total energy efficiency could go down. We'll just have to wait and see but given that apple isn't claiming an increase in battery life for the M3 pro products compared to their M2 pro counterparts, I don't think we should expect an improvement.
If you wanted to be even more accurate, you'd also have to take into account that most tasks are executed on the E cores, so having more of those, or faster, will have a much greater impact than any improvement on the P cores. It's impossible to estimate the impact like this - which is why Apple's performance claims[1] are based on real-world tests using common software for different workloads.
In summary, there is supposedly improvement in all areas so the reduced P core count doesn't seem to be a downgrade in any form as the OP suggested.
E cores are ~30% faster and P about 15%. So the question would be how much the Es assist when Ps are maxed on each chip. In any other situation, more/better E cores should outperform and extend battery. I’m not saying that means you should want to spend the money.
I love Apple's E cores. It just sucks that the M3 pro gains so few given the reduction in P cores.
Apple's E cores take up ~1/4 the die space of their P core. If the M3 pro lost 2 performance cores but gained 4-8 efficiency cores it'd be a much more reasonable trade.
Depends on what you consider an upgrade. As M3 cores perform better than M2 cores, I expect the M3 configuration to perform similar to the M2 one, even though it trades performance cores for efficiency cores. Apple apparently believes that its users value improved efficiency for longer lasting battery more than further improved performance.
Functionally, how does this impact observed performance on heavy loads like code compile or video manipulation? I doubt it's not much, and these are the low/mid-tier priced machines we are talking about.
If you bought a $2k M2 machine and traded it for a $2k M3 machine, you may gain better battery life with no concessions, except for benchmark measurements (that don't affect your daily work).
We all know what is meant by “low/mid-tier”. This is pointless pedantry. Next someone is going to come by with the throwaway comment complaint about how OpenAI isn’t “open”.
Different people have different needs. I certainly need a MacBook Pro for my work, but I use next to no storage. I’ve never purchase beyond the minimum storage for an Apple computer. I did however up the processor on my current MacBook Pro.
Minimum 8GB RAM is more universally egregious but I’m not going to sit here and justify my own exception whilst discounting the possibility that 8GB works for others.
To be fair– While 8GB is annoying– I've bought the M1 MacBook Air when it came out and it's remarkably resilient. I've only had it freeze a few times due to too little RAM.
I've also been using many different programs. I just have to be a tad mindful about closing tabs (especially Google tabs) and programs.
This makes going Mac Mini M2 Pro over iMac M3 feel real compelling. The respective prices of these models are in fact the same, so if you happen to have a good monitor already... (also the iMac M3 curiously doesn’t even have a Pro option.)
> Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)
I believe this is due to the TB4 spec requiring support for two external displays on a single port. The base spec M series SoCs only support one external display.
I’d expect the ports to work identically to a TB4 port in all other aspects.
I really, really wish they would fix this silly display scan-out limitation. I gave them a (frustrating) pass on the M1 given it was the first evolution from iPhone/iPad where it wouldn't have mattered. But seems silly to have made it all the way to the M3. Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.
I'm sure there is some kind of technical explanation but both Intel and NVIDIA seemed to managed 3+ scanouts even on low end parts for a long time.
Is this an actual hardware issue though? One issue is MacOS has never supported DisplayPort MST (Multi-Stream Transport) EVER as far as I can tell. MST allows for multiple display streams to be natively sent over a single connection for docks or daisy chaining monitors. Back on Intel Mac's if you had a dock with 2 displays or daisy chained 2 together you would get mirrored displays. The exact same Mac/displays in boot camp MST would work perfectly. 1x display per Thunderbolt 4 port is the worst!
You can't do it with a base model M chip. Not supported on Mac unless you go with displaylink and displaylink has weird issues on mac like no hdcp support and screen recording enabled that make it a really bad experience compared to mac.
Not with nVidia, no, they are 4 displays, always has been. The NVS810 8x display card is using two GM107 GPUs.
AMD is 6 displays. You see this rarely on consumer boards but the ASRock 5700 XT Taichi for some inexplicable reason did expose all six -- with four DisplayPorts to boot, too. I do not think there has been 4 DP or six output customer cards since.
Even with less ports you can use Display MST hubs to breakout 3 displays from one. (But not on a Mac, even intel, they never added driver support. Works in windows boot camp though)
There are couple 900-, 10-, 20-, 30-Series NVIDIA with 5 outputs. 700- and below had up to 4. IIUC it's more like up to (x px, y px) max with up to N independent clocks without external adapters or something along that.
I was doing 5 for no reason from a GTX970 at one point. They just work. But for some reason(segmentation?) NVIDIA brochure pages sometimes disagree or contradict with products in the market.
M1/M2 only has 1 native HDMI pixel pipe in any form, I think? Apple uses the HDMI PHY to drive the screen on tablets, and the screen on laptops. Base-tier M1/M2 also only have a single displayport pixel pipe, and Pro/Max get +1/+2 respectively.
The base-tier chips are designed as high-volume tablet chips first and foremost, with ultramobility crossover capability.
Using DisplayLink or certain kinds of thunderbolt multimonitor are possible while running outside the pixel pipe or running multiple monitors on a single pixel pipe (this is not MST which is multiple pixel pipes on a single stream). But yeah it's ugly especially on a base-tier processor with this eating cycles/dumping heat. You're running the hardware encoder at least.
Discord had this weird error if you tried to enable the screen/audio capture, it tries to launch something and fails and the solution is you need to manually install "airfoil" because it's an audio capture module that discord licensed. you don't have to fully install it but the audio driver is the part that discord uses and that goes first (has to be allowed as a kext, ie non-secure mode). theoretically a kernel-level capture like that could be a ton faster than userland, I think that's the on-label use of airfoil.
>I'm sure there is some kind of technical explanation
I'm sure it's a marketing explanation: they make bigger margins on more expensive machines, and they need some feature differentiators to nudge people to move up. 2 extra displays is a poweruser/pro feature.
They make their own silicon, it's not like they're shy about designing hardware, if they wanted to stuff features into the lower end books they easily could.
> Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.
I mean, it almost certainly is? I would guess a majority of the low-end SKUs are rarely if ever attached to one external display. Two would be rarer still.
At a ~recent work place the entire floor of developers had (Intel) MacBook Pros with dual 24" monitors.
Some of them were trying out Apple Silicon replacements, though I'm not aware of what they did monitor wise. Probably used the excuse to buy a single large ultrawide each instead, though I don't actually know. ;)
Is a 1,599 laptop a low-end laptop? An M3 Macbook Pro 14" that costs $1,599 can only drive a single external monitor according to the spec. A $1,000 Dell XPS 13 can drive 4 monitors via a single Thunderbolt 4 Dock that also charges the laptop!
Honestly, I'm an accountant and everyone in my office uses 2-3 monitors with $1,200 business ultrabook.
I am not sure how can anybody compare Intel and Apple. Apple is a vertically integrated system that has a CPU component, with a proven track record of making the right decisions. Intel is a CPU vendor with shrinking market share. As I pointed out, this use case is probably not that important because it represents a very small user segment.
We’re still talking the low end of this product line. If you’re buying two monitors for your employees, I’m not sure you’re skimping on the cost between an M3 and an M3 Pro.
you're saying they're low-end because Intel? if you've got your macbook connected to two monitors, you're not very concerned about battery performance.
So isn't Intel silicon competitive speedwise? I thought the M[0-4]s were OK but sort of hypey as to being better in all regards.
Not a chance. Moving from an Intel MacBook Pro to an Apple Silicon MacBook Pro was absolutely revolutionary for me and my very pedestrian ‘interpreted language in Docker web developer’ workloads.
I’d seriously consider not taking a job if they were still on Intel MacBooks. I appreciate that an arch switch isn’t a piece of cake for many many workloads, and it isn’t just a sign of employers cheaping out. But for me it’s just been such a significant improvement.
Where are you getting that impression from the parent post? Maybe they were on a 2, 3, or 4 year upgrade cycle and still had a bunch of Intel MBPs when Apple Silicon hit the market. That'd be extremely typical.
What dev shop immediately buys all developers the newest and shiniest thing as soon as its released without trialing it first?
We stuck with Intel MBPs for awhile because people needed machines, but the scientific computing infrastructure for Apple silicon took more than a little bit to get going.
Apple does not compete on checkboxes. If they deemed is necessary to remove, there’s a reason. Not saying I agree, just that’s how they operate. If there isn’t a need to support 3 displays then they won’t, regardless if the “competition” did it years prior.
The longer battery life is genuinely useful to a wide range of people in a way that being able to connect 38 external monitors is not.
I recently went on a 5-day trip and forgot to bring the charger for my M2. The first day I thought I'd have to rush around and find one. By the fourth day I still had 8% and then finally realized I could charge it via USB-C instead of magsafe.
Well the number with two screens would be zero, because you can't do it. That doesn't mean people don't want to do it because 0% of the laptops do it. They're just unable to.
The CEO is a supply chain guy. They've been optimizing their profit margins ruthlessly since he took the helm. I don't think any savings are too small, particularly if comparatively few users are affected and it motivates upselling.
I think it's weird though how far people go to defend Apple. It's objectively making (some) users worse off. Apple clearly doesn't care and the people defending them also clearly don't. But the affected users do care and "but money" isn't really a good excuse for them. It also doesn't solve their problem of not being able to use two external monitors anymore without spending significantly more money.
For the past 3 years, including with the latest laptops, "better chip" means 14" M* Pro starting at $1,999. $1,299 M1/M2 or $1,599 Macbook Pro does not support that. When you can find support for dual external display on $600 Windows laptops, or Intel Macbooks since at least 2012. By any standard this is an embarrassment and a regression.
I mean they are physical things and you can look at how big they are. But sure the rest of how that factors into cost and sales is harder to figure out, yes.
Is this a change to the spec, or did they skirt around that previously, because I didn't think they supported more than one screen per port on the M1/2?
I'm running an M1 Max with two Thunderbolt docks, and each drives 2 4k displays, runs great, although it's kinda overkill. But it does require the docks; you can't connect directly.
Wouldn’t do what? Intel has more E-cores than P-cores on most of their range, and especially on the higher end e.g. on raptor lake S the i9 all have 16 E and 8 P, the i7s have 8:8, only the lower end of the i5 (below 13500) have more P than E cores. And the i3 have no E cores.
The story is somewhat similar on mobile (H and HX), a minority of SKUs have more P than E, and none of them in P and U.
In fact that was one of the things which surprised me when Intel started releasing asymmetric SMT, they seemed to bank heavily on E cores when mobile and Apple had mostly been 1:1 or biased towards P cores.
Although that’s not quite true either e.g. on raptor lake H, the upper i5 (13600H) has 8 E cores while the low range i7 (13620H) has 4, but the i7 has 6 P-cores versus 4. The base frequencies also get lower as you move from i5 to i7. And you have less GPU EU (80 -> 64).
The SKUs are becoming more complex because they are probably learning why Intel/AMD have so many SKUs. Making complex chips at scale results in a range of less-than-ideal chips. This drives a the need to segment and bin chips into different SKUs to reduce losses, rather than trying to sell one SKU and throw awaying the anomalies.
The new M3 14" MBP seems to be a red herring - why does it even exist? Why not just refresh the MBA instead?
An obvious rule of thumb is for "Pro"-branded laptops to only use "Pro"-branded chips. It's what they follow for the iPhone lineup, but I suppose iPad Pros also use non-Pro chips. Just seems like a very confusing SKU to create, and definitely something Steve wouldn't approve of.
It replaces the 13 inch macbook pro with m2. Apple always has a “pro” macbook at that price point and it is one of the better selling macbooks, because not all “pro” users have a need for cpu grunt. A lawyer, for example, probably wants a “pro” class of hardware but doesn’t need more than an 8 gb m1. You could argue they should get a macbook air, but this 14 inch macbook pro is effectively that but with a better screen and more ports, which is exactly what that kind of buyer needs.
I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?
And yet, the MBA's screen in comparison is serviceable and nice, but nothing outstanding. That's the case for the MBP 14 (when the 16 is just too large and bulky).
I find it to be the perfect size actually. Easily in a backpack and is light, and can use it on the couch, etc. comfortably. I’d never buy a 16” laptop.
The old 15” was like the perfect dimensions. It practically had the footprint of the present 14”, maybe even smaller. Apple made a big deal about how their new chips run so cool, yet they made the pro laptops as fat as they were in 2012 again so clearly thermals were an issue.
Aren't the new 16" laptops the same dimensions as the old 15" ones? I thought the 16" display was simply because they were able to shrink the bezels on the display enough to get an extra inch on the diagonal. Other than the rounding on the edges, my M2 16" Pro feels about the same size as my old Intel 15" one.
I feel there is an obvious appeal to the MacBook Pro 14"/16" with M3. It has a good display, lots of battery life, and plenty of performance.
I'm more confused about the "M3 Pro" variant. Its performance either seems to be overkill or not enough. A more sensible lineup to me would be:
M3 - 2 thunderbolt ports, dual monitor support, memory up to 8-24gb (2x4, 2x6, 2x8, 2x12, 2x16). In the MacBook Pro, always comes equipped with second tier upgrades.
Llama.cpp is a pretty extreme cpu ram bus saturator, but I dunno how close it is (and its kind of irrelevant because why wouldn't you use a Metal backend).
> The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.
That's not super surprising to me. Apple loves to charge stupid prices for storage and memory. Maybe it's worth it for lots of people to have the convenience of built in storage at the lower levels, but I have to imagine that most people would want 8TB of SSD would rather just get an external solution for... much less.
Yeah I can imagine that’s an incredibly niche setup. Maybe if you were editing on the go or similar, but even then, TB drives seems like a more pragmatic choice.
I think what Apple is pushing for is computing efficiency. It still gets faster but with much less power. Focusing on performance solely would be the wrong way to evaluate these chips.
https://www.tomsguide.com/news/apple-m3-chip
There's a reason they didn't just stick an A series chip in their laptops and call it a day - they want more performance even if it comes at the cost of efficiency. It's probably better to say that Apple is pushing performance within a relatively restricted power envelope.
Just to illustrate my point - if m3 had exactly the same performance as m1, but with 1/2 the power draw, I don't think many people would have been happy even if it would have been an amazing increase in computing efficiency.
This drives me crazy. Apple plays the market like Nintendo. Pick something that no one cares about, do it better than anyone else, and make a big deal about it.
I dream of a world where instead of a marketing company becoming a top 3 tech company, a tech company would have. Wonder what they would have done for their laptop...
Or maybe this is just an inevitable outcome of capitalism/human biology where a veblen goods company will become a top player in a market.
So Apple is the most successful company because they prioritize things that no one cares about?
I dunno, if a there was marketing company that could design the likes of the M series chips along with the mobile versions, develop a full technology stack from programming language and compiler, custom chips, through to shipping whole devices at unimaginable scale would make me wonder what technology companies were doing.
What other “tech” company really compares from a hardware perspective? Samsung? Dell? AMD? Love them or hate them, there’s no denying that Apple has serious technical chops. One day people will only hate Apple for reasonable things, today’s not that day apparently.
Apple develops its own OS.
Apple develops its own development stack, frameworks, etc.
Apple develops its own CPU/GPU architecture.
Apple develops its own battery architecture.
Apple develops its own tooling to manufacture a lot of their products.
Apple develops its own tooling to dispose off their products.
There are very few companies that have as much first party Tech in their products from start to finish.
I think Apple under prioritizes sdvanced functionality but if they’re not a Tech company than it’s hard to see what is.
Who knows... maybe they will be like Google (which I consider a tech/engineering driven org) and they'll throw away (good) products all the time just "because"?
I think Apple plays the "niche" very well, not only regarding marketing, but also from a techs view.
What a weird take. Literally every "tech" company is chasing Apple's silicon but you are trying to claim that they're not a tech company. Let me guess, iPhones and iPads aren't tech either, right?
Considering that Apple has been significantly more innovative (tech wise) than pretty all of their competitors I’m not quite sure what this tells about them.
Not really . Or rather not only. The only two things I hate about Apple’s hardware is the lack of repairability and the price gouging for memory/storage upgrades. Otherwise they are objectively miles ahead of their competition
Of course I have no idea why am I taking the effort to respond to an edgy single word comment…
> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.
just contrasting this with the recent TR4 announcements from AMD, apparently their PRO variants top (theoretically at least) at around 325GB/s (non-pro versions are half of this), so just from that perspective alone M3 Max's might be better ?
i always have the naive assumption here that keeping the-beast i.e. the cpu fed with data is much better for overall performance than just clock-rates etc.
- Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.
- Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)
- The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.
- The M3 Pro actually has more E-cores than the Max (6 vs 4). Interesting to see them take this away on a higher-specced part; seems like Intel wouldn't do this