Hacker News new | past | comments | ask | show | jobs | submit login

A few things I noticed, as I'm seeing the variety of SKUs becoming more complex.

- Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

- Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

- The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

- The M3 Pro actually has more E-cores than the Max (6 vs 4). Interesting to see them take this away on a higher-specced part; seems like Intel wouldn't do this




I'm wondering now if Apple tracks this sort of stuff in their released machines. I know my iPad always asks me if I want to send stats to Apple (and I always say no). So let's say that enough people do, then do they have a good idea of how often all the performance cores are used? Max memory B/W consumed? Stuff like that. Back when I was at Intel there were always interesting tradeoffs between available silicon/thermal/margin resource and what made it into the chip. Of course Intel didn't have any way (at that time) to collect statistics so it was always "... but I think we should ..." not a lot of data.


Apple shares anonymous usage data by default on all their operating systems and users are asked again on every major update.

Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

But I suspect at high-end they only really care about the performance of a few dozen professional apps e.g. Logic or Final Cut. And at the low-end it's likely just efficiency.


> Given that there has never been any public incidents about it and what we know about similar defaults I would be surprised if Apple is getting less than 95% opt-in rate.

A 95% opt-in rate is INSANELY high for any type of usage-stat opt-in, everything above 50% is usually outstanding.

What is known about "similar defaults"?


Apple enjoys a level of consumer trust that essentially no other technology business, and almost no other business at all. Whether that's justified or not is a matter of opinion.


It seems like the comment above is describing out-out and the it pesters you to opt-back in if you opt-out.


That's not how it works. You get asked the question again on every update, regardless of what you chose the last time.

So there are people who were opted-in that change their minds. My friends and family opt-in rate is <50%. And most of them are non-technical.


It honestly doesn’t matter. We’re talking about hundreds of millions of devices sending data in either case. A hundred million more provides no additional value.


...unless there's a correlation between opt-in choice and usage patterns.


That’s the trade off. You don’t opt-in, then you don’t get customized stuff. Shouldn’t be surprised if Apple doesn’t optimize for your usage.


Major updates are infrequent maybe once a year if you always update, it’s not pestering you. And the UI makes it very easy to skip unlike some designs.


Unless there is a flurry of network vulnerability updates, then a bespoke fork is set in the road for them.


Security/minor updates don't prompt for this AFAIK


It’s a step in a setup wizard. Whilst it’s explicitly asked, and far from dark pattern territory, it’s designed in such a way that I wouldn’t be surprised by a 95% opt-in rate.


I would be VERY surprised.

To someone with experience in that area of UX, a 95% opt-IN rate is ridiculously high.

A 95% consent-rate would already be hard to achieve as opt-OUT.

For opt-in a 95% rate would require both attention AND consent from 95% of the audience at this stage in the setup wizard.

I highly doubt that it can achieve 95% attention, let alone 95% consent.


But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.


I was actually more genuinely interested to learn about the "similar defaults" mentioned in the OP, the 95% comment was just a side-note to a huge overestimation on how easy consent is achieved.

> But it's not quite opt in our opt out in this case. The user is required to opt for something. Apple literally has 100% attention, because otherwise the user can't move past the screen.

Thing is, you don't even have 100% of the users' attention in this case. The user wants to use the device, you're just standing in the way.

The scenario is this: You force the user to take a decision between option A and B. Regardless of his decision he will achieve his immediately desired outcome (move to the next screen / use the device).

Getting 95% to vote for 'A' would require some quite aggressive dark pattern, to the point that option 'B' would need to be almost invisible and actively discouraged.

Even if the UI would be a pre-checked check-box and the user would just have to select "Next" to Continue (=opt-out), your rate of consent would not be 95%. As mentioned, everything beyond 50% is already outstanding

Or, let's rephrase: If Apple would have 95% opt-in rate, they wouldn't bother chasing for consent again on every SW-update


Another way of putting it: an option for a 100$ itunes gift card no strings attached, probably wouldn't hit 95%


I do agree it's probably not 95%. But 60% wouldn't surprise me.


Expect something in the ballpark of 20-25%, and that already assumes that Apple's above-average brand-reputation translates into above-average consent on data sharing with them.


To add to this, it's not like a mailing list, either. Marketing opt-in is lower because it's annoying. A lot of people don't want emails.

Anonymized stats from your machine? Most normal people (who don't use computers like we do) do not care and just click the most affirmative option so that they can move forward.


This is deeply misguided opinion aboit 'nornal' people. To nornal people 'Anonymous' is a lie.

My dad can't tell apart Windows and Linux, but he makes sure to uncheck any kind of data collection, tracking, and clicks 'reject all' on every cookie warning


Yeah, I don't think allowing telemetry etc is really a matter of technical literacy, and is more a matter of social trust. High-trusting people will see no problem, low-trusting people will say "no way!". I'd imagine this varies widely but with definite trends based on social/economic class.


> To nornal people 'Anonymous' is a lie.

Normal people don't even give a second of thought to this. My partner knows the difference between windows and Mac, and is perfectly content to browse the internet without an ad blocker and to read in between all the cookie dialogs. The only time she clicks on one is when it's required to proceed, and she'll click whichever one is the most obvious button.


I think that was kind of the OP point. "Pro" users are significantly more likely to be opt-out in this scenario, unless they are not Pro users but just want the Pro machine for conspicuous consumption, making a much more dramatic swing in the usage data that is collected.


The word Pro in the product name really doesn't separate consumers as well as you might think.

Every college kid has a Mac Book Pro, yet they are by definition not Pros


It’s more like 15% opt in. I know because it controls dev access to analytics on their apps.


Wait telemetry is opt-out?

And I've never heard people complain?

Genuinely surprised as it seems to be quite a commonly controversial thing amongst devs.


It's not exactly 'opt-out', they ask you on first boot or after major upgrades, and you either select "Share with Apple" or "Don't Share with Apple". It's just that the "Share" button is coloured blue so looks more default since it's more prominent (at least on iOS, I think it's basically the same on macOS).

It's not like it's enabled by default and you have to know to go and find the setting to turn it off or anything..


It’s opt-out, but it’s not enabled silently. It’s a pre-ticked checkbox on a screen you have to review when you first setup the machine (and when you do a major version OS upgrade).

IMO that’s quite different to something that’s just silently on by default, and requires you to set an environment variable or run a command to opt out.


On a phone there is no box at all. It's two options to select. The opt-in is highlighted, but there is no "next" button -- you have to select an option.


I don't think it's pre-checked, is it? I thought it was Yes/No buttons


No the default action is to do nothing (ie do not install the OS). You have to actively consent or reject.


Yeah, that's kind of surprising, given that Apple is often hailed as a privacy champion.


It’s not really opt-out or opt-in: it’s an explicit, informed choice you have to make when you start up your Mac after first purchase or major upgrade.


Well, Apple generally has so much info about your every step people stopped caring a long time ago.


I think you are talking about Google, not Apple.


No, both of them actually. Don't trust them too much.

This calls out some soft spots that were exposed during the Hong Kong riots: https://www.youtube.com/watch?v=nQ9LR8homt4


Not the OP, but I am not watching a random YouTube video from a random guy to help you prove your point. I can confidently link you some of these that “prove” that the Earth is flat.


It's not a default because users must choose yes or no. So there basically is no default.


> asks me if I want to send stats to Apple (and I always say no)

so you like them enough to pay them thousands for the premium product, but not enough to tell them how much CPU you use?


I have no idea what information they’re collecting on me, and it seems very few people do (given that nobody was able to answer the above question).

Could be “how much CPU does this user use?” but could also be “when prompted with a notification that a user’s iCloud backup storage is low, how long did they hesitate on the prompt before dismissing? How can we increase their odds of upgrading?”

Also, my willingness to provide information does not correlate to how much I “like” a company’s products. If I buy a sandwich from a deli, and they ask for my email for their newsletter or something, I won’t give it. That doesn’t mean I don’t like their company or their sandwich. Could be the best sandwich in the world, they don’t need my email.


In addition to the reduced memory bandwidth, the M3 pro also loses 2 performance cores for only 2 more efficiency cores.

M2 pro: 8 performance cores + 4 efficiency cores.

M3 pro: 6 performance cores + 6 efficiency cores.

Not a great trade... I'm not sure the M3 pro can be considered an upgrade


Depends. Is it faster? Then it's an upgrade.

Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought we'd learned our lesson with the silly Mhz Myth already?


I guess we'll have to wait for benchmarks but I did find this interesting:

Apple's PR release for M2 pro: "up to 20 percent greater performance over M1 Pro"

Apple's announcement for M3 pro: "up to 20 percent faster than M1 Pro" (they didn't bother to compare it to M2 pro)


Sure, that's the title, but at least in this PR they immediately show a graph with a comparison to both.

Presumably it makes more marketing sense to compare to the M1 family up front because most people that bought an M2 last year are probably not going to be upgrading to M3. They are speaking to the people most likely to upgrade.


fwiw, i cant remember the last time i saw a company go back more than a generation in their own comparison. Apple is saying here as much as they're not saying here. M2->M3 may not be a compelling upgrade story.


The vast majority of Mac users go years between upgrades. For any other vendor it might seem weird to show several comparisons going back multiple generations (M1 and x86), but for the macOS ecosystem it makes perfect sense since only a very tiny slice of M2 users will be upgrading.


and what makes you think windows users update their devices every single generation?


Windows has distinct groups: the people who buy whatever costs $700 at Costco every 10 years / when it breaks don’t care but there’s also a vocal enthusiast community who do upgrade frequently. That group gets more attention since it’s a profitable niche and gaming generates a lot of revenue.


I used buy a $700 Windows laptop every 18 months in the 2000s. Then I got fed up with them just falling apart and switched to Macbooks. My 2013 purchase is still alive and being used by the kids.


In the 2000s, I went through a wide variety of PC laptops (Lenovo, Toshiba, Dell, Alienware, Sony, etc.) all within the range of $1200-$6500 and they all died within 3 years (except for the cheapest one which was a Lenovo with Linux). Some died within a year.

When my first Macbook lasted for more than 3 or 4 years I was surprised that I was upgrading before it died. I went through many upgrades with almost zero issues(one HDD failure, one battery failure). I still have a 2012 Macbook Pro that I've since installed Linux on.

When I bought the first touchbar Macbook (late 2015?) I spent around $6k maxing out the options, and I was surprised at how totally trash it was. Hardware QC issues were shocking: particles under the screen from manufacturing, keys stuck within the first hour of usage, external monitor issues, touchbar issues...

I haven't bought a laptop since.


Did you buy a macbook for $700? That was a pretty low price back then which meant you were buying devices made to a price. Buying a Macbook is one solution, another would have been to spend more money on a higher quality Wintel system.


No, it was around $1100 IIRC, maybe as much as $1300.


Right, so when spend twice as much you wind up with a better device. I think this might be only tangentially related to the fact that it was an Apple product, rather, you weren't purchasing the cheapest available device.


Ten years ago Apple was by far the highest quality laptop manufacturer. There was essentially no other option back in the early 2010s. Even now laptops with a "retina" display are not always easy to find for other manufacturers. In retrospect, that was probably the killer feature which induced me to switch.


Yeah, the quality of PC laptops has improved but that really just means you can get closer to equivalent quality at equivalent pricing. I've heard people claim to have saved a ton but every single time I used one there was some noticeable quality decrease, which I find kind of refreshing as a reminder that the market does actually work pretty well.


Did you treat the MB differently because you paid more? If so, that may have yielded longer life in addition to quality design, etc.


Not really. The difference in build quality was night and day; metal vs. plastic, keyboard that doesn't flex, etc.


Windows users buy whatever, from so many brands, that it doesn't matter how often they upgrade, they're likely to not upgrade from the same vendor anyway (so that the comparison to its older generations to be meaningful in the first place).


> and what makes you think windows users update their devices every single generation?

They don't, but the difference is that Windows users generally don't know or care about processor generations. In contrast, it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.

You can test this by asking Windows users what CPU they have. For the few who know and who have an Intel CPU, you can ask what their Brand Modifier¹ (i3/i5/i7) is. If they know that, you can ask what the 5-digit number following the Brand Modifier is — the first two digits are the Generation Indicator¹. I'd be surprised if more than 0.01% of Windows users know this.

¹ Intel's name


Intel's CPU naming strategy used to drive me nuts when trying to talk to anyone at work who knew "just enough to be dangerous." Why is X so slow on this machine, it's got an [6 year old, dual core] i5! It runs fine on my laptop and that's only an [1 year old, quad-core] i3!


> it's common for Mac users to know they have an "old" Intel-based Pro, an M1 Air, etc., and to use that knowledge to help determine when it might be time to upgrade.

Not at all. I've worked with FANG developers with brand new M1 MBPs that had no idea what 'm1' meant until something broke.


like everything you said could apply to nvidia gpus as well


man, that's a whole lot of mental gymnastics to justify scummy benchmark practices from apple.


How are they scummy? The M3 vs. M2 performance improvements they showed looked pretty modest.

My interpretation while watching the event is that this is a company persuading x86 holdouts to upgrade to Apple Silicon, and maybe some M1 users as well.


It’s absolutely not, and that’s fine. The video has statements that the machines are made to “last for years” and they want to save natural resources be making long lasting machines.

I’m currently at 4 to 5 years on laptops and 3 to 4 years on phones, and even then I hand them over to kids/friends/family who get a bit more use out of them.


> they want to save natural resources be making long lasting machines.

Apple always comes from a position of strength. Again, they're saying as much as they're not saying.

Also, if they really cared about long lasting machines: slotted ram and flash please, thanks!


Huh. So they used to do this, but looking at the M series chips it seems like the architecture assumes the CPU-GPU-RAM are all on the same chip and hooked into each other, which enables zero copy. Someone more well versed in hardware could explain if this is even possible.

Expandable internal storage would be nice, yeah. But I get the sealed, very tightly packed chassis they’re going for.


> get the sealed, very tightly packed chassis they’re going for

The Dell XPS 17 is only 0.1 inch thicker yet has fully replaceable RAM and 2(!) m2 slots. I’m pretty sure what Apple is going for is maximizing profit margins over anything else..


I have an XPS 15. And while I liked that I could bring my own SSD and RAM, the build quality is nowhere near a Macbook Pro... like not even in the same galaxy. I had to have it serviced multiple times within the first few weeks. It had to be sent to Texas, and when it returned, one WiFi antenna wouldn't plug into the card, and the light on the front was permanently broken. I could have demanded Dell fix it - and I'd have been even more weeks without my main work laptop. So, by pure numbers/specs? Sure. By real world quality, no way would I favor Dell.


The issue is often comparing apples (heh) to oranges.

I understand the desire for slotted RAM, but the major limiting factor for nearly 10 years was CPU support for more than 16G of RAM. I had 16G of ram in 2011 and it was only 2019 when Intels 9th Gen laptop CPUs started supporting more.

The Dell XPS 17 itself has so many issues that if it was a Macbook people would be chomping at the bit, including not having a reliable suspend and memory issues causing BSOD's. -- reliability of these devices, at least when it comes to memory, might actually be worse and cause a shorter lifespan than if it had been soldered.

Of course it always feels good to buy an underspecced machine and upgrade it a year later, which is what we're trading off.

But it's interesting that we don't seem to have taken issue with BGA CPU mounts in laptops but we did for memory, I think this might be because Apple was one of the first to do it - and we feel a certain way when Apple limits us but not when other companies do.


There’s a lot of flat-out wrong information in this post. For one, even the low-power (U-series) Intel laptop CPUs have suported 32GB+ of memory since at least the 6th generation[1]. Many machines based on these CPUs unofficially support more than that. I have a Thinkpad with an i7-8550u and 64GB of DDR4, and it runs great.

On top of that, the higher-power laptop SKUs have supported 64gb or more since that time as well.

Secondly, it’s silly to claim that having RAM slots somehow makes a computer inherently more unstable. Typically these types of issues are the result of the manufacturer of the machine having bugs in the BIOS/EFI implementation, which are exacerbated by certain brands/types of memory. If you don’t want to mess around with figuring that stuff out, most manufacturers publish a list of officially-tested RAM modules which are not always the cheapest in absolute terms, but are always night-and-day cheaper than Apple’s ridiculous memory pricing.

[1] https://www.intel.com/content/www/us/en/products/sku/88190/i...


Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM. I know because I had to buy a high end workstation laptop (Dell Precision 5520 FWIW) because no other laptop was supporting more than 16G of RAM in a thin chassis.

No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

I know this because it was something I was looking at intently at the time and was very happy when the restrictions were lifted for commercially viable laptop SKUs.

Citing that something exists predisposes the notion of availability and functionality. No sane person is going to be rocking into the room with a Precision 7520 and calling it portable. The thing could be used as a weapon and not much else if you had no power source for more than 2hrs.

Also, socketed anything definitely increases material reliability. I ship desktop PC's internationally pretty often and the movement of shipping unseats components quite easily even with good packing.

I'm talking as if I'm against socketed components, I'm not, but don't pretend there's no downsides and infinite upgrade as an upside, it's disingenuous, in my experience there are some minor reliability issues (XPS17 being an exceptional case and one I was using to illustrate that sometimes we cherry pick what one manufacturer is doing with the belief that there were no trade offs to get there) and some limitations on the hardware side that limit your upgrade potential outside of being soldered.


> Sorry, you're entirely mistaken, there is no business laptop that you could reasonably buy with more than 16G of RAM.

> No Dell Latitude, Elitebook, Thinkpad X/T-series or even Fujitsu lifebook supported a CPU that was permitting greater than 16GiB of memory.

Here are the Lenovo PSRef specs for the Thinkpad T470, which clearly states 32GB as the officially-supported maximum, using a 6th or 7th gen CPU:

https://psref.lenovo.com/syspool/Sys/PDF/ThinkPad/ThinkPad_T...

This is not a behemoth of a laptop; I'm writing this on a T480 right now, which supports 32GB officially and 64GB unofficially, and it weighs 4lbs with the high-capacity battery (the same as the T470).

I can't tell if you're trolling or what, but if you're serious, you clearly didn't look hard enough.

Edit: since you mentioned Latitudes, Elitebooks, and Fujitsu lifebooks:

- Dell Latitude 7480 (6th gen CPUs) officially supports 32GB: https://www.dell.com/support/manuals/en-us/latitude-14-7480-...

- HP Elitebook 840 G3 (6th gen CPUs) officially supports 32GB: https://support.hp.com/us-en/document/c05259054

- For Lifebooks, I couldn't find an older one that supported 32GB, but this U937 uses 7th gen CPUs, and has 4GB soldered and one DIMM slot which supports up to 16GB. This is a total of 20GB, again, breaking the 16GB barrier: https://www.fujitsu.com/tw/Images/ds-LIFEBOOK%20U937.pdf

I believe these are all 14"-class laptops that weigh under 4 pounds.


One more thought: you might be getting confused here with the LPDDR3 limitation, which was a legit thing that existed until the timeframe you're thinking of.

Any laptop which used LPDDR3 (soldered) typically maxed out at 16GB, but as far as I'm aware, this was due to capacity limitations of the RAM chips, not anything to do with the CPUs. For example, the Lenovo X1 Carbon had a 16GB upper limit for a while due to this. I believe the 15" MacBook Pro had the same limitation until moving to DDR4. But this is entirely the result of a design decision on the part of the laptop manufacturer, not the CPU, and as I've shown there were plenty of laptops out there in the ~2014-2016 timeframe which supported 32GB or more.


Intel actually has this documented all on one page: https://www.intel.com/content/www/us/en/support/articles/000...

DDR4 support was introduced with the 6th gen Core (except Core m) in 2016, LPDDR4 support didn't show up until (half of) the 10th gen lineup in 2019. It's just another aspect of their post-Skylake disaster, wherein they kept shipping the same stuff under new names for years on end before finally getting 10nm usable enough for some laptop processors, then a few years later getting it working well enough for desktop processors. In the meantime, they spent years not even trying to design a new memory PHY for the 14nm process that actually worked.


Yeah, this link is helpful, but IMHO doesn’t actually call out the specific problem I was referring to, which is that only laptops that used LPDDR3 had the 16GB limitation. If the laptop used regular DDR3, or DDR4, it could handle 32/64GB. The table lumps everything together per processor model/generation.


They haven't made slotted ram or storage on their macbooks since 2012 (retina macbooks removed the slotted ram afaik). It might save on thickness, but I'm not buying the slim chasses argument being the only reason, since they happily made their devices thicker for the M series cpus.


> It might save on thickness, but I'm not buying the slim chasses argument being the only reason

Soldered memory allows higher bus frequency much, much easier. From a high frequency perspective, the slots are a nightmare.


It's not soldered. It used to be, but ever since the M1, it's in-CPU. The ram is actually part of the CPU die.

Needless to say it has batshit insane implications for memory bandwidth.

I've got an M1, and the load time for apps is absolutely fucking insane by comparison to my iMac; there's at least one AAA game whose loading time dropped from about 5 minutes on my quad-core intel, to 5 seconds on my mac studio.

There's just a shitload of text-processing and compiling going on any time a large game gets launched. It's been incredibly good for compiling C++ and Node apps, as well.


the ram is not on die, and 5 min to 5 sec is obviously due to other things, if legit


Sounds like the iMac had spinning hard disks rather than SSD storage.


Yup. I’ve been looking at the Framework laptop, and it’s barely any thicker than the current MacBook Pro.


I have no excuse for flash, but memory can't really be slotted anymore since SODIMM is crap. High hopes for CAMM making it's way into every other machine 2024!


Given that there is a legally mandated 2-year warranty period at least in Europe, I would be surprised if any laptops weren’t made to “last for years”.

The problem with Apple, however, is that their hardware will long outlive their software support. So if they really want to save natural resources by making long-lasting machines, they should put much more effort into sustained software support.


Yes my MacBook Pro 2010 is still going strong.

But, drivers are only available for win 7 and macOS High Sierra was the last supported version.

Luckily Linux still works great.


> i cant remember the last time i saw a company go back more than a generation in their own comparison

Apple likes doing that quite frequently while dumping their "up to X% better" stats on you for minutes.


Nvidia did it when they released the RTX 3080 / 3090 because the RTX 2000 series was kind of a dud upgrade from GTX 1060 and 1080 Ti


Apple always does game comparisons like this for their conferences though. The intel era was even worse with this iirc.


Intel era there wasn’t much to game, they’re using the same chips as all the PC competitors. The PowerPC era, on the other hand…


The majority of MacBooks out there are still intel based. This presentation was mostly aimed at them & M1 owners.


Is it a problem, though? The vast majority of people skip generation and for them the relevant reference point is what they have, which is going to be hardware from a couple of generations ago. M2 -> M3 does not have to be compelling: the people with M3 devices are a tiny fraction of the market anyway.

I find it interesting how people respond to this. On one side, it’s marketing so it should be taken critically. OTOH, if they stress the improvements over the last generation, people say they create artificial demand and things about sheeple; if they compare to generations before people say that it’s deceptive and that they lost their edge. It seems that some vocal people are going to complain regardless.


Given how strong they emphasised the performance over the Intel base - who now have had their machines for 4 years and are likely to replace soon (and may be wondering if they stay at Apple or switch over to PCs), it is pretty obvious that they also want to target that demographic specifically.


That’s not what it says. Actual quote:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.


Ok, so then the M3 pro is up to 1.3/1.2=~8% faster than the M2 pro? I can see why they wouldn't use that for marketing.


Depends who they are marketing to I think is the point. If the biggest group of potential buyers are not M2 users, then it makes sense not to market to them directly with these stats.

I've got an M1 Max 64GB and I'm not even tempted to upgrade yet, maybe they'll still be comparing to M1 when the M10 comes out though.


I'm also far from replacing my M1. But if someone from an older generation of Intel Macs considers upgrading the marketing is off as well.


I was referring to the graphic they showed during the announcement that verbatim said the CPU was "up to 20% faster than M1 Pro".

https://images.macrumors.com/t/wMtonfH5PZT9yjQhYNv0uHbpIlM=/...


Plausibly they thought market is saturated with M1:s and targeted this to entice M1 users to switch.


> Depends. Is it faster?

The devil tends to be in the details. More precisely, in the benchmark details. I think Apple provided none other than the marketing blurb. In the meantime, embarrassingly parallel applications do benefit from having more performant cores.


Heh, I recall seeing many posts arguing against benchmarks when all Macs equipped with an M2/8GB/256GB SSD scored much, much lower than the M1/8GB/256GB SSD. People said the synthetic benchmarks were not representative of real world use and you'd never notice the difference. 'Twas a battle of the optimists, pessimists, and realists. In reality, 'twas just Apple cutting costs in their newer product.


> Heh, I recall seeing many posts arguing against benchmarks (...)

It's one thing to argue that some real-world data might not be representative all on itself.

It's an entirely different thing to present no proof at all, and just claim "trust me, bro" on marketing brochures.


oh absolutely, I can't wait to see the benchmarks. Per the (non-numerical data) benchmarks in the video tho - it is faster. So... until other evidence presents itself, that's what we have to go on.


> Has the CPU industry really managed to pull off it's attempt at a bs coup that more cores always === better?

I thought this at first then I realized the cost-performance benefit gained from adding more cores often outweighs just improving the performance of single cores. Even in gaming. I think this is what led AMD to create their Ryzen 9 line of CPUs with 12 cores in 2019.

That being said, I abhor the deceptive marketing which says 50% more performance when in reality, it's at most 50% more performance specifically on perfectly parallel tasks which is not the general performance that the consumer expects.



Few game devs bother optimizing games to take advantage of multiple cores


I find that frustrating with how intel markets its desktop CPUs. Often I find performance enhancements directly turning off efficiency cores...


Faster than what? M1 Pro? Just barely.


Reference should be M2 pro


I suspect it's about equal or perhaps even slower.


Based on what? The event video says it's faster.


M2 Pro was about 20-25% faster than M1 Pro, M3 Pro quotes a similar number. It has faster cores but a weaker distribution of them. Seems like a wash, but we'll see exactly how close when benchmarks are out.


2.5x is "just barely"? lol k.


> 2.5x is "just barely"? lol k.

That's only rendering speed, and M3 Max vs M1 Max (not Pro). M3 Pro is only 30 percent faster:

> The 12-core CPU design has six performance cores and six efficiency cores, offering single-threaded performance that is up to 30 percent faster than M1 Pro.


20%


Let me re-write your post with the opposite view. Both are unconvincing.

<< Depends. Is it faster? Then it's an upgrade. Has the CPU industry really managed to pull off it's attempt at a bs coup that more MHz always === better?

I thought we'd learned our lesson with the silly cores Myth already? >>


I think you're misreading the comment you're replying to. Both "more cores is always better" and "more MHz is always better" are myths.


Yup, exactly what I was saying.


Yes, but the number of cores in similar cpus do provide a good comparison. For example, with base M2pro at 6 p cores and base M3pro at 5 p cores, one would want ~20% faster cores to compensate for the lack of one core in parallel processing scenarios where things scale well. I don't think M3 brings that. I am waiting to see tests to understand what the new M3s are better for (prob battery life).


That's... the same view, just applied to a different metric. Both would be correct.

Your reading comprehension needs work, no wonder you're unconvinced when you don't even understand what is being said.


That makes less sense because the MHz marketing came before the core count marketing.

I agree with GP that we should rely on real measures like "is it faster", but maybe the goal of exchanging performance cores for efficiency was to decrease power consumption, not be faster?


Probably a balance of both tbh, as it appears to be both faster AND around the same performance per watt.


The new efficiency cores are 30% faster than M2, and the performance ones 20% faster, so lets do the math:

    M2: 8 + 4

    M3: 6*1.2 + 6*1.3 =
        7.2 + 7.8
That’s nearly double the M2’s efficiency cores, a little less on the performance ones.

They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.


You're not considering the difference in performance between the p and e cores. The math should be something more like:

  M2 pro = 8*3 + 4 =28 (the *3 representing that the performance cores contribute ~3x more to total system performance than the efficiency cores)

  M3 pro = 6*3*1.15 + 6*1.3 =28 (apple claims 15% more performance for the p cores not 20%)
> They do say the system overall is up to 65% faster, and has lower power consumption at the same performance level.

They don't claim either of those things. They claim the performance is 20% faster than the M1 pro. Interestingly, they made that exact same claim when they announced the M2 pro.

Energy efficiency might be better, but I'm skeptical till I see tests. I suspect at least some of the performance gains on the p+e cores are driven by running at higher clock rates and less efficiently. That may end up being more significant to total energy consumption than the change in the mix of p/e cores. To put it another way, they have more e cores, but their new e cores may be less efficient due to higher clock speeds. Total energy efficiency could go down. We'll just have to wait and see but given that apple isn't claiming an increase in battery life for the M3 pro products compared to their M2 pro counterparts, I don't think we should expect an improvement.


If you wanted to be even more accurate, you'd also have to take into account that most tasks are executed on the E cores, so having more of those, or faster, will have a much greater impact than any improvement on the P cores. It's impossible to estimate the impact like this - which is why Apple's performance claims[1] are based on real-world tests using common software for different workloads.

In summary, there is supposedly improvement in all areas so the reduced P core count doesn't seem to be a downgrade in any form as the OP suggested.

[1] https://www.apple.com/nl/macbook-pro/


I wouldn't trust Apple's marketing on that if it's where you got those numbers from


E cores are ~30% faster and P about 15%. So the question would be how much the Es assist when Ps are maxed on each chip. In any other situation, more/better E cores should outperform and extend battery. I’m not saying that means you should want to spend the money.


I love Apple's E cores. It just sucks that the M3 pro gains so few given the reduction in P cores.

Apple's E cores take up ~1/4 the die space of their P core. If the M3 pro lost 2 performance cores but gained 4-8 efficiency cores it'd be a much more reasonable trade.


I’m sure the difference is GPU.


I’d like to see that. Good point about die space.


Could you not resolve these questions with benchmarking?


Depends on what you consider an upgrade. As M3 cores perform better than M2 cores, I expect the M3 configuration to perform similar to the M2 one, even though it trades performance cores for efficiency cores. Apple apparently believes that its users value improved efficiency for longer lasting battery more than further improved performance.


Functionally, how does this impact observed performance on heavy loads like code compile or video manipulation? I doubt it's not much, and these are the low/mid-tier priced machines we are talking about.

If you bought a $2k M2 machine and traded it for a $2k M3 machine, you may gain better battery life with no concessions, except for benchmark measurements (that don't affect your daily work).


These are not low/mid tier machines when talking about "consumer-grade".


Yeah.

$2K-3K is what my 3090/7800x3D sff desktop cost (depending on whether you include the price of the TV/peripherals I already own).


Within the MacBook Pro lineup, they are objectively the low and mid-grade pricing tiers.


Indeed, but that's a bit of an oxymoron as any Macbook Pro is not a "low/mid-tier priced machine"


We all know what is meant by “low/mid-tier”. This is pointless pedantry. Next someone is going to come by with the throwaway comment complaint about how OpenAI isn’t “open”.


Fair enough, I was just arguing even Mac users might not have the cash or the patience to commit into another machine.

We've seen the same with Nvidia's GPUs going from the 10 to 20 series. If people don't perceive higher gains without compromises, they won't buy it.


Then why do they come with (low end) consumer level storage and memory capacity?


Different people have different needs. I certainly need a MacBook Pro for my work, but I use next to no storage. I’ve never purchase beyond the minimum storage for an Apple computer. I did however up the processor on my current MacBook Pro.

Minimum 8GB RAM is more universally egregious but I’m not going to sit here and justify my own exception whilst discounting the possibility that 8GB works for others.


The cost for adding an extra 8GB would be insignificant for Apple, though. The only reason they don’t is to upsell higher tier models


It would make them less money. /thread

To be fair– While 8GB is annoying– I've bought the M1 MacBook Air when it came out and it's remarkably resilient. I've only had it freeze a few times due to too little RAM.

I've also been using many different programs. I just have to be a tad mindful about closing tabs (especially Google tabs) and programs.


This makes going Mac Mini M2 Pro over iMac M3 feel real compelling. The respective prices of these models are in fact the same, so if you happen to have a good monitor already... (also the iMac M3 curiously doesn’t even have a Pro option.)


> Just like the low-spec M3 14" has one fewer Thunderbolt port, it also doesn't officially support Thunderbolt 4 (like M1/M2 before it)

I believe this is due to the TB4 spec requiring support for two external displays on a single port. The base spec M series SoCs only support one external display.

I’d expect the ports to work identically to a TB4 port in all other aspects.


I really, really wish they would fix this silly display scan-out limitation. I gave them a (frustrating) pass on the M1 given it was the first evolution from iPhone/iPad where it wouldn't have mattered. But seems silly to have made it all the way to the M3. Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I'm sure there is some kind of technical explanation but both Intel and NVIDIA seemed to managed 3+ scanouts even on low end parts for a long time.


The technical explanation is that on the base M1/M2 SoC there is one Thunderbolt bus that supports 2 display outputs.

On the MacBook Air one output is connected to the internal display leaving one output for an external display.

(The Mac Mini that uses the same SoC is limited to 2 external displays for the same reason)

To support more displays they would have to add support for a second Thunderbolt bus to the base SoC.


Is this an actual hardware issue though? One issue is MacOS has never supported DisplayPort MST (Multi-Stream Transport) EVER as far as I can tell. MST allows for multiple display streams to be natively sent over a single connection for docks or daisy chaining monitors. Back on Intel Mac's if you had a dock with 2 displays or daisy chained 2 together you would get mirrored displays. The exact same Mac/displays in boot camp MST would work perfectly. 1x display per Thunderbolt 4 port is the worst!


You can get multiple displays from a single port, the hubs are just expensive.


You can't do it with a base model M chip. Not supported on Mac unless you go with displaylink and displaylink has weird issues on mac like no hdcp support and screen recording enabled that make it a really bad experience compared to mac.


There's no reason a whole Thunderbolt bus is needed for every two displays. It's just Apple's decision to build their GPU that way.

And to not support industry standard NVIDIA GPU on ARM Macs, too. 1 GPU typically supports 5 output over as little bandwidth as PCIe x1.


Not with nVidia, no, they are 4 displays, always has been. The NVS810 8x display card is using two GM107 GPUs.

AMD is 6 displays. You see this rarely on consumer boards but the ASRock 5700 XT Taichi for some inexplicable reason did expose all six -- with four DisplayPorts to boot, too. I do not think there has been 4 DP or six output customer cards since.


Even with less ports you can use Display MST hubs to breakout 3 displays from one. (But not on a Mac, even intel, they never added driver support. Works in windows boot camp though)


There are couple 900-, 10-, 20-, 30-Series NVIDIA with 5 outputs. 700- and below had up to 4. IIUC it's more like up to (x px, y px) max with up to N independent clocks without external adapters or something along that.


Just because there are X outputs on GPU, doesn't mean it will work with all of them at the same time


I was doing 5 for no reason from a GTX970 at one point. They just work. But for some reason(segmentation?) NVIDIA brochure pages sometimes disagree or contradict with products in the market.


Right, but why can't you disable the internal display to run 2 external displays? That wouldn't be an unreasonble compromise but seems not possible.


M1/M2 only has 1 native HDMI pixel pipe in any form, I think? Apple uses the HDMI PHY to drive the screen on tablets, and the screen on laptops. Base-tier M1/M2 also only have a single displayport pixel pipe, and Pro/Max get +1/+2 respectively.

The base-tier chips are designed as high-volume tablet chips first and foremost, with ultramobility crossover capability.

Using DisplayLink or certain kinds of thunderbolt multimonitor are possible while running outside the pixel pipe or running multiple monitors on a single pixel pipe (this is not MST which is multiple pixel pipes on a single stream). But yeah it's ugly especially on a base-tier processor with this eating cycles/dumping heat. You're running the hardware encoder at least.

Discord had this weird error if you tried to enable the screen/audio capture, it tries to launch something and fails and the solution is you need to manually install "airfoil" because it's an audio capture module that discord licensed. you don't have to fully install it but the audio driver is the part that discord uses and that goes first (has to be allowed as a kext, ie non-secure mode). theoretically a kernel-level capture like that could be a ton faster than userland, I think that's the on-label use of airfoil.


Allow the user to turn off the internal display in favor of 2 external displays. That would be a usable docked configuration.


you are right, but apple won't do this.


independent repair technician demo video to mux MBA internal and external display?


>I'm sure there is some kind of technical explanation

I'm sure it's a marketing explanation: they make bigger margins on more expensive machines, and they need some feature differentiators to nudge people to move up. 2 extra displays is a poweruser/pro feature.

They make their own silicon, it's not like they're shy about designing hardware, if they wanted to stuff features into the lower end books they easily could.


> Wanting to dock onto two displays even at the low end doesn't seem like such a niche use-case.

I mean, it almost certainly is? I would guess a majority of the low-end SKUs are rarely if ever attached to one external display. Two would be rarer still.


At a ~recent work place the entire floor of developers had (Intel) MacBook Pros with dual 24" monitors.

Some of them were trying out Apple Silicon replacements, though I'm not aware of what they did monitor wise. Probably used the excuse to buy a single large ultrawide each instead, though I don't actually know. ;)


Which workplaces are these that buy low-end laptops for their employees but shell out for dual monitor workstations?


Is a 1,599 laptop a low-end laptop? An M3 Macbook Pro 14" that costs $1,599 can only drive a single external monitor according to the spec. A $1,000 Dell XPS 13 can drive 4 monitors via a single Thunderbolt 4 Dock that also charges the laptop!

Honestly, I'm an accountant and everyone in my office uses 2-3 monitors with $1,200 business ultrabook.


I think this use case is probably not the majority.


So? Intel doesn’t seem have any issues supporting it regardless of that.


I am not sure how can anybody compare Intel and Apple. Apple is a vertically integrated system that has a CPU component, with a proven track record of making the right decisions. Intel is a CPU vendor with shrinking market share. As I pointed out, this use case is probably not that important because it represents a very small user segment.


External displays can be used for multiple generations of laptop hardware. Unlike CPUs, displays are not improving dramatically each year.

MacBook Air is a world-leading form factor for travel, it's not "low-end".

MBA with extra storage/RAM can exceed revenue of base MBP.


We’re still talking the low end of this product line. If you’re buying two monitors for your employees, I’m not sure you’re skimping on the cost between an M3 and an M3 Pro.


As stated, it's not about cost.

The travel form factor of MBA is not available for MBP, for any price.


What's Apple high end laptop product line?


> low-end laptops

Heh, that's not how I would describe MacBook Pros. ;)


I work at Motorola and we get M1 airs unless you specifically request a Linux laptop. I wouldn't call it low end though. Low end is an Intel i3.


> low-end laptops

you're saying they're low-end because Intel? if you've got your macbook connected to two monitors, you're not very concerned about battery performance.

So isn't Intel silicon competitive speedwise? I thought the M[0-4]s were OK but sort of hypey as to being better in all regards.


I have worked in plenty i5-i7 windows/linux laptops before and a macbook m1 air with 16gb of ram is miles better in everything. Nothing like them.

And even if you do not care about battery, you still care about throttling.


Honestly anyone who calls them hypey hasn’t actually used them and spends too much time arguing about geekbench on forums.

Real world, the M series chips are by far the best I’ve ever used as a software engineer and it’s not even close.


Not a chance. Moving from an Intel MacBook Pro to an Apple Silicon MacBook Pro was absolutely revolutionary for me and my very pedestrian ‘interpreted language in Docker web developer’ workloads.

I’d seriously consider not taking a job if they were still on Intel MacBooks. I appreciate that an arch switch isn’t a piece of cake for many many workloads, and it isn’t just a sign of employers cheaping out. But for me it’s just been such a significant improvement.


More like cheap out on monitors such that devs want two crappy monitors instead of one crappy monitor


What dev shop gives their engineers base model machines?


Doesn't need to be a dev shop. Go into any standard office and most productivity office workers will be running dual monitors now.

But with the general power of the base model Apple Silicon I don't think most dev shops really need the higher end models, honestly.


Where are you getting that impression from the parent post? Maybe they were on a 2, 3, or 4 year upgrade cycle and still had a bunch of Intel MBPs when Apple Silicon hit the market. That'd be extremely typical.

What dev shop immediately buys all developers the newest and shiniest thing as soon as its released without trialing it first?


We stuck with Intel MBPs for awhile because people needed machines, but the scientific computing infrastructure for Apple silicon took more than a little bit to get going.


Yeah, they were running Intel Macbook Pros because that's what everyone was used to, and also because production ran on x86_64 architecture.

At least at the time, things worked a bit easier having the entire pipeline (dev -> prod) use a single architecture.


Yeah, that was my experience. The early M1 adopters at my previous company definitely ran into some growing pains with package availability, etc.

(Overall the transition was super smooth, but it wasn't instant or without bumps)


Huh? He was talking about dual monitor situations being a problem.

If the company bought Pro or Max chips and not base models, it wouldn’t be a problem.


Intel has supported three external displays on integrated graphics since Ivy Bridge in 2012.


I’m not sure what that has to do with it being a niche use-case or not.


Niche or not, being more than a decade behind the competition is gauche.


On one somewhat niche feature, on the lowest SKU in that particular product lineup.

I can pick areas where Apple is beating Intel. Different products have different feature matrices, news at 11.


They also don’t show any signs of catching up to the Raspberry Pi’s on GPIO capabilities.


They did with https://ark.intel.com/content/www/us/en/ark/products/series/... but sadly seem to have killed off that product line.


That was Intel, not Apple.

It does seem like a shame, though—Intel’s IOT department seems to try lots of things, but not get many hits.


Apple does not compete on checkboxes. If they deemed is necessary to remove, there’s a reason. Not saying I agree, just that’s how they operate. If there isn’t a need to support 3 displays then they won’t, regardless if the “competition” did it years prior.


> there’s a reason. Not saying I agree, just that’s how they operate.

Almost always it’s maximizing profit margins rather than anything else.


>there’s a reason

they operate 100% on profitability, not what's technically feasible. They are extremely focused on making money. Yes, there is a reason after all.


Exactly my point. It’s technically feasible to do many things. Apple will do what Apple does. Try to upsell you into the higher tier hardware.


If that were true Apple would have stopped bragging about battery life.


The longer battery life is genuinely useful to a wide range of people in a way that being able to connect 38 external monitors is not.

I recently went on a 5-day trip and forgot to bring the charger for my M2. The first day I thought I'd have to rush around and find one. By the fourth day I still had 8% and then finally realized I could charge it via USB-C instead of magsafe.


> connect 38 external monitors

Just 2 would be enough. Which seems like a basic feature their competitors are are capable of supporting for a very low costs.

They in fact are competing on checkboxes, specifically they are probably using this limitation to upsell theirs more expensive models.


Can you not connect 2 monitors on a Mac?


Not on those with a non-pro M chip.


Even if you use one of those Thunderbolt/USB-C expansion dongles?


Correct.


It has nothing to do with niche use-case or not. This is a regression compared to their own Intel Macbooks.


Well the number with two screens would be zero, because you can't do it. That doesn't mean people don't want to do it because 0% of the laptops do it. They're just unable to.


It’s a bit funny though that their competitors don’t seem to have any issues supporting this on pretty much all of their products.


Display pipelines are expensive and take area.


Easy to say but hard to prove. How much more expensive would an MBP be if they supported it? How many fewer units would they shift?

Those are harder questions to answer. We could assume Apple crunched the numbers. Or perhaps they just stuck to the status quo.

Only an insider or decision maker (maybe that’s you) knows.


The CEO is a supply chain guy. They've been optimizing their profit margins ruthlessly since he took the helm. I don't think any savings are too small, particularly if comparatively few users are affected and it motivates upselling.

I think it's weird though how far people go to defend Apple. It's objectively making (some) users worse off. Apple clearly doesn't care and the people defending them also clearly don't. But the affected users do care and "but money" isn't really a good excuse for them. It also doesn't solve their problem of not being able to use two external monitors anymore without spending significantly more money.


I think their assumption is that if you’re the kind of pro that needs that many monitors, you’ll upgrade to the better chips they sell.

But it’s a frustrating limitation and remains one of the only areas their old intel based laptops were better at.


For the past 3 years, including with the latest laptops, "better chip" means 14" M* Pro starting at $1,999. $1,299 M1/M2 or $1,599 Macbook Pro does not support that. When you can find support for dual external display on $600 Windows laptops, or Intel Macbooks since at least 2012. By any standard this is an embarrassment and a regression.


Having 2 monitors isn’t even that ‘pro’ these days. I see receptionists with three sometimes.


An assumption they are so unsure about, that they kind of force that decision on their users.


It’s a money thing. Apple wants to upsell. The production cost would be negligible, but now you have to buy the next level of the product.


I mean they are physical things and you can look at how big they are. But sure the rest of how that factors into cost and sales is harder to figure out, yes.


Unless you’re Intel?


It's because they don't want to put a Thunderbolt controller on the right side of the computer


Is this a change to the spec, or did they skirt around that previously, because I didn't think they supported more than one screen per port on the M1/2?


I'm running an M1 Max with two Thunderbolt docks, and each drives 2 4k displays, runs great, although it's kinda overkill. But it does require the docks; you can't connect directly.


> seems like Intel wouldn't do this

Wouldn’t do what? Intel has more E-cores than P-cores on most of their range, and especially on the higher end e.g. on raptor lake S the i9 all have 16 E and 8 P, the i7s have 8:8, only the lower end of the i5 (below 13500) have more P than E cores. And the i3 have no E cores.

The story is somewhat similar on mobile (H and HX), a minority of SKUs have more P than E, and none of them in P and U.

In fact that was one of the things which surprised me when Intel started releasing asymmetric SMT, they seemed to bank heavily on E cores when mobile and Apple had mostly been 1:1 or biased towards P cores.


I think you confirmed what you were replying to. Intel makes the numbers get bigger as you go up, regardless of whether that makes the most sense.


Oh yeah I misread the comment.

Although that’s not quite true either e.g. on raptor lake H, the upper i5 (13600H) has 8 E cores while the low range i7 (13620H) has 4, but the i7 has 6 P-cores versus 4. The base frequencies also get lower as you move from i5 to i7. And you have less GPU EU (80 -> 64).


Well, when your P is still quite E, I guess it’s a different equation :).


The SKUs are becoming more complex because they are probably learning why Intel/AMD have so many SKUs. Making complex chips at scale results in a range of less-than-ideal chips. This drives a the need to segment and bin chips into different SKUs to reduce losses, rather than trying to sell one SKU and throw awaying the anomalies.


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

Is this because they are not populating all the memory channels, or just using lesser memory ICs?

If its the former... thats annoying. It just makes their products worse for artificial segmentation and very little cost savings.


The new M3 14" MBP seems to be a red herring - why does it even exist? Why not just refresh the MBA instead?

An obvious rule of thumb is for "Pro"-branded laptops to only use "Pro"-branded chips. It's what they follow for the iPhone lineup, but I suppose iPad Pros also use non-Pro chips. Just seems like a very confusing SKU to create, and definitely something Steve wouldn't approve of.


It replaces the 13 inch macbook pro with m2. Apple always has a “pro” macbook at that price point and it is one of the better selling macbooks, because not all “pro” users have a need for cpu grunt. A lawyer, for example, probably wants a “pro” class of hardware but doesn’t need more than an 8 gb m1. You could argue they should get a macbook air, but this 14 inch macbook pro is effectively that but with a better screen and more ports, which is exactly what that kind of buyer needs.


I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

And yet, the MBA's screen in comparison is serviceable and nice, but nothing outstanding. That's the case for the MBP 14 (when the 16 is just too large and bulky).


I find it to be the perfect size actually. Easily in a backpack and is light, and can use it on the couch, etc. comfortably. I’d never buy a 16” laptop.


Absolutely love my 14” M2 pro and use it daily for coding. Perfect size/weight for the backpack, and endless battery at the local coffee shop.


The old 15” was like the perfect dimensions. It practically had the footprint of the present 14”, maybe even smaller. Apple made a big deal about how their new chips run so cool, yet they made the pro laptops as fat as they were in 2012 again so clearly thermals were an issue.


Aren't the new 16" laptops the same dimensions as the old 15" ones? I thought the 16" display was simply because they were able to shrink the bezels on the display enough to get an extra inch on the diagonal. Other than the rounding on the edges, my M2 16" Pro feels about the same size as my old Intel 15" one.


> I personally struggle with the 14". It feels too small to be productive on, at least for coding. Anyone else experience this?

absolutely not... working for 10 years on 13/14 and never _felt_ that way I get this is personal ;)


I find the 14" perfect, but I also find a tiling window manager (universally) vital.


I feel there is an obvious appeal to the MacBook Pro 14"/16" with M3. It has a good display, lots of battery life, and plenty of performance.

I'm more confused about the "M3 Pro" variant. Its performance either seems to be overkill or not enough. A more sensible lineup to me would be:

M3 - 2 thunderbolt ports, dual monitor support, memory up to 8-24gb (2x4, 2x6, 2x8, 2x12, 2x16). In the MacBook Pro, always comes equipped with second tier upgrades.

M3 Max - 3 thunderbolt ports, quad monitor support, 32-128gb (8x4, 8x6, 8x8, 8x12, 8x16).

Then again this wouldn't let Apple upsell people on basic functionality like dual monitor support so they'll never do this.


About the M3 pro, I’ve heard a theory it’s most likely due to lower yields by TSMC and M2 pro and max being too similar.

Now it’s clearly, if you really need perf you get an M3 max.


The most popular Macbook Pro?

Look, I'm a 16" guy myself, I even carried one of the 17" cafeteria trays back in the day… but it's clearly the sweet spot for _most_ people.


It was pretty hard to saturate the memory bandwidth on the M2 on the CPU side (not sure about the GPU).


The GPU can saturate it for sure.

Llama.cpp is a pretty extreme cpu ram bus saturator, but I dunno how close it is (and its kind of irrelevant because why wouldn't you use a Metal backend).


Well, Metal can only allocate a smaller portion of “VRAM” to the GPU — about 70% or so, see; https://developer.apple.com/videos/play/tech-talks/10580

If you want to run larger models, then CPU inference is your only choice.


Aren't these things supposed to have cores dedicated to ml?


You’re thinking of the neural engine. I’m not sure that llama.cpp makes use of this. They’d have to turn it into a CoreML model to do so.


They are not as fast as the GPU (but much lower power).

Also, not many implementations can even use it.


> The M3 Pro loses the option for an 8TB SSD. Likely because it was a low volume part for that spec.

That's not super surprising to me. Apple loves to charge stupid prices for storage and memory. Maybe it's worth it for lots of people to have the convenience of built in storage at the lower levels, but I have to imagine that most people would want 8TB of SSD would rather just get an external solution for... much less.


Yeah I can imagine that’s an incredibly niche setup. Maybe if you were editing on the go or similar, but even then, TB drives seems like a more pragmatic choice.


I think what Apple is pushing for is computing efficiency. It still gets faster but with much less power. Focusing on performance solely would be the wrong way to evaluate these chips. https://www.tomsguide.com/news/apple-m3-chip


I think it's a bit more nuanced than that.

There's a reason they didn't just stick an A series chip in their laptops and call it a day - they want more performance even if it comes at the cost of efficiency. It's probably better to say that Apple is pushing performance within a relatively restricted power envelope.

Just to illustrate my point - if m3 had exactly the same performance as m1, but with 1/2 the power draw, I don't think many people would have been happy even if it would have been an amazing increase in computing efficiency.


This drives me crazy. Apple plays the market like Nintendo. Pick something that no one cares about, do it better than anyone else, and make a big deal about it.

I dream of a world where instead of a marketing company becoming a top 3 tech company, a tech company would have. Wonder what they would have done for their laptop...

Or maybe this is just an inevitable outcome of capitalism/human biology where a veblen goods company will become a top player in a market.

(I have my own Google and M$ complaints too)


So Apple is the most successful company because they prioritize things that no one cares about?

I dunno, if a there was marketing company that could design the likes of the M series chips along with the mobile versions, develop a full technology stack from programming language and compiler, custom chips, through to shipping whole devices at unimaginable scale would make me wonder what technology companies were doing.

What other “tech” company really compares from a hardware perspective? Samsung? Dell? AMD? Love them or hate them, there’s no denying that Apple has serious technical chops. One day people will only hate Apple for reasonable things, today’s not that day apparently.


Apple develops its own OS. Apple develops its own development stack, frameworks, etc. Apple develops its own CPU/GPU architecture. Apple develops its own battery architecture. Apple develops its own tooling to manufacture a lot of their products. Apple develops its own tooling to dispose off their products.

There are very few companies that have as much first party Tech in their products from start to finish.

I think Apple under prioritizes sdvanced functionality but if they’re not a Tech company than it’s hard to see what is.


It's probably fairer to say "Apple builds products focused on things I don't care about."

Obviously, other people care.


Who knows... maybe they will be like Google (which I consider a tech/engineering driven org) and they'll throw away (good) products all the time just "because"?

I think Apple plays the "niche" very well, not only regarding marketing, but also from a techs view.


What a weird take. Literally every "tech" company is chasing Apple's silicon but you are trying to claim that they're not a tech company. Let me guess, iPhones and iPads aren't tech either, right?


no they arent lol


Considering that Apple has been significantly more innovative (tech wise) than pretty all of their competitors I’m not quite sure what this tells about them.


marketing


Not really . Or rather not only. The only two things I hate about Apple’s hardware is the lack of repairability and the price gouging for memory/storage upgrades. Otherwise they are objectively miles ahead of their competition

Of course I have no idea why am I taking the effort to respond to an edgy single word comment…


> Note that memory bandwidth is down. M2 Pro had 200GB/s, M3 Pro only has 150GB/s. M3 Max only has 400GB/s on the higher binned part.

just contrasting this with the recent TR4 announcements from AMD, apparently their PRO variants top (theoretically at least) at around 325GB/s (non-pro versions are half of this), so just from that perspective alone M3 Max's might be better ?

i always have the naive assumption here that keeping the-beast i.e. the cpu fed with data is much better for overall performance than just clock-rates etc.


The 2x USB/Thunderbolt ports are on the same side. :(


Unfortunately, I don't see apple doing anything but price discrimination using mostly number goes up.


The max is probably only going to be in desktops, so better to use the die area for other things than E cores


Wish that you could get the 16-core CPU with the smaller Max GPU, but alas I just ordered one anyway.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: