Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
iPhone 14 Pro faced 'unprecedented' setback leading to removal of new GPU (macrumors.com)
180 points by tosh on Dec 23, 2022 | hide | past | favorite | 238 comments


This seems like an odd use of 'unprecedented'. It's in the same category as "I made an unprecedented dietary decision today and decided to drink chamomile tea instead of mint". Yes, it might be unprecedented because I've never done that before, but it's also super boring. Who cares? So they tried to put too much graphics power in the iPhone and it didn't work. And it's not exactly a setback either. So they had to have a slightly less powerful phone. Whatever.

Sometimes these article titles are ridiculous and I'm quite tired of this hyperbole, especially in tech. What about "iPhone engineers tried a powerful GPU but had to use a less powerful one".


Seems quite reasonable to me, we've got a world-leading chip team who have been doing great work responsible for a pretty major cock-up. Sounds like they're not used to these kinds of failures in apple silicon development.

> And it's not exactly a setback either. So they had to have a slightly less powerful phone. Whatever.

Having a roadmap with a more powerful GPU with senior management being told its deliverable right up until very close to the end is a pretty major set-back. Yes Apple is so dominant and well respected in their market it's unlikely to have significant impact, but may produce fewer iPhone 14 Pro sales. Also Apple have got to that position specifically because they've been doing a great job executing to ambitious road-maps without things like this. So short-term, looking at this product, yes, maybe not that big a deal, long-term they need to ask themselves how they're ensuring this doesn't happen again. Given they're restructuring the team and getting rid of managers sounds like Apple leadership take this pretty seriously.


I wish more companies would "cock-up" by shipping functional products.


And prioritise battery life over faster processors.


“Our product can last all day!”

Doing what? Sitting on the counter? I want a phone that can stream data all day (music and maps) while occasionally taking photos and sending messages.


Get an iPad. I made the mistake of buying one a year or so ago and it completely replaced my personal laptop. It’s basically a massive battery, so it lasts all day and then some.

Also pick up an Apple Pencil and download procreate, if only for the “well that’s pretty cool” experience.


Even iPads are shit now. I have two iPad 11 Pros and they can not last a week just sitting there. My Ipad 2 and 4s? I charge them once every 3-4 months on super light duty. Apple is slowly and surely going down the gutter.

The iPhone 14 pro is a pretty sad upgrade too with regression in battery life from the 13. I've owned almost every iphone since 4 and the 14 is the one i skipped because it doesn't even have a SIM slot! Lets see if they can bring it around in 15.


> I have two iPad 11 Pros and they can not last a week just sitting there.

To be fair to Apple I don't think any company is too concerned with features aimed at people who don't use the product.


He’s stating an iPad has replaced his laptop, and you complain about battery going empty if not used for a week?


They will never have a SIM slot again unless you get the China model, and you don’t need one. eSIM is better.


Physical SIM cards don't make me contact my carrier when I get a new device. What does eSIM offer that beats that?


You can get a data-only eSIM in a couple of clicks for 4,50usd for example. I will be doing this in a couple of days. Very useful when traveling to a country where crazy roaming charges apply.


You have to contact the carrier every time you use a cell tower.

Or are you saying you have to contact them to transfer eSIMs to new phones? You don’t have to do that anymore.


> What does eSIM offer that beats that?

"shareholder value" for the carrier.


Perhaps not with this phone but with other devices, it's not uncommon for a manufacturer to work with other companies to make sure there's something to show off on this device.

For a GPU on an oversized phone, you could have been looking at additional video or picture editing operations that did not get released.


> Seems quite reasonable to me, we've got a world-leading chip team who have been doing great work responsible for a pretty major cock-up.

You mean had a word beating chip team. I heard that the M1 team pretty much all left.


There is this weired thing, that when mudnane things happen, but Apple or Tesla is involved, they are suddenly unorecedented and newsworthy.


Stories about popular topics will have a natural leg-up in becoming viral. However, while a mundane story could become popular it will be hurt by people wondering why they are seeing it, other than simply because its about a popular topic. So it is common for such stories to also bake-in a facade reason for their popularity. No, no, no. You are not seeing this story on your feed simply because its about Apple. You are seeing it because something 'unprecedented' happened at Apple. And surely the more engagement you see on it, the more unprecedented the thing was! This effect has been weaponized in politics and crypto, where you get a huge base of proponents upvoting anything related to their pet topic.


Exactly. “Engineer Blows Budget” is also known as a day that ends in “y”.


Any hardware product company has to be atleast one cycle/model ahead in prototyping and development. If the 13 is launched today, then 14 should have already finished prototyping and development.

The "unprecedented" is because they found this after the 13 launch, and it's not an integration issue – it's a core issue. It's not possible to go back to architectural drawing boards in the current product iteration. So they had to do a hail mary and revert back to the previous launch's GPU.

That does count as a major goof up (if it happened ofc). We have many popular YouTube reviewers saying there is no reason to buy the 14 series, the 13 series are equally as good _(except the Pro Max's dynamic island)_. Apple just happens to be in a comfortable position. These kind of things could kill smaller hardware product companies.


HN should run all headlines through the following GPT prompt: "please rewrite the following headline without hyperbole: $HEADLINE"


The key thing being that they apparently had a fallback plan, which I assume is always the case for every generation of iPhone. That means that while it might be unprecedented, it apparently wasn't unanticipated.


It wasn't clear from the article that there was a fallback plan, or that they decided to use a lightly modified previous gen GPU because that was the only way to get working silicon in the timeframe?


I suppose you’re right, although I find it hard to believe that wasn’t the explicit backup plan long before they even started investigating the new stuff that ended up not working out.


They discovered the flaw late in their process. Since they discovered it so late, the had to base the iPhone 14’s GPU on the iPhone 13’s. Because of that, the 14 only delivered small graphical improvements. According to benchmarking firms, prior generations showed major leaps over their predecessor.

That’s the unprecedented part - the 14 only showed small gains whereas previous generations all showed big leaps.

In a case like that, unprecedented fits. It’s not hyperbole.


The iPhone is the best smartphone in the world. Its mobile CPU is the best mobile CPU in the world. People are used to Apple pushing the envelope with its mobile SoCs. If they step back and decide to include last gen's GPU on this gen's chip, that is unprecedented and it is significant.


    The incident is reportedly unprecedented in Apple's chip design history
Did ya read the article?


Yes this is the engineering equivalent of putting too much ham on your plate at Christmas dinner. Unprecedented amounts of ham go uneaten!

Early specs of engineering project don’t match final specs, oh the humanity!, more at 11.

News content is frequently very stupid.


I would rather hear headlines about Apple failing at a stretch goal than them just falling behind.


Agreed. When they wrote “missteps”, I figured that there were errors found in the GPU design. Instead, it seems like the design was fine, but had an unexpectedly high power draw. The GPU worked, it just needed more power than they expected.

That doesn’t seem “unprecedented” to me.


Unless they had previously perfectly modeled and predicted power, until this moment. (But that seems unlikely)


Attention to power should be given importantance in the design process, especially in mobile devices.


Apparently, every post we write here is unprecedented too.

...except if writing two consecutive identical posts, the second one being then not unprecedented, but making that pair unprecedented.

Ok, I'll stop now:)


Also, am I wrong, or are Apple products suffering more hardware flaws?

When was "It Just Works" abandoned as a guiding principle?


How did you end up in this thread?


How did you end up replying?


I was interested in the topic.

You think nobody should care, but you cared enough to comment so I was just curious. If I'm not interested in bananas I don't visit threads about them. HN seems mostly rational, so I'm legit interested what has driven you to click and comment.


How did any of us come to be?


Lynn Margulis knows the answer.


>This seems like an odd use of 'unprecedented'

so then, what was the precedent in an earlier generation of iPhones where a new set of chips was designed and developed which could be marketed as a leap in performance, but shortly before committing to manufacture they were forced to abandon the plan and roll back to the chips of a previous iPhone generation, leaving marketing with no performance boost to trumpet, especially in the context where the previous generation was also considered a disappointingly marginal improvement over its predecessor?


You're technically correct, which is the best kind of correct.


Odd, not false.


Odd, meaning literal, you mean. Odd use of odd.


I disagree. It is not odd to say that an overly literal use of a word is odd.


Its really quite interesting to see that all of these companies that make chips are having varying degrees of issues with making GPU's. Intel's arc was rumored to have some issue with its silicon, AMD's RDNA3 is this strange failure after RDNA2 shocked the world (and everyone thought they would easily build on that), and Nvidia, while successfully pulling off incredible generation to generation performance uplift with Lovelace, has Jensen telling people that the silicon for cutting edge graphics is going to be much more expensive from here on out. I'm wondering if its issues with trying to miniaturize their designs to 4nm and beyond, and if this recent issue with apple ties into the issue with sram not being able to really be miniaturized further using current industry knowledge.


How is RDNA3 a failure?


(from memory) more than twice the FLOPS, 60% more RAM bandwidth, but only 20% more FPS at best.

There's something strange going on.


At the moment the flagships are being matched by the 6950xt in some benchmarks. Hopefully the ‘fine wine’ effect kicks in and the 7900xtx gains on the 4090. The gpu market desperately needs that to happen.


It's significantly less power efficient than was shown off.


This yearly upgrade cycle is terrible. Their phones are fantastic, and now they're stuck in some sort of spiral to push things every year. They could stand to chill out

I got the new 14 Pro, but only because I wanted the mag charger feature. Before that, the 11 Pro was still a solid phone and did everything I wanted and more. Won't be upgrading at least 3 years


I disagree. A consistent yearly upgrade cycle is actually a good thing. It allows staggered upgrades for people who do not want to upgrade yearly, but still create the necessary incentive for incremental improvement.

A new iPhone every 5 years for example, would be dumb. Ignore the hype and marketing and just pick the one you'd like if and when you upgrade.

A tangent:

Personally, I wish more things had yearly upgrades and the backwards/forward compatibility Apple devices generally have. It's a different model, but it would be nice if game consoles had improved iterations yearly. The once a half decade model creates too much of a boom/bust cycle imho. If the Switch had a new, better version every year that was both forward and backward compatible within a "software generation" a lot of games like Pokemon wouldn't look and perform absolutely terribly.


A new iPhone every five years means you’ll have a harder time getting one, like whenever a new video game console gets released.


4 years of development time, 1 year of production to build inventory projected to last 6 months or something.

Take pre-orders a few weeks in advance, then everyone can have it day 1. No scalpers reselling them on eBay for 10 times the cost. No stampede at Apple stores because they already have a few weeks of inventory.

The accelerated product cycle is why you have awful launches like consoles. The product is barely finished and rushed into production in time for Christmas. There's not time to build the actual units. Slow down a bit and you can solve that problem.


Let's put it into perspective. A11 had 3x less bandwidth, 5x less flops, node that's a lot worse(since then we had N7, N7P, N6, N5 and it's many variations). Such big leaps are problematic in terms of project management. Just ask C++11 people. Or Intel, with their EUV nodes. Or anybody else who decided to shoot to the moon. Having intermediate deliverables has an immense value on its own. I'd argue that we are getting much better phones this way than we would have otherwise.


All of those ‘bad’ thing in launches serve a marketing purpose. Scarcity makes consumers think something is in demand. Additionally, every day of stockpiled inventory costs money.


Yes, artificial scarcity and the accompanying artificially high prices are a bad thing for consumers, in all situations.

Inventory in a warehouse does not have the kind of cost you think it does. Most of the "cost" people talk about are hypothetical profits that a different product occupying that space might generate in the same time. The real costs are insurance and warehouse leases. Per unit, it's almost nothing.

A company like Apple could easily stockpile months worth of inventory and never even notice the real physical cost of storing it.


If you think there's no value to present money flows, I'd love to take $1MM from you. I'll pay it back later, so there's literally no cost to you, right?

Opportunity costs, or any other "virtual" cost, are still meaningful figures.


Not if we are talking game consoles. Sony and Microsoft sell them at a loss and then recoup that loss with a cut from game sales.


A few weeks of iPhone inventory is going to be in the tens, if not hundreds, of millions.


Perhaps, but a new better one every two years might be pretty compelling.

They sorta do a tick-tock cycle now, generally, its form factors on the 'tick', and significant hardware refresh on the 'tock'


Apple has terrible forwards or backwards compatibility. It is one reason I abandoned them. I wanted off the required upgrade bandwagon when my apps and such started breaking and their UX paradigms kept changing.


I’m still on an iPhone 8 and it’s running smooth and receives the latest iOS updates.

I couldn’t say the same about an Android phone of similar vintage (not counting lineageOS - there I can even still get the latest Android on my Oneplus One, but it’s more rough around the edges compared to the original OS)


i'm curious, which versions, devices and apps did you experience this on?


I used almost all Apple products from 2004 to 2018 when i decided to get off the bandwagon. I started with getting off the iPhone cause I wanted a fingerprint sensor and no notch. (OnePlus 7 Pro had what I wanted.) I hated the touch bar and wanted real F keys because I use those extensively in my IDEs. I also hated fixing my laptop once or twice a year cause keys broke. From an OS perspective, iOS upgrades permanently broke apps for me several times, which were not updated by their providers. For macOS, 10.8 was great and I eventually got to 10.12, but I had devices that work just fine but were 32-bit only. Then Homebrew even stopped supporting old versions, whichnwas the last straw. All in all, Apple just became more and more hostile to me and I am done with them. I keep an iPad solely for Foreflight, but I am sampling Garmin Pilot too. (Does not appear to be in the same league.)


> Apple devices

Well, not Macs. They've never managed to settle into any kind of consistent upgrade cycle there. In the past I guess they could blame Intel somehow, but they're not even releasing an M2 iMac at all.


> In the past I guess they could blame Intel somehow,

It was fairly easy. Intel's released products often would not meet original spec, or once it was throttled to fit into spec it didn't have significant performance gains over the prior generation. Often they would go generations without the performance warranting the increased part cost.

> They've never managed to settle into any kind of consistent upgrade cycle there

They would like an annual incremental upgrade cycle and a five year design cycle.

Even without Intel, they still have supply chain constraints. For instance, even with their own silicon, they are relying on TSMC to be able to meet process and production volume targets.

Releasing chips on a new process will likely lead to really odd effects due to yield issues - for instance, having a new CPU show up first in products with lower sales volume. That includes tiering, either with things like a regular vs "pro" phone, or retaining the older model of computer while charging more for the newer version.

Right now they appear to be on an 18 month cycle for Apple silicon, which is driving their new hardware (either with a new machine serving as the launch platform of some new SoC, or a design revamp of a machine based on existing SoC).

Humorously, the only product which has had an incremental update so far is the MacBook Pro 13".

I suspect they'll eventually have a two or three year "architecture" tick for their Mac and iPad SoCs. Depending on those architectures and the upstream changes at TSMC, there might be a lot of different SoCs released against that tick.

> but they're not even releasing an M2 iMac at all.

I'd be really shocked if they don't release a M2 iMac, most likely in March.


iMacs aren't exactly a big seller. There are new macbooks every year.


The thing is the "yearly upgrade cycle" isn't something people need to follow.

People (usually from the Android camp) complain that Apple never innovates on brings anything new, which is kinda true if you only take the delta between the two latest devices.

But they fail to take into account the person going from a phone that gets dropped by the latest iOS release (5+ years old) to the latest and greatest. The leap in features is staggering.


This was me this year. Went from a Late 2013 MacBook Pro that was an OS behind to an M1 Pro Macbook and also went from a 1st Gen iPad Pro to an M2 iPad Air with Magic Keyboard and Pencil 2.

Everything about the MacBook is better than its predecessor but I was particularly blown away by the iPad Air and the handwriting recognition. I put my 1st gen Apple Pencil in the drawer not long after buying the original iPad Pro and eventually sold it. Now I use my 2nd gen pencil every day and I’m even thinking of buying an iPad mini as a really portable digital notebook.

I get the same pleasure again next year when I finally get to upgrade my iPhone 8 Plus. Battery life on my 8 is terrible now and it will get dropped from OS upgrades next year. I’m probably going to upgrade to a Pro Max because I’ve heard the current generation get 48 hours+ battery. But most exciting of all next year: USB-C!


As a counter-point I went from an original iphone SE to the 3rd gen iphone SE this year because it started having major problems holding a charge for any length of time.

I much preferred the old form factor and would readily give up the faster processor, etc, for it.


Do you still have the old iPhone SE?

Apparently, and this explains my experience, after an OS update the phone will use a lot of battery while it does post processing of any photos you have and other random stuff.

After a full night’s charge, the next morning my iPhone SE battery plummeted so fast that the numbers were ticking down every few seconds.

After some Googling I read about this post-update battery usage issue. I deleted all the photos from my phone and my battery is now fine.

I wonder how many people have bought new phones because of thinking their phone battery is on the way out when really it is just post update activity.


> Apparently, and this explains my experience, after an OS update the phone will use a lot of battery while it does post processing of any photos you have and other random stuff.

Photo processing should only happen when plugged in.


Well the phone wasn’t plugged in and processing of photos is the only thing that really makes sense to use all that power after an update. Maybe this is not as predictable as they think… or maybe was a bug. Maybe it was one of those “sleeping” background apps that shouldn’t need to be terminated.


I wonder if that was Apple's CSAM stuff.

Funnily enough, after transferring everything to the new phone, the first thing I did was go through all my photo's and start deleting a lot of photos. I didn't realize it, but any time a photo is texted to you it saves it (makes sense, I just never thought about it), so I ended up deleting random photos going back years.

The charge port had gotten so finicky I couldn't get it to charge half the time and it was really a matter of economics. Spend $200-300 (or more) asking someone to fix it or spend a bit more and get a new one, so I decided to spend a bit more and get guaranteed good hardware.

I just really lament the form factor change.


> I wonder if that was Apple's CSAM stuff.

Nah, that never got to production. The internet collectively lost their shit and the project was cancelled.

They just teach the image ML model to detect new kinds of stuff and it takes a while for it to go through your library.


Nah, they do a slower cycle for some of the iPads and it results in a model that’s 3 years old still selling and everyone advises to wait a year for the new one because it will be way better.

While the iPhone is always a good time to buy because next years one won’t obsolete this years.


Ed Catmull calls this "feeding the beast" and it's one of the biggest barriers to innovation. https://www.youtube.com/watch?v=LafDex0L7FM


> This yearly upgrade cycle is terrible. Their phones are fantastic, and now they're stuck in some sort of spiral to push things every year. They could stand to chill out

They don't have planning meetings once iOS 16 or iPhone 14 ships to decide what to work on next year - like most companies, the products are divided into components, for is divided into features, where those features have multi-year pipelines and land when they are ready (although they often land incrementally).

Slower releases likely mean that there'd be more motivation to aggressively push changes out, e.g. your team's changes either ships in 3 months or have to wait two extra years. That means that a slower release cycle can actually decrease quality, and is IMHO one of the reasons that many longtime MacOS users have decided they'll never install an OS until the dot-two release.

Even if that isn't the case, slower release cycles mean that developers are less nimble in correcting the problems that inevitably exist.

A non-Apple example - Java before 1.9 is a great case study of how a poor release process cost insane amounts of efficiency, and cause a language to be too slow to take advantage of market opportunities. Their new process is faster and more predictable, and has resulted in significant gains for the language and VM.

> I got the new 14 Pro, but only because I wanted the mag charger feature. Before that, the 11 Pro was still a solid phone and did everything I wanted and more. Won't be upgrading at least 3 years

Sure, most people don't upgrade their hardware every year. That doesn't mean creating and selling new hardware is a bad idea.


They’ve basically killed the cycle, in all but name. Now it’s just steady progress but it’s not too differentiated. I too upgraded to the 14P from 11P, mostly for the MagSafe. MagSafe has been out since the 12 but I couldn’t be bothered until 3 years on. Now I won’t upgrade until USBC or it breaks.

The iPhone 14 (non pro) uses much of the iPhone 13’s internals, while the Pro gets upgraded. I think this is basically how the watch already is. I expect this is the future, the upgraded won’t really matter and won’t really exist beyond slight aesthetic tweaks. It’s like cars, no matter which year you get, they all drive just fine and are mostly the same internally. I’ve long said that cars are the natural comparison to these sort of electronics. Cars and phones are probably the two most expensive and utilitarian things people regularly buy. They come out with new ones every year, but don’t do major refreshes every year so you don’t really have a reason to tine when to buy one. I think it also indicates that we’ll see a price stagnating base-model phone and an ever increasing top-tier model as the companies explore how much people are willing to spend (a Corolla vs a Mercedes).

The nice thing about the steady new versions is that you can always go to the store and buy the biggest-number iPhone version and not worry about lifespan or if it’s good (contrast with a game console: if you buy the PS4 6mo before the PS5 comes out, you made a mistake).


> I think it also indicates that we’ll see a price stagnating base-model phone and an ever increasing top-tier model as the companies explore how much people are willing to spend (a Corolla vs a Mercedes).

I don't think so.

A Mercedes will get you laid, the latest iphone won't and likely never will.

cell phones stopped being status symbols long before smart phones were created by apple and I don't see them going back to that.


> A Mercedes will get you laid

Only because it’s a sign you have money, not because cars are cool.

> the latest iphone won't and likely never will.

When apple has a $5k iPhone it may once again become a status symbol.

Almost 50% of Americans have an iPhone of some version. It’s obviously not a status symbol of everyone has one. iPhones, like cars, are mostly undifferentiated.

Apple has a lot of room to build a status symbol, but admittedly they haven’t been too successful when trying with the Apple Watch, so I will admit the odds are low they’ll succeed, but I believe the market exists in the long run.


> Almost 50% of Americans have an iPhone of some version.

Smartphones are present in ~85% of US households, and about 45% of US smartphone users have an iPhone; that’s significantly less than 50% of Americans with an iPhone.


I think the issue is that a phone mostly sits in your pocket, a car is clearly visible to everyone when you "enter the room", as it were.

The other issue is copying it is too easy, so even if Apple tried the look of it would very quickly get copied.


> A Mercedes will get you laid, the latest iphone won't and likely never will.

Not having one might harm your chances, though.


I just changed my phone after 6 years, but I did not expect the industry to suspend for all that time just because I don't need a new phone every year. Having new products every years does not suggest you should upgrade.


Completely agree. 2-4 years would be wonderful if they actually worked out all of the bugs. Instead we get mostly half baked ideas and products instead of stuff that actually works solid.


yeah, I went from a 12 to a 14, and honestly you could swap it back and I wouldn't even notice or care.

maybe this is what we should expect, though, as the tech matures -- like cars: there's a new model every year, but only significant upgrades every half decade or so.


Why did you feel the need to upgrade?


what can i say im a sucker

but mostly 'ciz work paid for it & and i could give my old phone to my brother-in-law (who had recently lost his in a hilarious dating mishap -- a hike, a river, old shorts w/ holes in the pockets...).

to be fair, the camera is better -- but im such a pitiful photographer it's wasted on me.

also, the pill sucks compared to the notch. (notch disappears mentally, pill doesn't.)


> I got the new 14 Pro, but only because I wanted the mag charger feature

As long as people are willing to upgrade for minor features, why wouldn't they release a new phone each year?

I say this as a person who would (will) absolutely upgrade just for USB-C hopefully next year.


I have an iphone XS. I'll be upgrading for USB-C, Globalstar messaging, and 8GB of ram. All three are required.

I will not play the nickel and diming game on DRAM, 6GB on a flagship phone ain't enough for a five year phone.


I'm still holding onto my 11 Pro for now, but if the 15 Pro has USB-C I'll happily upgrade. There's nothing on my 11 Pro I'm unhappy with still, though the battery life is noticeably worse than when new by now.


Even stupider is their OS yearly release cycle. Apple OSes are feature-complete. They have been for quite some time. They do need iterative updates to add new APIs for new hardware, but that's really it.


I like to stay on older OSes because I agree they're feature complete and I don't want to lose battery life to the newest ML hotness I probably won't use and won't run efficiently on my old phone.

I've praised Apple for allowing me to do this by continuing to put out security updates for older phones, but the most recent set of pretty important security updates (15.7.2) have artificially been restricted to only devices that don't support 16. This is incredibly frustrating because all the work has been put into making my older device secure without having to update to the latest OS, Apple won't let me have the update and I have to go to 16 instead.


Just buy an android. You won’t have to worry about yearly updates because you won’t get any.


I get updates every week or two from LineageOS. I'm sure you'll say that doesn't count, but I specifically pick devices with LineageOS support.


My main phone is a Pixel


Welcome to publicly traded companies and stock-based executive compensation.


It is also the consumer news cycle.

When a company doesn't have a yearly big flagship product launch, people assume something is very very wrong.

Let's pretend Samsung skips releasing a new Galaxy this year. Mums the word, no news at all, they just don't announce anything.

Well crap, to the average person who wants a new phone, that is scary, they want a phone that is going to be supported, not a phone from a company that might go out of business tomorrow! Or at least stop making phones.

And who wants to buy a 1 year old phone? Heck phones have a huge drop off in purchase within months after release, 6 months on the market and unless you need a phone, mine as well wait another 6 months and either get the model you're looking at on serious discount, or get something that'll last longer.

And then the tech press will go crazy with speculation! "OMFG END OF THE WORLD FOR THAT COMPANY!!!"

Ok, another fun fact, profitability of consumer electronics, at least in the US, generally relies upon holiday sales. iPhones pop out in September, which is about optimal timing for reviews to hit en-masse, to build mindshare, and have partners do their initial huge full price sales before holiday promos kick in.

Start releasing every 18 months instead and you fall off that train, your choices are pretty much 12m or 24m cycles to hit that holiday release.


The main reason I go for the newest android phone possible is to make sure I get updates. Only in the last year have google and samsung increased their update commitments to the point that a year-old phone is acceptable.


This seems more like any company simply trying to maximize revenue and profits rather than something to do with being public.


but only public companies have to maximize revenue to the Nth degree, publicly. If the company isn't public, you have a far smaller audience to impress, and one that can be made to understand nuance. Eg how is Dell doing? Few really know since they went private.


That’s a fine argument if you explain why a yearly release cycle hurts long term total revenue as well. To me it seems to help it.


personally i was pretty stoked about the satellite SOS feature and dual band GPS. The huge jump in camera megapixels and car crash detection is also nice.

but if none of that interests you then yes, i guess it could be considered a terrible cycle.


What a silly statement.

Most people aren’t updating every year.

Imagine if they only updated every second year. You would pay full price for a phone that came out 23 months ago?? Don’t be daft.


I really don't get this implication that it's bad to do minor updates to products every year. They're not expecting you to buy one every year (but they hope you do!). It's like I don't complain if they come out with a 2023 Corolla with refined cupholders.


I'm really curious as to why Apple has been unable to reproduce their leap in CPUs in the GPU space.

It's not exactly surprising when Nvidia parts handily beat the M1/M2, but when both Qualcomm and Mediatek have better GPU performance _and_ efficiency [0] something is up, especially given just how far ahead Apple has been in mobile CPU

[0] https://twitter.com/Golden_Reviewer/status/16056046174164295...


They have been designing the CPU since A4; the CPU success didn’t materialize from nothing, the M1 is the 10th gen.

They have only been designing the GPU since A11.


No, they started designing them earlier. A11 was the first one that they publicly claimed to be fully in-house. They were substantially (but not wholly) Apple-designed as early as A8, and generations prior to that they did significant tweaking to.


I wonder how closely related their GPU is to PowerVR these days as well. With both PowerVR and the Asahi GPU driver it would be interesting to see if any of the design still resembles PowerVR.


> I'm really curious as to why Apple has been unable to reproduce their leap in CPUs in the GPU space.

GPUs are highly parallelized and specialized systems. The workloads are already being optimized for the GPU, rather than having a CPU which is being optimized to deal with more arbitrary workloads (with things like branch prediction, superscalar architecture, etc).

So you could say, without creating new instructions to represent the workflow better, there is a fixed amount of digital logic needed to perform the given work, and that translates to a fixed amount of power draw needed on a particular fabrication process.

So Apple could throw more transistors at the problem (with a memory bus that can support the extra need), but the same amount of work still would take the same amount of power and generate the same amount of heat. It is usually far easier and more efficient to create dedicated logic for particular common problems, such as certain ML operations or toward hardware video encoding/decoding.

> It's not exactly surprising when Nvidia parts handily beat the M1/M2, but when both Qualcomm and Mediatek have better GPU performance _and_ efficiency [0]

Benchmarks are highly subjective, so I'd wait for more reviews (preferably by people with more established reputations, and perhaps a website). Reviewers who might try to determine _why_ one platform is doing better than another.

GPU benchmarks are even more so, because again the workloads are targeted toward the GPU, while the GPU is also optimized for handling particular workloads. This means that benchmarks can be apple-to-oranges comparisons - even before you find out that a given benchmark was optimized differently for different platforms.

There is also of course the reality that some vendors will optimize their code for the benchmarks specifically, going as far as to overclock the chip or to skip requested instructions when a particular benchmark is detected.


The thing is that mobile GPUs are hardly utilized unless they end up in something like the Oculus Quest or Nintendo Switch.


Is Apple really that far ahead in mobile CPU, or is it just a node issue?

https://www.notebookcheck.net/Qualcomm-Snapdragon-8-Gen-2-be...


Does Apple need to catch up with Qualcomm and MediaTek in terms of raw gpu performance when Apple can optimize software and apis given to developers to work on its hardware? Or am I really out of date and is their public evidence of Qualcomm and Mediattek outperforming apple's hardware in real world workloads?

Nvidia primarily makes add on GPU's, if I understand their business correctly. Apple integrated a GPU onto its m2 (or whichever chip is used in their studio) that performs comparably to the 3060, and even beat the the 3090 in some benchmarks/workloads. I think that's pretty impressive.


This isn’t at all true despite Apple’s marketing. The M2 gets trounced by the 3060 in any graphical benchmark other than power draw, comparing it to a 3090 is just laughable.

https://nanoreview.net/en/laptop-compare/razer-blade-15-2022...

Like I absolutely love my M2 air, it’s the best laptop I’ve ever owned but it is definitely not a competitive gaming machine.


> comparing it to a 3090 is just laughable.

The idea of trying to fit a 3090 in a laptop is amusing.


That’s the point the comment you’re replying to is making, just with more words.


The original topic of conversation was:

> Nvidia primarily makes add on GPU's, if I understand their business correctly. Apple integrated a GPU onto its m2 (or whichever chip is used in their studio) that performs comparably to the 3060, and even beat the the 3090 in some benchmarks/workloads. I think that's pretty impressive.

The form-factor of the 3090 isn't relevant.


Kind of like how the form factor of the space shuttle doesn't matter when comparing it's peak speed and cargo capacity to my pickup truck.


Its more the fact that we're talking about Apple catching up at all. Android SOCs have been generationally behind Apple for a long time (and MediaTek in particular as a "budget" option), but now in the GPU space that is reversed.

The situation on the desktop/laptop is muddied by CUDA and other Nvidia-exclusive tech - while the M1/M2s indeed trade blows with laptop parts like the 3060 in some specific tasks, once CUDA comes into play Nvidia walks it (unfortunately IMO, even AMD can't compete there and its holding the industry back)


> beat the the 3090 in some benchmarks/workloads

Did it actually do that or was it in the "performance per watt" comparison?


Nah, it gets 10x fewer fps in anything, if you can even run it. Laughable comparison, really, given the disparity of the two.

This isn't an ARM vs AMD64 competition where Apple has a 40 year instruction set advantage it can exploit. The 3090 is nearly state of the art.


The official marketing comparison was to a mobile 3090, not a desktop 3090. Completely different GPU.


There isn't a 10x performance difference between the desktop and the mobile 3090, but nice try, Tim.


i think m1 was a big boost because of risc--compilers have gotten really good, and cpu pipelining has been well researched/developed, so there was a lot of performance to be harvested by putting everything together.

gpus, on the otherhand, are already risc. so where is apple going to improve? not by integrating everything: lots of companies have done this for years and years. if you want to do more with the same transistors, you'll need an even more clever execution model...


This is not correct. M1 is designed to take advantage of being RISC, but that doesn't mean it was fast because it went RISC.


As opposed to x86 processors which are designed for cisc, but just not to take advantage of it?


No, they do. It's just that x86 processors are currently built by people who did a worse job overall.


GPUs are more power-dense. Battery power or thermal envelopes limit what they can pull off.


You know what'd be better than a new GPU? USB-C with full speed USB 3.2/3.1 and Thunderbolt 3.


Why? Most people don't care about this and only use the port for charging.


Most people care about having cables that actually work and don't cost a ton to replace


And they’ve had that continuously since the iPhone came out so that’s a distraction from the real reason the EU acted: incompatibility. Lightning is a better connector but it’s not enough so to be worth so many people having cables which only work with one family of devices, especially since it’s not even Apple’s own laptops.


I found lightning worse than USB-C. So far none of my USB-C cables broke, however I already broke one lighting cable and my current one is only charging from one side. I sometimes plug it in with the wrong side up, and it doesn’t charge my phone. Then I have to plug it out, turn it by 180 degrees and plug it in again.

Anyway, these issues could be user issues or bad luck (so please don’t comment on this), but I have one nitpick which isn’t:

USB 2.0 speeds for pro-res footage are a joke. Why even praise your camera and the pro resolution format being introduced on iphone, when I can’t get it off my PC via cable vat decent speeds? I only used it once to make for a side project, and had ~30min of footage. Took about 1.5h to copy.


Lightning does not mandate USB 2.0 speed - there have been lightning ports on iPad Pro models which were capable of 3.0.

Likewise, USB-C ports on a spec-compliant device do not indicate greater than USB 2.0 speeds (e.g. high speed), nor do cables with USB-C ports on both ends indicate greater than high speed.

In fact, there are plenty of devices with USB-C ports which do not support data channels at all (but these tend to have compatibility issues due to not being able to negotiate power delivery)


It’s actually a hassle having to keep two types of cables around.


Interestingly, USB-C is one of the reasons my wife and I switched away from Android.

Don't get me wrong, USB-C is ideal for charging. However, it _always_ became extremely fickle with Android Auto. The connector would develop wiggle overtime, resulting in an unstable connection. You'd have to work voodoo to get it positions correctly - which would often result in even more strain and deterioration. My most recent phone became completely unusable with Android Auto after about 6 months.


My Pixel 2 USB-C still works fine despite my partner constantly sitting on the damn thing when plugged in. Of course my Pixel 5 is also in fine shaped but that one is much younger(2 years) so we will see how it ages. Why not use Android auto wirelessly? If you car doesn't support it then get a Motorola wireless adapter.


I have a box at work with, quite literally, probably 50+ different manufacturer's 'USB C' standard cables. Some will work with certain docks. Some won't, but will work with Phone A and Monitor C, both of which use the exact same level of speed. Cost isn't a good signal. It's a complete cluster.


This will probably happen thanks to the EU.


USB Full Speed is like 12 Mbps and been supported since the 1998 iMac.

3.2/3.1 aren't speeds, they are USB spec numbers. Those specs don't mandate the speeds you think they do.


I have never once connected my phone to my computer, with a cable, besides for charging. What am I missing out on? I have a large iCloud storage plan, so maybe that’s related?


If you take 4k HDR video the files are huge and take ages to get off the phone via USB 2.0. Same with lots of 48 MP HDR photos.

It’s easier to let the phone upload them to iCloud and then download from there, but that shouldn’t be necessary (and requires a good internet connection).

I don’t think it’s a problem for normal people’s workflows, more if you’re trying to use them for professional video/camera things.


I think your average person, familiar with Apple products, would use air drop. There’s no sanity in bouncing off of iCloud. That’ll be many times slower than USB, for most asymmetric internet connections.

USB 3 will be a couple times faster than WiFi 6, used between modern apple devices, so I see how that would be beneficial, for people editing.


Some people don't use a mac so airdrop is not an option.

With Gigabit internet iCloud can be faster than USB 2.0


There are many apps that allow transferring files over WiFi, including to a windows computer. It never makes sense to bounce things off of iCloud, if time matters. You can’t start the receive until the upload is complete, and if you have Wi-Fi 6, gigabit internet would still be a bottleneck.


> There are many apps that allow transferring files over WiFi, including to a windows computer.

And they're a lot more awkward then using a cable.

They also might not even have a speed advantage over USB 2.0 if the router is in a different room.


Network tethering, scrcpy, and dead-simple very fast file transfers are three things I miss from Android. I think KDE Connect is faster wired too (eg to use your phone as a USB keyboard or mouse.)


You don’t need a cable for network tethering or file transfers on iPhones.


You don’t _need_ to, but it’s better and more reliable.


the phone also charges while tethered which is a plus


Even on Android I’ve never had a good time with file transfers. MTP is just such a flaky protocol.


Android Auto/Apple CarPlay are the only things I ever connect for data transfer.


Hardware development is brutal because there's long lead times. It is a regular occurrence that features ship in a broken state and the software (drivers, firmware, etc) either has to work around it or turn it off. That it was so bad they had to revert to the previous generation design sounds unusual, but it's not unheard of.


Compared to the delays that Intel have seen in recent years it's honestly quite impressive that Apple have managed to get new silicon every year largely on time.


I wonder if this was planned to use the PowerVR ray tracing solution. I believe Apple and Imagination Tech quietly buried the hatchet some time ago.

I also wonder how this will work for the Metal API. Metal currently does have support for accelerating ray tracing via acceleration structures and a few other things that require GPU hardware features not present on older GPUs that are not specific to ray tracing only. It doesn’t not have support for ray tracing specific hardware acceleration like in NVIDIA’s RTX and AMD’s RDNA2 chips. This means you can buy a 6000 series AMD GPU, stick it in an Intel Mac Pro, and not actually be able to use the RT acceleration hardware!


Metal does have that support on the API side afaik.

Just that they didn't implement the raytracing support in the AMD driver side...


Metal raytraycing is a thing and it's the officially supported API. I think AMD just didn't bother to write the driver since their GPUs are present much viewed as legacy by Apple.


Metal API. I can not understand why they did not just go with Vulkan.


Metal predates Vulkan by almost 2 years.


It predates the name change from Mantle to Vulcan by almost two years. Metal was built using Mantles ideas. When AMD gave it to Khronos, they changed the name to Vulkan.

Here is a comment from 2016 from a person already tired of explaining this to people. https://news.ycombinator.com/item?id=11112078


Mantle beat Metal to the press release stage, but Metal shipped before Mantle did.

Mantle morphed into Vulcan even later than that.


That has no citations and is pretty ranty. Given the timing, it seems more likely that all of the people in the industry working on similar problems identified the same problems with the previous generation APIs and since they all work with the same major developers they're going to be coming up with similar solutions.


At the time, Apple was using AMD GPUs in their macbooks.


Also their desktops. My point was just that they’re similar but not the same and everything in that space was evolving in the same directions for the same reasons. If Apple was ripping it off I’d expect them to be more similar and further apart in time.


I’m sorry but you’re quite wrong that Mantle was a simple name change to Vulkan.

The mantle API was a foundation but what came out was quite different.

Much of the Mantle team also went on to work on DX12 and Metal as well, so they have as much lineage to Mantle as Vulkan does

The comment you link to also ignores all the context of the timing, including the significant amount of time between when Metal was released and mantle was donated to khronos, and how much khronos was pushing AZDO OpenGL at the time. The post you linked is quite biased and uninformed.


Better still, AMD was the GPU vendor for Macbooks at the time of Mantles release. Sounds like classic Apple to me.


That's still no reason to keep Metal alive and ignore Vulkan. It's a huge competitive disadvantage by now. Put Metal on live support, implement Vulkan and transition to sit on a Vulkan translation layer. Profit!


Because Vulkan didn’t exist when Apple developed Metal.


Anyway, MoltenVK is a thing.


Metal is missing enough features that wrapping from Vulkan will never be ideal.


Adding ray tracing to an iPhone screen as “overly ambitious” is one way of putting it. On that size display, is it even impressive to the user? Are devs going to design to that capability? I just don’t see the point when a lot of desktop users aren’t in possession of nice enough graphics cards to enable ray tracing.


iPhone chips have a long lifecycle and a large user base.

The iPhone is one of the most popular devices in the world, and they have a 5+ year lifespan between yearly upgrades to people who buy secondhand years later. So an iPhone 14 chip will find use through 2027 or later.

The iPhone chips periodically also end up in lower end iPads, Apple TVs, HomePods, etc.

Beyond that, the engineering efforts impact all apple silicon, including Mac, so the efforts can apply to more devices.


Seems kind of assbackwards to start baking in ray tracing to the smallest device. Is there some advantage to this?


One possibility is that it doesn’t make sense financially on any other platform for devs.

macOS games are dead and iPad games are better but still adding a complex feature for a small group of iPad users might not make financial sense.


Simulation fail? I'm surprised they didn't know exactly how much power it would draw, that seems to be a standard thing in chip design suites. If a company like Synopsys blew it I'd expect some financial recovery actions.

That said, unprecedented? Pretty much every chip company I know of has made a chip that didn't meet thermals. The Pentium 5 was the poster child for a while of a reaching too far.


I wonder if A16 was backported from N3 to N4 which increased power consumption.


Piling on the speculation, the dynamic power draw when raytracing was enabled, especially the memory access patterns - and the poor performance in spite of the increased power draw - might have been the final blow.

But if an entire h/w project is punted to "the next release" there's more than just one feature that contributed. So maybe the halo feature, raytracing, plus the other new Metal 3 features all contributed?


The article doesn't say it was an issue with the tools; it sounds like the design was too ambitious.

Leakage power is not hard to quantify, but dynamic power is harder to evaluate, because it's workload-dependent. It's not always completely straightforward to get an answer.


Perhaps it used a new unproven process, making power dissipation difficult to predict?


Sounds like they just discovered it late in the design process, not in silicon.


One would think that the makers of smartphone chips would be particularly good at and experienced with calculating power draw.


I wonder how much TSMC pushing back 3nm plays into this? Weren't they projecting 3nm risk production to start at the tail end of 2021 originally?


Amazing that with all this drama it never made it to the customer facing side or had any impact on the success of the launch. Truly an outlier among high performing companies.


Exactly my thought. Another was: Because Apple shares technologies between platforms, I wonder if this GPU development hiccup had any bearing on the apparent delay of refreshed MacBook Pros and the Apple Silicon Mac Pro.


Mac Pro

I wonder if this was a bigger setback for the Mac Pro than anything else.


This article is about the A16 but the M2 family is based on the A15.


At this point, wouldn't apple want the mac pro to be based on its most cutting edge work?


Every year there is new cutting-edge work.


Oh that’s an interesting angle.


The M1 was such a groundbreaking chip. I don’t expect Apple to wow us again until the M4 or M5. This is a good time for Apple to pause, look at what Qualcomm, AMD and Intel are doing and see how they can attract the same class of talent that gave birth to the M1 again.


I wouldn't expect another M chip to be groundbreaking in the same way ever again. Instead I just look forward to incremental improvement year after year, the same with the A chips. That's more than enough.

Something similarly groundbreaking would requite a similarly radical redesign. Maybe there will be one another couple decades from now, but I can't even begin to imagine what it would be. Maybe something related to ML if anything.


> That's more than enough.

Sure but..already the second gen didn't show gains _that_ impressive over the first one.

If M1 successors will start showing Intel 2010s gains gen over gen, it's not gonna do great.


> Sure but..already the second gen didn't show gains _that_ impressive over the first one.

I don’t think performance is the only metric. For all we know, there could have been some considerable cost reduction.


How well the M series will do over time would be based on its comparative performance. For all the “not that impressive” gains of the M2, there still seem to be plenty of people that historically wouldn’t buy an Apple computer but are ‘begrudgingly’ buying Apple laptops due to M series performance. Have we seen anything that delivers a comparable value proposition, let alone a better one?

It seems natural that M series chips won’t forever be uniquely positioned ahead of the pack. I don’t think that anyone besides Apple actually wants that in the first place though. I have every expectation that they’ll keep up with the pack, or simply move back to third party chips, even if that means changing architectures, which Apple is not scared of doing.


Just got an M1 MBP from work and... I am staggered. It's _so quiet_. There's no heat. The battery is still above 50% and I've been using it for nearly two days!!

Just... staggered. What an upgrade.


Aren’t most hardware teams typically very old in terms of having lots of people with a lot of experience who have been working together for a long time? It feels to me like it’s not as simple as attracting talent.


> Aren’t most hardware teams typically very old in terms of having lots of people with a lot of experience who have been working together for a long time?

Yes, that was my experience with well-established players like AMD, Nvidia and Qualcomm.

Then again, there just aren't that many people in the world with experience designing GPUs or writing systems software for them. What you would often see is a manager and their whole team switching to a competitor. People prefer working with other people they already know and trust.

In addition, GPUs are massively complex and it simply takes a good amount of time to understand a new architecture well enough to make substantial contributions. I spent five years at NVidia and felt that I had only scratched the surface.


I don't expect them to wow us again. Most of their gains were from buying the best TSMC node and putting memory on-package.


Any tech advancement can be reduced to “they just did this extra thing”. If nobody else will or can do that extra thing, it’s a big deal.


It is a big deal, but throwing money at the technology provides limited runway as a strategy for impressive gen over gen uplift


Well, buying the leading node is big $ that not everyone can afford. Putting memory on package runs counter to repairability, modularity, etc.

When you're as vertically integrated and rich as Apple it makes sense.


> and rich as Apple

Being rich isn’t really related, it’s that they can still profit while doing this (some people say “sheep” while others say “good products with great integration”), where nobody else can, which makes them rich. ;)


> Well, buying the leading node is big $ that not everyone can afford.

It isn't just that Apple can afford a particular part. Apple funds some of the R&D for that part, the building of the factory (sometimes, multiple factories) that create that part, and will commit to buying 100% of its capacity for years.

They aren't just outbidding others for capacity. They are making long-term strategic partnerships to _build_ that capacity.

> Putting memory on package runs counter to repairability, modularity, etc.

Everything is a trade-off. For traditional OEM integration for desktops and laptops, memory on package would be too limiting. Apple is one of the few companies that can pull it off.


> putting memory on-package.

Sure, but do we expect AMD/Intel to do this for their processors anytime soon?

Apple can decide that their custom processor only needs two memory configurations to meet their product need.

Intel and AMD can't easily make that decision for all the OEMs in the world, and the architectural ramifications pretty much demand one to decide one way or another for a given chip generation.


lol so thats all it took to more then double battery life? why doesn't AMD just do that for x64 then?


M1 is notably possible because of the much tighter integrated ram & cpu, providing a huge boost in memory throughput.

It's definitely possible with stacked chips we see a similar mind of process-and-packaging driven boost.


The M1 is 'just' an evolutionary change from the mobile processors it was based on, which were mostly evolutionary changes from their predecessor.

I'd expect more revolutionary changes to not come from digital logic improvements, but rather:

1. Adding specialized silicon for specific tasks

2. manufacturing changes, such as moving to chiplets, allowing for a wider variety of processors.


M1Pro and M1Max are designed in Israel, so not certain if Apple has lost designers in Israel.

"...as well as the integrated circuits that were developed in Israel, and the jewel in the crown: the Israeli team played a central role in developing the premium version of the company's flagship M1 processor, including the M1Pro and M1Max chips designed to support premium Mac computers such as MacBook Pro and MacBook Studio. These chips were built here in Israel while working with other teams worldwide, including at the headquarters in Cupertino. The integration with the verification applications and processes was also carried out here."

https://en.globes.co.il/en/article-apple-to-open-jerusalem-d...


Apple is adding a GPU upgrade to iPhone?

I really want them to compete with Nvidia release a brand-new, efficient but affordable GPU for mac. So that I can do ML training on mac and not rely on Nvidia and Windows. It would also help people who want to use mac for gaming,


Apple bases the M series off of the A series. So the M1 is based on the A14 while the M2 is based on the A15. If this report is correct, then that means the M3 was supposed to get ray tracing.


I wonder if the M3 could still get raytracing since the power budget is higher...


The M1 Ultra is already "equivalent to a 3090" (lol) so maybe the M2 Ultra will be "equivalent" to a 4090 (lmao).


Did you miss the M1 / M2 series?


Sounds like a great opportunity for a desktop machine where power usage matters much less.


FTA: 'Apple engineers were "too ambitious"'. Yeah, right. I'm sure management had nothing to do with pushing unrealistically time-framed specs down to engineering.


And then later the article says:

> The error resulted in Apple restructuring its graphics processor team and moving some managers away from the project.

Not suggesting this is or isn’t it, but the manager thing cuts both ways. I’ve had managers who push overly ambitious goals, and managers who defer to the overly ambitious engineers because they seem confident enough and the manager has failed to understand the engineer’s capabilities.

I’ve experienced many engineers who are waaaaay too ambitious and really do not comfortably understand the true development cost to everything. They deliver the first 80% on time and the second 80% puts the project six months behind.


But then they deliver 160%. Sounds good!


Moving experienced team leads out of their (presumed) area of expertise seems like a strange reaction to me. Everybody loses if that's true.


The phone comes out once a year, anything that doesn’t make it can just go in next year.


I'm curious what software features were planned that depended on the high-power GPU, which I guess must have been scrapped.

Usually Apple does not ship a new hardware feature without some software feature that showcases it.

At one point Apple seemed all-in on using dot-matrix projections to scan 3D spaces. I wonder if this GPU would have enhanced that capability somehow. Or perhaps it was simply supposed to be the icebreaker for a future M3 chip of some kind.


For the past few years there were regular rumors that Apple will launch AR glasses and/or a VR headset "really soon".

AR goggles would definitely need to be light weight, so they would probably rely on an iPhone to do all the processing.


Question: If even most PC gamers are pretty take it or leave it, on the whole real time raytracing thing in AAA PC games, it being more of a flex rather than a must have, then what's the point of having this tech on a phone for iOS games?

I don't game on my phone, so am I missing something wild here where seeing raytraced reflections on a six inch screen would be the ultimate game changer and have everyone rush out to upgrade their phones?


I wonder if they might be applications for accelerated raytracing in augmented reality applications? There have been a handful of publications suggesting it for a while, such as https://ieeexplore.ieee.org/document/6297569 .

This demo video is pretty convincing: https://www.youtube.com/watch?v=i2MEwVZzDaA - looks like it was part of a PhD thesis https://www.peterkan.com/download/kan_phd_thesis.pdf


You're missing the fact that Apple uses the same building blocks for their chips across all of their devices.

The M1/M2 are "just" a bunch of the same CPU cores found in A14 and A15.

They do the same with the GPU blocks — the GPUs in M1/M1 Pro/M1 Max/M1 Ultra (and presumably M2/3/4 etc. iterations of those chips) also have the same GPU cores, "just" more of them (and probably clocked differently, not sure on that).

So missing the RT hardware for iPhone 14 Pro is probably not a huge deal. Missing it for A16 tape-out, which if the pattern holds, means it'll also be missing in M3-generation of chips, is a much bigger one.


I don't think your premise is true. PC games look great on RTX cards. People like it; they just have trouble affording cards that support it. So there is limited appetite for developers to support ray tracing below AAA. Getting the tech on iPhones would change that.


It's a bit more complex than that. Hardware raytracing is still so slow that it requires sophisticated ray allocation and denoising schemes to look good. These are where the real implementation effort goes in all contemporary hardware accelerated raytracing implementations. Nvidia has DLSS as a very good and rather versatile denoiser, but it's not generally available. You need to go through an application process and considered worthy (by whatever criteria they have).

So the bottom line is that not a lot of developers can afford to tackle hardware raytracing as a feature at that point. It's too expensive to get right for them.


I agree. Metal's new upscaling is Apple's answer to DLSS, but of course a) it's only just arriving now, b) Apple will never offer optimizations customized to individual games the way Nvidia works with developers. So, we should expect no application process as they keep rolling out better hardware and software, but less drastic results in exchange for that. And the same story will be with improved ray tracing capabilities, except in that case I'm more confident that Apple will release a GPU that can do a certain amount of it usefully for the usual reasons (known hardware to support, lower resolutions.) Nvidia should also be doing that right now, but they've chosen quite a different pricing/availability strategy than Apple has in the last few years of GPU development.


> Apple will never offer optimizations customized to individual games the way Nvidia works with developers.

My understanding is that Metal (and Vulcan) are really meant to be about _not_ doing this, and instead exposing something much closer to the underlying hardware. So it would instead be, Nvidia developer relations working with a game studio to optimize the studio's engine code for Nvidia cards.


>PC games look great on RTX cards

This is highly debatable. LTT did a video of a blind test where people were shown games with RTX on and off and we're asked which looked better and most couldn't tell the RTX one was better looking.

Plus, RTX on a 24"+ screen and on a 6" screen are completely different things. If people can't tell RTX on a large PC monitor, what are the changes they'd actually see any difference on a tiny phone screen.


Machine learning and AR are far more interesting applications of the GPU than high resolution traditional games on a tiny cell phone screen.


What does this have to do with raytracing or lack thereof on a iphone GPU?


In my mind it would basically be UI skeuomorphism 2.0, this time with photo-realistically rendered materials, lighting, and shadows.


> The error resulted in Apple restructuring its graphics processor team and moving some managers away from the project, including the exit of key figures that apparently contributed to Apple's emergence as a chip design leader.

This sounds like an IQ test to me.

You have a team that kept a good track record for making breakthroughs, breakthroughs that made your trillion $ market cap possible. they tried to make another one but not successful this time. So you remove those key figures from the project after dozens have already walked away in the last few years?

You need to have a <100 IQ to cook a paywalled story like the above.


Oh dear. I hope humanity and civilization can recover from this "unprecedented" turn of events.


My sadness is unprecedented - I can’t get a ray traced giant bag of gems for $20.


No, you just can't get it in real time. But you can it in my app for the much lower price just $0.50/frame. it's a bargain!


I wish HN had a way to mute submissions by certain users. That way I could be blissfully unaware of certain categories of tabloid news stories in here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: