> Amazon, for instance, hired about 780,000 people during the pandemic. Meta, too, had more than doubled its staff in this period, going from about 40,000 to 87,000. Even Microsoft had hired about 77,000 people just before the pandemic.
> Apple, in comparison, is believed to have hired under 20,000 people during the period of the pandemic.
Can't layoff people you didn't hire.
Besides, if they lay anyone off, it'll be through retail (already happening in their third-party channel sales staff, who are employed by Apple but work in stores like Best Buy) and eventually retail closures when/if they can do it without being accused of more union-busting.
Global workforce as of end of 2021: 44.8% women, 55.2% men
Only field and customer support: 48.3% women, 51.7% men
Corporate Employees: 32.8% women, 67.2% men
ty for actually doing the math. Saying Amazon only hired warehouse workers as part of their headcount doubling was completely bs. Their new grad pipeline alone probably accounted for more than 5-10k hires during that period(Assuming ~20% corp_employees and ~780k new hires.
What I calculated is the absolute ratio. when looking at new hires you need to look at turnover. Turnover rate is significantly higher for field jobs (sometimes intentionally, e.g. Amazon always hire more people before the holidays for a term). Therefore, when looking at new hires, the ratio is skewed even more towards field workers.
At the end of the day, Amazon has around 300K corp emplyees, of those, they are laying off 18K+. This 6% ratio is similar to other tech companies.
Cook has made several decisions in the past handful of years that seem to show an ability to resist the group think that exists among the large tech companies leadership.
No scandals, no twitter addiction, no sexual assault cases. No bullshit. All this guy does is wake up at 4, work till 10, and sleep. Cuts his own pay voluntarily. Didn’t double his headcount while revenues fell like so many. Tim Cook is such a beast. Jobs left one last blessing for Apple shareholders in his decision to elevate Cook.
> No scandals, no twitter addiction, no sexual assault cases. No bullshit.
It’s unfortunate (and it says a lot about our current situation) that that behaviour is worthy of note.
> Jobs left one last blessing for Apple shareholders in his decision to elevate Cook.
I don’t understand the obsession with shareholders. It’s all about money money money, good products and respect for the customer be dammed. Cook was a fantastic choice to fill Apple’s coffers, but an awful one for those who identify with Apple’s (old) values of making something good.
Yes, I agree that hardware-wise they’re on top of their game¹. I was specifically thinking of software and services, like the trend of trying to upsell you to their services when using their apps or releasing half-baked apps which go against their own Human Interface Guidelines and are never fixed.
¹ Let’s see if that continues with Evans Hankey leaving.
Apple is a public company, and public companies are accountable to shareholders, not customers. Shareholders want it to be worth their while, so yes it's about the money. It is possible to do what's best for both, and Apple still does that plenty, but the customer of today isn't the customer of the Apple you're describing. I wish they focused more on the Mac line, and the M1 and M2 are returns to that, but Apple's making money off iPhones, not Macs.
> Apple is a public company, and public companies are accountable to shareholders, not customers.
Tim Cook himself, a decade ago, might have had something to say about that¹.
> It is possible to do what's best for both, and Apple still does that plenty
My objection is the trend more than the current state. A decade ago it would have been unthinkable to me that Apple would ever have ads in any OS. Today, I’m dreading the moment they go “full Windows” and start shoving third-party ads in Spotlight, then Safari, then the Dock².
Sure I think there are senses in which Tim Cook is accountable to customers, or maybe his own moral compass (as might be the case here), but in a specific and direct sense the board that controls whether he keeps his job or not is legally bound to act for the shareholders.
However I don't think Tim Cook in that moment was necessarily saying that accessibility was non-negotiable, shareholders be damned, it's also possible he was taking a broad view of shareholder interest that is about more than immediate ROI.
> the board that controls whether he keeps his job or not
A job he doesn’t need¹ and which he’s aware he’ll leave in the not-too-distant future.² The only thing stopping Cook from making different decisions is himself.
> Apple isn’t an advertising company. It’s a hardware company that also does software.
They could be a noodle shop. The distinction isn’t relevant to the point: they are shoving ads in our faces, from their own apps from their own OSs. Always pushing you to the subscriptions. Everything Cook does indicates it will only get worse.
They're expanding into advertising as a revenue source. This undercuts google on their platform, but it does mean that advertising revenue will become something they want to protect. Google makes lots of money advertising and it's possible this could become a major (thinking around 30-40%) portion of Apple's revenue.
Link is interesting and a good response from Cook. I was hoping it would be more of a complete rejection of the terribly destructive Friedman doctrine of shareholder supremacy.
Because the shareholders are the owners. And I don't think it's all about money (in the short term). I believe a significant part of Apple shareholders understand the importance of brand reputation and are willing to sacrifice short term gains for sustainable long term gains.
Customers have a dozen companies to chose from that would respect them. Or are you saying Apple customers are specifically those who do not like respect nowadays.
> Or are you saying Apple customers are specifically those who do not like respect nowadays.
That’s such a leading question, I’m genuinely wondering if you’re arguing in good faith.
I’m saying neither of those. I’m talking about how Apple’s leadership transition from Jobs to Cook changed focus from what the company used to do best.
If you look at what Apple has done since, diversifying production out of China, this may have been a trigger. It could also be purely supply chain stability, but Cook appears to be a more noble figure than his peers IMO.
The collision of politics and business is never good though, but in most cases the company has to bow politely, at least in the near term.
I don't completely understand your comment. Do you imply there's something wrong with being autistic? Or that overworking is an exclusively autistic trait? Both interpretations seems off, so I believe I missed your point.
Though I suspect it's a joke that went well over my head.
Stripe, as an example, drastically slowed down hiring when the pandemic hit expecting an economic slowdown. It then turned out that so much commerce moved to online—rather than stopped—so they ramped up hiring to meet demand.
The bet that companies like Shopify and Stripe [1] made was that COVID created a “new normal”; much of the economic activity that went online would stay there. They were wrong, so there were layoffs to correct it.
I’m not sure I believe that. It honestly sounds like an excuse to fire people. Or leadership has no clue.
The biggest thing I got from the pandemic is flexible work arrangements. But I and everyone I knew all wanted to get back out there. Just not back out on public transport to work. We were doing things online because there was no other option.
But once the borders opened and restrictions eased, everyone did what they said they would. Travel. Eat out with their friends. Play their old sports. Who in their right mind would think they would continue to stay home and do everything online?
I never wanted to commute. Not in busy trains and certainly not getting stuck in traffic jams. I also never wanted to shop or spend time in the supermarket getting spaced out by their weird lightning they install to make you make bad decisions.
I do want to visit friends, go out in nature and once in a while eat at a restaurant. But for me, I am so happy to wfh 80% of the time and do 99% of shopping and groceries online, that has certainly stuck. And while I am a bit more of a recluse than most people I know, almost everybody is happy to work from home at least some of the time and does far more online shopping than before the pandemic.
Maybe this varies between countries. Or social circles. I certainly see a lot traffic again so people do like to move around again and get stuck in traffic jams a lot, but something _has_ changed for at least _some_ people. A couple of years ago working from home was unheard of, now a company seems out of touch if they don't allow it at all.
Something that the pandemic has rammed home with my wife and I is just how poor the out of home dining experience is in general. The food is lackluster, the ambiance is awful, being surrounded by other people eating is terrible (my wife is a quiet talker and being fat always makes eating in public a shameful experience), so we'd MUCH rather eat at home. The quality of the ingredients are far higher, the food is tastier (my wife is an amazing and experimental cook), and we don't have to yell to be heard. We eat out when we're traveling but thats about it. Eating at home is better in every way.
I've found that eating out varies a lot city by city (estimate: 50%+?). Some cities have excellent base-level food; others are terrible.
As someone who lived in Atlanta during the restaurant revolution, it went from "I could do better than this at home" to most places being able to put together an interesting plate I couldn't/wouldn't do at home. And oddly, pre-pandemic prices didn't change that much when it happened.
My working theory is that restaurants only strive to be a bit better than the average restaurant in an area.
I may be spoiled, but its extremely rare we eat someplace where I find the ingredients, flavor, and options to match what we have at home. We don't travel a crazy amount (compared to people with traveling jobs or a real wanderlust), but we get around a fair amount and I see the variability you describe--but only up to a point. There's obviously places where the restaurants are truly awful, and there's places where the food is ok. But no place where on average the restaurants are excellent.
I'm probably in the minority but pasta and frozen pizzas are about the extent of my cooking skill and there are about dozen excellent restaurants within a 10 minute walk... it isn't good for my wallet.
Whether the average person likes to cook or not, I'm pretty sure that between lack of easy access to good restaurants and or reluctance/inability to spend on eating out, the majority of people make most meals at home.
>I also never wanted to shop or spend time in the supermarket getting spaced out by their weird lightning they install to make you make bad decisions.
Not that I ever used grocery pickup during the pandemic, my anecdotal observation is that, at least around where I live, you see very few pickers in stores for curbside pickup at this point. Curbside delivery/home delivery/meal kits/etc. are by all indications extremely niche from what I can see. I'm sure things have changed for some small minority however.
You’re being very absolutist, people don't have be doing “everything” online for the shift in habits to have a significant effect.
Yes on the whole most people do go out and travel again, but on average much less frequently than they used to because they retained the desire but not the same habits. They find they can scratch the itch with fewer outings.
All it takes is for people to go out 20% less than they used to, averaged out, and that’s a massive shift in economic terms that can heavily impact businesses.
You're absolutely correct that relatively small differences at the margin can have fairly large knock-on effects.
That said, I'm pretty sure the overall picture is that things didn't change as much post-pandemic as many predicted. Travel is pretty much back to normal. Dining out is somewhat back to normal--though you need to factor in inflation effects as well. If I drive into the nearest major city, traffic sure seems back to normal--especially mid-week. The shift to remote seen in tech is something of an outlier.
Eyeballing the TSA's checkpoint volumes (https://www.tsa.gov/travel/passenger-volumes), air travel certainly looks to be back at pre-COVID levels. And that's with a lot of companies watching travel expenses fairly closely.
Tech also depended a lot on the zero interest backed money from investors. Now that interest is up, money costs money, so the situation as changed as well.
I hear this a lot lately, and intuitively it is true of startups and businesses that are pre revenue, but I have trouble seeing how it applies to companies with billions in profit.
Google makes billions in profits by selling ads. Many of the buyers of those ads are affected by interest rates. Therefore, Google is affected by those interest rates.
This is fair, but general economic health applies to every company in the economy, and this large cap tech company segment is both profitable and seemingly laying off employees at a higher rate than the economy as a whole. So that blame seems misplaced.
From a theoretical perspective at this level of scale the number of engineers in particular does not scale with the ebb and flow of the selling of ads to the degree of the layoffs. Many were for divisions that were not related to ads, except for getting funded by them.
Obviously if there is not funding for these other divisions because the ad business dries up there would be large cuts, but these companies are still very very profitable even on lower ad sales. So perhaps they did not have a positive ROI, and that is fine, but a far different reason than waving vaguely at interest rates.
The calculation (for Google and for the advertisers on Google) is not "does this have a positive RoI as compared to zero?" but rather "does this have a more positive RoI than all my other alternatives?"
When the risk-free rate increases, the RoI hurdle for any investment also increases. That's how economic stimulus by lowering interest rates works, by creating incentive to "try things" (by removing the incentive to "park your money").
Advertisers on Google now bid on ads based on an environment where they are more conservative (because they have better alternative investments), where their input costs have increased (labor, raw materials, energy), where some industries (like mortgage refinancing) have been dramatically curtailed directly by the change in rates, and for some companies, buying back their own bonds is more attractive than advertising on Google with a portion of their spend.
Google can look out on the horizon and conclude "OK, we had a really great business when rates were near zero; now we have a pretty damn good business with rates several points higher than that, but it's definitely not as good as it was, so we need to tweak our frugality dial in response."
Because it effects the whole economy (afaik) so from housing to banking to any industry. Interest is a compound factor in a complex modern economy. Even big companies have to factor in the effect of interest, even if they haven’t borrowed anything.
I think you need to broaden your thinking horizons here. The thought in the midst of the pandemic was that it brought ecommerce/payments to the masses who would otherwise follow their old routines. Thus the large hire. I believe the theory was that once everyone starts on this new routine of convenience it would be difficult to return to old patterns.
Your comment reads as someone who likes to look back on the past with all the knowledge of today as if it was so incredibly obvious.
That said there will be some significant knock on effects from pandemic that will be favorable for those companies however not as large as originally thought.
The problem with hindsight is people often forget just how chaotic previous events like the entire pandemic was.
Even the "experts" were flip flopping globe affecting decisions, we had no idea what the right solution was and how long it would be... 3 months? 6? Permanent?
Long term decisions had to be with numerous unknowns, it is clear now they were wrong but I am not certain I would have made the right choice either.
I wouldn't go so far to say they wanted an excuse to fire, but they definitely bet the wrong odds.
You can however react to changes in circumstances when you get new inform. I got my first dose in Sept 2021. Surely people getting vaccines was the signal that things might start reverting a bit. How many employees did these companies add over 2022? Why didn’t they slow down? Because no one else was slowing down?
> It honestly sounds like an excuse to fire people. Or leadership has no clue.
Neither of these are true. The truth lies somewhere in the grey area between these two things.
> Who in their right mind would think they would continue to stay home and do everything online?
Hindsight is a helluva drug. Covid has made several lasting, possibly permanent, marks on society and our habits. Nobody could say what those were going to be.
The world is a lot nicer when you (correctly) assume that everyone is doing their best with the tools and the information they have to work with.
There were a lot of predictions during the pandemic about permanent change. My observation is that a lot of those changes were either largely temporary (e.g. widespread grocery pickup or delivery) or were mostly limited to limited groups (e.g. full remote work).
There was a solid stretch of not knowing if things might get much worse, if the vaccines might fail etc. Some borders have only opened very recently.
It's not like all those people who wanted to travel and go out were actually making plans and booking post pandemic flights back in 2020.
We're lucky that things are solidly back to normal this soon. Not just the restrictions that kept us inside, but the massive paranoia that made many fear the outside, and some new, worse virus
You lose great people, you take a huge hit to culture and productivity, you spent a lot of capital that you otherwise wouldn’t have. It create doubts in leadership and cut projects with maintenance debt.
People quit after layoffs, which means attrition isn’t as balanced as when the layoffs were planned. Talent isn’t distributed equal to business need, meaning the company will lose great people who were in the wrong place at the wrong time.
Depending on how it’s executed, the company could burn bridges. Even if they do it well, some people would never come back.
It’s not like the people hired were working for free so I’ve never personally put much stock in the, “employees were screwed over.”
But again, it depends on some of the items above. If a company is doing well, less-direct revenue teams are fine. In a sketchier environment, not so.
Depends: did you really get that much value out of those new people in the time you had them employed? It usually takes a while for a new tech employee to become productive, and just hiring a person costs a lot of money. If you, for instance, hired 10,000 new engineers and then laid them off after only 1 month, you would definitely lost a LOT of money, and not have gained anything for it.
Who lost? Everyone who got laid off wouldn't have had at a job at all if they weren't hired, and they probably got severance packages for more than the average American makes in a year.
> Everyone who got laid off wouldn't have had at a job at all if they weren't hired
Maybe I've been lucky, but every time I've been looking for a job I've had AT LEAST two options.
People move cities, sell houses, buy houses, give up apartments, plan their finances, and plan their career around their job all the time. Investing a year learning the ropes at a company, finding ways to fit into the structures, etc. only to have the rug pulled out from under you just as you're starting to get the hang of things...
> and they probably got severance packages for more than the average American makes in a year.
Maybe I've been unlucky, but I've never had a job where my expectation of a severance package was anything more than 2-4 weeks wages.
> Maybe I've been lucky, but every time I've been looking for a job I've had AT LEAST two options.
You got many offers precisely because a lot of companies were (over)hiring. And the fact that you had many options on the table allowed you to get a better compensation (either because you negotiated, or because everyone else did so the industry standard increased). You benefitted from these hiring binges.
When that was the case, I saw a grand total of 0 complains by tech workers that tech companies were hiring too much; it benefitted them immensely but that did not translate in any article praising CEO for taking these risks.
Could these companies keep on these workers ? Many can ! But a hallmark of good governance is having a budget and taking care of your expenses.
At your household level, you can probably afford to pay for Netflix, Disney+, HBOMax and Amazon prime at the same time. And maybe you did during COVID because you were watching TV more.
But now that you are back to doing more "real life" things, you maybe don't need them all. It's not that you risk being evicted because you can't pay rent; you are probably still saving a bit of money every month. But that is not a reason to not ask yourself "do I really need these all now ?" and if the answer is negative, to do something about it like cutting one or two. What is someone then told you "But look, you can currently afford all these streaming services ! You should keep paying for them all for as long as you can, and since you have them, make sure you reorganise your life to schedule some time to watch them all !".
Can we stop using the household budget analogy for megacorps, governments and economies? It’s overly simplistic and wrong.
(For just one example, a household has a single non-varying income stream and everything else is expense. Large companies have many revenue making depts, lines of business, products, etc)
> People move cities, sell houses, buy houses, give up apartments, plan their finances, and plan their career around their job all the time
Then those people are naive. I never plan my life around an employer. I plan my life around my employability. I wouldn’t move somewhere that there weren’t other jobs in the area. Well now I wouldn’t work at any company that wasn’t fully remote
>I wouldn’t move somewhere that there weren’t other jobs in the area.
While a good principle, I'll posit there are a lot of more or less specialized professional jobs--especially at more senior levels--where you can't just walk across the street and slide into a similar role at a different company. Even if it's in the same general area, a 2 hour commute each way is probably not sustainable.
And thats why I have a 25 year of paranoia about being overly specialize.
Yes I’m self aware enough to understand the irony of the only reason I fell into my role at $BigTech at 48 years old is because I did become overly specialized in enterprise dev + cloud.
It's hard not to be at least somewhat specialized as a very senior person.
If you're an embedded systems programmer, maybe you can hack on some Javascript but no one is probably going to pay you very senior comp to do junior programmer work.
It is certainly true you don't want to be too specialized in general. You didn't want to be the Y2K guru in 2001 or the world's expert in performance optimization for some specific computer architecture that isn't manufactured any longer.
When I took my current job, there were probably a few companies in the general area that would have been somewhat obvious potential matches. But it was sheer coincidence that the one I connected with first through a connection happened to be the closest major tech company to my house.
Is that really true though? How many of the 2.7 million developers in the US are just your generic enterprise CRUD developer writing apps using Java, C#, etc? They are basically interchangeable and for most of my career, I could throw a resume up in the air and get multiple offers for yet another generic enterprise CRUD job.
On the other side, how many jobs are (were?) available to the generic software engineer who could do the DS&A monkey dance (junior/mid) and “design Twitter” and talk about “scope” and “impact@ in STAR format (senior).
There are not “millions” of people being laid off. Unemployment is still at a historic low. It’s mostly the tech sector. I’ve been through this type of economy at least three times - 2000, 2008, and for a brief second 2020.
Even back in 1996 when “having mine” meant making $33K a year as a hybrid computer operator/programmer, I saved aggressively. Again, I was 22 years old and had enough sense to be paranoid about depending on one specific job.
If you feel like this, you shouldn’t take a job that can’t guarantee in writing that they will keep you for X number of years. If you took a calculated risk and took the job anyway, well that’s on you.
> If you feel like this, you shouldn’t take a job that can’t guarantee in writing that they will keep you for X number of years.
Those jobs don’t exist. If there are they are such a minuscule number of openings it is not worth thinking about them.
> If you took a calculated risk and took the job anyway, well that’s on you.
Let me break it to you: almost everyone works because otherwise they would go hungry, loose their home and die in sad circumstances. People who don’t are statistical anomalies.
Are you saying that by having the misfortune of not being born independently rich and working basically any job the poster “took a risk” which is “on them”?
> Are you saying that by having the misfortune of not being born independently rich and working basically any job the poster “took a risk” which is “on them”?
He is saying that. Its the Puritan Christianity's mentality of blaming the poor for not working hard. Repurposed into the modern free market capitalist ideology where its used to blame the ills of the system on the victims - "You havent worked hard enough" "You werent smart enough" etc...
This is nonsense because the people that were laid off at Google (as one example) were not selected by their tenure or having been hired during pandemic or during the remote phase, nor was it based on performance reviews.
People who had been there 12, 13, 16 years; people who had been just promoted. People who were senior management. People who were engineers. People who were on mat leave...
Management there effectively fed people's employee numbers through a random generator. And people who (IMHO naively/foolishly) gave their lives to the company suffered.
Decisions to overhire during the pandemic impacted people who had nothing to do with that decision. And not just because of the layoffs, but also because the company growth during that period was so intense that it led to onboarding and project mgmt difficulties as well.
But my experience when working there is that there are definitely people whose emotional (and physical) engagement with their jobs goes well beyond just what is required to get that compensation.
> Everyone who got laid off wouldn't have had at a job at all if they weren't hired,
Incorrect. Many of those people already had a job when the moved to Amazon/Microsoft/wherever.
Since no one makes a horizontal movement, all the roles they vacated when moving where lower roles. And this argument goes recursively when the old company fills those roles.
This means that most people who did not have a job prior to the hiring explosion and had a job after the hiring explosion, had low-paying jobs, in very low roles.
I don’t get it. What did they lose? They got paid good money during the mean time and have the fact that they worked for $BigTech on their resume. What did they spend all of the money on? Coke and strippers?
I was also hired for a remote role at $BigTech in mid 2020. Fortunately I didn’t get laid off. But I wouldn’t exactly be bitter that I got paid 50% more during the interim. I save/invested/paid off debt with 80% of the difference in take home pay.
Yes, I’ve been laid off before when I was making a lot less in 2012. I made sure I had savings then
Textbook capitalism is a society in which (among other things) companies belong to investors. In other words, when a company bets, it is typically with other people's money. In that case, it's other people's jobs, which extends a bit this textbook case, but not by much.
edit Rewritten to take into account sokoloff's remarks, thanks!
Capitalism defines nothing about whether the money invested in a company is yours or other people’s. In practice, it often is, but “in practice” is not nearly the same as “by definition”.
>They were wrong, so there were layoffs to correct it.
Aren't the cuts smaller in size than the increased head count going into the pandemic, which would indicate there was a shift, but not one as large as they had prepared for?
For some companies, hiring more engineering staff probably hurt productivity. Every team added and every engineer added to a team increases the time/effort/communication/planning anything takes. It's often the wrong decision. This of course depends on how work is divided up.
In Italy whole industry verticals that were traditionally resistant to change where digitalized, kicking and screaming, from general practice to luxury wholesales
Check zoom user growth graph, something like 300% growth was common for the software supporting this transition. Not everything was positioned as good that, but the pull was there.
Wow. That makes it even more horrible. So they did it to get PPP money - the public's money - and now they are dumping the very people who paid them that money. ~160,000 people laid off means up to ~500,000 people involved when you count in the families, relatives etc. This feels more like theft than anything else when you count in the PPP - they took taxpayers' money, they ate it with stock buybacks etc, now they are dumping those taxpayers...
For Amazon, sure - warehouse workers and drivers. They built a lot of new warehouses, brought a lot of delivery in-house, etc. For the others? Maybe a bit; more remote work, general cloud growth, but yah they overhired.
The pandemic shifted business from brick and mortar stores to online, along with all the services you need for an online store (payment, logistics) and made people work from home, where they needed devices like a decent computer and webcam and services like Zoom.
Those businesses boomed during the pandemic and deflated once it was over.
of course not, it's just a bullshit excuse. Just like currently none of the faangs businesses are hurting to the point where they just have to get that couch cushion money from layoffs.
Let's be real - Apple outsources the dirty work of hiring/firing loads of workers to Foxconn. If the demand for iPhones slips, there will still be people out of a job, it will simply be factory workers in China/India who likely have far less savings to fall back on.
I don't recall whether this was ever stated explicitly, but I always had the impression that this is one of the elements of Apple's strategy that is still driven by the tribal memory of the late 1990s turmoil. Both Steve Jobs and Tim Cook were there at the time, and especially the former (who also had witnessed a 1980s bust) was determined to never be in this position again.
So as a consequence:
* Apple is hoarding cash to an extent that business analysts dislike (although Tim has reduced this to some extent)
* Teams are being run leaner than they'd like to be (not nearly as much so as in the pre-iOS times, admittedly)
* Conversely, since the company builds good reserves and not too much bloat in good times, it can and will avoid overreacting to bad times.
> not nearly as much so as in the pre-iOS times, admittedly
deserves better than a parenthetical. Zoom out, and Apple's been on a serious hiring binge for the last decade, from 72,800 employees in 2012 to 164,000 in 2022.
(Both numbers include retail; in 2012, 58% were retail, and they don't seem to break it down anymore in 2022.)
The market cap has grown by ~10x in that time and the number of products that the company makes/maintains has also increased a lot in that time (both number of products and total unit volume). Doubling the staff does not seem unreasonable given the obvious increase business.
So their staffing has increased by a factor of about 2.25 while their revenue has increased by a factor of 2.5 - doesn’t sound so much like a binge as the company steadily scaling up.
On the other hand, Facebook went from about 4,000 employees to over 70,000 in the same time period, so given how much Apple's business has grown in the last decade (when they released the iPhone 5) it seems pretty impressive that they grew only ~2x.
The pile of cash just reminds me of Carl Icahn and others. It's "fortunate" that Apple is so highly valued that a corporate raider can't purchase a controlling interest and pay out to investors.
They’re also an outlier and a lot of what they do would be judged more harshly by analysts and activist shareholders if they weren’t so profitable. The question should be how we get more companies like Apple.
The answer to that depends on the timeframe and your risk tolerance: a lot of cryptocurrency people said FTX was profitable, too, right up until it wasn't. When you're talking about retirement funds, the timeframe is long enough that you have to think about how stable something is and whether the profits you're seeing will continue. There isn't a recipe for “highly profitable” and you need to consider both the quality of the leadership and whether they share the same goals you have: there are many companies which were bought by a private equity firm who “improved efficiency” by slashing costs and milking the existing customers for as much revenue as possible. If you're the PE guy who cashes out early in that process, this is highly profitable. If you're the retail investor with some shares in your retirement account, you likely had the opposite experience.
Of course you need to take other things into account as well besides current profitability, such as your trust in management’s ability to steer, for long term product/service foothold, etc.
I was juts making the point that all else being equal, you wouldn’t invest in a company that doesn’t try to make handsome profit, which was the suggestion. And Apple makes handsome profit, and then some.
The general expectation for any company is to milk the daylights out of every quarter they can. Either put your money back into growth or hand it back to investors - cash reserves bad even though they obviously allow you to weather things (like we’re discussing here).
Apple is a real company. Amazon, Twitter, even Microsoft to some extent are not.
Apple makes things. The software side works with it, but it's a hardware company. They aren't entirely dependent on conjuring up value out of thin air, they have supply chains and manufacturing and whatnot.
Even Amazon, which deals with physical objects, doesn't really make anything.
I get what you’re saying, but I think what you’re describing as the “realness” of a company is just the sustainability of its competitive advantage. Which is a good metric to understand their long term financial position and susceptibility to market changes and layoffs. Others are pointing out that yes MS and Amazon make physical things, but we all know those aren’t their core business - that’s beside the point.
Tech companies are largely “interface” providers - they simplify the interface between a human and some other thing. Twitter simplifies the interface between one human and all other humans on Twitter. Microsoft enhances the interface between one human and other humans within a company. Amazon simplifies the interface between humans and retail goods, and AWS simplifies the interface between humans and scalable compute and storage.
Apple simplifies the interface between humans and personal compute, which is the entry point to all of the above (although the relationship with Microsoft is obviously more complex than that). So Apple is just further up the tech value chain, where transient effects are softened and delayed, and where competitive advantage is less easily displaced. Twitter is really only protected by network effects. Microsoft is protected by lots of UI/UX implementation moat and strong vendor lock in. Amazon is protected by others’ ability to scale physical logistics, AWS by backend and interface development and also lock-in. Apple is protected by all of the above plus hardware engineering, and the hardware is an especially difficult one to catch up with or copy.
I believe you are undervaluing the demand for cloud, e.g. AWS and azure. Companies that "make things" use these services extensively. AWS and Azure are the quintessential shovels during the gold rush.
But even by your flawed definition of "making things" Amazon counts, because of Kuiper, Fire Devices, and Echo alone.
I would grant that Amazon's Graviton is an actual thing, but that's pretty new. I would put the Fire and Echo devices in the same bin as their branded USB cables, i.e. "we put our logo on some stuff out of Shenzen."
I get that AWS services are a real thing made possible by other real things, but it's also essentially a branded software layer on top of a lot of open source stuff. Over-simplification, sure, but there are enough third-party services that ape enough of the AWS API that I think of it more like a Dell than an Apple.
Maybe I am missing a point, but both Amazon and Microsoft have devices that they manufacture and distribute. While they may not be profitable at the price they are sold they provide value to customers that isn't conjured from air. Additionally they also sell access to data services, while the data may not by physical, the data centers are physical objects that are designed and built by both AWS and Microsoft.
By that logic Amazon is a vastly superior company because they make more products than Apple. Amazon makes its own hardware and products: Alexa, Kindle, Ring, etc. They have ground and air fleets. Amazon holds majority marketshare with online web services...
I don't think this 'real' sentiment is best used to describe Amazon.
I had a moment of enstartlement (note to auto-correct: I made up the word, okay?) the other day. I get a lot of vintage/retro computing videos in my YouTube feed — I realized how odd it was that Apple was one of the few (only?) 8-bit computer companies still around. And around in an impossible-to-ignore kind of way.
But only just barely missed going under back in the late 1990's.
Why does apple hoard cash instead of buy back their stock? 5 years ago they had 90 billion cash in hand. If they put that into their stock then, It would be up at least 260% by now even with the losses last year, then if they need cash they could just take a loan against the stock.
Looks like they spent a lot more than that on buybacks in the past decade:
> Since 2012, Apple has been buying back its own shares at an extraordinary rate -- Apple is known for spending more on share repurchases than similar tech giants like Meta or Alphabet. Apple's total share repurchases have totaled $274.5 billion, with just $20.4 billion in the December quarter.
> Apple spent $85.5 billion to repurchase shares in 2021, and issued $14.5 billion in dividends.
Apple's buyback program probably has no equal, it's a sight to behold which demonstrates the power of sustained share repurchases. Sure, there are the Teledyne stories, but to do it at this scale is something else.
Agreed, but that chart is misleading: the number at the bottom should be 0, not 14.
The reduction in outstanding shares is from 26.5 to 16, which is still impressive, but perhaps less than the chart might make you believe at first sight.
Setting aside the fact (pointed out by several siblings) that they do buy back stock...
Why should they? Who does it benefit? Not their customers. Not their products. Not their future prospects. Not their employees, for the most part.
...Oh, the shareholders? You mean they should throw massive amounts of money—their hedge against future problems—into enriching people who are, for the most part, already vastly wealthy?
I'm not super enthusiastic about stock buybacks, but to my knowledge, pretty much 100% of Apple employees are eligible for the discounted Employee Stock Purchase Program (which is a no-lose proposition, if you can afford to defer part of your salary for 6 months), and at least in some years, all Apple employees got some outright stock grants, so the interests of "employees" and "shareholders" may be more aligned than you make them out to me.
What else should they do with it? Just hold it so that they can hedge against future problems? How much do they need for a hedge? 1 year of expenses? 2 years? 5 years?
What should they do once they reach your hedge number? Just keep hoarding?
What do you do with your salary? Do you just put it in your bank account or do you invest it? If I’m Apple, do I really want to become a company with an investment arm? If I’m an Apple shareholder, do I really want Apple buying a broad market index fund and then paying corporate taxes on that and then spitting off dividends to me rather than just giving me the opportunity to invest in a broad market fund myself?
> then if they need cash they could just take a loan against the stock.
When companies buy back stock, they typically cancel it. That's why the price goes up so much of the remaining stock held by shareholders.
So they wouldn't be able to "just take a loan against the stock". It no longer existed. All they'd be able to do is to ask ask their investers for more money.
This is such a weird topic. The comparison is between companies that didn't exist 20 years ago like Meta and the 50 year old industry veteran Apple. Of course Apple didn't hire as fast, they were established. That's like saying a 20 year old college student grew more over the past decade than a 40 year old engineer. Yea, that's how life works.
What does the age of the corporations have to do with how responsibly they staff their business(es)? Also, I'd suggest that the assertion that corporations somehow all move towards responsibly managing staffing numbers warrants some supporting documentation. The article we're talking about already indicates a pretty significant example that directly contradicts what you're suggesting: Microsoft -- actually an older business than Apple, for that matter.
With age comes experience, although the variety of experiences may differ.
The "near death" variety that Apple experienced during the 90s led to a conservative fiscal culture (among the top level executives at least) compared with the rest of the tech industry.
> Also, I'd suggest that the assertion that corporations somehow all move towards responsibly managing staffing numbers warrants some supporting documentation.
Definitely. Though it might be difficult to test properly. I expect a lot of survivor bias, with companies having a reasonable long-term strategy faring significantly better in the long run (Microsoft notwithstanding, in this case).
Their business is under severe attack / pressure. From a customer, regulatory, and competition perspective. So they can’t juts coast along or juts optimize what they’ve got.
With the latest direction (metaverse), they’re hoping to pave a new future, even if it leverages their current power to bootstrap.
The answer is rather pedestrian: Apple is mostly a hardware company.
Apple isn't primarily in e-commerce, streaming or online ads and did not get as big a boost from millions of people suddenly trapped in their homes as other tech companies. Apple was never tempted to overhire, I suspect Apple had the opposite pressure during lockdowns due to its many retail workers seeing fewer customers, while Amazon's warehouses were brisk.
Apple is the only one that doesn’t seem to start up hundreds of obviously useless projects.
Google will hire thousands to work on an obviously doomed streaming service, to build a new OS, and so many other obviously bad ideas and then just fire them all later.
Vanishingly few Apple products seem to be discontinued. Usually after a long life like the iPod.
You wouldn't know about Apple's failed projects. Like other big tech companies, Apple has teams working on R&D and new products. Unlike the other big tech companies, the bar for publicly releasing a new product is extremely high at Apple, and unreleased products are kept secret.
They do tons of R&D work and sometimes they’re working silently for many years before they release an actual product. This was the case with iPhone for example.
I would say that their push for smart speakers has also yielded mediocre results, at best. Let's see what will happen with their AR headset, as they have been working on AR since at least 2017 [0].
Microsoft paid half a billion dollars to buy Andy Rubin's company, Danger. They decided to rewrite the platform from Java to Dot Net, which took so long the product was no longer competitive when they finished. Then they spent a fortune to advertise a product that was canceled after two weeks.
Then Microsoft decided to take another huge pile of cash and set that on fire by purchasing Nokia's mobile phone business.
> Microsoft wasted at least $8 billion on its failed Nokia experiment
Well to compare Apples to Apples we would need to know how much money Apple has spent with many of those efforts, which before Jobs came back almost driven the company to bankruptcy.
For younger generations Apple might feel unstoppable, meanwhile I remember the discussions to migrate away from surviving Macs on our IT department, and my graduation thesis was porting a particle simulation engine from NeXTSTEP into Windows, as those boxes were to be sent away.
We don't have to wonder with Microsoft. Again and again they have paid billions of dollars for complete failures like the Surface RT and Windows RT. They would have been well advised to cancel the platform before spending a fortune on advertising and unsold inventory.
Hell, how much did they pay for Skype, despite already having several text and video chat clients of their own? Then they completely rewrote Skype before replacing it with Teams.
They have a longstanding habit of taking shareholder value and creating huge bonfires from it.
These were arguably pretty reasonable hedging; it was _far from certain_ that developers would accept Objective C, and then later there was a general sense that dynamic interpreted languages might eat the world. I can't imagine either were particularly expensive; MacRuby in particular was basically someone's side project?
> WebObjects
This was inherited, and was pretty successful in its day (though it did perhaps outstay its welcome).
Most of the others you name are either _very_ old, or were actually quite useful in their day but have aged out.
Quicktime was never cancelled. It was adopted industry wide as the MP4 file format.
> MPEG-4 Part 14 is an instance of the more general ISO/IEC 14496-12:2004 (MPEG-4 Part 12: ISO base media file format) which is directly based upon the QuickTime File Format which was published in 2001.
Titan is another recent big bet that went nowhere, but so happened to be too big to hide. Who knows gow many other secret dead-end projects they buried.
It was never announced so not comparable to other's dropping projects once they're in the open, and if all we're relying on are rumours and industry analysts, then those still say they're continuing the work on Titan, with a view towards a 2025-2028 launch.
But it does change what makes sense to talk about in terms of "Apple" as it exists today.
Otherwise you might as well talk about why IBM is an undefeatable behemoth that dominates the industry...oh, wait, what's that? That hasn't been true for over 30 years now? I thought age didn't change facts?
You mean the company that is usually the champion on patents per year, owns a large portion of Linux, GNOME and GCC development, is the 2nd major Java vendor, and has had quite a good fiscal year, including growth in mainframe and micro sales?
A behemoth indeed, only HN isn't paying attention where it matters.
QuickTime wasn’t a failure. It was the basis of their entire media playback API for over two decades. WebObjects came in with the acquisition of NeXT and from what I gather is still the basis of the backend of iTunes and the App Store. Most of the rest were pre 1997.
While I respect Apple as a whole, "they didn't start up hundreds of obv useless projects" is an empty claim. Not only they sank so much money into subpar offerings like iCloud/iWork, not pay attention to some of their competitive advantages like Automator, they ran some of their once-great stuff into ground as well (Finalcut pro).
I prefer Google's method of open sourcing their work. it benefits humanity in the long term, where Apples software generally benefits Apple and is only allowed to run on their "blessed" hardware.
Thanks to Google we have patent-free codecs, an open source mobile OS, and they contribute significantly to the Linux kernel, among other projects that are "available to the public".
Not saying that's necessarily always a bad thing for the end user, but they do play the long game.
Chrome was great at the start, did a lot to drive the web forward, and is now suffocating those advancements with Manifest V3 etc (including delaying removal of third party cookies).
It almost feels like embrace, extend, extinguish..
I can't think of a company more blatantly engaged in anti-competitive practices than Apple, but I'm glad to see all that unfairly amassed wealth benefitting even the lowest rungs of its corporate hierarchy through the unusual benefit of not being terminated at the drop of a hat. Bravo!
Amazon, Meta, Google are most certainly cut from the same cloth, only slightly smaller. For Wal-Mart I can't think of anything strictly anit-competitive, but its practice of paying sub-subsistance wages and counting on government relief programs to make up the difference is so disgusting I don't know how to describe. Time-Warner, Comcast I'm not too familiar with, but as I understand they are under regulations that prohibit them from dropping your connection when e.g. they find you visiting ycombinatior.com, a site where their business is frequently ridiculed, which is certainly a step in the right direction and the model that should be applied broadly and forcefully to all of the above.
These companies you mention, at least the tech ones, are all way more open to third parties interfacing with whatever products they have or data they produce.
They are more like big city gangsters willing to do business with whomever as long as you pay them protection money, while Apple's ecosystem is like a gated community where you get shot at the door if you even so much as look like you can't afford to get in.
Do you have any examples of that? Facebook and Google seems open more in the roach motel sense - they get your customer volume but continually adjust the terms so you get less of the ad revenue (pivot to video, ad words’ declining payout, the need to pay for placement in search or adopt proprietary tools like AMP not to be pushed down the page, etc.).
Yes, I am not OP but obviously they are more anti-competitive if ONLY for their killing off support for progressive web apps over the last 3 or 17 releases of iOS/Safari
I'm a big fan of PWAs, but this really seems like small fries compared to something like Walmart which has an internal planned economy many times larger than the Soviet Union ever achieved.
If you sell your product in Walmart they basically own everything about it. They deem how you make your product, the supply chain, the prices you charge, everything
And they use this incredibly granular level of control to run out any possible competition
In a world where when Apple puts a dialogue on your phone, that when they try track you with ads, they ask "allow us to enhance your experience?" whereas when other companies try to track you, they ask "allow <company> to track you across apps and websites?"
In a world when you can't even hint at the existence of a payment mechanism outside the App Store to get subscriptions?
It's not the most anticompetitive but it's definitely competing for the title in big tech.
I've thought about this, and I agree, and I'm no lover of any of the FAANGs. It makes Apple look like they know what they're doing, whilst the others are just following others and going in and out with the tide. I think it all boils down to Tim Cook. Tim always seemed analytical and strategic in his thinking, and Apple's behaviour in this matter only serves to underscore that. As an operational guy, I think Tim is the best that any company has ever seen.
You make them sound like saints. They are not, and they have a history of being less scrupulous than some of these other companies. Here's just a small example of what they're capable of when their obsession for secrecy and control makes them flex their corporate muscle: http://www.cnn.com/2011/TECH/mobile/09/07/iphone.5.probe/ind...
And we didn't even talk about their sweatshops in China which has always had persistent labor-abuse issues[0], the Chinese government boosting iPhone production with child slave labor, and all the many other scandals they've been involved with.
I really have no idea how anyone would get the notion that they are any better than the rest of the pack. Perhaps their upbeat pristine presentations, live from Cupertino. They should broadcast one from their sweatshops in Shenzhen. Some of the highschool kids they pressed into 11 hour shifts to assemble their iPhones could sing the praise of Tim Cook: https://www.cnbc.com/2017/11/21/apple-iphone-x-reportedly-as...
> And we didn't even talk about their sweatshops in China where employees kill themselves an awful lot, the Chinese government boosting iPhone production with child slave labor, and all the many other scandals they've been involved with.
This tired argument again.
Apple has always led the industry in their auditing of their supply chain and taking proactive steps to address illegal and unethical behaviour. There will always be mistakes but it's how you deal with them that counts.
If you're going to call Foxconn a sweatshop then arguably Amazon, Tesla etc should be called them as well.
5 years ago it was the massive child-slave-labor scandal, which I suppose is what you refer to as "mistakes happen". Apple chose to continue to use Foxconn for production, even though they had similar scandals before and after the child-slave-labor one, until our present day.
That's not a "mistake". Apple is deliberately choosing to use a producer which has had persistent issues with labor abuse, some of which were as serious as pressing schoolchildren into 11 hour shifts of slave labor.
Let's face reality here: Apple is using Foxconn because they produce their products quickly and cheaply. They don't care about the abuse as long as it makes more cheap gadgets with fat profit margins. They aren't saints, nor better than other companies in their position.
> If you're going to call Foxconn a sweatshop then arguably Amazon, Tesla etc should be called them as well.
I never said Apple is worse than the rest of the industry. I refuted the notion that they are far better, and the overall halo of sainthood hung over them in this thread.
I clarified the comment, since suicide rates aren't the only factor by which to judge or point out the issues at Foxconn. Foxconn had a long list of labor abuse[0], and apparently they still do given large-scale worker protests last November[1].
Yup its just that they have become the largest seller of status signalling veblan goods in history, in times where inequality is at its highest.
It would be nice instead, if they used their cash hoard to stop wasting time upgrading the iphone with more superficial shit. Bring prices of the phones down to 50 bucks. Put it in the hands of everyone. And kill the toxic advertising supported attention economy that has caused chaos all over the world.
Now its just another group of unimaginative optimizing corporate robots, with no actual compass heading beyond hoarding cash.
>Bring prices of the phones down to 50 bucks. Put it in the hands of everyone.
60% of Americans already have them, and are perfectly happy to spend $1300 on them every few years, even if it means not paying rent or having to starve themselves. Why would they want to bring the price down?
I have an iPhone, but I'm not willing to spend $1300 on one. I bought my last one refurbished for about $250, an iPhone 7 Plus, about three years ago, and I still have no issues with it.
And Boost Mobile (where I got mine) is currently advertising a deal for an iPhone 8 for $80, or an iPhone 11 for $350.
Not all of that 60% are paying the full price for a brand new iPhone, probably not even the vast majority of them. iPhones tend to last (and continue to receive updates -- I'm still getting updates on mine) for quite a while.
That being said, I agree with you that they don't need to bring the price down. And I disagree with the parent, because there's already ways to get (older) iPhones pretty cheap, almost as cheap as they were calling for.
>60% of Americans already have them, and are perfectly happy to spend $1300 on them every few years, even if it means not paying rent or having to starve themselves. Why would they want to bring the price down?
Source? This comment sounds hyperbolic when many very capable iPhones versions are available for $700+ and 90% of people would not know the difference between a $700 and $1,300 iPhone.
In my family, we have an iPhone 6, X, XS, XR and 2020 SE still working.
They were never $1,300. And there has always been a perfectly viable non top of the line option for around $800, but still very future proof option.
Some people like to claim that everyone is buying a new maxed out iPhone every couple years, but that is nowhere near the truth. If you do not need the latest and greatest camera, people can and do buy the much cheaper models.
I didn’t say they were 1,300. I said they were very expensive for their time, relative to the competition. In fact, started at $649. Also, the specs are just not comparable to the latest. From battery life, to screen quality, to storage.
I disagree they were expensive relative to their competition, if you define competition as phones that have a decade+ long history of lasting at least 4 years (including software updates).
It has been many years since 90% of people need the “latest” specs. For the purposes of messaging, taking photos, browsing the internet, and watching media, an iPhone 13 serves just the purpose just as well as a 14 pro.
Even a $500 SE will do everything most require, except have a bad battery life, but that is fine for people who are not out in the field like retired people.
> Bring prices of the phones down to 50 bucks. Put it in the hands of everyone.
A $50 reduction on a $1000 price tag won’t “put it in the hands of everyone”. People who cannot afford it still won’t and it won’t change anything to people who can. $50 over the life of a device is nothing.
Besides, that’s what previous generations are for: usability is pretty much identical to the latest and greatest, and the discount is much more than $50.
> Yup its just that they have become the largest seller of status signalling veblan goods in history, in times where inequality is at its highest.
This assertion falls apart under even basic sober analysis. It may sound tautological but entire point of status signaling is that other people notice it. iPhones look very similar to the equivalents and most people aren't going to be able to tell which model or year you have without a close examination – contrast with a luxury sports car which is audibly and visibly distinct from a fair distance. If you look at what actual rich people do, you can really understand the point: there's no better phone available at any price due to how the product segment works so they buy the same phone as everyone else but they get things like high-ended designer cases because that's where there's room to demonstrate how much money you have. Tim Cook has essentially the same phone as half the people in line at your local coffee shop; that's decidedly not true of actual luxury goods.
It also has two other fundamental flaws: the first is the assertion that there's a substantial price difference when even a bit of research would show that equivalent phones cost roughly the same amount even before you adjust for the extra years of service an iPhone will provide. These comparisons can also be complicated because, for example, if you care about battery life or CPU performance the comparison for a Pixel 7 isn't the iPhone 14 but a much cheaper iPhone 11 but the same probably isn't true if your primary buying criteria is camera quality.
The second is trying to look at this in isolation: the price differential between one phone and another just isn't that much compared to other things people spend money on — a phone costs significantly less than what most people will spend on cell service over the same timeframe, and for perspective the total lifetime cost for that phone is likely to be 1-2 months worth of rent. For something which people derive heavy value from throughout the day, that's definitely not conspicuous consumption.
The U.S. median income is something like $50k so even if you're buying the most expensive model sold you're looking less than one percent of median income over the average 40 months that Americans keep their phones. Contrast that with, say, cars where the average new vehicle buyer is spending the equivalent of that purchase price _every month_ on something they use on average less than one hour per day and most of them are paying significant premiums for models which aren't more useful for the things they actually do just to present an aesthetic style. If you want to talk about Veblen goods, ask why so many people are commuting to office jobs in $60-90k trucks in showroom condition.
Google has the focus of a crack addled flea and has products that aren’t profitable and people working on them - including probably 4 new messaging apps simultaneously.
Facebook predicted that it would lose billions in potential revenue because of Apple making tracking optional.
Microsoft is basically just cutting employees that are on products that it doesn’t care about anymore.
You can say what you will about Apple. But no one can accuse Apple of a lack of focus on profitable product lines.
Google was doing Fuchsia. SUPER cool... but what is the business case there?
They are also ham strung by just not listening to customers it feels like - so I think that impacts traction. You get an 80% solution with simple glaring edges. Thousands ask - please fix this edge, but whoever is driving the product is onto something else it feels like sometime.
Apple's main product lines all seem to make money, have positive margin. Their R&D is speculative as well, but at least too me seem to have a clearer path to product. I would love an apple interface in my car vs whatever the OEM ships in most cases. Even Tesla, with most advanced interface just can't keep it current / smooth / familiar etc the way apple does (music / playlist / etc integrations alone).
A bunch of Google’s lack of focus is their internal management culture and nothing will change (and google/alphabet will not be worth being a customer of or being investing in) until the culture problem is fixed.
Google only promotes engineers when they are part of a program launch. No credit is ever given for feature launch or program maintenance. If you are not a part of a program launch within a few years at google/alphabet you are managed out.
This doesn’t create a company that I’d find it worth investing my limited engineering hours in long term. A GCE sales person called me last week and I couldn’t help but laughing when I told them that I’d just agreed to five year terms on an AWS private pricing agreement the previous week. Google’s recent actions made me unsure GCE would still be a product in five years.
Fuchsia is (was?) a long term investment, so would be expected to not be directly profitable for a long time.
The core problem I guess is that it suffers from an inability to partially migrate to it. This is a common problem that many people fail to understand actually matters. For example, we know rust is a safer language than C and C++, and so people say "all should be rewritten in rust", but rust is designed with no intent or plan to support partial migration. What that means is that even Mozilla - who created rust specifically for gecko - has been unable to replace the bulk of their code with rust: they can only replace entire large subsections at a time, which is harder, riskier, and as a result slower. Fuchsia's (I guess Zircon's) adoption is hindered because the objectively superior security model also meant that none of the infrastructure around the linux kernel say could work with it, so feature parity required at best significant porting, but frequently complete rewrites of large swathes of the surround infra. e.g. porting android to zircon (the fuchsia kernel) would require significant rewriting of the existing code bases, and the time spent doing that is not time spent adding shiny new features and animations (or chat programs?). Apple is generally much more pragmatic which is why Swift had such good objc and C interop from day 1, and why real C++ interop is something it's working on: if fixing C++ projects can only be done by all-at-once complete rewrites you basically ensure the C++ will live forever - rewriting a file at a time however is much more achievable, which is personally what I want (as for Swift vs rust I'm on the fence I like different parts of both, and also c++ :D).
This is very false. Fuchsia is built around the idea of making it easy to port software. It just doesn't attempt to achieve it by providing a POSIX interface. POSIX doesn't specify the vast majority of interfaces a modern OS needs to define anyways. For instance you won't see POSIX interfaces that tell you that provide you a signal for when memory on the system is low and you should free up memory.
Instead Fuchsia tries to target porting runtimes which applications are built against. For instance, porting chrome as a runtime unlocks web applications. Porting flutter unlocks flutter applications. Porting Android as a runtime is challenging but still achievable. It is not necessary to rewrite all of Android to accomplish this either.
At the extreme, it's possible to simply implement runtime support by implementing various virtio interfaces and running a full OS in a VM, similar to how other OS like ChromeOS and Windows achieve Android app support today. Fuchsia is written in a way that you don't need to do that, but it's always an option as well.
I think it might be worth pointing out that Apple is less inclined than other companies to talk about pre-product R&D, so it may not be as evident when certain lines of investigation don't make money.
My understanding with Fuchsia was to do better than Linux/Android in terms of security, scalability, and adaptability across devices. Why wouldn't they want to own a great OS full-stack? It's part of their bread & butter.
> My understanding with Fuchsia was to do better than Linux/Android in terms of security, scalability, and adaptability across devices.
That's the theory and the marketing copy. Since Fuchsia has only been tried on a few closed-world embedded devices (Nest hub, Nest camera), it is really too soon to declare it more secure, more scalable, or more adaptable. [By "closed-world" I mean the devices run only a limited set of apps, with functionality known ahead of time and from known developers.]
> Why wouldn't they want to own a great OS full-stack? It's part of their bread & butter.
It's not their bread and butter. Their bread and butter is web advertisement. 80-90% of all money they make comes from web advertisement.
Someone in the company said "we need vertical integration like Apple has and we don't like Android", and for a while they managed to run with it. Now they lost the internal power struggle. Oh well. It never had any tangible value to Google as a company anyway.
Android is a hugely important piece to Google. It’s pretty clearly the case for a wide variety of reasons, some of which is because they want to control the browser/app experience on mobile as much as Desktop in order to fuel those ads.
Android itself of course makes billions alone, which is enough to justify Fuschia. They need a better setup to compete with Apple specifically on decoupling the OS from the drivers for updates on 3rd party manufacturers, in addition to security. I think there’s even some thought it could run in the data center - that drives everything Google. They already employ entire teams that just work on the Linux kernel, compilers, etc. All of that is bread and butter.
> Android itself of course makes billions alone, which is enough to justify Fuschia.
Ads are 80% of Google's revenue. Literally nothing else is Google's bread an butter. There's even speculation that Google makes more money from the search deal with Apple than from the entire Android ecosystem.
Android is a vehicle for Google's ads, and suffers from the company's rollercoaster of "interested/not interested" over the years.
Fuchsia by this time is just a 6-year-old money sink with no revenue story.
Apple does a much better job than most big tech companies of canceling or rebooting projects they don’t think will be successful before they ship rather than after. See for example the car project and AR/MR, both of which have been restarted more than once.
Oh yeah, the reason these other companies are laying off thousands is because they've spent years hiring indiscriminately, and having many boondoggle (sp?) projects, numerous loss leader products ("we'll be profitable selling this at a loss, because we can spy on people more" - people complained about the cost of HomePod vs Alexa, etc and have said "speaker quality!" as the reason, but also all of those assistants are sold at or below cost).
The problem is that many are now doing a "we need to fire x% of the company so I can get my stock reward... I mean be profitable", and are doing it in arbitrary and short sighted ways (stack ranking is an incredibly stupid way to cut people), so while cutting N-1 of N chat teams at google might be reasonable, cutting teams working on potentially high value future tech _just_ because it's not currently profitable. Take Fuchsia: I worked on it during my brief google stint, and even if they did decide it didn't have a future (and I think that was not the rationale here), there were incredible engineers they've apparently discarded that would still have been good to keep working on other things at google.
Others have said FB and G were also mass hiring to stop people being hired by each other - I'm not sure how true that is, I was never involved in interviewing people at G but even if I were the interview system there seemed very strange to me so even if I were interviewing people I don't know if I'd have been aware of hiring-to-stop-FB nonsense.
OTOH the butterfly keyboard was one of their biggest stinkers that was not killed nearly fast enough, and the new keyboards still don't feel as good as the 2012s.
Slightly off topic, but when I encounter the phrase butterfly keyboard I think of that really cool sliding keyboard that IBM came up with for the Thinkpad 701c... and not a failing keyboard switch.
Apple is full of lack of focus these days. The laptops use usb-c to charge like the rest of the worlds thankfully, but the phone still uses lightning. New ipad supports the old apple pencil and not the new one for some reason. Can't let them off too easy about the butterfly keyboards either.
>Microsoft, Google, Apple all have enormous amounts of cash stored thanks to their fantastic earnings for so many years.
Your earnings are fantastic until they suddenly aren't and then by that time, it's too late to turn the ship around on all the mistakes that lead to it. Just ask Nokia or Kodak.
And is managed by a CEO who is choosing not to lay off workers. Are you saying that Apple is wrong to not do layoffs? You can go express that with your votes in the investor meeting, that you don’t want Apple to be the star in these PR pieces about being a great employer who doesn’t lay off employees.
I think you’re both seeing this discussion as way too adversarial and/or political, as well as assuming the GP was insinuating some underlying point you disagree with rather than just being pedantic.
I'm implying that Apple's cash pile belongs to investors and shareholders. Part of that money is used to pay employees. However, that cash pile is not a charity for employing workers that is no longer needed at the company.
There’s another possibility, which is that the group of executives who built the largest money printing machine in human history have a better understanding of how to maximize long term value than whoever wrote the Forbes article complaining about Apple’s cash reserves.
I don't get the point that the money should belong to the customers. They gave it to Apple freely upon exchange for a product. Your argument kind of sounds that products should be sold at cost.
The recent push by Apple to promote privacy controls has attracted and renewed trust in the brand. That's my theory as a partial explanation of recent success.
They correctly identified privacy controls as something people actually want for real, even just knowing it's there is a nice feeling. Knowing my front door is reinforced with 7 optional locks, is much better than a fly-screen door with a privacy policy attached.
Meanwhile, MS and Google and Facebook remove options and rely on the opposite of privacy: over-sharing by default. Telemetry by default. Ads, suggestions, ads pretending to be suggestions, bloatware.
Apple is the one commercial entity that could completely disrupt surveillance capitalism and reboot the digital space to an era reminiscent of the Microsoft monopoly - but now with their huge attached and interoperable mobile user base.
What they simply need to do is make self-hosting (using Mac devices) trivially easy and support a number of new/updated protocols for decentralized online interactions (messaging, blogs, search, social etc).
Monopolies are never optimal, but given the dismal moral basis of the other "big tech" I'd take it any day...
> make self-hosting (using Mac devices) trivially easy
They did this for years: every Mac starting with the first release of Mac OS X had a built-in Apache webserver, and activating it was just the click of a button in the Sharing preference pane.
Problem is, the segment of people who a) are willing and able to create web content, and b) only want the very basics as provided by Apple (including whatever versions of Perl and other server-side languages happened to ship with the OS) is a fairly slim one.
The other problem is, almost no one has an ISP that's friendly to self-hosting from your home.
I believe Apple removed Web Sharing from the Sharing preference pane a few years ago now.
I did not know about built-in Apache (cool). Last I have seen this was in a NAS server. I was never into the Apple ecosystem but I recall hearing e.g. about appletalk [0]
What you describe are both real hindrances and somehow not fundamental. E.g. people migrated en mass to centralized social media also for simple personalized websites (which in retrospect might not have been in their best interests).
My sense is that as time goes on it will take an active act of suppression for self-hosting not to become potentially much more widespread and ofcourse if a major and credible player can remove friction points for non-technical people this will only open the floodgates.
As for ISP's, yes, they are a big part of the problem.
Don't they simply need the people to actually get stuff out the door? Almost every other year teams are being shoved into another department to the detriment of a product, so they can get something else out because otherwise it won't happen. Sounds like they actually need more people, not less. To me, Apple always sounds like it's spread way too thin or running on the absolute minimum.
Meanwhile I'm still looking at bugs in my OS that have survived a decade.
The Apple brand has some serious cachet. They can attract top-shelf people, and keep the salaries reasonable.
I worked for a company that had similar "cachet," and they were cheap bastards, but they did keep people, and their posture resulted in remarkably few "non-serious" applicants. The people who applied really wanted to work there.
As a hiring manager, I appreciated not having to sort through a pile of totally unsuited résumés, and, as a bonus, the ones that were unsuited, on paper, were often the high-achieving, ambitious "diamond in the rough" types that I looked for.
I don’t think Apple went on a hiring spree like the rest of the tech giants during the pandemic, but there is still time for even Apple to start cutting jobs.
Well, Tim Cook started by halving his compensation.
I didn't hear anything like that from his fellow CEOs who all "took full responsibility" for the choice of going for mass firing instead.
It's always insane to hear about CEOs saying that they take full responsibility, but never resign, donate their money to keep employees around, or anything of the like.
I’m waiting for Zuckerberg and Pichai to say bye bye themselves or keep staff and pay them with their salary + bonuses because they took “full responsibility.”
The CEO is ultimately following the direction of the board though, right? That kind of makes it the board's responsibility. The CEO is just the public whipping boy who goes on TV and is stuck making keynotes.
It was Cook’s suggestion to the board. From the proxy statement:
“The Compensation Committee evaluates and makes compensation decisions prior to the start of each fiscal year. The results of the 2022 Say on Pay advisory vote led to broader shareholder engagement on executive compensation in 2022 of approximately 53% of institutional shares held. The Compensation Committee balanced shareholder feedback, Apple’s exceptional performance, and a recommendation from Mr. Cook to adjust his compensation in light of the feedback received.”
And yet, it's still better than all the other big tech companies, that cut thousands of jobs and did not reduce executive pay in any way.
If cutting the CEO's compensation lets them keep even one regular employee who would otherwise have been laid off, then that's made the world a materially better place, if only for that employee and their family.
I thought I saw an article recently about a memo that Nvidia’s CEO had sent everyone about how strongly he opposes layoffs in general. So Apple would not be the only one?
Beyond hiring and into product design and even marketing I see Apple’s approach as generative. They’re willing to step into new territory, which precludes following others and frequent reactivity. This generative, self-directed style is juxtaposed by the reactive approach of Google, Amazon, and most other huge megacorps.
I’m not sure what’s going on. Obviously Apple has a stronger vision than most tech companies. Obviously most tech companies chase returns and live in fear of missed opportunities. Maybe it is simply this combination of factors that sets Apple apart from the crowd.
Tim Apple was the one who brought the company from being a financial war zone to an extremely profitable company - by overhauling their logistics. He is ultimately a logistics and supply chain man, and he hates waste.
By the way, this is a good example of how vision and ideas (Jobs) will not build a successful company alone without execution. Apple always struggled with that until Jobs brought in Cook.
Of course, Steve Jobs would never allow the Weather App in the new MacOS to be so bad compared to Dark Sky ;)
Not surprising honestly. New iphones start at like $800 before tax now and only incrementally improve on the previous couple of models. Inflation is also high for housing along with a lot of other consumer goods, and my gas company just doubled the rate for January. I bet this is already priced in.
Who would Apple lay off? They don’t hire people for the sake of keeping people around. Apple teams are lean, and manager count is held to the minimum necessary.
The situation changed and in a market economy you can adjust quickly. Not sure what is problem.
This is unlike Twitter where a firm is destroyed and people fired not because of situation but because someone has money to burn. And instead of creating a new firm to compete he just fired people of that firm he did not like.
That is the “good” and “bad” of market system. Sorry but both involved firing people.
I have developed strong opinions about tech companies over the years yet these discussions look more and more like sports fans arguing over whose team is better.
People try to find reasons to justify why their team is the goat and why the rest are simply worse.
Our brains have a way of "sports teamifying" just about everything. Worded properly, you could probably kick off a firestorm argument on HN with 1k+ comments about apples vs oranges.
Could it be that Apple is the only one amongst those that is more of a hardware than a software company? Also not running a social network and not reliant on online ads-sales.
Amazon has lead the way to a philosophy of hire early and fire early. This is why they have run out of people who would like to work for them with the same conditions that they offered before.
Not everything is a conspiracy. It's very obvious that Musk cutting jobs at Twitter had them/him take the brunt of the news cycles about the job cut because it happened first. Each subsequent company has faced less and less blowback. I'm sure they've all been wanting to do this for a while but didn't want the headlines.
I see a lot of people quick to dismiss and down vote this comment. I understand why, people don't like unsupported supposition and conspiracy theories when they in theory pride themselves on being logical.
I'd like to point out how similar replies and derision were directed at people who suspected there was collusion to fix engineers salaries and how their was a "gentleman's agreement" to prevent hiring engineers away from each other amongst these very same companies. That happened in 2014 and it proved these same companies were all in close communication with each other in regard to staffing and salaries. They were caught and convicted and paid a small portion of the money saved by colluding as a fine.
Do I believe we have any evidence anyone is currently colluding in order to lay off enough people to cool down a rapidly heating up labor market, and suppress the demand to work remotely and be paid the same as their peers in high cost of living areas like the Bay?
No.
Do I believe they would do so if it increased profits or manipulated the cost center they hate the most (salaries and headcount) in a beneficial way, and have the means to communicate with each other unofficially to coordinate such actions, yeah, I do. They've already proven they were willing and able to do so in the past.
So while baseless accusation might be useless conspiracy fodder, cynicism is warranted when a seemingly coordinated action is being taken. Cynicism is warranted in all dealings with all of these companies. They are amoral, and exist at a level of income in which the legal system has only a tenuous hold. They have also demonstrated in the past they are willing to collude in order to hurt engineers to their benefit.
Just food for thought, and discussion. I would prefer to think about how they could be caught if that's what they were doing, and how the issue could be dealt with, rather than dismissing it out of hand. However, labor conditions for people like us has so far not descended to the point where embracing the concept of unionization is appealing, and I don't see any other way engineers could fight for job security, and wage increases commiserate with inflation or pinned to profit margins.
It's funny, there are a lot of brilliant people in this industry, but we are still being taken advantage of by the same old hucksters, and con men, I've seen it for 20 years. You would think at some point we would find a way to fight back even if it didn't involve unionizing, but it's the same pattern of the "business" guys robbing the piggy bank, or turning good places to work into dystopian nightmares for a percent more of return for that quarter. Then we just pull up stakes and move on to another company make a bit more until they do the same thing there. At some point it's all going to suck then what?
What positivity? This is just a bunch of facts showing Apple hasn’t laid off folks in the past and didn’t over hire recently so it probably won’t lay off folks.
I get your point completely, but to me, yes there are “small” tech giants. They are the tech companies that aren’t remotely FAANG scale, but are still common household names.
Working at Apple is the equivalent of working for Target in the US.
Good brand image, everything appears great on surface, employees all appear homogeneous - however..the internal feuds, dislike for leadership, and cost cutting measures make it completely miserable to work at.
I’m still of the belief Apple probably should do layoffs, but won’t. Pandemic has not been kind to their profits either, but Tim will take the salary cut just to take the good PR.
> the internal feuds, dislike for leadership, and cost cutting measures make it completely miserable to work at.
Guess I am lucky. Haven't seen anyone dislike Tim or his team. I have access to all the hardware I could ever need, and I am not miserable at work. I do not think any of my teammates are either.
Having first hand seen the waste - in the form of non productive employees - I am a bit surprised by the flak that companies are getting for firing a relatively small percentage of people at the same time, especially when they are fearing a downturn...
> Amazon, for instance, hired about 780,000 people during the pandemic. Meta, too, had more than doubled its staff in this period, going from about 40,000 to 87,000. Even Microsoft had hired about 77,000 people just before the pandemic.
> Apple, in comparison, is believed to have hired under 20,000 people during the period of the pandemic.
Can't layoff people you didn't hire.
Besides, if they lay anyone off, it'll be through retail (already happening in their third-party channel sales staff, who are employed by Apple but work in stores like Best Buy) and eventually retail closures when/if they can do it without being accused of more union-busting.