It’s easy to build high performance custom components, for example with Canvas which is excellent. However, at the top level of an app it’s got to be the standard way.
That may be a good thing for usability across apps but it feels like a low code platform sometimes.
Know what you mean. SwiftUI can feel similar – great for standard patterns, but you hit walls when you want something custom. Ended up mixing in UIKit for some edge cases.
Trade-off I'm willing to make though. For a solo project, fast iteration beats pixel-perfect control.
Can understand – depends on the designer though! ;)
I've had both experiences. Good designers push you to make things better. But some insist on details no user will ever notice. Finding the right balance is key.
There's an immediate solution: local-first software.
Keeping app data purely server-side is no longer viable for customers with data sovereignty requirements, and having a toggle button saying 'Keep my data in Europe' isn't enough either because it places too much trust in the SaaS provider.
With network monitoring verifying local applications are accessing user-verified endpoints, privacy reduces to OS-level security.
It used to be emailed around, and when you explained to people that "encrypted" email usually exposes your plaintext to relays they'd shrug. If they bothered with encryption at all, which most people and providers didn't until big tech started pushing the issue a decade ago.
How is that relevant to data storage, locality and access now? Secure endpoints don’t have to be managed by huge companies running data lakes which could be anywhere.
The current best security practices can be used by any organisation. I respect the engineering that Google have done. gRPC is excellent and local first software can absolutely use it, accessing data locality verified endpoints.
In my experience, the best practice for sharing "health data? Government data? Corporate data? Financial data?" within an organization is to use a secure cloud platform with native data sharing functionality. The original comment's suggestion for "local-first software" doesn't work, because organizations frequently need to forward private data between individual workstations and the staff are going to do it using email if you don't give them something better.
What about it? My work place (university) also stores its data local (internal network/storage) because that is where it is needed 99.99% of the time and bandwidth costs money. On the off chance that someone needs to access something from the outside wie have an host of ways to do that.
We could also have everything on a cloud in a foreign country with a mad king, but what would be the benefits of that?
Health data can reside within your hospital's network. Government data within your government's network. Etc,...
I think the point is that your doctor or civil servant or local sushi shop shouldn't have to reach to AWS/GCP/Oracle each time they want to look up an MRI or building permit or loyalty points card status.
The data should reside exactly where they’re needed and nowhere else. For the UK NHS that’s probably in a UK data centre run by a UK company. Not AWS.
The fundamental problem with SaaS and pure server side applications is we do not know where the data are. With local first we can verify data locality.
Unfortunately the American companies are using their monopolies to price out everyone else. You're now in a situation where it's harder and harder to find people in the UK that can operate data centre services at the speed and quality of the cloud providers. The UK/EU needs it's own GCP/AWS/Azure alternatives. Unfortunately there's not really anyone close.
Sounds like you've already captiulated to big tech.
Governements could and absolutely should be subsidisng home-grown data centres. And taxing the hell out of every square metre of AWS and Google data centres. Why not have a data tax for foreign companies?
Sure! I'm just talking about data residence. They can transfer data over the internet (or some inter-hospital network) no problem. It's just a matter of "local-first".
This should be a wake-up call to a lot of SaaS companies.
It's becoming fairly clear that keeping app data purely server-side is no longer viable for customers with data sovereignty requirements.
Having a toggle button saying 'Keep my data in Europe' won't be enough either if local-first apps can actually guarantee data location, and allow users to specify which API endpoints they use.
With network monitoring verifying application behavior, privacy reduces to OS-level security.
It still says AD in big blue letters on the new version. Not really a big deal from my point of view.
Still there does seem to be a pattern of ignoring their hardcore fanbase: using Gemini, making ads less obvious, making free apps part of paid bundles. I suspect Apple are getting a lot of pressure from shareholders, given their recent growth has been far lower than e.g. Google.
This is not a trend I like and I'm definitely looking for a Linux boat to jump on, to future proof app distribution, but there just doesn't seem to be an obvious candidate right now.
> It still says AD in big blue letters on the new version. Not really a big deal from my point of view.
Sure, but it'll be small next letters after this. Then small grey letters. Then small grey letters on the details page. Then small grey letters in an accordion on the details page. Then ...
It's not. Just glimpsing the top of the article will reveal this is a patently false statement.
Something I don't get about Apple haters is they just spout absolute bllx for no apparent reason. I don't feel the need to defend Apple, I just want a reasoned discussion. I just don't get this attitude.
I share your frustration. I’m an Apple user and absolutely dislike everything about their direction regarding software. I have nothing but criticism for Tim Cook. Yet I see myself having to correct batshit lies people make about Apple to have a proper discussion. There’s no need to make shit up (and doing so gives Apple the opportunity to outright dismiss those as haters), Apple makes a lot of crappy decisions we can criticise in good faith.
That's a good point. Hadoop may not be the most efficient way, but when a deliverable is required, Hadoop is a known quantity and really works.
I did some interesting work ten years ago, building pipelines to create global raster images of the entire Open Street Map road network [1]. I was able to process the planet in 25 minutes on a $50k cluster.
I think I had the opposite problem: Hadoop wasn't shiny enough and Java had a terrible reputation in academic tech circles. I wish I'd known about mrjob because that would have kept the Python maximalists happy.
I had lengthy arguments with people who wanted to use Spark which simply did not have the chops for this. With Spark, attempting to process OSM for a small country failed.
Another interesting side-effect of using the map-reduce paradigm was with processing vector datasets. PostGIS took multiple days to process the million-vertex Norwegian national parks, however splitting the planet into data density sensitive tiles (~ 2000 vertices) I could process the planet in less than an hour.
Then Google Earth Engine came along and I had to either use that, or change career. Somewhat ironically GEE was built in Java.
> a developer ecosystem tipping the incentive scales such that companies like the Googl/Msft/OpenAI/Anthropics of the world WANT to contribute/participate
I think Apache Arrow has achieved exactly that [1]. It's also very file-friendly, in that Arrow IPC files are self describing, zero-copy, and capable of representing almost any data structure.
That may be a good thing for usability across apps but it feels like a low code platform sometimes.
reply