But how can it be a problem? Working on a project just for fun is totally valid. Is this not «Hacker News»?
For the record, this is the definition of being a hacker by Stallman: «Being a programmer doesn't mean being a hacker: it means appreciating playful cleverness. Now, you can program without being playfully clever and you can be playfully clever in other fields without programming.»
1. I say both somewhat 50/50. I say SSL instinctively, and TLS when I think about it and remember we don't say SSL anymore. It's been like that for around 10 years now, before that I'd only say SSL.
2. I started programming professionally in 1998 and I'm in my early 50s.
I have a similar experience with providing users with excel files, but would also like to add that in a lot of business, the number 1 competition for a web application is the good old excel file (or its modern cloud version), and it's sometimes a challenge to beat.
I really hate how much work happens out of Excel/Google Sheets, but there's no denying that spreadsheets do a lot of heavy lifting without having to fuss with putting a DB together. Especially nowadays when two people can simultaneously work in a spreadsheet.
You are so right about computers, except from Raspberry Pi (UK) not much as been done in Europe.
Regarding the eu cloud, it is definitely about sovereignty, specially since the CLOUD Act (H.R. 4943). In the context of the global trade war initiated by the US of A, it also makes sense from a European Union perspective.
No he couldn't. That site uses a non-geographical definition of Europe that excludes Britain. "EU, EEA, EFTA, or DCFTA member country" also excludes Serbia, Turkey, Belarus and Russia but includes several countries that aren't able to join the EU at all due to corruption or misaligned legal systems.
This really shows the incoherence of the whole Euro project. There's no such thing as European-ness: when the sort of people who wave the blue flag use the word Europe they are imagining an ideological construct subject to inconsistent and ever-shifting definitions. They don't even agree with each other what European means. One minute Britain is in Europe and Ukraine isn't, events happen, and suddenly Ukraine is European via DCFTA and Britain isn't. Switzerland is a similar case: sometimes it's considered to be European by these types, and other times not.
Why should I, a man born in Britain now living in Switzerland who has worked on two different US clouds, want to apply that experience to a Eurocloud given this history? This of thing is why it will never inspire much loyalty, and why Bert instantly gives up on the idea of a European cloud being used because it's actually good. The resort to force of law underpins the entire project because the European identity is a sort of social engineering programme, not something organically developed.
To be fair, the non-geographical definitions that excludes Britain, actually only excludes Britain because Britain excluded itself of the European Union in 2020.
And yet that website doesn't use the EU as a definition of European, so it clearly doesn't matter in this case. That's what I'm getting at: the word European doesn't seem to mean anything because the people who use the word the most are relying on definitions that yield unintuitive and self-defeating outcomes, like deliberately excluding one of the countries in Europe that actually does have a bit of a tech ecosystem.
OVH (french) is very well known and I like them a lot. Used them for domains a lot, because they are very cheap and their management is nice.
I also like very much ScaleWay (french also) for price and quality of service, have used them for years on my startup, can highly recommend.
Also heard a lot of good things about Infomaniak (swiss), but never used them myself.
Would love to hear about european cloud providers with comments from users.
Not primarily, as primarily it was just "we're the V in MVC", and people were using classes. The hooks came when the react team got on the "functional programming" wagon, and suddenly everything was about immutability, side effects, etc. and very little about giving control to the developers on their components life cycle.
React was literally first created in ML out of the author's dislike of MVC, who preferred functional programming and immutability[0]. For a long time, react aspired for its future to be in ReasonML.
Hooks came as no surprise to anyone who paid attention, as the recommended way to write components since at least 2016 was in the stateless functional style whenever possible, and many of us used recompose[1] to simulate hooks long before their introduction.
Maybe I was not paying attention, this is of course a possibility, but until at least early 2019 the react website's main page was only mentioning the class-based "stateful component" as the way to write components.
Although they were mentioning function components at the time in the documentation, I can't say how mainstream that was. Hooks were introduced in Feb. 2019.
I migrated most of my self-hosted services from multiple rpi (I've been using several of them for years) to a single cheap Intel N100 NUC that I purchased last year: 16GB RAM/512Gb SSD for 156€, and I've been very pleased with it.
You lose some resilience when you do that - if the NUC fails then you lose everything. Whereas if you distribute your services across multiple rpis then a failure of one rpi is not catastrophic.
venv + requirements.txt has worked for every single python project I made for the last 2 years (I'm new to python). Only issue I had was when using a newish python version and not having a specific library released yet for this new version, but downgrading python solved this.
Being new to the ecosystem I have no clue why people would use Conda and why it matters. I tried it, but was left bewildered, not understanding the benefits.
The big thing to realise is that when Conda first was released it was the only packaging solution that truly treated Windows as a first class citizen and for a long time was really the only way to easily install python packages on Windows. This got it a huge following in the scientific community where many people don't have a solid programming/computer background and generally still ran Windows on their desktops.
Conda also not only manages your python interpreter and python libraries, it manages your entire dependency chain down to the C level in a cross platform way. If a python library is a wrapper around a C library then pip generally won't also install the C library, Conda (often) will. If you have two different projects that need two different versions of GDAL or one needs OpenBLAS and one that needs MKL, or two different versions of CUDA then Conda (attempts to) solve that in a way that transparently works on Windows, Linux and MacOS. Using venv + requirements.txt you're out of luck and will have to fall back on doing everything in its own docker container.
Conda lets you mix private and public repos as well as mirroring public packages on-perm in a transparent way much smoother than pip, and has tools for things like audit logging, find grained access control, package signing and centralised controls and policy management.
Conda also has support for managing multi-language projects. Does your python project need nodejs installed to build the front-end? Conda can also manage your nodejs install. Using R for some statistical analysis in some part of your data pipeline? Conda will mange your R install. Using a Java library for something? Conda will make sure everybody has the right version of Java installed.
Also, it at least used to be common for people writing numeric and scientific libraries to release Conda packages first and then only eventually publish on PyPi once the library was 'done' (which could very well be never). So if you wanted the latest cutting edge packages in many fields you needed Conda.
Now there are obviously a huge class a projects where none of these features are needed and mean nothing. If you don't need Conda, then Conda is no longer the best answer. But there are still a lot of niche things Conda still does better than any other tool.
> it manages your entire dependency chain down to the C level in a cross platform way.
I love conda, but this isn't true. You need to opt-in to a bunch of optional compiler flags to get a portable yml file, and then it can often fail on different OS's/versions anyway.
I haven't done too much of this since 2021 (gave up and used containers instead) but it was a nightmare getting windows/mac builds to work correctly with conda back then.
it was a nightmare getting windows/mac builds to work correctly
I think both statements can be true. Yes getting cross platform windows/Mac/Linux builds to work using Conda could definitely be a nightmare as you say. At the same time it was still easier with Conda than any other tool I've tried.
For the record, this is the definition of being a hacker by Stallman: «Being a programmer doesn't mean being a hacker: it means appreciating playful cleverness. Now, you can program without being playfully clever and you can be playfully clever in other fields without programming.»