Who's in the better position? Me completing a Masters degree in the EU, spending anywhere between 10k and 20k EUR, and landing a job that pays 30k EUR/year and living a pipe dream of getting myself salaried with 100k EUR, and only perhaps after 10-15 years of continous effort (and luck). Or is it someone in the States who needs to spend as much as 5x more for the same level of degree (but not education and opportunities!) - but in the very first year of employment gets 100k per year and has a vast more opportunities to build their careers up?
My perspective: I'm an (Technical) Information Security Officer in the Netherlands and I'm at €120k if I include pension contributions (Which Dutch people almost never include in their 'gross' salary).
And this is with all the employee protections you get when being salaried in the Netherlands.
If you elect to go for a more American style contract with your employer where you have little to no protection you can easily make €200k or more. Two of my colleagues get €20k a month. Downside is that they're the first to go since they're more expensive and the company can just let you go for whatever reason they want essentially any time your contract is up for renewal.
These are called 'independent contractors' in my country but essentially they are just employed 'at will' as Americans call it.
I feel like most people online that complain about €30k salaries in Europe (which is minimum wage here by the way) are just parroting online meme talking points.
This is a great way of discrediting somebody's else opinion and also incredibly a very rude one.
I earn more than you do but that's besides the point since I'm most likely in 95th percentile of the market. The point that you're missing and that I'm trying to reinforce regardless of the very good position I'm in is that 50th percentile of the market is not anywhere near +100k EUR figure. I worked with 100s engineers across different domains in Europe so the sample size should be pretty representative. I also hired many engineers so I am familiar with the market value as well.
Finally, you're also missing the point that your sample of dutch market isn't representative of all European countries markets.
I'm sorry for the rude remark, but claiming software engineers and other IT professionals are working at the legal minimum wage came across as if you were mocking us.
My apologies.
And yes, perhaps the Dutch market is not representatie for all of the EU. But then again Silicon Valley isn't representative of all of the US but these salaries are usually posted for benchmark.
Still your argument doesn't apply since I wasn't comparing salaries of experienced professionals but out of the University starting salaries.
Considering the whole European market as a whole I think I was even optimistic with 30k figure since there are many professionals with years of experience still not managing to earn much more than that. And I'm talking about building smart stuff, algorithmic design.
100k USD figure OTOH was pessimistic and it didn't include SV starting salaries which would obviously be much larger than that.
I'm talking about the EU countries and not the third-world countries but since you're ignorant and disingenuous there's no purpose of continuing this discussion anymore. More graceful interpretation would be that you're limited in capabilities to understand distributions, percentiles and some fresher year statistics, in which case it should be easy to fix.
There are 44 countries in Europe. What makes you think that your experience of getting the most of your government/taxation whatever you want to call it matches the experience of any other European country government/system?
If you want to attract the best people in the EU it's the fact, and not even for a discussion, that you have to pay them the good $$$. Those people are found in places where the most complex infrastructure software is being built. And that is for worse or for the better is only found in FAANG or HFT or similar very large-scale software where even saving a few cents per line of code is bringing big profits to the company. How many such EU-based products are out there?
Your perspective of "delivering more work in less time" is cute. Put yourself in a pool of 95th percentile of engineers on the market and you will soon find that your experience is not representative.
I do not. Even with those skills you're still looking for a lot of hours to burn in. It only happens that those skills are more or less a prerequisite for more chance of success.
It's not false. Those are the people who have already proven that they have the necessary experience to build those products so the rate of success is simply much higher rather than seeking the talent elsewhere. Startups and US companies apply this strategy all the time - you want success, poach the good people from competing companies.
I think people outside the domain often underestimate the complexity of Internet-scale infrastructure software and hardware. You don't find such challenges very easily elsewhere so simply paying more money to random engineer doesn't stand more chance than paying the same money to already established engineer in that domain.
> so the rate of success is simply much higher rather than seeking the talent elsewhere
This is true to some extend, but I don’t want the riffraff that built AWS building the EU infrastructure. It works to some extend, but some consistency would be a very welcome addition.
I don't understand what is your concern. What consistency? If you need engineers with storage engines domain expertise at scale, or query execution challenges, where else do you find those people if not in existing businesses that have already proven they have a workforce delivering exactly that type of work?
The recipe is simple but nobody supposedly wants to do it.
1. Find all (e.g. LinkedIn) the EU-based engineers working or having worked for American cloud companies.
2. Offer them a proportional salary to join and start building the EU cloud product.
3. Incentivize the engineers to top-performance by giving them bonuses, stocks, whatever ...
4. Go back to the garage engineering and cut-off the bureaucratic BS with endless PMs, milestones, JIRA boards, scrum masters, chapter masters, architects, and all other similar BS.
But EU doesn't want to do it - they rather spend money in ways so that "EU funds", which are supposedly public, end up running up to their pockets directly. Dozens of examples out there but "AI factories" being the last embodiment of that.
You describe a plan for building it. That is but one piece of the equation. How will you sell it? Why should people pick you over established AWS/Azure/GC? Will you target new projects, or get people to migrate? How does the strategies there differ? What is the timeline to becoming profitable? How will you fund it in the meantime?
I'm aware of that. Please see my other comment that already addressed your concern. The point rather was that EU strategy won't even get us so far because it doesn't have or doesn't want to have a plan.
The EU is ran by lawyers and political science majors. They have absolutely zero idea how to setup proper incentives. Any hope of EU getting its act together is wasted. Much like our tax euros.
Even if you manage to build the product (R&D) now you have to incentivize the private sector and governments to switch to the EU product. There's only few options of making this happen and tariffs could be one of them. Natural transition won't happen for many reasons.
If I derive my work using multiple sources, do all the copyright holders from these multiple sources have an exclusive right on my work? How otherwise would people build a knowledge on some topic and then apply that knowledge to build a product if not by reading bunch of (book) material and studying other similar products?
Rust is rather heavy on its copy/clone imposed semantics making it potentially less suitable for low-latency or large data volume processing workloads. Picking Rust for its performance potential only means that you're going to have a harder time beating other native performance-oriented stream processing engines written in either C or C++, if that is your goal of course.
This logic
> written in rust will have better performance, lower latency, ..., lower memory footprint
is flawed and is cargo-cult programming unless you say what are you objectively comparing it against and how you intend to achieve those goals. Picking the right™ language just for the sake of these goals won't get you too far.
> Rust is rather heavy on its copy/clone imposed semantics making it potentially less suitable for low-latency or large data volume processing workloads. Picking Rust for its performance potential only means that you're going to have a harder time beating other native performance-oriented stream processing engines written in either C or C++, if that is your goal of course.
There is absolutely nothing in Rust's semantics preventing you from writing high-performance data processing workloads in it, and in fact it's one of the best languages for that purpose. Beyond that, the usual barrier to entry for working on a product like this written in C++ is incredibly high in part because stability and safety are so critical for these products--which is one of the reasons that in practice they are often written in memory safe languages, where C++ is not even an option. Have you worked on any nontrivial Rust data processing product where "copy/clone imposed semantics" somehow prevented you from getting big performance wins? I'd be very curious to hear about this if so.
Stability and safety are the least of the concerns in data processing and database workloads. That's totally not the reason why we saw an increase of these systems during the 90s and early 00s written in Java or similar alternative languages. It was ease of use, low-entry bar into the ecosystem and generally developer pool accessibility. Otherwise, the cost is the main driver in infrastructure type of software and the reason why we see many of these rewritten exactly in C++. Rust is just another contender here, and it's usually because of the performance and a lot of hype recently, which is fair.
> Stability and safety are the least of the concerns in data processing and database workloads. That's totally not the reason why we saw an increase of these systems during the 90s and early 00s written in Java or similar alternative languages.
not_sure_if_serious.jpg
To be extra clear about it (and to avoid pure snark, that's frowned upon here at HN): that's the kind of software (alongside a lot of general enterprise code) that got rewritten from C++ to Java, not the other way around. The increased safety of Java was absolutely a consideration. Java was the 'Rust' of the mid-to-late 1990s and 2000s, only a whole lot slower and clunkier than the actual Rust of today.
I am serious. C is a simple language but rather complicated to wrap your head around it since it requires the familiarity with low-level machine concepts. C++ ditto but with a difference that it is a rather complicated language with rather advanced programming language concepts - something that did not really exist at that time. So the net result was a very high entry barrier and this was the main reason, and not "safety" as you say, why many people were running away from C and C++ to Java/C# because those were the only alternatives we had at that time. I don't remember "safety" being mentioned at all during the past 20 years or so up until Rust came out. "Segfaults" were the 90s and 00s "safety" vocabulary but, as I said, it was a skill issue.
Frenzy around the "safety" IMO is way too overhyped and when you and OP say that "safety" plays a huge role in data processing and database kernel source development, no - it is literally not even a 1% of time that a developer in that domain spends his time on. C and C++ are still used in those domains full on.
> that's the kind of software (alongside a lot of general enterprise code) that got rewritten from C++ to Java, not the other way around
So you agree that many people were absolutely "running away from C and C++ to Java/C#" but somehow this didn't involve any data processing code, even though arguably the main thing that internally-developed enterprise code does is data processing of some kind? OK, I guess.
> Which C or C++ engines exactly got rewritten to Java?
It's difficult to give names precisely because private enterprise development was involved. But essentially every non-trivial Java project starting from the mid-1990s or so, would've been written in C++ if it had been around in the late 1980s or earlier in the 1990s. It's just not very sensible to suppose that "data processing" as a broad area was somehow exempted from this. And if writing segfault-free code in C/C++ could be dismissed as a mere "skill issue" we wouldn't need Rust either. It's a wrong take today and it was just as wrong back then.
(And yes, Java took significant steps forward in safety, including adding a GC - which means no wild pointers or double-free issues - and converting "null pointer" dereferences into a properly managed failure, with backtraces and all that. Just because the "safety" vocabulary wasn't around back then except for programming-theory experts, doesn't imply that people wouldn't care just as much about a guarantee of code being free from the old segfault errors.)
You're a servant to the business needs so whatever the business needs are at that moment. It's a vague answer probably not appealing to many engineers but that's what it really is. You're solving problems for your business stakeholder and for your business stakeholder clients.
In another words, programming language is usually not at the very focus of daily development, given that there's always much bigger fish to fry in this domain, but if Rust provides such an undisputed benefit to your business model, while keeping the cost and risk of it viable for the business, then it's going to be a no-brainer. Chances are that this is going to be the case is very very low.
So, my advice would rather be use the language whichever you prefer but don't dwell over it - rather put your focus on innovating workload-specific optimizations that are solving real-world issues that are palpable and easily proven/demonstrated. Study the challenges of storage or data processing engines or vectorized query execution algorithms. Depending on the domain problem you're trying to solve, make sure that your language of choice does not step in your way.
Why do you have to beat a native performance-oriented streaming engine written in C or C++?
Currently, most of the mainstream stream processing engines are written in Java. Sorry, I may not add qualifiers to make you misunderstandings.
Software does not have silver bullets, so does programming languages, and each has its own strengths. I also like to use go and Java to develop software.
So if you don't want to beat native engines in performance what is it that you're trying to solve but Java-based engines don't have? I think it's pretty important to set a vision upfront otherwise you're going to set yourself a trap for a quick failure.
That's a very simple and non-biased model view. In reality, many people might read your job ad as "so, your profile claims you have the skills but how come then that you don't have a job already?" aka "there's something wrong with this guy".