The horrible naming thing is epitomised by the frankenstein that is, "ASP.NET Core 2.3", which is a re-release of "ASP.NET Core 2.1", which is actually a .NET Framework thing.
Re-released because ASP.NET Core 2.1 had a better support lifecycle than ASP.NET Core 2.2, and people "upgraded" to 2.2 not realising this.
So people got stuck on unsupported 2.2, rather than the supported 2.1.
So they re-released 2.1 as 2.3, to give people an "upgrade" path from 2.2 to 2.1.
All named after "ASP.NET Core" which, confusingly, is detached from ASP.NET on .NET Core.
( .NET Core which is now just ".NET" , of course. )
I swear .NET has had the best technology focus for the past 5 or so years, but has had absoultely the worse brand and a complete lack of developer relations / brand ambassador type work.
They need someone who can turn around and tell them to stop this madness, and focus on getting out that .NET is a fantastic ecosystem that performs damn well, and modern C# a pleasure to write in.
But instead we get stuff like "EF Core 9" that targets .NET 8, and all the 9.0.x system libraries, where isn't not clear whether you should be using them if you want to stick to the LTS and target .net 8.
Porting .NET framework to .NET core / standard / 6+ really isn't as smooth as MS would like you to believe.
And the support lifecycle for .NET4.8 is ironically better than they are for .NET 6 (already dead) and .NET 8, because of the shipping-with-windows thing.
The odd-numbered .NET releases aren't even worth looking at from an enterprise perspective.
So yeah, there'll be a bunch of .net framework code around for a while.
That said, it's frustrating, because in some places the performance just isn't there, and there are so many nuanced differences between them there's a weird effect trying to switch between them.
You not only have to recall which APIs are available in each, but the performance differences between otherwise identical (or near identical) APIs.
> Porting .NET framework to .NET core / standard / 6+ really isn't as smooth as MS would like you to believe.
Having just spend the better part of 3 months porting a lot of legacy projects, you're absolutely right, but I am not sure where MS said this would be smooth. There are some processes that are standard enough but certainly there is a lot of hands on work involved, and actually for good reason, there are performance and security issues that need to be addressed (In my brief, and hopefully final experience).
That said, porting from event .net versions (stable LTS versions) such as 6 -> 8 has so far been flawless, I think over the last year, we ran into maybe some minor semantic issues.
It's not perfect, but I have to defend MS unfortunately, they are taking things in the right direction while being tied to a LOT of legacy enterprise clients and systems.
I'd add that WPF was "don't bother" up until recently when WPF support was added. (Though I don't know if that means stock WPF or that it supports the breadth of WPF controls out there, and whether it requires WPF component vendors to make changes)
We just migrated our massive WPF application to .NET 8 and there are a handful of libraries that are not supported or had to be replaced but overall it was surprisingly smooth. The biggest issue we have is that garbage collection seems to have taken a huge hit and there are bugs in the Microsoft WPF components (any sort of list view in particular) but it's all relatively easy to work around.
Yes, it's partial parity, and is server only. It's also not easy to migrate; the upgrade-assistant tool typically just gives up.
The client libraries only came separately, and much later[1] , and in typical "Fuck your migration path" fashion, doesn't have any .NET standard support.
Support for .NET Frameworks 4.8 is essentially forever and .NET Core requires you to jump one version up almost every year, which has a much higher maintenance cost.
This is completely false and goes against the experience of the vast majority of the teams.
Using .NET Framework carries significant risk and opportunity cost. Updating LTS every 2 years usually comes with simply bumping up versions and rebuilding.
It’s not “completely “ false. Otherwise this sentiment wouldn’t exist and .NET 4 would be a nice memory like Windows 2000.
Opportunity cost? Sure, but you can’t deny that small teams that just wanted some stuff to please a business itch were able put the shitty code in whatever Win server they had at the time and forget about it. They didn’t have to ask IT to evaluate any installs, it was right there. And today in win 10 servers, that code still runs, returning ugly aspx.
We have some equivalent 2.1 stuff laying around and no one wants to go through the trouble of updating that.
You’re probably going to say something like “well that’s what you get when you just let your code rot and no one cared to document it and maintain it”. Yes, true, but that’s not the on topic point, one of those constantly comes up in vulnerability reports and one doesn’t.
Yep, this is correct. Not to mention the maintenance cost of .NET Framework going up every year because the library ecosystem moves onwards, eventually, from legacy behaviors.
And also, bumping the .NET version from 8 to 9 or 9 to 10 is typically a very small effort for most. I think what the OP is describing is working for a software cost center where they don't value the software system itself.
But you will only be able to hire the employees willing to tolerate working with it. Especially the condition of not planning any form of modernization.
And as phillipcarter noted, increasingly more libraries drop 'netstandard2.0' target, depending on the workload the performance difference is going to be anything between 50% and multiple orders of magnitude (e.g. Regex engine has gotten tens of thousands of percents faster for some patterns).
Working with .NET Framework from non-Windows environment requires using Parallels / UTM on macOS or QEMU or any other VM env. on Linux - there's Mono but it's not a replacement, and I don't think Visual Studio will work under Wine.
There are all sorts of hidden costs that you don't plan for, that come with using .NET Framework.
One of the bigger problems with staying on .NET Framework is that it limits you to C# 7.3, which lacks a lot of new language features (e.g., most pattern matching, records, nullable reference types, etc.). It can be both frustrating and even potentially career limiting to be stuck on that for too long.
You can use newer LangVersions with older targets, including .NET Framework. A good deal of features can be made work by just setting LangVersion: 13 and doing dotnet add package PolySharp. But of course this won't bring missing span-based APIs, new ASP.NET Core and EF Core, etc.
However, the workplaces that adamantly refuse to upgrade also select for developers who may not be familiar with pattern matching or other features at all, and never read release notes.
Honestly, if a project in 2025 is on .NET Framework, I am simply not going to believe any statements about pending modernisation. Simply because if it was going to happen, it would have happened already.
Not sure why this is so aggressively downvoted. I work in legacy software and it's true. A firm that's still on .NET Framework, PHP 5.6, or Java 8 is likely to stay on those versions for the near future.
We just moved from 4.6.1 to 4.8 this year. We maintain a frontend(++) for a major enterprise software package, and follow them. They just moved to 4.8 so we did as well. Not even the point release. The contractor I work for is pretty conservative, he likes his 1+ million dollar home and worldwide vacations. He won't be changing anything without being forced by Microsoft.
Even if Microsoft puts in a huge effort to updating a migration tool to be seamless, we wouldn't use it. It'll only happen when either the larger enterprise corporation migrates, or when Microsoft forces people off 4.8.
.NET as well. If you do a framework-dependent deployment, then in many cases you only need to run Windows Update, just like with .NET framework.
Yes... if you do self-contained deployments or build containers, then of course you're tied to the version you've built with - but that is your choice and therefore your responsibility.
You are now blaming having that some deployment choices cannot be updated manually. It is the wrong mindset.
> .NET Core requires you to jump one version up almost every year,
Close.
* The requirement is to jump every second year when a LTS version comes out. Last one was .NET 6 to .NET 8. And next, .NET 10 will come in November 2025. You can jump 1 version every year, but it's not a "requirement".
* It's just called ".NET" now, not ".NET Core". There is no ambiguity in e.g. ".NET 8" - it's the modern version formerly known as ".NET Core".
* Release is every year in November, and every second one is LTS. there's no "almost", it's every time. IDK if this will still be so in 10 years time, but it's been consistent.
* maintenance cost? Yes, you have to review project files and build pipelines in order to to update, so there's some detailed work, but the actual breaking changes and puzzling issues are few to non-existent. It typically goes very smoothly after you change some numbers.
For what it's worth, migrating from Java 8 to 11 is much smoother than .Net 4.x to .NET Core / Standard / 6+. In all honesty, in all the project we've migrated it was only ever so slightly more painful than migrating 11->17 or 17->21. You typically have to upgrade some libraries and your version of Gradle (assuming you're using it).
There is a much stronger case for enterprise apps being stuck on .NET 4.x.
> The odd-numbered .NET releases aren't even worth looking at from an enterprise perspective.
Not hard to argue the opposite either; as you have to upgrade every 2ish years anyway is the LTS track really that valuable compared to updating every march-ish when the latest has been proven? 6 -> 7 -> 8 -> 9 was trivial too.
Meanwhile a friend of mine still has 4.6 going strong and I’d struggle to argue a reason to migrate.
In my experience porting from .NET Framework to .NET Core isn't that difficult per se -- it is the frameworks that we are stuck on that cause the difficulty, namely if you have a large project using ASP.NET pages and EntityFramework classic, which do not have compatible .net core versions really. The .net core replacements are objectively better, but have significant conceptual and API differences that prevent a simple move.
There are some community projects that try to fill this void by porting entityframework classic to .net core for example though.
What surprises me more is how young Subversion is in comparison to git, it's barely older.
I guess I started software dev at a magic moment pre-git but after SVN was basically everywhere, but it felt even more like it had been around forever vs the upstart git.
Any version control where you had to manually (and globally) "check out" (lock) files for editing was terrible and near unusable above about 3 people.
Version control systems where you didn't have shallow branches ( and thus each "branch" took a full copy / disk space of files) were awful.
version control systems which would have corruption data-bases (Here's to you Visual source safe) were awful.
Subversion managed to do better on all those issues, but it still didn't adequately solve distributed working issues.
It also didn't help that people often configured SVN to run with the option to add global locks back in, because they didn't understand the benefit of letting two people edit the same file at the same time.
I have a soft-spot for SVN. It was a lot better than it got credit for, but git very much stole the wind from under its sails by solving distributed (and critically, disconnected/offline) workflows just a bit better that developers could overlook the much worse UX, which remains bad to this day.
>It also didn't help that people often configured SVN to run with the option to add global locks back in, because they didn't understand the benefit of letting two people edit the same file at the same time.
I think it was more that they were afraid that a merge might some day be non-trivial. Amazing how that fear goes away once you've actually had the experience.
(I had to check because of this thread. SVN and Git initial releases were apparently about 4 and a half years apart. I think it was probably about 6 years between the time I first used SVN and the time I first used Git.)
It's always hard to describe the minutiae of things happening in the span of just a couple of years, but I think you're overly broad here.
Wikipedia tells me the initial release of Subversion was in late 2000, and for git it was 2005 - but although those were kinda just smack in the middle of my first years online, learning to code, starting with FLOSS work, and so on - I think those years were pretty important with the shift to the WWW and then web 2.0.
I basically don't remember a world without SVN, but that's probably because I just missed the cutoff and projects and companies were migrating from CVS from 2002 on or so, because the model was very similar and while it wasn't drop in, it made sense.
For git I want to say it took just a little longer, and the decentralized model was so different that people were hesitant, and before github in 2009 (I know it was founded in 2008, but my user id is below 50000 and it felt very much new and not at all widespread in non-rails circles before that) I would have called it a bit niche, actually - so it's more like a 7year span. But of course I was living in my bubble of university, and working for 2 small companies and as a freelancer in that time. I think bigger FLOSS projects only started migrating in droves after 2010/2011. But of course my timeline could be just as wrong :D
Yeah, odd to learn. I remember dipping my toes into source control, playing around with CVS and SVN right around when git was originally announced and it felt so "modern" and "fresh" compared to these legacy systems I was learning.
There were far, far worse things out there than Subversion. VSS, ClearCase, an obscure commercial one written in Java whose name escapes me now..
Subversion was basically a better CVS. My recollection is that plenty of people were more than happy to switch to CVS or Subversion (even on Windows) if it meant they could escape from something as legitimately awful as VSS. Whereas the switch from Subversion to Git or Mercurial had more to do with the additional powers of the newer tools than the problems of the older ones.
No, but at this point there should be mass protests. Deporting innocent people to El Salvadoran prison for life without due process? If people aren't (at least figuratively) up in arms about that, then what?
Protests in blue cities will do nothing right now. We need to field candidates in primaries against complacent democrats. And we need protests in red districts (and apparently at Tesla dealerships, given that’s setting Musk off).
Maybe it was just the way I was taught percentages ( "% means /100, 'of' means multiply), but it seems too trivial a result to even be something worth commenting on.
> but it seems too trivial a result to even be something worth commenting on.
Yes, but I also think it is memetic and lovely enough to remembered it if you've encountered it. Calculating 52% of 25kg of something as 52/4=13 sparks joy.
No, just putting the results in a set of objects as boxes and move them along. I don't use ORM layers. Just put the returning data from the DB directly into their neat boxes, string them along in a vector, and pass along.
I also write my templated queries myself and talk with the server directly. No need to go "FizzBuzz Enterprise Edition" for a simple task.
When the data you need isn't entirely contained w/in those "neat boxes" you realize quickly the error was trying to force those "neat boxes" onto your data model.
Before starting to code any application, I tend to verify the ideas by first doing the design on my mind, then spending some time with a pen and paper, and maybe with some diagramming software. If I can't see the whole machinery in front of me, I don't start coding.
In your case, if the data is not fitting into neat boxes, it's not being forced into that in the first place. I select tools/features according to the problem at hand, not try to fit what I have into the problem at hand. If that requires a new language to learn, I'm down with that too. This is why I learnt Go, for example.
Sometimes the design stretches to its limits. If that happens, I stop, refactor and continue development. I call this outgrow/expand model.
It's not the fastest way to develop software, but it consistently gives me the simplest architecture to implement the requirements, with some wiggle room for the future.
For example, the latest tool I have written has broken the design I made in the beginning, because I overgrown what I designed (i.e. added more functionality than I anticipated). Now, I'm refactoring it, and will continue adding things after refactoring.
Every tool, every iteration brings learnt lessons, which are stored in a knowledge base.
What I am saying is that Objects are not rows. And we sometimes try to force the Object model onto a data-schema that ultimately is more rich than a chain of objects.
Which is not using the most appropriate tool for the job, per your example.
An ORM is _fine_ for stuff that has a fairly standard shape, like blob posts, user accounts, things like that. Lots of relation questions against persisted data end up not being those exact shapes and patterns yet folks generally reach for the ORM.
Consuming this article feels like empty calories. Satisfying, but there's nothing there. It doesn't even define beyond a single example of "watching someone do a tutorial", what the author thinks constitutes "programmer junk food".
Is this about all youtube / stream content? Or about tutorials and tech's fascination with chasing new languages / paradigms over honing skills? Or specifically about a single niche of watching programmers do tutorials?
One that has made it's way into LLMs because they've injested a lot of actual books that were professionally type-set.
Humans typing online don't tend to use em-dash; LLMs often make the choice to do so.
That makes it a very strong signal.
reply