Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Microsoft is busy rewriting core Windows library code in memory-safe Rust (theregister.com)
219 points by mikece on April 27, 2023 | hide | past | favorite | 173 comments


It would be amazing if there were a Manhattan Project for eliminating memory-unsafe code from modern operating systems and browser engines. This would probably increase the cost of pwning VIPs and critical infrastructure by several orders of magnitude.


It wouldn't noticeably change the unit economics. Very few attacks rely on memory exploits.

Edit: This appears to be confusing to people, so here are some examples you'll recall from the past few years: SolarWinds, the Colonial Pipeline hack, the Okta breach, the Uber breach. Most attacks don't rely on memory exploits.


The majority of CVEs are due to memory exploits. For C/C++ code that holds true across companies and OSs

Microsoft says 70% https://www.zdnet.com/article/microsoft-70-percent-of-all-se...

Google says 70% https://www.chromium.org/Home/chromium-security/memory-safet...


A majority of CVEs are memory exploits. A majority of attacks don't use CVEs. It's a common misconception among people on HN who don't work in the field.


I work in the field and I'm not entirely sure about the cardinality of types of attacks. On one hand, there are password spaying, RDP bruteforces, email attachments, social engineering etc. On the other we have BlueKeep, ZeroLogon and the tons of RCE present in VPNs (looking at you PulseSecure), Routers, and Firewalls.

I would say that breaches often are related to RCE that ultimately derives from buffer exploitation. They are notoriously difficult to detect with forensics techniques, so they might not be discovered and tracked.


You're guessing I think. Phishing of some sort is by far the most reliable and used method. CVEs that get exploited are rarely using memory exploits but they do happen and affect companies and people that refuse to update their stuff to the most part. There is just rarely the need to spend time to develop memory exploits because on every consumer OS there is some sort of memory-safety protection. At least DEP or ASLR unless you get lucky and and the software or shared libs have all that disabled or reliable rop gadgets are found.


I'm not making general claims about the use of memory exploitation - only questioning the statement that they are not widely used.

With more than 500 forensics cases with my name on it, and a substantial amount of them being RCE based, I'd say it is more than just guessing.

There is no need to spend time on developing a exploit when you can find hundreds new ones every month on GitHub. DEP and ASLR are also not used in embedded devices where memory management in the firmware is atrocious.


Well I didn't claim that memory exploits were not used. They're just rarely used when compromising end user workstations these days. 10 years ago you had rampant exploit kits for example none these days. You still see memory exploitation if internet facing stuff or even internal devices for lateral movement.

The comment you were replying to is talking about the majority if compromises. Citing your case stats to argue against that is a bit weird.


Your experience is valid. I'm absolutely not saying memory exploitation doesn't happen, only that it's so comparatively infrequent in the 2020s that magically eliminating it wouldn't change the economics of attacks.

As a point of comparison, 10-15 years ago exploits in general were much more prevalent. Flash was still around, people read PDFs in Acrobat instead of PDF.js, Internet Explorer hadn't been displaced by Chrome, macros were just starting to make a comeback after signing restrictions from the early 2000s were lifted, crown jewels hadn't yet moved to the cloud via SaaS, and things just weren't commoditized like they are now with pentest frameworks, LOLBins, etc. In fact the most commoditized element in those days was exploit kits targeting IE memory vulnerabilities. The landscape has changed a lot since then.

I'm vendor-side research, which gives me pretty broad visibility here.


ASLR and other hardening practices are also not used in old machines on your network everyone forgot about


I don’t work in the field but do you know for a fact companies like NSO don’t use memory exploits for their attacks ? Majority of the “published” attacks is probably a better assertion.


NSO absolutely uses memory exploits. I think the person you’re responding to is saying that weaponized exploits of the form that NSO builds are a minority of overall attacks (which is both true, and also not a sufficient reason to discount the severity of memory corruption).


> A majority of attacks don't use CVEs.

Depends what your definition of "attacks" is, to be precise: is an event where an adversary places a malicious ad with code exploiting a browser 0day counted as one attack or as X attacks with X being the number of infected machines?

Additionally, the same segmentation (with the same split) applies if you only count large-scale hacks against organizations as attacks, or if you're counting infected machines of everyday common people as attacks as well. Basically, if you're counting attacks on organizations, you're correct as the majority entrypoint there is social engineering and outdated exploitable software/appliances reachable from the public internet or a compromised partner connected to the victim's network.


This is an excellent point. At the end of the day, rewriting is time and resource intensive. If there isn't a very good business case to backup the change, it's very difficult to justify the project.

This is why you see so many whitepapers trying to quantify things like consumer trust, reputational damage, regulatory, impact, etc. If there is a true cost to the damage, the investment in prevention can be made and compared with other requests, like new features, scope, etc.


No need to be rude about splitting hairs.


Tell me again that dirtycow wasn't used in the field.


Oh, i've upset the logic cart a little more.


I don't know if this is pedantic, but op indicated "attacks" not "vulnerabilities". I would not be surprised if statistics in vulnerabilities are different than statistics in realized attacks?


If there's a difference I'm open to someone citing a source quantifying it, but I won't quite be convinced by unsourced blanket generalizations that go against common wisdom


This page is informative: https://www.oaic.gov.au/privacy/notifiable-data-breaches/not...

> Just over half (54%) of cyber incidents involved malicious actors gaining access to accounts using compromised or stolen credentials.

My experience has been that most attacks are not that sophisticated and tend to target poor practices within organisations.


Aren't memory bugs in I/O layers the most common source of vulnerabilities?


Careful about pointing out dirtycow, because you'll be downvoted.


Ironically Mozilla began to sponsor Rust as part of their Servo project. Before they laid everyone off and made Servo a volunteer-run project.

I do feel they should have persevered. Although, yeah, writing a new rendering engine from scratch is a big job.


As far as I know, Servo wasn't ever really intended to be a production browser. But significant parts of Firefox have been rewritten in Rust since, and those parts have included bits from Servo.


Oh, thank you, I wasn’t aware.

They even have a decent graph linked to off[1]: https://4e6.github.io/firefox-lang-stats/

[1] https://wiki.mozilla.org/Oxidation


How is that ironic?


Where were you 25 years ago?


In high school, using Netscape.


Then perhaps you understand the irony of Microsoft adopting—in the Windows kernel—a technology created by, of all organizations, Mozilla.


Microsoft is also a member of the Linux Foundation and has their own kernel. As someone who grew up in the 90s, it was quite a surreal thing to live to see.


I still have trouble believing there's a build of SQL Server for Linux (also available as a Docker image).


For me the “wtf” thing is that you can get a newer bash with Windows than macOS.

Also since Dec 2022, you can run GUI apps on WSL, so you can install Firefox, VS Code etc on Linux and have them show up on the Start Menu.

I can’t say why exactly but that was... weird.


Sadly, humans would remain the weakest link (in terms of PWNing defenses).


Remain? Humans /may become/ the weakest link after remote code execution exploits cease to be discovered, but certain hacks require no human in the loop.


I don't think so. Outside of labs, I can't remember needing memory safety exploitation or hearing about a major compromise that used such an exploit.

I am not saying the security improvements are not real, just not orders of magnitude. For example some post exploitation stuff like BYOVD I am sure would be much harder.

A lot of vulns are just mistakes people made. Perhaps LLMs can help with that in the neat future but right now you hardly even need a CVE to compromise even well protected networks. A little phish here, a little abuse of legit software there, patiently look for services and credentials that aren't well protected and voila. Few actors use 0days and auto-updates have made 1-days hard to rely on


> I don't think so. Outside of labs, I can't remember needing memory safety exploitation or hearing about a major compromise that used such an exploit.

NSO group. Stuxnet.


I said "I" as in myself who is neither of those things.

I also said the following because of in part the point you are trying to make:

> I am not saying the security improvements are not real


You say that like you know for certain that this isn't part of such a thing!


If you know that there is such an initiative already in progress, speak up!


P.S. Subtle reference, love the product placement.


[flagged]


Good developers tend to embrace tools and automated constraints which prevent whole classes of bugs, and treat those “bumpers” as sensible defaults which can be opted out for well justified reasons. I’m not a rustacean, but that sounds like an apt description of Rust’s value proposition to such developers working in the problem domains it targets. And when good developers choose to work “closer to the metal”, whichever “metal” that might be, they tend to develop similar tooling and automation (“bumpers”) for themselves.

IME, the developers who eschew these kinds of abstractions usually have a higher estimation of their ability than the work demonstrates. And that’s to be expected, because software of any reasonable complexity is more than a human can reasonably fit into their brain to establish a mental model. If that wasn’t the case, we’d presumably see the most talented among us flipping bits directly.


What exactly leads you to believe these “better” developers aren’t already working for employers who benefit from these skills?

If they are already working elsewhere, what do you think happens when they go to Microsoft and leave their current employer?


"Get better" has never worked in the history of programming. Systemic problems require systemic solutions (although using value semantics in modern C++ has practically eliminated memory bugs for me)


Good Devs apply constraints to themselves.


Heh in the ~10min I spent typing up my response you condensed it to a six word sentence. Well done.


time has shown repeatedly that eventually any given programmer will make mistakes with memory management, no matter how diligent they are. it cannot be avoided; no developer has a flawless day every single day.

it's not a failing of the developer when it is known and well understood that humans make errors continually, and that this cannot be prevented.

computers exist entirely to help us with things we aren't good at, like memory management.


Where exactly are these better developers? Even DJB is known to produce faulty C code. And I struggle to come up with someone who surpasses him in C.


I am not qualified to says if he surpasses him but I am pretty sure that Fabrice Bellard is at least his equal.


Good example - and if we look at the ffmpeg security page[0], we can see dozens of CVEs going back to 2008, many related to manual memory management.

I know it wasn't you saying this, but "just be a better programmer" is a dumb response to improvements on the language side.

[0] https://ffmpeg.org/security.html


If Microsoft is getting value out of Rust, I hope they add Rust as a supported language in Visual Studio.


I don't know what it's like now, but back in the day devdiv (owners of Visual Studio, .Net, etc.) and Windows were highly siloed and it was hard to get them to align on a common way forward. Rust seems like the sort of thing devdiv would naturally pick up though.


They still are, that is how we got C++/CX replaced with C++/WinRT and to this day zero tooling on Visual Studio, beyond the COM stuff that goes back to COM early days.

Or like UWP was dropped for WinUI 3.0/WinAppSDK and there is no designer, or Native AOT doesn't do WinUI (.NET Native and C++/CX were under WinDev umbrella).


From an outsider’s perspective, it seems like MS (meaning the “new” MS under Nadella) is more aligned under VS Code than their traditional Visual Studio product.

I don’t doubt that Visual Studio will continue to be developed, but it would surprise me to see MS direct Rust resources specifically to it when they’ve already dedicated a lot of work towards Rust in VS Code.

(This is coming from someone who feels mild resentment at how much better the Rust editing experience is in VS Code than his preferred editors.)


VS is still very much a flagship product for Microsoft.


I didn’t say it wasn’t!

However, if I was MS, I would look at the prevailing winds: it’s good to have one flagship, and even better to have another that doesn’t have the historical (technical, sentimental) baggage of your older one.


> I didn’t say it wasn’t!

Sorry, I got the impression that you felt it was being slowly deprecated in favor of VS Code.

I don't think that's the case, is all.


> but it would surprise me to see MS direct Rust resources specifically to it when they’ve already dedicated a lot of work towards Rust in VS Code.

Embrace, or are we up to Extend?

But seriously, don't MS already have form on the .Net LSP? I can remember the furore but not the details, something about pulling back some VSCode functionality for Visual Studio.


Has Microsoft actually dedicated a lot of work to Rust in vsc? They haven't contributed significantly to the rust-analyzer project, which owns the vsc extension.

> The following companies contributed significantly towards rust-analyzer development: Ferrous Systems Mozilla Embark Studios freiheit.com

https://rust-analyzer.github.io/


I made a quick SDK for an API I built in C#, not too long ago. VSCode was surprisingly a much better experience.


Have you tried it on VS Code? Have found the Rust experience in VS Code to be pretty good.


Rust with rust-analyzer is IMHO very good nowadays, probably on par if not better than the experience with C++ on paid IDEs. The fact Rust grammar is context-free makes writing tools way easier because there's way less room for breakage, unlike when you write an extra > in C++ and parsing burns down in flames.


(Rust’s grammar is technically not context free but the part that’s context sensitive is rarely used and even more rarely at a depth greater than one, so you’re right in spirit but I feel the need to be a bit pedantic about it for fairness’ sake)


(I know, but if you take raw literals out of the equation it more or less is, thankfully you can implement raw literals as a big huge lexer hack and ignore it during parsing. There's also macros, which are a separate pickle, but then the user more or less expects they may cause some magic, sure it's vastly better than `bob<flop>(blaz)` which is not sane in any metric I can think of)


Visual Studio is on its way out, and I say that as someone that uses it professionally every single workday. They've been monkey-patching in features now for years, and it remains slow and bloated.

VSCode is the future and Microsoft's future. The current limited .Net support on VSCode feels like it is intentional to give Visual Studio another few years.

PS - Obviously Visual Studio will never ACTUALLY go away. I am talking about as its flagship product for current/future development, it will remain for things like Windows Forms and other dead-end tech.


I think VS Code is great for "light coding", especially Web development. I find it not so great for large projects. For example, there is no comparison at all between Java in VS Code versus IntelliJ.


It very much depends on how you like to work. If you are very terminal oriented IDEs are actually detrimental to productivity. I'm basically forced to use Rider at work now and I just can't stand it, I hate how it tends to hide critical details on how stuff works.

Visual Studio is very much the same for me, it also tries too hard to make nice what basically boils down to "run shell tasks". I've seen people get accustomed with those fancy features way too often, only to then lose view on how the stuff underlying works and relying on others for support.


VS code is a hodgepodge of plugins badly tied together which constantly spam you with change logs each time you open the app. Hopefully that’s not the future of IDEs and a Visual Studio replacement

Luckily Rider exists


I don't understand how Rider is superior to VSCode. I tried Rider a while ago and switched back to VSCode because it gave me the impression that I was acting as a beta tester. There were, or still are, silly bugs that forced me to reinstall it completely several times. You just can't have such bugs in your software product if its codebase is covered with tests and if there is a dedicated team of software testers. Additionally, it costs $149 for the first year. However, I should note that I'm not a power user, so perhaps Rider's built-in Resharper is actually a must-have feature for someone.


From a solution organization perspective, Visual Studio is still king in the .NET world. I’ve tried matching up my work patterns in VS Code and no matter how I twist it, VS Code always feels like a NodeJS web dev editor.

As for performance, I recently upgraded from an old Yoga 2 (VS became unusable) to the new Surface Laptop 5 and Visual Studio is smoking fast now.

I’ve heard great things about Rider but haven’t tried it.

I don’t see MS dropping Visual Studio any time soon. There are still a lot of us older coders that’d raise hell if they tried.

I also use the free Community version and it has everything I need.


Visual Studio is not going anywhere for Windows developers, or console game developers for that matter.

Even for .NET, what VS4Mac and VSCode can do is a tiny subset of VS capabilities for .NET development.

Naturally you can argue that anything MS, or game consoles are dead-end tech.


My experience with hobbyist C programming in VSCode (with C/C++ extension) is yet to match the experience of hobbyist C++ programming in VS. I'll try out CLion someday.

The biggest issues I noticed are:

- Lack of error indication e.g. used an indeclared variable

- Intellisense meh


There is an intellisense alternative for VS Code called clangd (made by LLVM developers) you might want to try, I use it heavily. You don´t need to use clang as your compiler. It scans my CMake output to perform indexing.


Thanks! Gonna try it out. Does it conflict with the official C/C++ extension?


You need to disable intelligence (there is a popup)


The C/C++ extension actually uses the same underlying engine as VS's C++ IntelliSense. If you're missing things like error indicators then it's likely you just need to configure it a bit more precisely- VS Code has a bunch of ways to do that, but they're all sort of DIY, while VS's is set up out of the box if you're using its project format or CMake support.


As another comment said, try Clangd, it's IMHO superior to intellisense on VSCode and supports anything that can emit `compile_commands.json`files.


Thanks man. TBH I never dug deep into those json files. I need to take a look.


CLion is pretty good! If you like JetBrians IDE's, you'll like CLion.


Thanks, I do! I use Pycharm and Datagrip a lot in my current life, both are excellent.


One of the worst takes I've ever seen on this site


Or it could be that .NET is ok its way out too


Very unlikely. Microsoft’s business clients would scream bloody murder, and much of their own internal products use .net expensively. Azure, Bing, and SharePoint are all built on .net.

But for things like Azure, I’m hopeful to see more node and rust support built in. The interesting part is figuring out which direction Microsoft goes for desktop software. I’m seeing more Electron based applications coming out now.


I very much worry about how invested MS is in .net. After programming in .net for 10 years I'm switching to node. Much to my coworkers dismay.


Why? As asp.net (core) developer who has to use node (nest.js) in my job, I dont think it is any better.


I don't think MS sees a growth for .net at this point

1. Back in .net 6, mere *days* before the final release they yanked a hot reloading feature from the dotnet CLI that is typically used by the massively more popular, but free, VS Code, and making the feature exclusive to flagship VS. To me this spoke of a leadership who didn't care about growing .NET with new developers but only to prop up their old creaky VS $50/mo from their established customer base that is locked in to C# from legacy code. They rolled that mistake back but it was a big warning sign / wake up call for me.

2. Speaking of hot reloading, their much touted hot reloading feature SUCKS. It has nothing on the JS guys. Hot reload is a massive deal. How is Blazor ever going to compete with the JS frameworks with a horrible hot reload story. Not to mention Blazor is way heavier in terms of KB and also is way slower rendering performance to boot. Can you even use tailwindcss with Blazor since the compiler relies on your framework being able to hot reload the css changes?

3. .NET MAUI by all accounts is just a massive disaster at this point.

4. Their latest Teams 2.0 UI rewrite is in *drum roll* react.

I dunno it all just speaks of an underfunded / understaffed group at this point whose top brass doesn't care about them beyond the meager, dwindling, VS pro subscriptions they get from them.


> 4. Their latest Teams 2.0 UI rewrite is in drum roll react.

React is also used for PowerApps UI and even when you can write your own components with PowerApps Component Framework, you can actually leverage platform libraries tor performance if you do it in React: https://powerapps.microsoft.com/en-us/blog/virtual-code-comp...

Overall I don't like your comment but I won't downvote because you bring up concerns that matter to YOU :)


Lost interest in NET MAUI without Linux platform support.


Now if they can make the Search work inside directories like it was until Windows XP. They somehow broke it in Vista and never fixed it until now.


It's really silly how much better Fast tiny 3rd party programs like everything search so much faster than windows search


I personally gave up on GUI file searching a long ago, it's one thing that sucks everywhere IMHO. I'll take CLI tools like locate or `fd` any day, they are IMHO quicker to use even on Windows.


Just use Everything (https://www.voidtools.com/), so much better than Windows Search.


The latest version supports searching file contents now, though it's slow because the contents isn't indexed.


my favorite FOSS goto for searching contents in Windows is AstroGrep - https://astrogrep.sourceforge.net


It doesn’t seem to index either, though?

I normally use Notepad++’s Find in Files feature, which covers most cases, and also supports replacing.


How many years and counting and MS still can't make search work? So much comments about how search is screwed up. For me I can't stand it sometimes doesn't find that I know there is. Or sometimes I don't really know that something is or is not there because I can't trust the search. And sometimes it just wont search. Or does it really slowly


https://www.youtube.com/watch?v=8T6ClX-y2AE

This is the announcement video. its about an hour long but much of it is very interesting.

"Adminless windows" sounds very interesting.


Ugh that whole talk is announcing a whole bunch of things that make it much harder for smaller devs to get started. They want a signing chain that will inevitably cost thousands of dollars just to get approved with some sort of annual fee tacked on as well, so as a independent dev you can either go through that rigmarole or just not target windows, or you need to walk your users through disabling the smart filter thing first...


adminless and passwordless windows: population 9001 (spyware telemetry entities)


> Beyond the presumed safety improvement, performance is said to be 5 to 15 percent faster for Shaping (substituting) glyphs with OTLS (OpenType Library Services). That's all available to developers now.

I'd be interested to know as to why a simple rewrite (from C++, no less) would give you this kind of upgrade.


I wonder if somehow someday Microsoft Windows can be rerooted as something like wine running in user space of a rust os like https://github.com/theseus-os/Theseus


The kernel space of Windows is not a problem, it has been fit-for-purpose and even good for the last 20-odd years. The problem is the deterioration in the UI and UX side of the OS (adware, spyware, hodgepodge of UI paradigms, spotty GUI settings apps, reset of preferences after any serious update, bad built-in tooling forced on the user etc).


It's the same shit at Meta. It's developer job security. Random group decides to rewrite in Rust but then breaks their shipped codebase for 3 weeks. They don't care. They have a new toy and don't care if they break things other people use.


Listen to them talking about it. This isn’t the sound of rogue group of developers focused on creating job security for themselves

https://youtu.be/8T6ClX-y2AE?t=2607


Developing in rust is so much easier than other languages I've used. Took a while to get the hang of it. It's honestly a wonderful experience. Can blindly follow types or hone in on performance critical areas without sketchy bugs.


[flagged]


Well, if you watched the video you would know. Since they are removing whole attack surfaces, which means reduced exploits, which means more security, which at windows scale means millions of dollars saved in prevented security issues for companies across the world.


The way Microsoft wrote code for 40 years was an attack surface. The absurdity and fragility of DCOM and ActiveX without meaningful security or development rigor.


Ok? What's the point? Do you want them to invent a time machine and not do that? Is it a bad idea to fix the issues?


Closing the barn door after the cows left. Too little, too late.


Yesterday would be better, but there’ll never be a better day than today.


How is it too late when they are continuously producing new products?


No, I want them to write better stuff going forward, which this seems to be.


It's like them finally admitting the emperor's new clothes is the king is naked.

Frankly, I don't care because Microsoft's ethos is one of hostility, neglect, and irresponsibility.


“Microsoft has a history of hostile neglect!”

[MS moves to responsibly maintain their codebase and patch previously overlooked vulnerabilities]

“Ugh why are they wasting all this tiiiime”


I actually kind of agree with you, but my god man. Nothing may change today except the ability to innovate forward without living in fear of breaking shit. This is huge.

We are on a site called hacker news. What the hack with the ignorant business value. The business view has always been bad at being a hacker, but don't rub it in! Your degradation is just more heaped abuse from the know-nots. Have some decorum, so we smart knowing shit hackers can bother to put up with you presence without being in pain.

Building a good technical foundation is almost never a clear business case. But I dare you to try not doing it.


> I actually kind of agree with you [...]

Then it escalated really quickly.


Microsoft spends a lot of money on software security, particularly on a large body of software written by them in unsafe programming languages; spending less money is good business.

Microsoft makes a great deal of money selling software to companies and governments that are increasingly convinced that memory safe software is in their best interests; maintaining those business ties is also worth a great deal of money.


…the roles in question at Microsoft are literal business value since that’s their jobs.

Chill out and read the article.


“…these kinds of bugs were at the heart of about 70 percent of the CVE-listed security vulnerabilities patched by the Windows maker in its own products since 2006.”

Associate a $ number to the average CVE then associate a reduction in total amount of CVEs.


On its own, absolutely none, which is true of everything that's not in service of a business solution. In general, how often are engineers held accountable for delivering direct business value anywhere rather than being subject to arbitrary favoritism performance ratings?


I can’t disagree more. I hesitate to hitch my cart to Rust, but rewriting all the old memory unsafe code is an inevitable need. This status quo of just patching things up every time an exploit appears is farcical when we have the tools to address these problems today.


I don't think it is so much about rewriting old code, rather new code is written in Rust.


I don't think it is so much about rewriting old or new code, rather code where security and stability are important is written in Rust.


This is how I feel about devs forcing rust onto web UI development (https://news.ycombinator.com/item?id=35722681). JS exists and people already make new toys (frameworks) for it every month. That's not enough and now we need rust on the frontend


I often advocate for the use of Rust, but it's not the right tool for every job. If one of these conditions is met, I'll happily use Rust:

  - Cannot have garbage collection
  - Need the raw speed of C
  - Need to provide a C interface
  - Already have a lot of Rust code in the project
Having said that, I very much prefer statically typed languages, and ones with a focus on correctness and safety. Rust and Haskell both fit that bill very well.


I care if it's written to be better and cheaper to maintain. I don't care about a new Javascript, Rust, or (latest religion) framework, this or that, unless the approach is better than what came before and it's part of sound governance. Code should not be religious, it should be functional and focused on its purpose.


Thankfully most of that seems self-contained to Reddit/HN in pet projects. My god what a disaster for the web that would be, shoving everything into WASM despite zero performance reason to.


Nobody's forcing anything. This is hacker culture, and it's what drives progress. Get over it.


Hacker culture is great. Hacker culture is less great at work when I have to deal with over-engineered over-abstracted code that's hard to debug and solves simple problems in seemingly simple ways until inevitably you have to dive in and dig deep to figure out wtf is actually happening and good luck (this is how i imagine debugging rust UIs on the web will be)

I'm just ranting. I know in reality not every abstraction is correct and often they introduce more problems than they actually solve, but every once in a while the hackers get it right and that's when progress is made. In the meanwhile us normies have to deal with the experiments of complex bs to solve simple problems


Rust is actually pretty kind about most abstractions. Sure you can write creepy rust, but it requires a lot more effort to write creepy rust than it does in say C. More chances that a code reviewer will say "ugh this is too fancy", whereas in C some pretty crazy stuff can seem so innocent.


if you aren't smart enough to understand it doesn't mean it is "over-engineered over-abstracted"


I think that "let people rewrite stuff in the new cool language/framework" is a deliberate strategy to retain midrange talent.


Because when I think of what would give me the most job security, it's to reimplement something that we already have and works perfectly fine on its own.


> Microsoft is busy rewriting core Windows library code in memory-safe Rust

They shall be busy fixing the bugs in Windows 10.


People still use GDI+? That brings me back.


I actually used raw GDI+ for the GUI of an app recently. My personal requirement was that the executable had to be completely stand-alone and only a few KB in size. GDI+ helped fit the bill for creating a simple GUI and eliminating dependencies on large frameworks, so that it could easily be copied around and guaranteed to work everywhere.

Wouldn't use raw GDI+ for any UI of moderate complexity, but it was perfect for the ghetto app I made myself.


Even in C#, the default image handling libraries use GDI+ still, even in .NET 7. There was some sort of attempt to make it work on Linux (via some sort of shim or comparability layer, I assume) but MS abandoned those efforts after .NET Core 3.1


Windows Remote desktop (display redirection) used a GDI xpddm driver up till Windows 19h1 (and Windows Server 2022). They switched to IddCx which is pretty cool to play with.


Yes, it is much easier to use than Direct2D.


I can now get personalized ads in my operating system in a memory safe way! Great


It's not fair to reduce the role of Windows to an advertising tool. It's also a spyware.


It's interesting we got to the point where I would gladly pay Microsoft a yearly fee for not introducing any (of what they call) innovations. I just need stability and security updates. Instead, in W11 I got all windows grouped without the possibility to ungroup them. And the main browser set up to their MSN page that actually serves fraud links as news. Could they go lower than that?


LTSC is basically that, but Microsoft basically won't sell it to you if you buy single licenses.

There are legal resellers, that sell them for about 200-300 USD per license.

There are shady resellers, that sell them very cheaply, but I always suspect they are pirated.


> sell them very cheaply, but I always suspect they are pirated.

You can't buy something for $100 and sell it for $20. Well, you can, but this is not a sound business model.

It's always amusing for me when people assume they can buy a 'legal' copy/key for $5-10 'somewhere on the net'.


Could be reselling from bankrupt companies though?


No, because these licenses are named ie contract is valid only between Microsoft and the $company. Any other entity is not the subject of the licensing contract.

The only way to resell something from a bankrupt company is to buy a retail box versions of software in the first place and LTSC is not available in retail.


It's a bit more complex than that. Windows licenses come in various variants with one common trait: they have very specific conditions regarding license transfer. Basically, Microsoft tries to define a very limited set of conditions that let you transfer the license. The problem is, not all of these are enforceable in all jurisdictions. One example:

> If you acquired the software from a retailer as stand-alone software, you may transfer the software to another device that belongs to you, but not more than one time every 90 days (except due to hardware failure, in which case you may transfer sooner).

It is highly probable that, if anybody cared, you could win such a case in a European court but I'm not aware if anyone bothered to try.


I suppose in the US the question is if it is indeed considered a license or a sale; wether or not the first sale doctrine applies.

I thought this had recently been ruled in favour of re-sale - but the newest i could find was on Vernor v. Autodesk, which goes the other way...

https://dunnerlaw.com/buyer-beware-the-threat-to-the-first-s...


[flagged]


Even Microsoft doesn't want the type of ads that memory safety vulnerabilities let in.


They need to re-do the entire Windows UI from the ground up. It's a disgrace today.


Uh-oh! I offended some Windows apologist!


Or alternatively, your comment is being downvoted for being random and off topic.


I downvote anyone complaining about downvotes.


Yeah, wouldn’t want anyone with a backbone on here.


lol


Rust is the best option for writing memory safe code.

However, it is far from a good option.

Rust is overly complex and this is going to bring its own issues to Windows.


What kind of issues?


Complexity is an issue.


Accidental complexity is an issue. If your problem is complex you can only have a chance in the battle by using some good abstraction — this is true in both math and CS.


Indeed. Perhaps some examples of how Rust is? Would be interested why you think that.


Because of the minimal standard library, the ecosystem is fractured into crates (similar to npm).

Also, rust forces you to actually model the complexity of the real world. I think the best example for this are the various string types. There is one for Interop with C, one for Paths, one that is utf-8 etc. Your code explicitly states what to do with edge cases. Edge cases that existed anyway but are now visible.

Don't get me wrong, I don't dislike Rust. It's an awesome language.


Compared to C/C++, I’d say the Rust standard library is very feature full. Regarding crates, they are super easy to pull in compared with random C/C++ libraries, and the crates generally are fairly high quality.


I would say that having methods like ‘and_then_if_finally()’ or ‘then_unwrap_or_none_and_map()’ (I’m barely exaggerating) is less of a feature than say, json parsing and hashing / crypto, which are sorely missing.

Tokio provides a bit more features, but is a damn huge runtime to depend on, and is creating fragmentation.

I feel rust needs to be taken out of the hands of former c++ devs / language theorists and put straight under responsibility of library design experts (rip off go’s stdlib for gods sake!)


JSON is handled well enough by serde which is extremely popular. Rust also has various string manipulation utilities and an insanely fast and high quality regexp implementation.

Crypto isn’t a core competency for the standard library authors and this it’s better to offload that responsibility. Maybe once a library becomes de facto in the community (ring?) then it might make more sense.

The other reason to prefer a smaller standard library is that it can often be difficult to strip away the unused bits from the resultant binary. Rust statically links which hopefully fixes most of this, but I suspect possibly not all.

I agree they need to standardize an effect system and unify async so that there’s a way to plug in arbitrary runtimes.


> Crypto isn’t a core competency for the standard library authors and this it’s better to offload that responsibility.

Ring is mostly C/Assembly. Good bye safety ! Offloading core security routines is such a bad design.

> The other reason to prefer a smaller standard library is that it can often be difficult to strip away the unused bits from the resultant binary.

It’s actually already handled quite well, and could also be behind feature flags.

Really, a standard library with feature flags and editions would make rust ridiculously much more productive… such a waste.


> Ring is mostly C/Assembly

Crypto needs to be written in Assembly to ensure that operations take a constant time, regardless of input. Writing it in a high level language like C or Rust opens you up to the compiler "optimising" routines and making them no longer constant time.

But you already knew this. And you also knew that the security audit (https://github.com/rustls/rustls/blob/master/audit/TLS-01-re...) of ring was favourable

> No issues were found with regards to the cryptographic engineering of rustls or its underlying ring library. A recommendation is provided in TLS-01-001 to optionally supplement the already solid cryptographic library with another cryptographic provider (EverCrypt) with an added benefit of formally verified cryptographic primitives. Overall, it is very clear that the developers of rustls have an extensive knowledge on how to correctly implement the TLS stack whilst avoiding the common pitfalls that surround the TLS ecosystem. This knowledge has translated reliably into an implementation of exceptional quality.

As a bonus, rustls is under active development with sponsorship from Google, AWS and fly.io, with the aim of making a high quality, memory-safe TLS stack - https://www.memorysafety.org/initiative/rustls/

You said

> a standard library with feature flags and editions would make rust ridiculously much more productive

What's the difference between opting into a library with a feature flag and opting in with a line in Cargo.toml? Let's say you want to use the de-facto regex library. Would it really be ridiculously productive if you said you wanted the "regex" feature flag instead of the "regex" crate?

I do agree that the standard library does need a versioning story so they can remove long deprecated functions. Where it gets complicated is if a new method is reintroduced using the same name in a later edition.


> What's the difference between opting into a library with a feature flag and opting in with a line in Cargo.toml? Let's say you want to use the de-facto regex library. Would it really be ridiculously productive if you said you wanted the "regex" feature flag instead of the "regex" crate?

If you are all alone developing and can audit the crate properly, sure.

However when there’s a team, whose favorite lib should we pick ? How many of them shall we audit ? What’s their license ? How’s code attribution managed ? Etc etc


Rust had a small standard library and that’s ok (https://blog.nindalf.com/posts/rust-stdlib/)

Your concerns are valid, but Rust has chosen a different set of trade offs. The link outlines the pros and cons.


For crypto, isn't assembly used to help prevent timing attacks? I don't know if that can be done in pure Rust.


At least let’s get rid of c and put this in stdlib, shall we ?


Crypto MUST NOT become part of the Rust standard library, because Rust's compatibility guarantees would have to be broken when any underlying cryptographic primitive or protocol is broken (or worryingly weakened) and needs to be removed. While that could be done with the "unsoundness fix" escape hatch, it's better to minimize such changes.

I'd like to see an "extension" library managed like the standard library (and shipped with Rust, included in the prelude, etc) but with SemVer, so that breaking changes could be made more easily.


Go and rust are not the same thing. Go is fine for what it is if you agree with it's opinions. But a lot of people are reaching for rust for different reasons then someone would reach for go. It makes sense that their std libs are different... Especially knowing the differences in the languages themself.

You aren't forced to use the more obscure methods in std. You can almost always recreate them with simple code and suffer no real performance or readability hits.

Former c++ devs are smart people. I work with some, and value their input specifically when designing libraries...


If I were a Rust user, I'd be really afraid of what will happen to Rust when Microsoft gets hold of it in any way. Just look at what happened to C++ after it got popular because of Windows/Visual C++ and the strong ties between Microsoft and WG21 in the 90s/00s.

To quote Steve Jobs: "They just have no taste."


This is FUD.

Microsoft doesn't "get a hold of" Rust by using Rust, any more than you get a hold of it by writing Rust programs.

With C++ they had their own compiler implementation, which obviously diverged. This is common for independent code bases. Whereas with Rust they're contributing positively to the main compiler implementation. Just one example - see all of Ryan Levick's (@rylev) contributions to Rust. Not just with code, or with tutorials, but in regular unglamorous work like running weekly compiler performance meetings.

Microsoft is also a platinum sponsor to the Rust foundation. The foundation uses this money to directly sponsor folks working on the language and library authors.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: