Hacker Newsnew | past | comments | ask | show | jobs | submit | tiberius_p's commentslogin

An HDL simulator written in Common Lisp.


I'm not up to date with the latest developments in C++ but would't it be straightforward to do something like "#pragma pointer_safety strong" which would force the compiler to only accept the use of smart pointers or something along those lines. Was anything like this proposed so far?


You might be interested in this talk[0] by a WebKit engineer on how they're implementing similar approaches using libTooling and their own smart pointer types.

For example, their tooling prevents code like this:

    if (m_weakMember) { m_weakMember->doThing(); }
from compiling, forcing you to explicitly create an owning local reference, like so:

    if (RefPtr strongLocal = m_weakMember.get()) { strongLocal->doThing(); }
unless it's a trivial inlined function, like a simple getter.

[0]https://www.youtube.com/watch?v=RLw13wLM5Ko


I was going to link to this.

My interpretation of Geoff's presentation is that some version of profiles might work, at least in the sense of making it possible to write C++ code that is substantially safer than what we have today.


Geoff's stuff is mostly about heuristics. For his purpose that makes sense. If Apple are spending say $1Bn on security problems and Geoff spends $1M cutting such problems by 90% that's money well spent. The current direction of profiles is big on heuristics. Easy quick wins to maybe get C++ under the "Radioactively unsafe" radar even if it can't pass for safe.

The most hopeful thing I saw in Geoff's talk was cultural. It sounds like Geoff's team wanted to get to safer code. Things went faster than expected, people landed "me too" unsolicited patches, that sort of thing. Of course this is self-reported, but assuming Geoff wasn't showing us a very flattering portrait of a grim reality, which I can't see any incentive for, this team sounds like while they'd get value from Rust they're delivering many of the same security benefits in C++ anyway.

Bureaucrats don't like culture because it's hard to measure. "Make sure you hire programmers with a good culture" is hard to chart. You're probably going to end up running some awful quiz your team hates "Answer D, A, B, B, E, B to get 100%". Whereas "Use Rust not C++" is measurable, team A has 93% of code in Rust, but team B scored 94.5% so that's more Rust, they win.


> Geoff's stuff is mostly about heuristics.

That's not true at all.

- The bounds safety part of it prevents those C operations that Fil-C or something like it would dynamically check. You can to use hardened API instead.

- The cast safety part of it prevents C casts except if they're obviously safe.

- The lifetime safety part of it forces you to use WebKit's smart pointers except when you have an overlooking root.

Those are type safety rules. It's disingenuous to call them heuristics.

It is true, however, that Geoff's rules don't go to 100% because:

- There are nasty corners of C that aren't covered by any of those rules.

- WebKit still has <10% ish code that isn't opted into those rules.

- WebKit has JITs.


I can't rationalize how "prevents... except" isn't still just heuristics.

r/cpp is full of people with such heuristics, ways that they personally have fewer safety bugs in their software. That's how C++ got its "core guidelines", and it is clearly the foundation of Herb's profiles. You can't get to safety this way, you can get closer than you were in a typical C++ codebase and for Geoff that was important.


> I can't rationalize how "prevents... except" isn't still just heuristics.

“Prevent something unless obviously safe” is a core pattern of rules in type systems. For example variable assignment in Java. If it’s possibly unsafe (RHS is not a subtype of LHS) then it’s prevented.

Are you saying Java and all of classic type theory is just heuristics?


I don't think that really accomplishes anything. If you interpret it broadly enough to meaningfully improve safety you have to ban so much stuff that no codebase will ever turn it on. It's a pretty straightforward locally-verifiable property as well, so people who really want it don't need a pragma to enforce it.


The problem with this would probably be that you usually have to use some libraries with C APIs and regular pointers.

You could compile your program with address sanitizer then it at least crashes in a defined way at runtime when memory corruption would happen. TCC (tiny C compiler initially written by fabrice bellard) also has such a feature I think.

This of course makes it significantly slower.


> pragma pointer_safety strong" which would force the compiler to only accept the use of smart pointers

You’d possibly just be trading one problem for another though - ask anyone who’s had to debug a shared ownership issue.


That's what the "Profiles" feature is. The problem is that any nontrivial real world program in a non-GC language needs non-owning reference types to perform well, and you can't express the rules for safe use of non-owning references without augmenting the language. People have tried. You need something more sophisticated than using smart pointers for everything. In the limit, smart pointers for everything is just called "Python".

What infuriates me about the C++ safety situation is that C++ is by and large a better, more expressive language than Rust is, particularly with respect to compile time type level metaprogramming. And I am being walked hands handcuffed behind my back, alongside everyone else, into the Rust world with its comparatively anemic proc macro shit because the C++ committee can't be bothered to care about memory safety.

Because of the C++ standards committee's misfeasance, I'm going to have to live in a world where I don't get to use some of my favorite programming techniques.


> You need something more sophisticated than using smart pointers for everything. In the limit, smart pointers for everything is just called "Python".

I don't see how that follows at all. What makes Python Python (and slow!) is dynamic dispatch everywhere down to the most primitive things. Refcounted smart pointers are a very minor thing in the big picture, which is why we've seen Python implementations without them (Jython, IronPython). Performance-wise, yes, refcounting certainly isn't cheap, but you just do that and keep everything else C++-like, the overall performance profile of such a language is still much closer to C++ than to something like Python.

You can also have refcounting + something like `ref` types in modern C# (which are essentially restricted-lifetime zero-overhead pointers with inferred or very simplistic lifetimes):

https://learn.microsoft.com/en-us/dotnet/csharp/language-ref...

It doesn't cover all the cases that a full-fledged borrow checked with explicit lifetime annotations can, but it does cover quite a few; perhaps enough to adopt the position that refcounting is "good enough" for the rest.


Python doesn't have lvalues in the way that C++ and Rust do. You can't refcount everything and still pass lvalues to subobjects. If lvalues to subobjects are important, you need borrow checking.


Yes, which is why I mentioned C# "ref" which is not refcounting but zero-overhead pointers with a simplistic borrow checker that covers most practical scenarios.


> Jython, IronPython

Both of which have modern, concurrent, parallel, and generational garbage collectors.


When you wrote "smart pointers" I naturally assumed refcounting specifically. If you include traced GC references in this category, then sure. Although even then languages like Java and Go are much closer in their performance to C++ than to Python, so even with this definition it's an odd comparison.


> In the limit, smart pointers for everything is just called "Python".

To be more precise, it's old Python. Recent versions of Python use a gc.

> And I am being walked hands handcuffed behind my back, alongside everyone else, into the Rust world with its comparatively anemic proc macro shit because the C++ committee can't be bothered to care about memory safety.

Out of curiosity (as someone working on static analysis), what properties would you like your compiler to check?


I've been thinking for a while now about using dependant typing to enforce good numerics in numerics kernels. Wouldn't it be nice if we could propagate value bounds and make catastrophic cancellation a type error?

Have you worked much with SAL and MIDL from Microsoft? Using SAL (an aesthetically hideous but conceptually beautiful macro based gradual typing system for C and C++) overlay guarantees about not only reference safety but also sign comparison restriction, maximum buffer sizes, and so on.


Please do this.

But first: we need to take step-zero and introduce a type "r64": a "f64" that is not nan/inf.

Rust has its uint-thats-not-zero - why not the same for floating point numbers??


You can write your "r64" type today. You would need a perma-unstable compiler-only feature to give your type a huge niche where the missing bit patterns would go, but otherwise there's no problem that I can see, so if you don't care about the niche it's just another crate - there is something similar called noisy_float


I can do it, and I do similar such things in C++ - but the biggest benefit of "safe defaults" is the standardization of such behaviors, and the resultant expectations/ecosystem.


> Rust has its uint-thats-not-zero

Why do we need to single out a specific value. It would be way better if we also could use uint-without-5-and-42. What I would wish for is type attributes that really belong to the type.

    typedef unsigned int __attribute__ ((constraint (X != 5 && X != 42))) my_type;


Proper union types would get you there. If you have them, then each specific integer constant is basically its own type, and e.g. uint8 is just (0|1|2|...|255). So long as your type algebra has an operator that excludes one of the variants from the union to produce a new one, it's trivial to exclude whatever, and it's still easy for the compiler to reason about such types and to provide syntactic sugar for them like 0..255 etc.


Those are the unstable attributes that your sibling is talking about.


Yeah of course I can put what I want in my toy compiler. My statement was about standard C. I think that's what Contracts really are and hope this will be included in C.


Oh sure, I wouldn’t call rustc a “toy compiler” but yeah, they’d be cool in C as well.


Dependent types in well-behaved, well-defined snippets of C++ dedicated to numeric kernels?

While I think it's a great idea, this also sounds like it would require fairly major rewrites (and possibly specialized libraries?), which suggests that it would be hard to get much buy-in.


To be even more precise:

> Reference counting is the primary mechanism of garbage collection, however, it doesn’t work when the references have cycles between them and for handling such cases it employs the GC.


I remember the anti-nuclear fever went viral in 2011 after the Fukushima nuclear accident caused by the Tōhoku earthquake and tsunami. I think the correct lesson to be learned from that experience is not to built nuclear power plants in places where they can be damaged by natural disasters...and not to call for all nuclear power plants around the world to be shut down.


Or if you build them there, build them so they can withstand that disaster.

There was another similar plant even closer to the epicenter, and it was hit with a (slightly) higher tsunami crest. It survived basically undamaged and even served as shelter for tsunami refugees. Because they had built the tsunami-wall to spec. And didn't partially dismantle it to make access easier like what was done in Fukushima.

Oh, and for example all the German plants would also have survived essentially unscathed had they been placed in the exact same spot, for a bunch of different reasons.


> Because they had built the tsunami-wall to spec.

If you're referring to the Onagawa plant, one engineer (Yanosuke Hirai) pushed for the height of the wall to be increased beyond the original spec:

> A nuclear plant in a neighboring area, meanwhile, had been built to withstand the tsunamis. A solitary civil engineer employed by the Tohoku Electric Power Company knew the story of the massive Jogan tsunami of the year 869, because it had flooded the Shinto shrine in his hometown. In the 1960s, the engineer, Yanosuke Hirai, had insisted that the Onagawa Nuclear Power Station be built farther back from the sea and at higher elevation than initially proposed—ultimately nearly fifty feet above sea level. He argued for a seawall to surpass the original plan of thirty-nine feet. He did not live to see what happened in 2011, when forty-foot waves destroyed much of the fishing town of Onagawa, seventy-five miles north of Fukushima. The nuclear power station—the closest one in Japan to the earthquake’s epicenter—was left intact. Displaced residents even took refuge in the power plant’s gym.

https://www.economist.com/open-future/2019/12/06/were-design...

https://en.wikipedia.org/wiki/Onagawa_Nuclear_Power_Plant#20...

https://en.wikipedia.org/wiki/Yanosuke_Hirai


Yes. And in Fukushima, they apparently actually lowered an existing natural barrier.

https://oilprice.com/Latest-Energy-News/World-News/Tepco-Rem...

In addition, they didn't have hydrogen recombinators, which for example are/were standard in all German plants. Those plants also had special requirements for bunkers for the Diesel backup generators so they couldn't be knocked out by water.


The point is not about "someone may not err" but about "someone may err", or more precisely "someone WILL err", coupled with the effects of such mistakes.

Failing to correctly design, build, exploit or maintain a wind turbine or solar panel isn't a big deal. Failing to do so on a nuclear reactor can become a huge and lasting disaster for many.


Wind turbines cause more deaths than nuclear reactors.

Fact.


It depends upon nuclear accident victims' estimation one choses to consider.

https://en.wikipedia.org/wiki/Chernobyl:_Consequences_of_the...

Moreover pretending that the words nuclear accident is not more dangerous than the worst wind turbine accident will be difficult.


You are making the very common "mistake" of comparing 1 nuclear accident with 1 wind turbine accident.

And are completely missing that you need a LOT more wind turbines, and these have a lot more accidents.

For example, wind turbine accidents killed 14 people just in one year, 2011. How many people were killed in the UK in nuclear accidents that year? That decade.

Ladder accidents kill ~80 people per year in Germany.

Google "avilability bias"


The various estimations of "victims of nuclear" also neglect victims from such accidents. In 2011 2 workers died while working to build the new EPR in Flamanville, and aren't officially (nor AFAIK anywhere) counted as nuclear victims.


> Or if you build them there, build them so they can withstand that disaster.

You can't build to withstand humans ignorance. You always can argue to do this or that, but if the responsible managers won't approve it, it's all just theory and good hopes. Even worse if the ignorance grows over time; because the last decades it worked out, surely it will work another decade or two...

That's why things like nuclear are so problematic, because small neglections can explode into cataclysmic events.


I read that Germans watched their local nuclear experts explain on TV what was happening while Japanese authorities were still in denial.

They had a stereotype of Japanese hypercompetence and seeing them fuck up and then try to cover it up in the middle of a disaster had an impact even on traditional nuclear supporters.


> in places where they can be damaged by natural disasters.

And places where they can be damaged by human actions as well.

That leaves so many places to build reactors, right ?


I think human actions are easier to predict and prevent than natural disasters. Earthquakes are the biggest deal breakers.


Current Ukraine would beg to differ.


Can it be made to work on Android from Termux or Userland?


this I'm unable to answer because I don't have the means to try, but i'd love to know if it's doable. edge tts is surely light enough to run since all the processing happens in the cloud. the basic setup is just python and ffmpeg. let me know if you get it running!


I work in hardware design and verification. I've seen many AI-based EDA tools proposed at conferences but in the team that I'm working now I haven't seen AI being adopted at all. Among the proposed tools that caught my attention: generating SystemVerilog assertions from natural language prompts, generating code fixes from lint errors, generating requirements, vplans and verification metrics from specifications written in natural language, using LLMs inside IDE's as coding agents and chat bots to query the code. I think the hardware industry will be harder to penetrate by AI because hardware companies are more secretive about their HDL code and they go to great lengths to avoid leaks. That's why most of them have an in-house IT infrastructure and they avoid the cloud as much as possible especially when it comes to storing HDL code, running HDL simulations, formal verification tools and synthesis. Even if they were to employ locally hosted AI solutions that would require big investments in expensive GPUs and expensive electricity bills: the industry giants will afford it while the little players won't. The ultimate goal is to tapeout bug-free chips and AI can be a great source of bugs if not properly supervised. So humans are still the main cogs in the machine here. LLMs and coding agents can make our jobs a whole lot easier and pleasant by taking care of the boring tasks and leaving us with the higher level decisions, but they won't replace us any time soon.


How can you count on someone who can't count?


Have you not seen Sam Altman on a well polished stage? Did he not look confident? That's your answer. Stop asking questions and learn to trust ChatGPT 5 because Sam Altman says it is now PhD level and he is scared. It's not like he says that every single time his company releases something that's no more than an iterative improvement.

ChatGPT 2.5 scared Sam Altman so much a few years ago. But he got over it, now he calls it a toddler level intelligence and is scared about this current thing.

Get onboard the AI train.


Will we have to share our ID when we connect to Tor too?


I expect Tor will just be completely blocked in UK/EU/Australia.


That’ll come as part of a VPN ban…


Librewolf works fine for me. Comes with uBlock Origin installed.


I still suck at this game, even with all the help.


This is a great way to gather labeled training data for a neural network that can guess in which year a photo was taken.


A photo archive with actual correct date labels is even better (the thing which this game is built on.)


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: