Hacker Newsnew | past | comments | ask | show | jobs | submit | sandruso's commentslogin

https://rybarix.com

New year, new website to keep it simple.


Use of gun should be the last resort. Can you imagine to shoot somebody like that?

This is the way American law enforcement acts all the time, usually around black people. I guess it's just shocking when it happens to a white person.

[flagged]


If I where the officer in front of a reving car my instinct would be to move out of the way, not pull my gun and fire at the driver.

[flagged]


You are wrong on several points. It is policy not to shoot a car like that even if it is driving at the officer (which this vehicle was not doing and can clearly be seen from multiple camera angles). Shooting the driver will not stop the car, it will just kill the driver.

And in this case, shooting her made the situation much more dangerous. It caused her dead foot to stomp the gas pedal, dangerously accelerating the car uncontrollably until it crashed on the side of the road.

An officer yes, the person you're replying to, no.

Minnesota is a duty to retreat state (in public).

Some animals are more equal than others.


IANAL but I imagine that does not apply to traffic stop.

To be clear, I was saying it applies to "an officer." But not a civilian like you or I, since MN is a rare state with duty to retreat.

Presumably a traffic stop is always performed by an officer.


The guy who was involved in an identical incident a couple of months ago? Standing in front of a car in a tense situation against all logic and procedure, again?

It's surely "last resort" just as it's "last resort" for me to pull out a gun as a "last resort" in a fight I caused.

Either he is severely mentally incapable or he was looking for a reason to murder someone. The good ol' shoulder bump in the dancefloor.

What's crazier is that a clear swerve to drive away is being sold as an attack, and worse, it justifying 4 point blank shots to the face.


he didn't even limp away, far from last resort territory.

I'm in the exact same situation as the author.

It's not productive but hell it is rewarding to dive deep into lower levels.

No regrets so far.


What was you approach? Where one starts such a project?


Early boot and early page table setup, logging single characters to simulated serial console.

Then keep adding tests/features/tests/features/...


I missed last train due to delays and there was a group of in the same situation. One nice person offered me to that I can sleep on their couch. And they were so nice to give me a ride to the station the next day.

I was so angry at first when I found out that this was my last train and I missed it but it turned out to be great story I can tell :)

Thank you strangers, I'll repay it back to somebody in the future


I'm betting against wasm and going with containers instead.

I have warm pool of lightweight containers that can be reused between runs. And that's the crucial detail that makes or breaks it. The good news is that you can lock it down with seccomp while still allowing normal execution. This will give you 10-30ms starts with pre-compiled python packages inside container. Cold start is as fast as spinning new container 200-ish ms. If you run this setup close to your data, you can get fast access to your files which is huge for data related tasks.

But this is not suitable for type of deployment Cloudflare is doing. The question is whether you even want that global availability because you will trade it for performance. At the end of the day, they are trying to reuse their isolates infra which is very smart and opens doors to other wasm-based deployments.


it's back on

but wow, it must be stressful to deal with this


The ongoing issue is the maintenance.

This can't be solved without fully trusting the LLM period.

Just don't autopilot on important code you want to own. That's good start.


Related submission to the mentioned No AI December https://news.ycombinator.com/item?id=46098433


One of the guys running the challenge here.

No cloud based AI is the hardcore version of it for sure.

I hope local models will fill 80% of use-cases so we are not tight to the big guys.

How long are you using your setup?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: