Hacker Newsnew | past | comments | ask | show | jobs | submit | impulsivepuppet's commentslogin

While I circumstantially agree, I hold it to be self-evident that the "optimal amount of grift is nonzero". I leave it to politicians to decide whether increased oversight, decentralization, or "solution X" is the right call to make.


A little grift is expected. The real problem for us is when it's grift all the way down, and all the way up, to the extent even the President is grifting. Leaving it to the politicians in that case just means enabling maximum, economy-scale grift.


I can't help but think that this is a "I have nothing to hide" argument. It's quite sisyphean to keep accounts perfectly segregated, therefore there's always a chance that personal information can be traced back and pieced together; which, in turn, has "boring-old security" implications: i.e., now someone possibly knows your habbits and times when you are at work


my "personal" information there is as personal as my profile here


Related and likely inspired by the related thought: https://www.mattmahoney.net/dc/


Specifically 1.4 "Compression is an Artificial Intelligence Problem" https://www.mattmahoney.net/dc/dce.html#Section_14


On the topic of working hours, flexitime is highly addicting and I cannot imagine anything that's better for a software developer. Clock in, have meetings, write code, commit, clock out. Overtime? Just leave early without asking your boss. It just makes sense. Plus, the negotiated working hours per week / working days / mandatory hours can be set to whatever value that makes sense.

Nobody is paying you to sit, people care about the working product.


I find it quite intriguing to introduce "language-native" matrices and 2d blocks (which I still find difficult to wrap my head around.)

The reason why most people would more intuitively consider a music score as multidimensional has to do with parallelism or concurrency.

In theory, nothing is stopping you from creating a hyperarray language a la BQN++ (or dare I say QRP). Maybe I glossed over an example, but having proper pointwise application to hyperscalars feels like a must-have.

Second idea is to introduce process parallelism, which could actually make this form of syntax into an execution graph of sorts--could be quite promising!


Looking at the software development today, is as if the pioneers failed to pass on the torch onto the next generation of developers.

While I see strict safety/reliability/maintainability concerns as a net positive for the ecosystem, I also find that we are dragged down by deprecated concepts at every step of our way.

There's an ever-growing disconnect. On one side we have what hardware offers ways of achieving top performance, be it specialized instruction sets or a completely different type of a chip, such as TPUs and the like. On the other side live the denizens of the peak of software architecture, to whom all of it sounds like wizard talk. Time and time again, what is lauded as convention over configuration, ironically becomes a maintenance nightmare that it tries to solve as these conventions come with configurations for systems that do not actually exist. All the while, these conventions breed an incompetent generation of people who are not capable of understanding underlying contracts and constraints within systems, myself included. It became clear that, for example, there isn't much sense to learn a sql engine's specifics when your job forces you to use Hibernate that puts a lot of intellectual strain into following OOP, a movement characterized by deliberately departing away from performance, in favor of being more intuitive, at least in theory.

As limited as my years of experience are, i can't help but feel complacent in the status quo, as long as I don't take deliberate actions to continuously deepen my knowledge and working on my social skills to gain whatever agency and proficiency that I can get my hands on


People forget how hostile and small the old Internet felt at times.

Developers of the past weren't afraid to tell a noob (remember that term?) to go read a few books before joining the adults at the table.

Nowadays it seems like devs have swung the other way and are much friendlier to newbs (remember that distinction marking a shift?).


Org mode offers so much more than just syntax. You can use org files as a calendar, a todo/issue tracker with time accounting, a diary/knowledge base (zettelkasten, org-roam), as a literate programming tool (think jupyter code notebooks but for practically any programming language with org-babel), or a publishing tool (static site generator, latex/pdf export) all at the same time.

To be quite frank, Org mode is a lifestyle which existed long before Notion or Obsidian did. Saying that it has a barrier to entry is a bit of an understatement.

Having said all that, quite ironically, I've migrated over to Obsidian because I started using Intellij more for work, meaning that I don't need Emacs for its other capabilities all that much.


Since I don't often write raw SQL, I can only assume the author named their CTE `deleted_tasks` to elucidate that the query might delete multiple items. Otherwise, it makes little sense, for they intended to "pop" a single row, and yet their aptly named `deleted_tasks` ended up removing more than one!

The query reads to me like a conceptual mish-mash. Without understanding what the innermost `SELECT` was meant to accomplish, I'd naturally interpret the `WHERE id IN (...)` as operating on a set. But the most sacrilegious aspect is the inclusion of `FOR UPDATE SKIP LOCKED`. It assumes a very specific execution order that the query syntax doesn't actually enforce.

Am I right to think that not avoiding lock contention, i.e. omitting `SKIP LOCKED` would have actually produced the intended result?


DELETE with an overly-broad operator in the WHERE clause and no explicit limit: check, non-trivial subquery in the WEHERE: check. This should not have passed code review, let alone have been caught in production.

I will give OP the benefit of the doubt and say that automated testing did not catch this because the optimisations depend on table statistics, and not because it was not appropriately covered.


I admire your deontological zealotry. That said, I think there is an implied virtuous aspect of "internet vigilantism" that feels ignored (i.e. disabling a malicious bot means it does not visit other sites) While I do not absolve anyone from taking full responsibility for their actions, I have a suspicion that terrorists do a bit more than just avert a greater wrong--otherwise, please sign me up!


I find it hard to believe that vigilance can be opted out of, unless—sarcastically speaking—you leave the company before the lack of vigilance becomes a problem.

I somewhat agree with your take but that "free lunch" is paid by a disciplined use of lifetimes, somewhat contradicting the claim of "removing vigilance and discipline by using Rust's type system/borrow checker". In my worldview, type systems are in fact a compiler-enforced discipline, but I see how productivity can be boosted once problem areas become more visible / less implicit. Problems don't really disappear, they only become easier to scan through.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: