Hacker Newsnew | past | comments | ask | show | jobs | submit | more RVuRnvbM2e's commentslogin

Notion is horrendous for this. Hiding every control behind an invisible hover target. No, I don't want my company documentation to have a minimalist aesthetic. I just want to use it.


By Cloudflare inserting themselves as a "HTTP market maker", is this step one in the enshittification of the web?


First step? I think we're a few thousand steps along that way already, and rapidly approaching our final destination.


This is a good story and something everyone should experience in their career even just for the lesson in humility. That said:

> Here's the technical takeaway: Never use CASCADE deletes on critical foreign keys. Set them to NULL or use soft deletes instead. It's fine for UPDATE operations, but it's too dangerous for DELETE ones. The convenience of automatic cleanup isn't worth the existential risk of chain reactions.

What? The point of cascading foreign keys is referential integrity. If you just leave dangling references everywhere your data will either be horribly dirty or require inconsistent manual cleanup.

As I'm sure others have said: just use a test/staging environment. It isn't hard to set up even if you are in startup mode.


> The point of cascading foreign keys is referential integrity.

Not quite. Databases can enforce referential integrity through foreign keys, without cascading deletes being enabled.

“On delete restrict” vs “on delete cascade” still enforces referential integrity, and is typically a better way to avoid the OP’s issue.


Thanks for your takeaway. Yes the dev environment is definitely a must as soon as you start growing!


Reduces the vertical space by vastly increasing the horizontal space and inserting line noise into the syntax. I don't even understand what that code would do.


NVD != CVE


NIST owns the budget for both NVD and CVE, contracting MITRE to operate the CVE program.

NIST budget was cut 12% in FY 2024 (Oct 2023 - Sep 2024).

An earlier bill to supplement NIST funding has been reintroduced in 2025, https://fedscoop.com/public-private-partnerships-bill-nist-h...


There is nothing in that article mentioning funding reductions.

That article is about how the volume of software vulnerabilities are increasing, resulting in difficulty keeping up by the CVE and NVD projects.

Please stop spamming this thread with political spin.


Both CVE (MITRE contract) and NVD are funded by NIST, https://www.securitymagazine.com/articles/100795-understandi...

> Since February 2024, the National Institute of Standards and Technology’s (NIST) National Vulnerability Database (NVD) has encountered delays in processing vulnerabilities.. caused by factors such as software proliferation, budget cuts and changes in support.. NIST, an agency within the United States Commerce Department, saw its budget cut by nearly 12% this year.


Reading that article closely it says nothing about an NVD budget cut, only a NIST one. They were trackijg the changes after NIST's budget was cut, not NVD's. As pointed out below, CISA announced a cut and then NIST more than made up for it by reallocating funds, for an NVD funding increase, even though NIST had their overall budget cut.


One of your references has budget numbers that are two orders (?!) of magnitude higher than the CISA number. Hopefully someone can chime in with granular historical data for NIST NVD and MITRE-via-NIST CVE funding.


Wow I never knew this!


Yeah, if you just load normal epubs it defaults to an old version of Adobe Digital Editions unfortunately.


Yes, though I understand Kobo is working on correcting these issues with the epub format.


Are they? Where have you heard that?

Recently Calibre was updated to convert things to kepub when loading to Kobo devices - see https://www.omgubuntu.co.uk/2025/03/calibre-update-convert-k... - but I haven't anything about Kobo itself doing anything to improve this.


Were the walls you hit caused by Fastly's bot detection? I've found it to be quite accurate.

On the other hand CloudFlare and Akamai mistakenly block me all the damn time.


It's not like they say, but it's at least three different implementations and I don't think any were cloudflare because I've been running into those pages for years and they've got captchas (functional or not). One of them was Akamai I think indeed


Yeah, I definitely don't want to pivot this thread into a product pitch, as the important thing is helping the open-source projects, but we can work with the maintainers to tune the systems to be as strict/lax as preferred. I'm sure the other services can too, to be fair.


The underlying issue is that many sites aren't going to get feedback from the real people they've blocked, so their operators won't actually know that tuning is required (also, the more strict the system, the higher percentage of requests will be marked as bots, which might lead an operator to want things to be even more strict...)


I will say -- a higher-end bot detection service should provide paper trails on the block actions they take (this may not be available for freemium tiers, depending on the vendor).

But to your point, the real kicker is the "many sites aren't going to get feedback from the real people they've blocked" since those tools inherently decided that the traffic was not human. You start getting into Westworld "doesn't look like anything to me" territory.


I'm not into westworld so can't speak to the latter paragraph, but as for "high-end" vendors' paper trail: how do log files help uncover false blocks? Any vendor will be able to look up these request IDs printed on the blocking page, but how does it help?

You don't know if each entry in the log is a real customer until they buy products proportional to some fraction of their page load rate, or real people until they submit useful content or whatever your site is about. Many people just read information without contributing to the site itself and that's okay, too. A list of blocked systems won't help; I run a server myself, I see the legit-looking user agent strings doing hundreds of thousands of requests, crawling past every page in sequence, but if there wasn't this inhuman request pattern and I just saw this user agent and IP address and other metadata among a list of blocked access attempts, I'd have no clue if the ban is legit or not

With these protection services, you can't know how much frustration is hiding in that paper trail, so I'm not blocking anyone from my sites; I'm making the system stand up to crawling. You have to do that regardless for search engines and traffic spikes like from HN


In which HN deduces the existence of Institutional Racism from first principles.


Stoked to see that ICANN reference implementations are now being written in rust!

https://github.com/icann/icann-rdap


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: