Hacker Newsnew | past | comments | ask | show | jobs | submit | Fice's commentslogin

> We need a way to embed project metadata into .git itself, so source code commits don't mess up with wikis and issues.

Fossil (https://fossil-scm.org) embeds issues, wiki etc. into project repository.


Also Radicle, evidently


They can't distribute firmware blobs simply because FSF and GNU do not in principle participate in distribution of any non-free programs.

Also consider that if a manufacturer can distribute opaque firmware updates to your system, it practically has remote control over it, е.g. Intel can activate a backdoor in specific CPUs when needed by publishing a microcode update.


What is more risky to you: Leaving known vulnerabilities such as spectre unpatched or the possibility of Intel adding a backdoor for some unknown purpose that wasn't present in the shipped hardware?


The former is more risky from the security point of view. The latter is more risky from the freedom point of view. (And, while an FSF supporter, I choose to be more secure.)


Vulnerabilities such as spectre are only relevant if you run untrusted non-free software. Also these vulnerabilities show that sandboxing is not effective on current CPUs, and specific mitigations does not solve the problem in general.


As a Russian I'd rather call it the Putin's war.


What would you call the other wars then?

- Against Georgia (2008)

- against Ichkeria, twice (1994, 1999)

- against Afghanistan (Dec 24, 1979 – Feb 15, 1989)

Just to name a few. ‘As a Russian, I’d rather avoid any responsibility for the actions of my government and play a victim card, telling everyone I didn’t vote for it, so it’s everyone else’s problem.’ Yeah, sounds very Russian to me. Very Russian indeed.


From my own experience, JPEG quality and compression efficiency can differ a lot depending on the encoder implementation. It would make more sense to compare specific encoders rather than formats in general.

In 2014 (WebP was released in 2010) Mozilla claimed that the standard JPEG format is not used to it's full potential [1] and introduced mozjpeg project that is still being updated [2]. I wonder how it compares today with current WebP implementations.

[1] https://research.mozilla.org/2014/03/05/introducing-the-mozj... [2] https://github.com/mozilla/mozjpeg


Of course, no one will be making cars intentionally unsafe, but the law is also a risk factor, the risks of getting fined, jail time or licence revocation count too, and these risks can be made higher.


Advertising does not simply suggest you something that you might need, it often tries to manipulate you into needing something, and with the amounts of personal data being collected and advancements in machine learning this manipulation becomes dangerously effective.


Telling computers and humans apart is a wrong goal. Every request comes from a computer that is commanded by some human. And why shouldn't users be allowed to use automated user agents when they don't do it for spamming or anything malicious?

CAPTCHA is essentially a proof-of-work variant where challenges are designed to be solved by humans rather than computers, and same as any PoW it works by means of consuming some limited resource (human time, processor time, energy).


A lot of times the purpose is more on rate limiting than disallowing bot access. The goal to tell apart is on the premise that humans are a lot slower than bots.


In our SaaS we have usage limits and rate limits. Have never needed to implement "bot detection" for this reason


How do you rate limit a botnet coming from tens of thousands of different IP addresses?


For anonymous/free users we have very strict usage limits and the functionality is more limited to only operations that cost us less money. So a very targeted attack would do damage but that is true of basically any system and we could flip on bot blocking in Cloudflare if needed and if that would help


Cloudflare's bot blocking uses CAPTCHA... By your own admission, the only reason you don't have a CAPTCHA is that you haven't needed one yet.


Again, we have rate limits and usage limits in place. You know that you can pay to have Captchas automatically solved, right? It's not the solution to all problems. Obviously if a targeted DDOS happens then some changes would be required.

Also, that is no longer the case that Cloudflare uses Captchas for bot blocking. That's the legacy mode


The fact that you can pay for both doesn't make them equivalent. To have a similar cost for spammers, you would need to request a challenge that takes many minutes to solve, which you just can't do. There is a strict limit on how long a user will wait for your security check and you can't pretend otherwise.

Let's stop pretending that all things are in the same bucket because "you can pay to have it solved". That's such a weird claim. For the right price you can have someone rob a bank for you, that doesn't mean it's as safe as your $2 padlock.


Way to completely miss the point

At this point you are just arguing for the sake of it. What is it you are even trying to debate at this point?


The point is way upthread, it's literally the top comment on this submission. I don't know where you got lost on the way.


We already do rate limiting. We don't need a captcha that can be automated away for that.


I always figured that CAPTCHAs worked because they limited on a resource that was harder to steal - human attention.

Rate limit by IP, and you get attacked by a botnet that "steals" IP addresses with malware.

Rate limit by PoW and you get people stealing AWS accounts, or using aforementioned botnet. See bitcoin mining.

Rate limit by CAPTCHA and you have to get a lot more clever (see things like setting up porn sites and proxying CAPTCHAs there)

So while you can pay to have CAPTCHAs solved, you actually DO have to pay and can't just steal your way in, so it means your target has to be more valuable.


> So while you can pay to have CAPTCHAs solved, you actually DO have to pay and can't just steal your way in, so it means your target has to be more valuable.

None of these things you listed above are available for free. They all require either effort to obtain or paying someone to do the work.


Someone did the math down thread: https://news.ycombinator.com/item?id=37056504

Unless you set your challenge to many minutes of work, you are not competitive with the human-centric solutions.


Can you steal AWS accounts with no effort?

And keep stealing them after you get blocked on the first ones?


The main goal usually like anti-spam or anti-scraping.

Some shop (for example, concert ticket-selling) have very limited supply and high demand, and don’t want automation in buying.


I see you don’t understand why people make websites or systems. Or why people make bread.

I don’t make application so that users benefit or to make them happy. I make applications so that I can earn money.

Earning money requires having human on the other side. Just like you are not making bread to make bread and throw it into a shredder.

If someone has scheme where automation is beneficial they will create API for their system. You should use API if I provide one. But when I create UI then I create it for people to use it.


> I don’t make application so that users benefit or to make them happy. I make applications so that I can earn money.

This is why most commercial software is so bad.


And open source maintainers are burning out or writing rants how no one wants to pay.

There is no “non commercial software” that is better even if commercial is bad it is still better than non existing one.


Why not both, make money and benefit people. I think that’s what earning money means. Otherwise you’re just making money at someone else’s cost.


You always have to do software in a way that people will benefit because otherwise they will not pay.

Read again my down voted post and think about the sentence in context of post where "Fice" wrote: "Telling computers and humans apart is a wrong goal.".

Then add to that topic of CAPTCHA and that CAPTCHA is annoying for users so adding CAPTCHA is not beneficial for users so it specific case and discussed in context.


There is also Simutrans-Extended (https://bridgewater-brunel.me.uk/) — a fork that adds even more detail and realism to the simulation and tries to achieve better game balance.


> Human-sized computing: a reasonable level of complexity for a computing system is that it can be entirely understood by a single person (from the low-level hardware details to the application-level quirks).

"Personal Mastery: If a system is to serve the creative spirit, it must be entirely comprehensible to a single individual."

Dan Ingalls. Design Principles Behind Smalltalk, 1981. http://worrydream.com/refs/Ingalls%20-%20Design%20Principles...


Software vendors can deliberately leak "pirated" copies of their programs to Russian torrent sites with backdoors and other malicious code added. That could be an effective countermeasure.


Are you aware that torret tracker sites are heavily invested in uploader reputation? It doesn't matter if you create a new user to upload a torrent with a backdoor. Nobody is going to download it because it's coming from an untrusted uploader.


How do you prevent that they don't spread to other non russian torrents?


You don't. That's a way to punish all users of pirated software.


Thats still a crime.

You can't rob someones house even if they they are a burgler.


Can a vendor be held responsible for anything an unauthorized pirated copy of their software does?


Can a car manufacturer be held responsible for built-in features designed to maim or kill unauthorised users?


If he put the malware in, yes.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: