> Some controllers are originally painted with a rubber-like cover that, unfortunately, degrades with time and becomes a sticky gooey. I usually deal with it with the help of Methanol. It nicely removes it.
I have some products like that and I despise them. Maybe I should try methanol.
I was going to comment on this too. I notice this happens with what feels like more traditional plastics - what exactly is going with these? It feels like over time they are breaking down and liquefying, or releasing their oils?
Same thing happens with a somewhat expensive musical instrument brand that keeps using that plastic for their buttons.
As far as I can tell, it breaks down slower the more you use it, must be interacting with oils/something from human fingers, as I only have that happen for things that remain in storage for months/years at a time, but the gear with that sort of plastic that I use every day/week doesn't have that happening.
Yeah I had an official silicone iPhone case that was being used for about 8 months, replaced it with a third party leather one about a month ago and already within that time I noticed that the original one has gone all slimy just like those old plastics. There must be something about using it day to day that keeps it from breaking down.
With rubber products, it’s usually the plasticizers leaking over years. I have learned this the painful way (massive migration of plasticizers from the underside of my mousepad to other things), and now actively avoid any rubber products, usually in favour of silicone instead.
AIUI, they generally do all of that at the beginning.
Another approach, I suppose, could be to have it generate a second pass? Though that would probably ~double the inference cost.
> I think the microbes are still trying to figure this one out.
They mostly figured it out a couple billion years ago. Cyanobacteria oxidized Earth's surface until the atmosphere was flooded with molecular oxigen, that gets turned to ozone in the stratosphere, filtering most UV. Pretty large engineering feat for a bunch of microbes.
You are correct, however most of the harmful rays get filtered out in the upper atmosphere. Far-UV doesn't reach Earth, only UV-A and small amounts of UV-B (if the ozone layer is more or less intact that is!).
The problem here is that there might be a bug fix or even security fix that is not backported to old versions, and you suddenly have to update to a much newer version in a short time
> It configured also non-existent drivers, and for some reason it enabled monkey test support (but not test support).
If it doesn't have the underlying base data, it tends to hallucinates. (It's getting a bit difficult to tell when it has underlying data, because some models autonomously search the web). The models are good at transforming data however, so give it access to whatever data it needs.
Also let it work in a feedback loop: tell it to compile and fix the compile errors. You have to monitor it because it will sometimes just silence warnings and use invalid casts.
> What am I doing wrong? Or is this really the state of the art?
It may sound silly, but it's simply not good at 2D
> It may sound silly, but it's simply not good at can2D.
It's not silly at all, it's not very good at layouts either, it can generally make layouts but there is a high chance for subtle errors, element overlaps, text overflows, etc.
Mostly because it's a language model, i.e it doesn't generally see what it makes, you can send screenshots apparently and it will use it's embedded vision model, but I have not tried that.
Anubis doesn't target crawlers which run JS (or those which use a headless browser, etc.) It's meant to block the low-effort crawlers that tend to make up large swaths of spam traffic. One can argue about the efficacy of this approach, but those higher-effort crawlers are out of scope for the project.
wait but then why bother with this PoW system at all? if they're just trying to block anyone without JS that's way easier and doesn't require slowing things down for end users on old devices.
reminds of how wikipedia literally has all the data available even in a nice format just for scrapers (I think) and even THEN, there are some scrapers which still scraped wikipedia and actually made wikipedia lose some money so much that I am pretty sure that some official statement had to be made or they disclosed about it without official statement.
Even then, man I feel like you yourself can save on so many resources (both yours) and (wikipedia) if scrapers had the sense to not scrape wikipedia and instead follow wikipedia's rules
If we're presupposing an adversary with infinite money then there's no solution. One may as well just take the site offline. The point is to spend effort in such a way that the adversary has to spend much more effort, hopefully so much it's impractical.
I have some products like that and I despise them. Maybe I should try methanol.
reply