Hacker Newsnew | past | comments | ask | show | jobs | submit | fnands's commentslogin

Yeah, I understand it's probably part of their fraud protection, but feels weird that they get my GPU info when doing a payment.

Seems very unrelated.

Anyone who works on fraud protection who can explain how this info is used?


Not exactly on the backend, but I worked on the frontend (SDKs) at a previous employer whose product offering was fraud detection literally. Over the period of those years, I realised the team wanted "get whatever you can" and then just kept it and used it as needed. A few things I recall - heuristics, some matches with data sources they had of fraudulent actors, et cetera. I am talking about the time when "AI" as we know it was just picking up, and that company was actually calling these systems ML-backed. They pivoted to "AI" as soon as the term became more commonplace, and in the beginning it was just the name change, but I am sure they'd have changed the systems as well, or I hope so.

I can tell you that any kind of "abnormal" combination of system metadata (basically sysinfo) was technically frowned upon by that team, and of course, the system was designed by that team. So, say you had a rooted Android (we had solutions for all devices out there; pretty much) - naughty boy, the system suspected you of spoofing GPS - instant reject, disabling GPS - it was not a mandatory permission in the app (and we asked for it only for some clients) – but it didn't like it, you had changed the default resolution of the system - suspicious, we also captured typing/tapping speed (not only for text entry but also for interacting with the interface) - too fast was considered weird because you were not supposed to have known our interface (because it was interact once or twice in a lifetime or years, kind of thing).

I am speaking more from memory of new joinee intros and rare discussions with the team. The team was kinda "different," so other teams just wanted to avoid them and also wanted them to stay away from other teams. So a lot of things might not sound exciting, might not be accurate either and these are not technical observations anyway.

Another aspect I just remembered. Say you had an app list (oh, we read that too) that matched with known fraudulent actors datasets, you had app(s) that showed you were not well off (we served a lot of instant loan givers around the world), you had an old phone, your OS was very old – all these things were taken into account, along with your PII (which were of course mandatory), when their backend received the data and we gave the final reco/score to the client's system in the API response.


Thanks!

The app list one for loans is wild (but I can see it).


The problem is they have the ability to get it to begin with. The browser or OS should prevent this.

Very likely looking for VMs or other weird signals. Doesn't make it right for a regular user doing nothing wrong.

Can confirm. I was running through a forest in Berlin and saw a raccoon in a tree.

Was a little confused, but apparently quite a few around here.


Blast from the past.

I remember landing on this site when studying for my undergrad solid state physics exams.


It's an interesting result, but yeah, not a room temperature superconductor.

For that matter, we've had superconductors for decades that work at much higher temperatures than this one.

It seems the breakthrough is that you could use familiar semiconductor manufacturing processes. However the temperature is still going to be a major issue. I don't want a computer that requires liquid helium cooling.

> I don't want a computer that requires liquid helium cooling.

True, but I /can/ see someone, such as Sandia National Labs, very much willing to install a liquid helium cooled computer if it provides a significant performance increase above their existing supercomputer installations.


> you could use familiar semiconductor manufacturing processes.

Unclear to me why that's helpful. Materials that superconduct at a higher temperature than this one aren't hard to come by, or obscure:

> In 1913, lead was found to superconduct at 7 K,


Probably because they don’t behave well for normal lithography techniques? The high temp superconductors I know of are weird meta materials, and good luck getting them to exist in chip form at all.

Maybe. Could go up still, but a surprisingly conservative move for SoftBank.

But hey, maybe the market crashes tomorrow and this is seen as the best piece of timing ever seen, maybe it doubles in the next year.

In any case, you never lose by taking profit.


I’m not calling the top, but I think it’s incredibly likely we’ll see NVDA with a lower market cap than this in the next 3 years.

Softbank is likely in trouble and needs the cash or something.

SoftBank, according to its stock, is in the best shape ever.

https://finance.yahoo.com/quote/SFTBY/


And per the stock prices Tesla is worth more than every other car manufacturer ever, as well.

Until it wasn’t. Then it was again.

Either way, cash flow != stock price, not that SoftBank is in a ‘business’ that likely needs a lot of cash.

Maybe they just needed a new yacht?

Or need to bail out one of their investments?


Interesting - is buying SoftBank a proxy for owning some OpenAI at this point ?

I thought Nvidia was the proxy!

It's proxies all the way down

... to reverse proxy

That's incredible and completely unhinged. I love it.

Yeah, I think diminishing returns kick in at some point.

Going from 1080p to 1440p feels like a huge improvement. Going from 1440p to 4k (aka 2160p) is a little bit sharper. I don't think the jump from 4k to 8k will improve things that much.


I can tell the difference between 1080p (or upscaled 1080p) and 4k on a 50" screen at "living room" distances, but it's nowhere near as obvious as SD to DVD was.

At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.

8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.


> I can tell the difference between 1080p (or upscaled 1080p) and 4k

Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.

I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.

This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.

[0] https://media.springernature.com/full/springer-static/image/...


It’s most noticeable with animation that has text-like lines, in my experience.

For plain video, 1080p at high enough bitrate is fine.


DVDs are SD, with 480 pixels of resolution for ntsc.


Yeah I meant VHS → DVD (which was noticeable), and also SD → HD (1080p) which was again noticeable.


Probably not for 32” monitor, but I think 8k would be noticeably better for a 43”.


Nice!

It's a bit similar to Grammatisch, although that just focuses on the grammar.


No RAID?


If you are only going to have one of the two, choose backups, preferably off-site, even better soft-offline, over RAID.

Of course both is best if you don't consider the cost of doubling up your storage (assuming R1/R10) and having backup services to be a problem.


RAID isn't a backup, it only handles certain/specific failure scenarios.


Yes, it covers exactly the "Then a drive fails spectacularly." case. Unless you were hit by some subtle silent data corruption across the RAID (but it's pretty rare compared to classic drive failure with buzzing and clicking sound).


True, it does cover that specific case.

But it doesn't cover the your RAID controller dying, your house burning down, burglary, tornado, tsunami, earthquake and other "acts of god", etc.

"A backup is a copy of the information that is not attached to the system where the original information is."

[0] https://www.reddit.com/r/storage/comments/hflzkm/raid_is_not...


> But it doesn't cover the your RAID controller dying

One of the reasons some people ditch the hardware RAID controllers and do everything in software. If you're at the point of pulling the drives from a dead enclosure and sticking them in something new it's really nice to not have to worry about hardware differences.


I agree, RAID is not a backup (and nobody said it is, in this thread). But if you self-host a lot of data, even as a hobby, it will make your life easier in case of disk failure.


I thought it was implied with "No RAID?" in response to data loss (wherein they mentioned that they had a backup :)

I'm personally very skeptical as I have been using/used RAID for 20+ years, and I have lost data due to:

- crappy/faulty RAID controllers: who actually spends money to buy a good hardware controller, when a cheap version is included in most Motherboards built in the last 15+ years? In one case (a build for a friend), the onboard controller was writing corrupt data to BOTH drives in a RAID-0, so when we tried to recover, the data on both drives was corrupt.

- Windows 8 beta which nuked my 8-drive partition during install


It's actually in the name: R = Redundant, i.e. availability.


Just because it's in the name, doesn't mean it should be considered a fact or best practice in accordance with reality. I think this[0] reddit post frames it in the simplest way possible: "A backup is a copy of the information that is not attached to the system where the original information is."

There are many[1], many[2], many[3] articles about why "RAID is not a backup". If you google this phrase, many more people who are considerably more intelligent and wise than myself, can tell you why "RAID is not a backup" and it is a mantra that has saved myself, friends, colleagues and strangers alike a lot of pain.

[0] https://www.reddit.com/r/storage/comments/hflzkm/raid_is_not...

[1] https://www.raidisnotabackup.com/

[2] https://serverfault.com/questions/2888/why-is-raid-not-a-bac...

[3] https://www.diskinternals.com/raid-recovery/raid-is-not-back...

edit: formatting


The I used to stand for "inexpensive" too, until RAID drives turned out to be everything but. They've since made it a backronym as "independent", although the drives really aren't independent either.


Try yourself here: https://start.boldvoice.com/accent-oracle

It wrongly pegged me as Swedish.

It's second choice was the place I live, and third place was where I'm from, so not too bad overall. I have been told I have a very ambiguous accent though.


Tried again and this time it got me. Second place is still Swedish. Looking at the UMAP visualisation, there is a South African cluster overlapping with a Swedish cluster, so makes sense I guess.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: