Hacker News new | past | comments | ask | show | jobs | submit | idunnoman1222's comments login

Just email the author five dollars and then grab their book from AA


I like this idea. I do it when the author has a Patreon. How do you email random people five dollars?


Check if there's a PayPal account tied to that email. Often there is. PayPal is a shit company, but no worse than Patreon.


AA?


Anna's Archive


His approval rating says otherwise


Have you considered they didn’t teach meme coins at his university?


He has access to the same resources that everyone here does, probably more actually. Instead of retweeting the first "interesting" thing that pops up on his feed, hear me out on this, maybe he could have quickly Google'd/DuckDuckGo'd/Kagi'd/Yandex'd this before he retweeted it.

No, they don't teach "meme coins" at university, but I don't think that's really relevant.


It’s not unreasonable to backpedal even harder


Sure, it's better that he backpedaled than doubling down, but that doesn't mean that he cannot be criticized for the initial transgression.

If I drive drunk and then hit someone with my car, I'm not immediately forgiven just because I called 911 and offered to pay the hospital bill.


12 V power supply … sdr that jams ELRS like they don’t even know what ELRS is or how it works. An SDR that could jam that wide of a frequency all at once would be very, very expensive.

Also you can just buy a purpose made 300 W jammer on AliExpress


A Fnrsy three in one oscilloscope is 50 bucks on AliExpress


So use your cpu


A lot of us have ryzen / nvidia combos... hopefully, soon, though.


Openvino runs fine on AMD last I checked


Maybe it does, however the system requirements page makes it looks like it supports everything BUT AMD.

https://docs.openvino.ai/2024/about-openvino/release-notes-o...


It supports AMD cpus because, if I understand correctly, AMD licenses x86 from Intel, so it shares the same bits needed to run openVINO as Intel’s cpus.

Go look at CPUs benchmarks on Phoronix; AMD Ryzen cpus regularly trounce Intel cpus using openVINO inference.


Or use the underlying open-source models directly; this is just several existing open models packaged by an Intel-specific deployment framework and wrapped as Audacity plugins.


This is a great suggestion and all, but don’t you need a frontend/pipeline to run data through these models?


There are existing frontends for these models that aren't tied to Intel hardware. It may be somewhat less convenient than having them packaged as audacity plugins, but they certainly exist, for people who would want to use them but do not want to be limited to Intel hardware.


Obviously


This is the fault of the regulators. There’s no reason that new discoveries are not put in a queue to train a new AI and when there are enough to make it worth the run, you do the run and then you give the doctors old model and new model and they run both and compare the results.


And what was it about before?


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: