Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What if there were a law in place like a FOISA for AI whereby I an request the actual code/Data that caused the AI to come to its conclusion.

So if an AI generated bill for service was found that I Owe $N$ - I should be allowed to see all the code and logic that arrived at that decision.



That’s not the same as giving the model to someone and allowing them to build tools with AI powering it, or the development of alternative models (which is what they’re trying to stifle). It’s less about transparency and more about putting the tools in as many hands as possible


Its funny, this is the second time in a few days I came across your /u/ and comments. THIS WAS COMMENTED IN RED OR GREEN

Anyway, yeah I think models need a way to self-register upon whom uses them.. yes Creepy AF, but also needed AF. ? disagree>??


No, I think these models should be treated for what they are - a bunch of numbers. It’s like trying to treat encryption as special. It’s just code.

There may be some specific applications that require regulation or control. That’s ok. But the underlying fundamental technologies should be open and free.


just a question, did you get my RED/GREEN comment?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: