>We are escaping the woes of AI and radicalization, I guess..
You sure? I started a project not 24h ago and was quick to notice advertising of suggestions on multiple LLM chat prompts with things like: "Surprise me" and "Play a quiz"..
..but but... I've never had the opportunity or helpful environment to focus on learning languages and focused on other skillsets.
Other efforts to try and coordinate the time, finances and a team to accomplish the projects that I have in mind also failed miserably..
Am I (for example) so bad to believe that I could possibly accomplish some of my dreams with the help of LLMs as another attempt to be an accomplished human being?
>The more powerful the tool, the more responsible its wielder should behave.
I will argue that this is a false pretense, in part that you say it's impossible to enforce, but also for the fact that it does not happen in reality.
Anyone with a will to an objective will utilize any tools at their disposal, only the observer from another perspective will judge that this is 'good or not'. To the beholder, this has become the only way to achieve their goals.
An anecdote goes by the lady who was using a ww2 era hand grenade to crush spices in her kitchen for decades without anything happening. Goals were met and nothing bad happened but general consensus states that this is bad for many reasons, to which nothing happened.
Maybe it's not only responsibility, but the capability for one to understand the situation one is in and what is at their disposal. ..and a hint of 'don't be evil' that leads to good outcomes despite what everyone thinks.
> Maybe it's not only responsibility, but the capability for one to understand the situation one is in and what is at their disposal. ..and a hint of 'don't be evil' that leads to good outcomes despite what everyone thinks.
This understanding, and hint of broader/benevolent perspective, is what I meant by responsibility.
I'm not so naive as to expect it in general but I have known it to exist, that there are people who respect the responsibility implicit in proper use of their tools. The world is a labyrinth of prisoners' dilemmas so I get that there's a reasonable argument for being "irresponsible" whatever that means in the context.
For the greater good, would you go as far as saying that the act of responsibility comes from the top, the people who lead, and those in the public spotlight? Or would this ideology need to be indoctrinated in educational systems? Or do we have to hope and pray that each and every human born would need to go through the same process of learning and understanding to reach this level of responsibility?
I was thinking along the same lines, create artificial fireflies to attempt and lure others back into the neighborhood..
But then again, fireflies use their lights to attract mates, and I'm thinking now that their phosphorus-based communication might be more complex than just a randomly emitting diode.
I can picture it now... "You're into WHAT?!? Nope, I'm out of here!"
I (unfortunately) updated swiftkey to the latest version in an attempt to troubleshoot an issue I had, low and behold, a new copilot button has appeared!
(Ironically, I was trying to remove an @ key that popped up in an app which reduced the length of my space key by 1/3rd)
Edit: Disabling the copilot options in settings is necessary as they are on by default, disabling does not remove the button. Thanks to this post for reminding me to review settings after an update.. *unhappyface
Edit2: So I just discovered that that you can obtain several types of debug info by triple tapping on the version number in the about screen... That's nifty!
Neat video. It seems to capture the middle between "super high coastal wave" and "shallow but long duration wave in deep water". That's what you'd get with coastal waters several tens of meters deep.
>But Alsup split a very fine hair. In the same ruling, he found that Anthropic’s wholesale downloading and storage of millions of pirated books — via infamous “pirate libraries” like LibGen and PiLiMi — was not covered by fair use at all. In other words: training on lawfully acquired books is one thing, but stockpiling a central library of stolen copies is classic copyright infringement.
I am not actively following this trend but didn't Meta do the exact same and successfully argued that it was fair use because.. they didn't seed or upload any data?
reply