A law like that would probably be unconstitutional if it applied broadly to speech in general. Compare United States v. Alvarez, where the Supreme Court held that the First Amendment gives you the right to lie about having received military medals.
It might work in more limited contexts, like commercial speech.
> I wouldn't want humans pretending to be bots, for a variety of reasons.
I don't have an opinion yet, but I can't think of a specific reason to object to that (other than a default preference for honesty). Could you give an example or two?
Probably the main risk would be people trusting what they think to be an automated system and trusting it to act for a specific purpose. That or people saying things they think should be private. I’m not saying it’s actually safe to interact this way with bots, but the trust expectations are different enough.
What you'll find is that most people form a knee jerk opinion first, most often in opposition to change, then retrospectively seek reasons to justify their opinion after the fact.
In other words, people, generally, cherrypick evidence for their opinions, rather than picking opinions for their evidence.
A good sign this is occuring is when the reasons provided are vague, the prevalence of that negative outcome is rare, or are hypothetical scenarios which rely on companies or people behaving in unlikely and unnatural ways (like ignoring broader incentives).
The result is luddite-ism, and proposals exactly like this one, whereby regulations are proposed even in the absence meaningful and demonstrable harm.
So compel speech from a person? "Congress shall make no law..." Really the most basic civics education would benefit us all, I think there are some youtube videos about this.
I wouldn't want humans pretending to be bots, for a variety of reasons.