Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even human-written content has become so bad it's sometimes hard to tell it apart from a Mark V. Shaney or Racter implementation. I mean, aren't our brains' language capabilites not just Markov chains on steroids, with billions and billions of inputs and weights? Why are we still using text-based content-generation algorithms? Can't we train some kind of AI to generate "good" fiction content? Like, for example, this: "Oh," said Lucy, peeping into the parlour, "is that our old friend, the fairy?" "Yes," said Mr. Bennet, looking up from his newspaper, "my sister has been entertaining you in the best way she can." "She has been telling me," said Mrs. Bennet, with a solemn laugh, "all about her doll's tea-party. I wish I had been there." "My dear," replied her husband, "I am certain that it has been her greatest enjoyment. And the little girl really seems to have had a very good time of it." "Not more so, I am afraid, than the other ladies. _They_ had none of the advantages of hearing the charming story.


> I mean, aren't our brains' language capabilities not just Markov chains on steroids, with billions and billions of inputs and weights?

The word "just" is doing a lot of heavy lifting here. Yes, our language capabilities are just Markov chains - the way engineering is just constrained nonlinear optimization and playing games is just tree traversal.


The generated content is all nonsensical babel when it’s asked to do more than give a definition for a word or concept.

It’s grammatically correct but has no deeper meaning. And sure, some of the human written crap posted here is also meaningless blogspam.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: