Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>No, these things are not, "emergent," they are just rearranging numbers.

This is a bad take. Most ways to "rearrange numbers" produce noise. That there is a very small subset of permutations that produce meaningful content, and the system consistently produces such permutations, is a substantial result. The question of novelty is whether these particular permutations have been seen before, or perhaps are simple interpolations of what has been seen before. I think its pretty obvious the space of possible meaningful permutations is much larger than what is present in the training set. The question of novelty then is whether the model can produce meaningful output (i.e. grammatically correct, sensible, plausible) in a space that far outpaces what was present in the training corpus. I strongly suspect the answer is yes, but this is ultimately an empirical question.



I would love to read anything you have written about the topic at length. Thanks for your contribution.


I haven't written anything substantial on the subject unfortunately. I do have some ideas rattling around so maybe this will motivate me to get them down.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: