Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
ipsum2
3 hours ago
|
parent
|
context
|
favorite
| on:
Something weird is happening with LLMs and chess
The more obvious alternative is that CoT is making up for the deficiencies in tokenization, which I believe is the case.
aithrowawaycomm
2 hours ago
[–]
I think the more obvious explanation has to do with computational complexity: counting is an O(n) problem, but transformer LLMs can’t solve O(n) problems unless you use CoT prompting:
https://arxiv.org/abs/2310.07923
reply
ipsum2
1 hour ago
|
parent
[–]
What you're saying is an explanation what I said, but I agree with you ;)
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: