Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a difference between "supports the syntax for joins" and "does joins efficiently enough that they are useful."

My experience with Clickhouse is that its joins are not performant enough to be useful. So the best practice in most cases is to denormalize. I should have been more specific in my earlier comment.



ack that anonymous user in internet said he couldn't make CLickhouse joins perform well in his case which he didn't describe


Not "that anonymous user." In my experience, avoiding Join statements is a common best practice for Clickhouse users seeking performant queries on large datasets. A couple examples... https://medium.com/datadenys/optimizing-star-schema-queries-... https://posthog.com/blog/secrets-of-posthog-query-performanc...


> avoiding Join statements is a common best practice for Clickhouse users seeking performant queries on large datasets

its common best practice on any database, because if both joined tables don't fit memory, then merge join is O(nlogn) operation which indeed many times slower than querying denormalized schema, which will have linear execution time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: