Fascinating. Well done! There is a certain eerieness to seeing this actually work. Although Borges lived to see computers becoming mainstream, I am sure he never imagined his artistic vision being implemented in such a strangely convenient and easy way.
Like many uses of AI, very good as a "seed" especially if it comes up with nuggets like grouping on a range instead of a single value. But as with almost any database, devil is in details. Different products have different interp of "quantity" (e.g. box vs. unit), coupons and discounts are modeled in weird ways, weights are assumed to be pounds/kg and are mixed without assigning units, etc. etc.
Good point. However, AI may also train on the database metadata, including DDL comments, locale, etc. and perform data sampling to glean the nuances that are "understood".
JSON[B] does work perfectly fine with decimal and float inside postgres and pgsql, e.g. select (data->>'amount')::numeric. The issue is moving that data in and out of postgres (or truly, any other data manager) with precision. If the JSON[B] is {amount: 12.09} then you have no guarantee that someone picking it up outside of postgres uses float conversion which could lead to 12.0899999. With BSON, amount is a decimal128 and stays that way, fromBytes() and toBytes().
> If the JSON[B] is {amount: 12.09} then you have no guarantee that someone picking it up outside of postgres uses float conversion which could lead to 12.0899999. With BSON, amount is a decimal128 and stays that way, fromBytes() and toBytes().
But how is that relevant? With BSON as transfer format you still don't have that guarantee that the client won't misinterpret the data it receives by using bfl8 as its own internal data format after deserialization.
All languages I know have libraries that can decode the numbers from JSON strings into their relevant arbitrary-precision decimal type. If the user doesn't use that, why would it be a failure of JSON[B]/PostgreSQL?
With BSON I am guaranteed that, for example, in Java, when I do `Document d = BSON.fromBytes()`, then `BigDecimal amount = d.getDecimal128('amount')` is non-lossy. `amount` is typed; `double amount = d.getDouble('double')` is an error.
Again, it's not a failure of JSON[B]/postgres. It's what happens when the data carrier (JSON) is just a string with a small inferred type suite. How many times have we run into issues when `{"v": 1.3}` works and `{"v": 1.0}` works but `{"v":1}` gets turned into an `int` by accident?
Consistent navigation and keystroke commands for content/context-appropriate basic operations have really been the foundation of my use of the emacs. Yes, there's an elisp integration to everything and Magit is (perhaps..?) the best git client period, but boy ... For example, I have gotten so much mileage out of M-x comment-region across Java, bash, C++, XML. Using Ctrl-S to inc search a dired buffer in the same way I inc search text.
And I was delighted when I first used Eclipse to discover that 90% of the keystroke commands that have become muscle memory (nav, save, multilevel undo, block copy and paste...) are the same. Same with bash of course.