Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would be neat to have a fundamental "money" primitive alongside int, float, string, etc.

It might not be simple to implement, though. Off the top of my head,

* Can you confidently say that $100>¥10 in an offline environment?

* You'd need to support different bases for splitting units to deal with eldritch values such as farthings.

* Where is the line drawn? Is a Bitcoin a valid currency? A gold Krugerrand? A bundle of stock shares? A bushel of apples?

All of that said, I would love to see a world where I could write code like:

    money balance = money.lumber.grade.kilograms(8);


> Can you confidently say that $100>¥10 in an offline environment?

Money is never converted. It is exchanged. Trying to solve this is like trying to solve the question "Is "100 USD" > "1 LAPTOP".

When you turn 100 USD into 90 EUR, you didn't convert it. You exchanged it. You bought EUR, at a price given to you by someone or something exchanging it. This could be a bank, a well-established currency office, or some dude on the street. There is no real difference between all three of those: The third party gave you a price, and now has more USD and less EUR, whereas you have more EUR and less USD.

There are various entities publishing standardized average rates which are calculated after the day closes, based on a variety of datapoints they have access to. Those are often used in eg. accounting, to establish the "real" value of something you bought in a currency you don't often use, but it's not true conversion.

If you have, as a datatype, a currency becoming another, there is ALWAYS a "rate" attached to this. So the question "$100>¥10" you asked above requires more data, it should be "$100>¥10 @ 144.28". ANYTHING else is a terrible leaky abstraction. Don't do it. Source your rates automatically from a single source if you like, but make it explicit.

Anyway, a "Money" object really is just this: A precise decimal object, with an ISO currency code. The latter simply being a short string among an included, limited set.


Currency conversions are a transaction not an operation, the conversion rate fluctuates constantly and typically involves fees and involve tax liabilities. For a money type I’d go so far as to want it to either disallow or throw an exception if attempting an operation on two monies in different currencies.


I don't think it's as big a heft. First, there are standards bodies that list currencies as "default" set, much like we have ISO standard country codes. No one really complains if Narnia isn't a country, that Disneyland isn't a country, or the Austro-Hungarian empire, for ISO Locales.

At a bare minimum, it should be a reasonable fixed point type that correctly handles rounding and intermediate values. So a dollar amount like 123.45 times a rate like 0.3450 doesn't exceed 4 decimal places but intermediate values are extended so we get correct rounding. The destination should probably determine the number of places. That bare minimum wouldn't stop you from comparing yen to dollars, any more than a floating point representing mph stops you from comparing it to a value representing kph.

But there are times where we need to track prices to the nearest tenth or hundredth of a cent. So it should be extensible so that 123.456 dollars * 0.3450 winds up at a correct round/decimal places.

You also don't need always-on, real time currency conversion. You could have a conversion type, operator, or method that does safe conversion based on the value I give it. So if I estimate that Yen are about 130 to the dollar, I can just use that. If I happen to write an application that queries a data provider and can populate that in 'real time,' that's up to me.

If you really wanted, you could find a way to create new types that represent currencies that aren't part of the basic implementation. That might mean you need to specify some things like the representation for different locales, or the default number of digits.


"Can you confidently say that $100>¥10 in an offline environment?"

A major problem money has is that it isn't a unit in the sense we usually take the term to mean. We expect, for instance, that translating one unit to another with a suitable level of precision should be translatable without loss back to the original unit, but that's not true for money, even ignoring transaction costs. If "US dollar" is a unit, it is a unit that technically stands alone in its own universe, not truly convertible to anything else, not even other currencies. All conversions are transient events with no repeatability. But that is very inconvenient to deal with, and with sufficient value stability of all the relevant values, often it's a sufficient approximation to just pretend it's a unit. But if you zoom in enough, the approximation breaks down.

For that and similar reasons, while you could theoretically write that line of code, it would be implicitly depending on a huge pile of what would in most languages be global state. It would be a dubious line of code.


It looks like math but really it is describing an exchange of goods.


Here's the implementation we use at work. You might find some interesting ideas there. It's nicely documented.

https://hackage.haskell.org/package/safe-money-0.9.1/docs/Mo...


> Can you confidently say that $100>¥10 in an offline environment?

It’s not even clear what the second currency is. US$100 is over 14000JP¥ or 725CN¥.

If you’re offering me to wager whether $100 is more than ¥10, I’ll take the wager that it is.


> Can you confidently say that $100>¥10 in an offline environment?

Yeah, one more reason for it to be a different type.

> Where is the line drawn?

It's not.

As a rule, measurement unities have an absurdly bad support from computers.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: