Hacker Newsnew | past | comments | ask | show | jobs | submit | orolle's commentslogin

The higher the differential the higher the efficiency of heat to electricity transformation. If I remember correctly it's a big efficiency gain between 200C and 300C. From economic side of things, more bang for the buck.


Its a different discussion, quite off topic. When the boomers go into retirement, our current economic system with GDP growth focus has a big problem. Our system is based on produce more stuff and buy more stuff. But when the boomers retire, they will buy less stuff. Who will buy supply overhang? Nobody. Even worse, who will pay for the debt incurred to buy overly inflated asset prices, like housing and factories, when boomers start downsizing and start selling?

Look at Japan. They are 10 years ahead of us. It will be tough and depressing until the economic system adapted and prices normalized. On an opitmistic point, humanity will progress and when an economy cannot sell more stuff then it has to sell better stuff.


Japan is not an apt demographic corollary for the USA. The census recently came out and found that the number of Asian and Hispanic residents of the USA has increased (~20-30%) hugely since 10 years ago. These are younger populations with higher birth-rates than "White" people, whose percentile share of the US is heading downwards.

Japan, on the other hand, is xenophobic and has discouraged immigration very heavily. Combine that xenophobia with historical matters: sneak-attacking an industrial powerhouse in 1941 in of the most ill-advised, terrible wars, losing repute by massacring hundreds of thousands of civilians. Meanwhile thousands upon thousands of their own best young men and civilians were killed by the vastly superior man-power and industrial might of the US. Japan was hobbled by WW2 and has never fully recovered, consider the greatest catastrophes of their history were only 80 years ago still, namely losing a generation of youth, their cities being fire-bombed, their savings being depleted for phony war bonds, and being the only country to ever be nuked. Japan is simply not a good demographic comparator for the USA.


You're trying to explain current Japanese demographics with a weird rant about WW2 without mentioning the subsequent economic miracle and population growth? The population of Japan was not far off doubling from 1945 to 2010, and in case you somehow haven't noticed, they became a major first-world economy, eclipsing many many nations which were not nuked.

80 years is really quite a long time. Germany also bounced back rapidly in the second half of the twentieth century to become a major industrial power.


I don't disagree it could definitely be a problem with the US, but so far, with immigration, it's not been a problem. As long as our immigration #'s stay up, the US should be fine.

So far it's been a steady state, but it's unknown if that will continue into the future.


I agree with both you and the GP, and also see immigration as moving the hard problems from one field to another.

To solve the economic issue of maintaining growth, the US/EU moved to the issue of how to maintain a peaceful but diverse and divergent society.

I see Japan's partial bailing on immigration as a sign they don't see any good way to get through it, and we can see a lot of today's US internal fight as the result of not paying enough attention to how hard it is to adapt a society to the new challenges.

I wonder if there would be third ways, with economic powerhouse moving their "growth" to other countries without a stigma of stagnation or exploitation.


In the not too distant future, competition for immigration is going to be tough. I wonder if the US has the political atmosphere to offer competitive packages to win over the immigrants we'll need.


Why do you think that? It feels that the mid-term (e.g. 20-40 years) outlook for migration would include a large increase of migrant supply due to e.g. climate change issues in the "global south", so instead of competition for immigration it seems likely that places like US would be able to pick whatever kind of migrants they'd prefer to allow.


Yes, supply of immigrants will increase because of climate change, but I think it's important to understand the current structure of age distributions in the world. The US had a pretty large Millenial generation. Most other countries did not. Which means, we're not REALLY going to be feeling the need to take on immigrants for a while. But as boomers retire, and millenials move to replace them, the countries that didn't have a sizable millenial generation are going to be in a position to have a much higher demand. Those countries are going to be more desperate than the US, and will likely start developing very sizable offers. The US is going to catch up though, the US does appear to be inverting it's demographic distribution as well, but we're 20 years behind. That's actually a pretty big advantage in a bunch of ways, but in terms of competing for immigrants, it's a disadvantage.


Thanks, good points!


The real question is: is are we interested in quantity or quality?


I think the author wording is a little bit confusing. I think he want to point out, that narrow AI is used successfully only in (niche) applications with narrow purpose. Niche here is not about total market size.


You need to very carefull to apply Kelly Criterion to stock market, as you cannot precisely calculate the p of your investments. If you assume a too high p then you will overbet and its only a matter of time to go bust (see N. N. Taleb MOOCS on Kelly). Thus the Kelly Criterion should be your UPPER bound for real-life investments with uncertain p, stay well below the betting amount what Kelly Criterion would suggest so that you stay longer in game.

Related to this, the best investors in the world are quite old guys. Why? Because they lived long enough to accumulate enough wealth to be of public interest.


This is nothing new. The independent media wrote about this a year ago and mainstream media organized an anti-fake news campaign against it. Now the mainstream presents itself as the saint - BS! 1 year ago mainstream did not check the facts or at least compare natural versus lab origin. No - they steam rolled independent media for publishing "fake" news, got them kicked off of social media and destroyed their income stream. Now it's factual news and they present themselves as saints, without apologizing for their wrong doing. I do not trust these mainstream media outlets anymore. I spend my money on independent news.


We have anti-trust laws. Still the anti-trust laws are not enforced.


A vast majority of laws are not enforced, especially not consistently.


Because it would be political suicide to do so. Many people in this country love big companies and build their entire identities and personalities around brands like Tesla or Apple.


Many small and medium businesses also have been hurt by the monopolies. I don’t think it’s political suicide, and the winds seem to be shifting back towards anti monopoly.


I disagree. As long as you use prepared statments and bounded parameters, your application is safe from SQL injections. NEVER use string concatiation to generate any SQL queries - not in your app and not in your database! Its unsafe and slow. https://security.stackexchange.com/questions/15214/are-prepa...


That's easy enough to say, but time and time again I see codebases, even ones making extensive use of prepared statements, falling back to doing string concatenation from time to time. Prepared statements etc are an example of "opt-in security", which is a good band-aid to have for quickly fixing up old code, but it still allows for some pretty egregious errors.

Again, with the seat-belt analogy. As long as you're safe and careful all the time, seat-belts are worthless. Therefore seat-belts are only for dumb, reckless people.


Then again, prepared statements (and SQL injection) are a solved problem. Imagine what people who can't bother to use prepared statements would do with an ORM in non-trivial cases.


Funny enough, I reversed engineered your numbers a few days ago. I guesstimated at least 3 million ARR with 2 - 10 FTEs. From revenue per FTE perspective you have the most impressive numbers I have ever seen!


I used the library in a hobby project, where I used sql outer joins to produce maps to dativity. Dativity controlled the sql transactions to change the process instances. It was simple to implement and it controlled different business processes. The library itself is stateless thus it has no thread pool or anything that can execute scheduled tasks. Thus the trigger for change needs to come from the outside. Edit: I do not think dativity tries to replace BPMN. Its an alternative approch with its pros and cons.


It is nature of the problem space which requires you to produce high quality code. If you have a small error in your program, the financial market participants will use it against you to profit. What you loose, others win! Google "fat finger" for examples. In banking you have to keep track of every transactions, see "double accounting". You never delete a transaction, you only retract! Mutablity can cause you a lot of trouble there. SQL DELETE and UPDATE are extremly dangerous! Clojure and datomic solves this through immutibilty. Lastly time is relativistic, meaning that every IT system has a slightly different time. Normaly you never notice this. But they are the cause of tricky race conditions and cost you real money. Think about bank transactions, where you have a transaction date (date you send money) and valuta date (date your friend receives money). One transaction 2 different dates, depending which perspective you take (perspective is relativistic, Einstein is right even in IT!). Datomic linerialies transactions thus this problem does not occure on database level.


I love immutability and Datomic as much as the next Clojure programmer. But this sounds quite a lot like post-hoc reasoning.

For example, there's quite a few situations where you require mutability in bank data (e.g. with government requirements, or when you want to rollback fraudulent trading). Sure, both are solvable with Datomic, but they don't scream "huge advantage with an immutable datastore".

And there are plenty of highly competitive industries where code quality matters, but they're not flocking to functional programming.

Don't get me wrong, I think functional programming is an absolutely great way to structure and think about programming (and have coded professionally in Clojure for the last 5 years or so). But I suspect the prevalence of FP in some industries is as much chance/social reasons than it is for some inherit superiority in the approach.


small note: Banks do not want to forget fraud, the rollback would be akin to a git revert. Even if the history presented to the customer makes the fraud vanish it won't be forgotten by the bank.

With government requirements I'm assuming you mean something like gdpr. Datomic supports actually removing data via excision. it's just not the default behaviour. I personally prefer a system that doesn't forget by default whilst preserving the option to do so.

you can also support destruction of personal data via other means such as key shredding.


Aren’t “rollbacks” done as separate transactions, that return account amount to Previous value?


Depends on the legal requirements of that rollback


Not really a Clojure user, but I made a small API with it once and it blew my mind, especially the design patterns involved. An interesting one is the ports-and-adapters (a.k.a. hexagonal architecture) [1][2] . Basically, all the business logic will be kept at a layer, and all of the functions there should be pure (i.e. they will always return the same information according to your input, and these functions won't cause side effects [3]). Then you would have layers where you can plug databases and REST handling.

And Nubank take testing really seriously. REPL and pure functions makes it very easy to use TDD.

[1] https://github.com/nubank/basic-microservice-example#ports-a...

[2] http://wiki.c2.com/?PortsAndAdaptersArchitecture

[3] https://practicalli.github.io/clojure/thinking-functionally/...


> It is nature of the problem space which requires you to produce high quality code.

Wouldn’t a strongly-typed language be a better choice here?


It takes the will and discipline to use it, but what Clojure Spec (and other schemata libs) offer is, in a lot of ways, more powerful and flexible than traditional type systems. A spec can be thought of more as a contract with the data, one with enough detail that it can be used to auto-generate conformant example data even. If, for instance, you have a function, that needs to work on either, integers, or numeric strings, and textual fractions like '1/5'. Enforcing this constraint on input, and getting informative exceptions on bad data is easy with a spec, and the function code does not need to contain all of the noise to validate or coerce data. Sure, you don't see the problem at compile time, but if you can auto-generate test data, the tests that you should already have become easier to write.


this is perhaps controversial but i actually think datomic is so powerful, it's worth using clojure for. in other words, the database is driving the choice of programming language.

the idea of viewing a database as an immutable state of the world at a given instant t0, and time becoming a parameter on that state of the world (in order to show changes as time goes forwards [or backwards!]), is extremely, extremely attractive for things like finance, whose first class citizens are among others:

- capability for audits e.g. show me the history of transactions from any particular account. since datomic is basically a collection of immutable facts over time, this is "free"

- distributed computing - datomic runs nicely across your own internal compute (often needed for financial stuff)

- transactions are no longer strings, but are actual data structures - this makes the gnarly steps of things like transferring assets across instruments a lot easier (i'd imagine). think about how you'd implement a shopping cart with transactions in postgres vs. how you'd do it with access to raw data structures


Moreover, transactions are reified and can have arbitrary metadata, so you can query the transaction log itself (for example, "show me all transactions issued by person X").


I do not think so. The regulation is constantly changing and the meaning of names change frequently. Thus a "Verified Account" can mean different things over the years. The problem with types and object orientation is, that the names used in the domain diverge from the name used in source code (class name, types). Think about a class diagram with class names relating to each other. To represent the domain language better, you need to change a lot in a class diagram. Dynamic languages reduce the problem, as a lot less names are needed. Clojure spec is used for specification of data instead of types, but there is also clojure typed (which uses javas type system).


Standard Chartered, Mercury, Tsuru Capital (Haskell), Jane Street (OCaml) don't seem to have a problem with using a statically typed language in the financial space.


maybe they do have a problem if their net worth is so low in comparison :D


Unlikely. At least not for domain modeling.

Especially in that industry where your domain is changing all the time, where regulations are changing all the time, where the ability to reason about your domain at different points in time is essential.

Having more flexible types like maps is one of the building blocks to avoid complexity (there are more, more important ones) It sounds counter-intuitive, but it certainly is working out for companies embracing clojure.


I wouldn't think that strong types are an advantage. They may be in classic software with long compile times and complex builds. However, the current landscape for financial institution doesn't require that. Immutability and functional paradigms seem to be much more in line with the needs of the business.


> It is nature of the problem space which requires you to produce high quality code.

Would love to see any actual data or studies showing that functional programming implicitly produces "high quality code".



I have quite a few gripes with this article, and overall question the validity of its assertions, but interesting nonetheless.


There's a ton of problems with empirical studies about software in general. Very hard to conduct reliable experiments. In particular, I think analyzing public GitHub projects is pretty much the worst corpus possible.

For example, almost all of the projects in this study are infrastructure projects (I'm not familiar with all of them so I can't say that it's definitely all). I'm much more interested in application projects, and even if you (general you) aren't, you have to admit that an infrastructure project has a totally different set of characteristics than your average business application.

I think anything we can do to get more empirical data related to software the better, as we have devolved into strong personalities and conviction making pretty much all of the major decisions in our industry, which is really deeply sad. But we have to do better than just mining open source Github projects.


Yeah, I wasn't necessarily trying to get into an argument with the person who responded, but there's so many factors not being accounted for in that study that it might as well be measuring nothing at all.

Choosing Github projects and measuring defects on them has almost nothing to do with the quality of closed-source code (as we were discussing, functional programming languages in the wild). I also briefly started down the path of mapping that study's measurements to overall language popularity (as I think they're related - more C++ code available = more bugs), but gave up as I remembered that A.) Nobody's opinion changes as a result of a reasonable argument on the internet, and B.) Convincing this person nets me nothing at all.

You get higher quality code by hiring people capable of producing higher quality code. A great way to find people that are capable of producing higher quality code is by hiring for languages with extremely small talent pools - nobody got there because it was easy or the odds of getting hired were good. Functional programming might seem popular on HN, but in the wild is really not popular at all. Clojure developers probably care a great deal more about the quality of their work than, say, some random J2EE developer, as the Java dev might not care at all about anything other than staying employed.

I guess my argument summarizes down to "Functional programmers care more about their work", which, to my original point, has nothing at all to do with the languages they're using (as the person I responded to was saying). To assert that "Functional programming languages produce higher quality code" is like saying "This brand of hammer hammers better." It's just nonsense.


That's oversimplification. Good programmers care about their work, doesn't matter which language.

Also, use the right tool for the job.

A language that forces you to think about values rather than mutable objects, will produce higher quality code as the number of ways you can shoot yourself in the foot are drastically reduced.

Clojure will make you run your code constantly in the REPL. Paired with a dead-simple testing library, the desire to keep your majority of functions pure, it is not hard to see why Clojure code has higher quality.


> Good programmers care about their work, doesn't matter which language.

Yes, but the whole purpose of my post is to point out the difficulty in _finding_ the good programmers.

> Also, use the right tool for the job.

Not sure how this applies. You can build banking software in literally any programming language.

> A language that forces you to think about values rather than mutable objects, will produce higher quality code as the number of ways you can shoot yourself in the foot are drastically reduced.

Again, this is just totally unsubstantiated. You're using "higher quality code" to mean "code I like more".

> Clojure will make you run your code constantly in the REPL. Paired with a dead-simple testing library, the desire to keep your majority of functions pure, it is not hard to see why Clojure code has higher quality.

Where have we demonstrated that clojure has implicit higher quality? Also, where have we demonstrated that all clojure devs keep their functions pure? Or test them? And how does clojure enforce that?

Your last bit kinda proves my whole point, I think. You like a language a lot, and you make good use of its tools. But again, when you're looking at large groups of programmers, they're going to drastically lower the bar on what you "should" do - talking about "strong desire" and "dead simple" is totally unrelated to whether people will _actually do_ something in the wild. And what's the best way to keep a codebase good? Well first, tooling, but second, not letting bad devs get ahold of it.


That’s unfortunately hard to find, since hardly any customers require that these days. Fast and cheap is where it’s at.


Or, as I'm asserting, it's just totally bogus.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: