Hacker Newsnew | past | comments | ask | show | jobs | submit | averros's commentslogin

Nope. Dijkstra was and still is right.

The newfangled ways of teaching how to code managed to produce a generation of really poor-quality software "engineers" producing atrocious slow and buggy bloatware.

You cannot be a good engineer if you reason with analogies and metaphors instead of understanding how things actually work, at multiple levels, and knowing what and when can be abstracted and which lower-level details are important and should not be ignored.

So... if you want to become a decent software engineer, start with learning digital electronics and assembly. This way you won't need "metaphors" for arrays and such - because you will know that a static array is just multiplication by constant followed by addition to a constant and dereferencing of the resulting address.


Although I agree with the core of your argument, I think this position is a little reductionist. Yes, you really do need to understand what's going on under the hood (though, choose your hood). But you're also almost-but-not-explicitly saying that analogies and metaphors are not valuable tools for engineering. I really strongly disagree with that.

As one example, I find analogies downright necessary when working with a complex domain -- you have to find ways to speak domainese in the software world, or speak programese in the domain world, and analogies are precisely the tool for that job. It's easy to think that domain concepts should have a 1-1 representation in the software world, but in my experience, that's neither true nor productive. You do need to understand (and document for others!) the mappings between those concepts, but they shouldn't be so rigid.

On a related note, I'm reminded of Terence Tao's thoughts on rigor in mathematics [0]. He divides expertice into pre-rigor, rigor, and post-rigor phases. It's necessary to understand the rigorous elements, but those (a) build on intuititons built up over time in the pre-rigor phase and (b) support expert, formalizable intuitions in the post-rigor phase.

[0] https://terrytao.wordpress.com/career-advice/theres-more-to-...

> So... if you want to become a decent software engineer, start with learning digital electronics and assembly.

I disagree here too, though less strongly. I think it's important to have approximate intuitions that can be refined, rather than trying to manufacture a precise understanding out of nothing. Those intuitions can be built up by trying, failing, and accomplishing projects that you're motivated by. Even if Djikstra is right about "radical novelty", you obtain a certain amount of mechanical sympathy just by fighting with the computer and learning its needs.

The "Purpose-First Programming" research [1] that this article's submitter also submitted today seems relevant.

[1] https://news.ycombinator.com/item?id=25256874


I believe you misunderstand how the process of learning occurs. If you really want to deeply understand something, it's a whole process of connecting up existing facts and schemas in your brain. That happens more smoothly over time by processes like seeing metaphors. A good teaching metaphor always involves a description of how the metaphor fails. Learning from metaphors is not a weakness, nor is it the only plank to build a house of knowledge on. Eventually, you will need to understand the mechanics too, but there's nothing wrong with starting with metaphor, or examples, or visualizations, or interactive simulations, or any other teaching scaffold.

Also, don't gatekeep the field, unless you have a citation that proves that modern teaching techniques have really led to a decrease in quality of software engineers. I really doubt you have any concrete measure you can point to that won't be anecdotal, but if you have one I am interested in hearing why you think you can strongly believe that folks have gotten wore over time because we got better at teaching.


> Also, don't gatekeep the field, unless you have a citation that proves that modern teaching techniques have really led to a decrease in quality of software engineers.

I don't think modern teaching techniques have. I think the profession has done admirably (but not good enough) at cultivating the basics in more people than managed it ever before.

But when you multiply the size of the profession by so much, even if teaching improves greatly, odds are you're including a lot more volatility in the amount of ability/intrinsic interest/talent.

There's people who are making it today who'd never have been good enough 20-40 years ago. We need a bunch more programmers and software engineers now, so we tolerate it. And thankfully, improvements in tooling, debugging, and instrumentation, along with a bunch more available performance and storage, limit the harms.


From your comment history, you appear to be shadow banned. A perusal of your dead comments does not immediately reveal why.



Is it the method of teaching really? Or is it that good-enough, quickest to market, and diverse programs often win in a capitalistic market?

Capitalism is all about finding new opportunities. Nothing is static. Pedagogy or otherwise.


.


Thank you, I guess. I've been a practicing software engineer for over 40 years now, and watching the decline of my profession over the decades is quite painful.


This kind of confusion happens in newspapers and other coroprate media so often that one is left to wonder if it is intentional. After all, the fear porn sells.


Economy is NOT a social construct, fabricated or not. Economic laws are direct consequences of physical laws (i.e. scarcity is the result of Fermi statistics which makes material things to take space and of the no-cloning theorem which means the material things cannot just pop into existence because somebody wished them), or otherwise rather tautological mathematical statements (for example the first welfare theorem or the law of comparative advantage). Economy doesn't require humanity or consciousness to exist, it's a resource allocation and optimization algorithm.

So is political means of acquiring wealth - this is known as intra-species aggression and is common to all social species; this is another, older, and less efficient, algorithm for resource allocation.

Basically, every time politicians and their enforcers barge in to "fix" things they break the efficient algorithm (aka "market economy") to replace it with the older one (aka "the biggest ape gets to boss others around"). And, yes, it doesn't make any economic sense. "Bailouts" are nothing more than taking from many people by force and giving to few politically-connected ones. It's just indirect form of force: instead of simply looting, they print more money - but everybody else is prevented by force by doing the same, so they have no choice but to see the value of their money to decline and be transferred to the recipients of new money.


A whole lot of wild claims in that first paragraph without so much as a mote to back it up; some of it boils down to "because we can't magic things, thus scarcity."


I don't think you are prevented by force to print your own money, you just are prevented of printing dollars.


I have to disagree. Economy is the interface between producers and consumers of resources. It’s big time a social construct, and therefore the subject of political action.

Importantly, the idea that state intervention is bad for the economy as a universal rule is simply wrong. In the long run, history clearly teaches that mixed economy work (e.g. a la Keynes), while laissez-faire (e.g. a la Friedman) or centralized economies (e.g. a la Communist) don’t. Striking the right mix requires society input in the form of political debate.

We desperately need this to happen now in the West. While I have nothing about Jeff Bezos as a person, but billionaires simply should not be allowed to exist. They are the symptom of a « Laissez Faire » system in terminal state.


Jeff Bezos became a billionaire by making the lives of Amazon’s customers better. Sam Walton did more to improve the lot of poor families than most government programs.

(Those guys at least built something. Why does no one complain about America’s grandpa Warren Buffett. He just shuffled around money, plays with warrants and options (while telling others not to) and makes a couple hundred million by going to Congress and talking his book. )


Shuffling money around is absolutely an important part of the economy. It moves resources from unproductive or low productive companies to higher producing companies. The name for this activity is called "Capitalism".

One could argue that Warren Buffet should pay higher taxes. And he would agree! But unless you find capitalism itself unethical there is nothing wrong or underhanded with searching for undervalued firms and investing in them.


I agree in general.

My biggest complaint is that that instead of paying higher taxes Warren Buffet could maybe not jawbone Congress into bailing out AIG. AIG should have gone bankrupt and Goldman Sachs (and their other counterparties) should have borne that pain, and possibly gone bankrupt themselves. Instead Warren Buffet refuses to come to the rescue of AIG, comes to the rescue of GS, and then tells Congress that the economy will freeze up if AIG and other banks aren't bailed out. That is not efficient allocation, that's crony capitalism.


Any private citizen has a right to jawbone Congress. By itself, that isn't a problem.

What is a problem is that congressional representatives are enormously influenced by donations to political campaigns. Even if they lose a reelection there's a whole industry in place to put them into cushy jobs if they've followed their party line. And the party line is to do what the campaign donors want, not necessarily what the constituents want. That is a problem.


> In the long run, history clearly teaches that mixed economy work (e.g. a la Keynes)

History teaches no such thing. History teaches that governments messing with the money supply, which is the centerpiece of Keynesian economics, leads to civilizational collapse. The Roman empire being a prime example.

> centralized economies (e.g. a la Communist) don’t

Government messing with the economy by manipulating the money supply is just a different form of central planning, and doesn't work for the same basic reason that Communist central planning in the Soviet Union didn't work.


> History teaches that governments messing with the money supply ... leads to civilizational collapse.

I always had an impression that all the money everywhere was always something issued by the government and its supply effectively controlled by the governments: even many centuries ago there were laws that forbade forging the money, even at the times when the money was made of gold. The way I understand it, the worth of money was seldom, even in old times, the weight of the metal. Otherwise nobody would have cared about the forgeries?

So the question is, is there any historical example of money long enough disconnected from the issuers "messing" to be even able to argue there is such thing?

If I understand the more modern thinking of economists, it's not that the money in older times functioned differently (that there wasn't a "trust" component), it's just that there were limits due to the dependence on metals availability. But the economy was never "just a sum of the issued money".


> the economy was never "just a sum of the issued money"

Of course not; the economy is all economic transactions, not just the quantity of money.


> I always had an impression that all the money everywhere was always something issued by the government

No. Government having a monopoly on the issuance of money is a relatively recent development.

> even many centuries ago there were laws that forbade forging the money

That forbade forging/counterfeiting money issued by the government, yes. But not issuing different money altogether.

Historically, government-issued money has not always been available everywhere people wanted money, so other forms of money would be created to fill the gap. It is only in recent history, with the advent of paper (and later electronic) fiat money, which removes all practical limits on how much of it governments can create, that government issued money has become available everywhere to the point where the incentive to create private forms of money to fill a gap is basically gone.

> The way I understand it, the worth of money was seldom, even in old times, the weight of the metal.

The worth of the money was always supposed to be based on the weight of the precious metal, such as gold, that it contained.

However, once you've gotten everybody to believe that, say, all your gold coins contain some standard quantity of gold, which determines their value, the temptation is irresistible, judging by the historical record, to then secretly debase the coins by substituting a much cheaper metal for gold for part of them, so the actual weight of gold in them is less than the standard amount, but still telling everybody that they're the standard gold coins so they will continue to be accepted at the same value, based on the standard weight of gold instead of the debased weight.

Of course, such a secret never actually keeps for very long, and once word gets out that you're debasing your coins, their actual value in the marketplace goes down. Or, to put it another way, prices in terms of your coins go up--it takes more of your coins to buy the same goods and services. This is why monetary debasement always leads to price inflation.

> Otherwise nobody would have cared about the forgeries?

Governments always care about forgeries because they don't like competition in the business of issuing money, since that business is a source of revenue (look up the term "seignorage").

> is there any historical example of money long enough disconnected from the issuers "messing" to be even able to argue there is such thing?

The incentive to debase the money you are issuing, once you've gotten everyone to accept it at a certain value, is certainly not limited to governments. Fractional reserve banking comes from the same incentive: if you've gotten everybody to believe that, say, every gold note your bank issues is backed by a standard quantity of gold in your bank's vault, the temptation is irresistible, judging by the historical record, to then start secretly issuing more notes than you have gold, but still telling everybody that your notes are all 100% backed by gold, so they will continue to be accepted at the same value. Of course this never works for too long either, and since it's easier to print paper notes than to mint debased coins, inflation with debased paper money tends to happen faster and be more of a problem than inflation with debased coins.

So the difference between a private issuer of money and the government issuing money is not that the latter will debase money but the former won't. The difference is that if one private issuer's money gets debased, but there are multiple different private issuers, such as banks, people can simply stop using the money from the bank that debases it. So in a private, competitive situation, there is actually an incentive to not debase your money, which can counteract the incentive I described above towards debasement. In short, there is a limit to how bad things can get if private money issuance is allowed.

But if the government debases the money, while it's also enforcing a monopoly on money issuance (and also, most likely, requiring things like taxes to be paid in the money it issues), there is no other money people can choose, so an entire economy can be ruined.


> if one private issuer's money gets debased, but there are multiple different private issuers, such as banks, people can simply stop using the money from the bank that debases it.

The problem is, a big part of the value in the economy never gets to be an explicitly "issued" money, and that was so even many centuries ago. So your belief in the magic of the "hard" issued money appears to me to be just a wishful thinking, even without the historical support since the earliest times. Different forms of debts are older than any precisely defined issued money, and all the forms of debts are what actually drove the economies.

Most famously, the period of democracy in Athens started only after Solon abolished the accumulated debts, which speaks a lot about the dynamic of their economy during all the times before that point was reached.

It appears to me that sooner or later the current world will have to accept that we need a Solon-like act to end the period we're in.


> a big part of the value in the economy never gets to be an explicitly "issued" money

For an appropriate definition of "value", yes, I'll agree with this.

However, I don't see how it conflicts with anything I said.

> Different forms of debts are older than any precisely defined issued money, and all the forms of debts are what actually drove the economies.

Debt certainly plays a role in how economies develop. However, again, I don't see how that conflicts with anything I said.

> the period of democracy in Athens started only after Solon abolished the accumulated debts

First, I don't think historians are in agreement about exactly what Solon did and whether it amounted to abolishing all accumulated debts.

Second, while many historians credit Solon with laying the foundations for Athenian democracy, it did not actually start until well after Solon's death, after a considerable period of autocratic rule following the seizure of power by Peisistratos. So even if we grant that Solon abolished all accumulated debts, I don't see any clear connection between that and democracy.

> sooner or later the current world will have to accept that we need a Solon-like act to end the period we're in

Meaning, abolish the accumulated debts? That's basically what the US government is doing to its debt by continuing to print money; since all of the US government's debt, and pretty much all of US private sector debt, is denominated in dollars, printing more dollars dilutes the impact of all that debt on the debtors. It's not clear to me that this is a good thing. Of course, it's also not clear to me that continuing to accumulate debt is a good thing. But while the current financial and economic system certainly has plenty of issues, I don't see how "abolish all accumulated debts" is a good way to fix them.


Err... Keynes « center piece » is not manipulation of money supply. This extreme monetarism you describe is... Friedman’s!!! And boy do I agree it does not work!

Ultra liberal economics lead to accumulation of wealth by few super riches. This happens to be one of the reason the Roman Empire fell because all this money could not be spent on keeping the army in working conditions. Super riches of the time hoarded the necessary gold for themselves.

But don’t put that on Keynes’ tab. It’s twistedly wrong.


> Keynes « center piece » is not manipulation of money supply.

Yes, it is. Keynes said at one point that the government printing dollar bills and burying them for people to dig up would be a valid form of economic stimulus.

> This extreme monetarism you describe is... Friedman’s!!!

Milton Friedman did not advocate anything like the arbitrary manipulation of the money supply by the government that is practiced by the Federal Reserve. (He was in fact a consistent opponent of the Federal Reserve; he thought it should be abolished.) He advocated tying the growth rate of the money supply to the growth rate in real productivity, with no discretionary element at all.

> Ultra liberal economics lead to accumulation of wealth by few super riches

No, the government printing money and giving it to a few favored parties, which is exactly what the Federal Reserve does and always has done (the favored parties have almost always been financial institutions, no surprise given the background of most Fed members), leads to accumulation of wealth by a few super riches.

> This happens to be one of the reason the Roman Empire fell because all this money could not be spent on keeping the army in working conditions.

I don't know where you're getting your history from, but it's wrong. The reason the empire had trouble paying for a competent army was that it had debased the coinage so much that people were refusing to take Roman coins as payment. It even got so bad that the Roman government itself, knowing that its coins were worthless, stopped accepting them in payment of taxes and insisted on payment in kind instead.

The emperor Constantine managed to get hold of a large supply of gold bullion and used it to mint new gold coins, which were accepted in payment of taxes. That helped for a while as the empire could use these to pay the army and the civil servants; but many people could not afford to buy the new gold coins and therefore could not pay their taxes, so the improvement didn't last too long.

It also didn't help that the size of the army and civil service kept increasing, for no tangible reason, which just increased the tax burden.


> Yes, it is. Keynes said at one point that the government printing dollar bills and burying them for people to dig up would be a valid form of economic stimulus.

This does not make it the center piece. The center piece of Keynesian economics is that state intervention is necessary to moderate the booms and busts in economic activity.

> Milton Friedman did not advocate anything like the arbitrary manipulation of the money supply by the government that is practiced by the Federal Reserve.

Sorry, but clamping the growth of money supply with a k-rule, irrespective of the market cycle, is the arbitrariest manipulation I can think of.

> No, the government printing money and giving it to a few favored parties (.../...)

We can agree there that governments giving money to the already super wealthy is a bad think. But why does it do that? Because same super wealthy have become so wealthy they can influence government.

Which brings me to your case on Rome. The reason why Roman emperors debased their coins was because they had emptied their coffers on other things yet /needed/ to keep the army strong for what was essentially a looting regime to survive. At the end of the day, Emperors are not governments, they are another form of super-rich protecting their short-term positions at the expense of longer term stability.


> At the end of the day, Emperors are not governments

Ah, our old friend No True Scotsman. Sorry, not buying it.


> The center piece of Keynesian economics is that state intervention is necessary to moderate the booms and busts in economic activity.

Not just "state intervention", but state intervention of a particular kind, namely, manipulation of the money supply. Keynes' basic theory was that in a depression like the Great Depression, the problem was that there wasn't enough money in circulation, because people would not simply let both prices and wages fall in order to establish a new equilibrium between supply and demand. So, he said, the solution is to simply print more money.

> clamping the growth of money supply with a k-rule, irrespective of the market cycle, is the arbitrariest manipulation I can think of

I personally don't favor Friedman's solution either; I don't think the government should be manipulating the money supply at all. But to call determining the money supply by a known objective rule "more arbitrary" than determining it by the whim of regulators does not strike me as a sound use of language.

> We can agree there that governments giving money to the already super wealthy is a bad think. But why does it do that? Because same super wealthy have become so wealthy they can influence government.

No, you have it backwards. The super wealthy influencing government comes before the government giving money to the super wealthy, not after.

In the case of the Federal Reserve, it came into being because the super wealthy got tired of the government coming to them for loans to bail it out every time there was a financial panic. Why did the government come to the super wealthy for loans? Because the super wealthy had already gotten favors from the government through influence--for example, monopoly privileges over transcontinental railroad routes in order to outlaw free market competition. So naturally the government would come to the super wealthy expecting a quid pro quo.

Once this had happened a few times, the super wealthy figured out a better (from their perspective) solution: create a system of central banks that would allow them to transfer wealth to themselves directly, stealthily, by manipulating the money supply, instead of having to openly go to the government for favors. And sell this system to the public as a means of "preventing" future financial panics. The Panic of 1907 presented a perfect opportunity to put this system in place, and by 1913, the Federal Reserve system was law.

> The reason why Roman emperors debased their coins was because they had emptied their coffers on other things yet /needed/ to keep the army strong for what was essentially a looting regime to survive.

Governments always spend more money than they have, on white elephant schemes that do not benefit society. So of course governments are always looking for ways to avoid having to face the consequences.

But that is a very different claim from the claim you originally made, and I rebutted, that the Roman empire fell because the ultra rich were hoarding all the wealth. In fact the Roman emperors appropriated the wealth of the ultra rich pretty much the same way they appropriated the wealth of everybody else. The reason for the shortage of real wealth in the later Roman empire was not that it was being hoarded; it was that it was being squandered and destroyed by ruinous public policy.


Of course, people DID build ternary computers.

The most notable of these is Setun' (I had a chance to play with it at Moscow State University).

https://en.wikipedia.org/wiki/Setun


Setun is the only example, and it had a binary memory.


There was also the Canadian QTC-1. [0]

[0] https://jglobal.jst.go.jp/en/detail?JGLOBAL_ID=2009020829793...


This is ROM only, not a fully functioning computer.


Head to the end of the paper about the ROM [0], and you'll find the QTC-1 was the computer said ROM was designed for, not the ROM design itself.

[0] https://wwwee.ee.bgu.ac.il/~kushnero/ternary/Using%20CMOS%20...


AFAIK it was never built, only partially designed.


You still don't get accurate data if you just count bodies.

The issue here is every death is attributed to COVID-19 if the virus is present even if the actual reason is some other underlying health condition. Given that most victims are elderly and had serious pre-existing conditions this leads to massive mortality rate over-estimation.

You also cannot reliably compare EU/US data with Chinese: initially most deaths and cases of pneumonia in China were attributed to other illnesses. Secondly, Chinese government is notorious for flat-out lying in their official stats.


> The issue here is every death is attributed to COVID-19 if the virus is present even if the actual reason is some other underlying health condition.

That what you claim provably doesn't happen. Even old people simply don't die that fast otherwise, and we also have the data of the people who can't breath and have to be admitted to the hospitals: there were never that many cases before, happening that fast:

https://en.wikipedia.org/wiki/File:Is_COVID-19_like_a_flu%3F...

(the numbers were compared by the Italian National Biotechnology Association, each bar is one week of time)


Not really. When programming was done by real engineers (rather than "coders", "developers", or horribly misnamed "software engineers") - with proper engineering discipline which involved deliberate design and documentation rather than "agile" hacking, it was more reliable. By far.

My personal recent experience of actually doing software the old-fashioned way involved writing correctness critical high-performance code for a major data warehouse vendor, which mostly stayed with zero known bugs in production - greatly contributing to the vendor's reputation as reliable and dependable place to keep your data in.

And how do I know what the old-fashioned way to write code is? Well, I've been writing code professionally for nearly 40 years.


> it was more reliable. By far

[talking about non-software engineering] There's a huge difference between reading books about proper engineering and how it is actually practiced. Much of the sloppiness is covered by simply over-engineering it. Designs are constantly improved over time based on service experience. Heck, lots of times the first machine off the line cannot even be assembled because the design dimensions on the drawings are wrong.

The idea that non-software engineering is done by careful professionals following best practice and not making lots of mistakes is painfully wrong.


As an ex-machinist, I can confirm that bad drawings are a thing.

But then, non-software engineering is a wide field. There are products that you can afford to iterate on, and then there are very expensive (and potentially very dangerous) projects where you generally can't afford many slip ups.

If your engineers make lots of mistakes (which aren't caught in time) in a project that costs millions and can't be replaced by another unit off an assembly line, that's kind of a big deal. Thankfully, we don't hear about bridges, skyscrapers, heavy industrial lifts or nuclear power plants failing all that often.


The things you mention are over-engineered by wide margins to cover for mistakes.

The first nuke plants are pretty dangerous by modern standards, and we know how to fix them, but because the first ones are dangerous we are not allowed to build fixed ones.

The Fukushima plant, for example, had numerous engineering faults leading to the accident that would be easily and inexpensively corrected if we were iterating.

Airplanes are a good example of how good things can get if you're allowed to iterate and fix the mistakes.


The more I learn about modern software practices, the more I come to think that software is worse today because programmers are poorly disciplined and badly trained. Of course, that's likely because the barrier to entry is so much lower today than it used to be.


The barrier of entry to software has been zero ever since the 8080 was introduced.


Back then there were a lot less stackoverflow copy & paste mistakes.


There were also far fewer users to really discover the nasty edge case bugs.


> There were also far fewer users to really discover the nasty edge case bugs.

If you consider why a company makes something reliable or not, it's a relatively simple formula:

    = Number of Users x (Benefit of Getting it Right x Probability of Getting it Right) - (Cost of Getting it Wrong x (1 - Probability of Getting it Right)))
As the number of users in any system increases, the cost overall cost of getting it wrong also increases. You can then devote more fixed cost resources to improving probability of getting it right.


Old fashioned way?

You mean all that softwarte written in '90s with no security in mind?

I bet you're talking about outliers. Nowadays average developer practices are levels above that what was decades ago while being supported by great tooling.


I feel like there are two separate things going on here.

You can have an extremely reliable piece of software running say, an industrial lathe or a stamper or book printer or whatever - software which can run 24/7 for years if not decades, software which will never leak memory, enter some unknown state or put anyone in harms way - and yet have zero "security", because if you plug in a usb keyboard you can just change whatever and break it entirely. Software which has no user authentication of any kind, because if you are on the factory floor that means you already have access anyway because the authorization step happens elsewhere(at employee gates etc).

It's like people making fun out of old ATMs still running Windows XP, because it's "not secure". If the machine isn't connected to the internet, reliability is far more important - who cares windows XP is not "secure" if the ATM can run constantly for years and reliably dispense money as instructed and there isn't a remote way to exploit it.

I feel like that first kind of software(the reliable kind) is far rarer today - people just throw together a few python packages and rely on them being stable, without any kind of deeper understanding of how the system actually works, and they call themselves software engineers. The "security" part usually also comes as a side effect of using libraries or tools which are just built with "security" in mind, but without deeper understanding what having truly secure software entails.


It really depends on the scenario. When I wrote software for load balancing phone calls, it was minimal, had a well defined state machine, passed all the static testing I could throw at it, etc. At the same time, I wrote some crappy web service code which could fail and get retried later, because nobody would see that. If the worst thing that can happen is that one in a million visitors will get a white page, it doesn't economically make sense to do better. Even if you know how and have the tools.


I would find it very hard to bring myself to do that. I won't knowingly write incorrect code even if the chance of failure is very small. Luckily I don't have a boss breathing down my neck telling me not to waste time.


I don't think anyone knowingly writes incorrect code. But you can spend between 0 and infinite time thinking about whether the code is correct. At infinite you never release, so you don't need a boss to have some reasonable time limit. If this is your non-hobby work, you need to decide when to stop looking and accept the potential issues.


Just don't mention the elephant in the room: the Cantillon effect which is primary reason for the wealth flowing from working classes and savers to the bankers and the managerial class. Entirely courtesy of artificially low interest rates created by central banks and lax controls on monetary emission (i.e. fractional reserve shenanigans) by private banks.


The Cantillion effect describes a phenomena of relative inflation due to the uneven distribution of new money and access to credit.

This doesn't really translate to "a flow of wealth from working classes and savers to the bankers and the managerial class". Rather, the impact on inequality is that it reduces the purchasing power of those not benefiting from the increased supply of money and credit. As these tend to be the poorest individuals in society, inequality is made worse.


>artificially low interest

Interest rates are driven by the supply and demand of credit. Supply outstrips demand now.

There are two sides to every transaction; low rates are good for borrowers and bad for lenders. What makes you think the lenders are entitled to a greater return on their savings? Do you think we should force people to borrow at higher rates for this purpose?

>wealth flowing from working classes and savers to the bankers and the managerial class.

The working class in America are debtors and have no savings. Outside of low rates contributing to driving housing prices higher in some communities, how are the working class harmed by lower payments on their debt?


> Interest rates are driven by the supply and demand of credit. Supply outstrips demand now.

While that's somewhat true, its also largely dictated / controlled / heavily influenced by government. This means the overnight lending rate, U.S. bond rate, etc.


You're both right. The Fed is a lender (to banks only) and carries out what they call "open-market operations" with the goal of enacting policy and not making money. They throw around enough money to skew the market.


>This means the overnight lending rate, U.S. bond rate, etc.

The overnight lending rate is set by the Fed, yes.

Treasuries are sold in the market. Although an initial auction price is set, the rates will fluctuate based on demand for the bonds.

I don't deny the Fed are a major influence on rates, as it's a major component of their mandate now. However, the market can "agree" or "disagree" with those rates and set corresponding rates however they choose.


But your missing the key part. Sometimes if the Fed sets rates too low and there's not enough demand for the bonds the Fed buys the bonds thus keeping the interest rates artificially low.


>Sometimes if the Fed sets rates too low and there's not enough demand for the bonds the Fed buys the bonds thus keeping the interest rates artificially low.

Yes, it's how the Fed conducts monetary policy. Can you name the last time that US treasuries were under-subscribed? Greek bonds have lower rates to US treasuries; which would you rather own? On a relative basis, how can one claim that US interest rates are "too low"?


> Can you name the last time that US treasuries were under-subscribed?

Yes, a couple of months ago.


> The working class in America are debtors and have no savings.

There are plenty of working class people that avoid debt and save money. Why should those people, who are acting responsibly, lose out on savings interest? We should be encouraging people to save, not make it cheaper to go into more debt.


Interest rates are set by FED, who can print arbitrary amount of money out of thin air, there is no supply/demand mechanism involved in setting them. Basically every rate change is an experiment testing whatever monetary theory is currently popular among FED board members.


>Interest rates are set by FED, who can print arbitrary amount of money out of thin air

One interest rate is set by the Fed, which serves as a benchmark for other market rates.

But it's a simple question: if I can borrow money at 3%, why would I borrow your money at 7% so you can earn a return? And if someone wants to lend me money at 3%, why is that "artificial"?

>who can print arbitrary amount of money out of thin air

How else should money be created? Should we do pretend mining, like Bitcoin?


> One interest rate is set by the Fed, which serves as a benchmark for other market rates.

This used to be true, but lately CBs are also buying bonds. That affects their supply/demand balance, which affects their price, which is another way of expressing the interest rate.

Source: spent time trading bonds.


>but lately CBs are also buying bonds.

Monetary policy, and the setting of rates, is accomplished by the buying and selling of bonds in the open market by the Fed. They buy bonds and create money, or sell them to destroy it. This affects the amount of money "available" in the system, which affects interest rates.

Of course, this transmission mechanism isn't perfect.


>Should we do pretend mining, like Bitcoin?

Real mining seemed to work okay in past. American GDP grew faster in the 1800s under the gold standard (avg. 4%+) than any time after the creation of the federal reserve.


> American GDP grew faster in the 1800s under the gold standard (avg. 4%+) than any time after the creation of the federal reserve.

Yes... during industrialization. Basically all countries experience rapid GDP growth during their industrialization. Even developing countries today get 4%+ GDP growth. Look at China's GDP growth in the last 50 years for a recent example.


You would still have to prove to me that the rate of gold mining is the perfect tool to prevent inflation/grow the economy. Who knows if it may have grown faster or slower during that period without the gold standard.


>Real mining seemed to work okay in past.

For the purpose of "creating money", it's a waste of resources.

>American GDP grew faster in the 1800s under the gold standard (avg. 4%+)

I don't want to go back to that period.


Why stop at plants? If someone chooses to take whatever substance it's no one else's business (unless others have to pay for the medical treatment afterwards... but that is easily solved by NOT making others pay). The actually dangerous drugs - the ones (SSRIs, notably) which have suicide as a side effect, sometimes resulting in people going on murderous rampages - are quite legal and widely prescribed. Go figure.


How do you stop paying for medical outcomes? And why should we care?

As part of insurance pools, today I pay for extreme sports injuries, alcohol and tobacco related chronic problems, outright stupidity, and obesity related issues.

No one seems to balk at that; why should THC or magic mushrooms be fundamentally different? Even if they “should” be, how could they be?


I agree with you in the sense that refined sugar in processed foods should be highly regulated, as society pays an enormous cost for it.


No, LinkedIn works to establish the network of professional connections which stays with you even when you got a job.

Now, I'm not sure if these are of any worth (I never got anything useful out of LinkedIn... either people trying to sell me something, or colleagues I know already).

I've heard of people (of gold digger kind) using LinkedIn as a dating site.


Because if homosexual behavior is genetically-determined (or inalterably biologically determined) the gays can argue that they are being discriminated for what they ARE, rather than for what they CHOSE to be. There's a large number of laws which prohibit voluntary behavior of one sort or another even if there are no direct victims (drug use would be one example) so the proponents of restricting homosexuality would have it easier to argue that there's an overriding societal concern which justifies this voluntary behavior. (The existence of laws creating "victimless crimes" is an accepted part of modern legal systems, sadly, but no one other than hard-core libertarians actually oppose the concept.)


If we use that as a reason, shouldn’t we study why some people are attracted to people of another race since there are many religious people who still think interracial marriage is wrong and it was illegal in certain states as late as the 60s.

We decided as a society to ignore those people because they were wrong. We shouldn’t need “studies” to get the hell out of people’s bedroom. Slowly we are changing as society. But studies won’t convince people who believe that not only are “the gays” going to hell but they are the cause of all of society’s ills and natural disasters.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: