Hacker Newsnew | past | comments | ask | show | jobs | submit | more ReflectedImage's commentslogin

Nah that's just a lack of understanding in the role of unit tests in dynamically typed languages.


Elsewhere in this thread, dynamic typing advocates malign the hassle of maintaining types, and it is always coupled with strong advocacy for an entire class of unit tests I don't have to write in statically typed languages.


And that's the problem, if you want your code to actually work you do need to write those unit tests. A program not crashing doesn't mean it does the right thing.

With experience you will learn to either write unit tests or spend the same amount of time doing manual testing.

Once you start doing that then the unit tests just replace the static typing and you start shipping better code to your customers.


This always feels like a bad faith argument. Nobody says that with static types, you don't need any unit tests.

And your suggestion that people who like static types "don't know how to write unit tests" is further bad faith.

Perhaps it's dynamic typing programmers who don't know how to write sound programs? Except I'm not making that claim, because I'm giving you all some benefit of the doubt, a degree of respect you are not giving others.


Static typing doesn't have much value if there are proper unit tests. So it's fairly obvious that if people think there is value in static typing then they are shipping broken code to their customers.

It's called ratting yourself out.


> Static typing doesn't have much value if there are proper unit tests

Wasteful unit tests that assert your types are right don't have much value if there is a proper type system.

> It's called ratting yourself out.

Quit being childish.


"Wasteful unit tests that assert your types are right"

You don't test whether the types are right, you test if your code actually does the right thing. That's what's important to your customers.

The types getting tested is incidental.


Gradual typing is the worse of both worlds.

You get the complexity and slower development times of using statically typed languages along with the bad performance of using dynamically typed languages.


Is this based on your experience or is it just an assumption? I only have anecdotes, but it does not reflect your claims, rather the exact opposite. A lot of the boilerplate code doesn’t need to be type annotated, but annotating the main business logic doesn’t take more time and is not more complicated, but instead type annotations help write code that is more clear, more obvious, and it adds some kind of documentation.


It really depends on how you tackle gradual typing on a project level. The easiest way to sabotage is a "any new code must be fukly type checked" requirement, because it often means you also need to add type hints to any code you call, which leads to Optional[Union[Any]] nonsense if you let juniors (or coding assistants) go wild.

As always, no fancy new tech is a substitute for competent project management.


Have you actually used this in a real codebase? Because it is the opposite of my experience in gradually adding types to a large python codebase. There's no extra complexity or slower development. It's not like you need to suddenly move to a different coding paradigm with FactoryBeanFactoryBeans... You just keep writing python like you did, but add types here and there to help clarify things and make your LSP (like ty) work better.

If anything, it speeds up development. Plus it helps give more confidence in the correctness of your code.


Yep, the software development slows down to crawl. Yes, you can still code at the same speed as you were coding in a language like Java or C# but that is considerably slower then what's possible in languages like Ruby and Python.

To give you a roughly idea, you should always expect a 3x slow down when using static typing.

An recent example is Turborepo that went from basic types in Go to proper static typing in Rust. Just adding in the proper typing caused the codebase to grow from 20,000 lines to 80,000 lines and from 3 developer months to 14 developer months.

The stronger your typing system, the slower you will develop code. Whether you realise it or not.


Nonsense. You get the simplification and faster development times of knowing some variable types statically, plus the performance improvements for the compiler which can move the type checks from runtime to compile-time. Plus all the new optimization possibilities.

Common Lisp showed you the way. But almost none looked at it. Only PHP did.


Absolutely not. Duck type based development results in working code out of the door 3x faster than static type based development. It always has since ancient times.

If performance wasn't an issue, then the static type based developers would all be fired. Either directly or by the businesses who relied on them getting driven into bankruptcy by their competitors. You would still get some niche jobs in it where they like to do formal verification of the code.

Your problem is just that your development skills from static type based development don't transfer to duck type based development. Different style of codebases needs to be handed completely differently.


I am talking about gradual typing here. Types are optional, if not given or implied they default to any. No need to annotate anything. If given they are enforced, and lead to optimized op codes and errors if violated. Some at compile-time, some at run-time. If fully typed, all errors are caught at compile-time already.

Duck typing as done with python is the worst of both worlds. No optimizations, no enforcement. Just optional external typechecks.

Of course untyped code (ie runtime types in each var) is to write faster. You only need to add types to some vars or args, and gradually improve from there. Eg ints only, because they are optimized the easiest. No need to check for strings, bigint, floats,.... Or arrays to check for overflows at compile-time and restrict to ints or strings. Massive improvements possible, in size and runtime.

Or object fields. Hash lookups vs static offsets.


If JITs hadn't been invented you would be completely right but JITs have been invented.

There are deeper optimizations that JITs can do such as knowing at runtime that the value of a variable is always 2 that typing information simply can't express.

Duck typed Python is optimal for development speeds, the only thing that matters in startup environments. It has it's niche.

You aren't gradually improving, you are gradually deteriorating the codebase to make it look more familar to you.


At least CPython and CRuby (MRI), the most common implementations of each language, ignore all type hints and they are not able to use them for anything during compile or runtime. So the performance argument is complete nonsense for at least these two languages.

Both Python and Ruby (the languages themselves) only specify the type hint syntax, but neither specifies anything about checking the actual types. That exercise is left for the implementations of third party type checkers.


Because the anti-types crew showed up and sabotaged it. Similar with perl and lua.

But languages with stronger and more intelligent leadership showed what's possible.

You cannot implement all the compiler optimizations for const and types in extensions. You need to fork it.


The problem is there are a lot of developers who have only coded with static typing and have no idea about the terrible drawbacks of static typing.

They don't understand what static typing does to code verbosity and development times.

Take Turborepo going from Go's typing light system (designed to emulate duck typing) to Rust's heavy typing system (true static typing). Originally the code was 20,000 lines and was coded by 1 developer in 3 months. When moved into the typing style you like so much, the same code is now 80,000 lines and was coded by a team of developers in 14 months.


Type free languages like Lisp, Python and Ruby have faster software development times than languages that use types.

The developers who are using the statically typed languages, which are slower to develop in, with are being pushed to use the faster languages.

But those developers don't know how to code in type free languages. So they attempt to add the types back in.

This of course reduces the software development speeds back to their previous speeds.

This means the whole thing is basically folly.

If you want a real example you can take a look at Turborepo, which in weakly typed Go took 1 developer 3 months to develop and has 20,000 lines of code. The direct port to Rust took a team of developers 14 months to develop and has 80,000 lines of code.

Exact same program but the development costs went up proportionally to the increase in the strength of the type system.

There are plenty of developers out there who have only used static typing and don't understand it comes with massive software development costs compared to it alternatives.

If you are developing a SaaS and you use duck typing, unit tests and micro-services. You will get to market long before your competitors who don't.


I have experience that I think most don't. My experience says you are very, very incorrect.

In the past couple of decades I have been through a couple IPOs, a couple of acquisitions, and have been in engineering leadership roles and slinging code in half a dozen different shaped eng/dev cultures.

In every case, static typing makes teams faster and gradual typing was a pain with potential payoffs that were muddy. Gradual typing is a shitty bandaid and so are type annotations.

I have migrated no less than 30 systems from various languages to Go across different companies, divisions, and teams. Mostly PHP, ruby, perl, python. Didn't migrate the elixir but I would have if given the opportunity.

In every single case, the team started delivering software faster. Prototypes became faster with the sole exception of prototype admin crud panels which we have needed like twice out of the nearly three dozen services I have worked on migrating. And super dynamic json can be a pain (which I blame not on problem spaces but on less thought out dynamic typed solutions offloading their lack of design onto customers via randomish response bodies).

When programs/applications get larger, the complexity tries to combinatorially expand. It can quickly outgrow what newer team members can juggle in their head. Type systems take some of that away. They also take away tests that are there due to lacking types. "What if this is a string, or list, or number" isn't a question you ask, nor is it a test you write and maintain.

When everything fits in your head, dynamics types are freeing. When it doesn't fit in your head, tooling helps.

Even smaller programs benefit. The dozens of teams I have personally witnessed don't find adding a type as a slowdown - they see whole test cases they can ignore as impossible to compile.


This is junk. Writing a type annotation takes basically zero time, then saves you time by preventing the runtime error because you forgot which variable was which, then saves you more time by autocompleting the correct list of valid methods when you hit dot.

Acting like Go is comparable to JS is ridiculous; Go's type system is the only kind of type system needed in Ruby. Rust is a staggering outlier in complexity. And the Turborepo port took a long time specifically because they tried to port one module at a time with C interop between the old and new codebases, which massively slows down development in any language, especially Go. This is just about the most dishonest 'example' you could have picked.

Either that or you are saying 'weakly typed' to mean type inference in `var := value`, in which case (a) Rust has that too and (b) that's not what the debate is about, nobody is against that


Making the type annotations pass restricts you to writing more bloated and verbose programs in general.

Stating that A is an integer isn't much of a issue but once you get a reasonably complex program and A now has a compound type made of 5 parts, it really does slow you down and it really does make you write considerably worse programs for the sake of passing a type checker.

Any commercial code will need to be unit tested so there is no time saving from finding runtime errors earlier and an any good IDE will detect the same errors and provide you with the same auto complete automatically for free without any type annotations at all. These are problems which exist solely in your head.

1 developer vs a whole team of developers. I think you need to face the facts.

There are studies comparing old dynamically types languages against statically type languages. They always show approximately 1/3 of the lines of code being used with 3x faster development times in the dynamically types languages. This isn't some new discovery.

Well even Python is strongly typed but for the sake of this we are discussing type complexity.


It seems like your main gripe is that writing the type annotations slows you down, so I'd be interested to know what you think of languages like OCaml, Elm, Gleam or Roc. These are languages which never (or almost never) require any type annotations because the compiler can always infer all the types. Most people using these languages tend to add type annotations to top-level functions anyway though.

It seems to me that this is equivalent to a language without a type checker that automatically generates a unit test for every line of your program that tests its type.


The default type is always any (or dynamic, as some call it). No need to type everything. Usually you type just some args. Not even locals.

And some types can be inferred by the compiler, as e.g. for new instantiators. Or array, int, str convertors.


You are comparing apples to oranges, and go is pretty strong typed


I'm comparing a program with itself.

Go only has basic types and interfaces to emulate duck typing (structural typing). The type complexity in Go is rather on the low side of things.


And this is different from outsourcing the work to India for programmers who work for $6000 a year in what way exactly?

You can go back to the 1960s and COBOL was making the exact same claims as Gen AI today.


Gen AI taking programmer's jobs is 20 years away.

At the moment, it's just for taking money from gullible investors.

Its eating into business letters, essays and indie art generation but programming is a really tough cookie to crack.


It's taking away programmers jobs today. I know of multiple small companies where hires were not made or contractors not engaged with simply due to the additional productivity gained by using Gen AI. This is for mundane "trivial" work that is needed to glue stuff together for the fields those small companies operate within.

It's like how "burger flippers" didn't go extinct due to automation. The burger joint simply mechanised and automated the parts that made sense, and now a lunch shift is handled by 5 employees instead of 20.

They will not replace the calibre of folks like Rob Pike in quite some time, perhaps (and I'd bet on) never.

I will grant you that the hype does not live up to the reality. The vast majority of jobs being taken from US developers are simply being offshored with AI as an excuse - but it is an actual real phenomenon I've personally witnessed.


But is it meaningfully different from the outsource to India craze?

That certainly in the short term took some programmers jobs away. That doesn't mean it pans out in the long term.


All creative types train on other creative's work. People don't create award winning novels or art pieces from scratch. They steal ideas and concepts from other people's work.

The idea that they are coming up with all this stuff from scratch is Public Relations bs. Like Arnold Schwarzenegger never taking steroids, only believable if you know nothing about body building.


The central difference is scale.

If a person "trains" on other creatives' works, they can produce output at the rate of one person. This presents a natural ceiling for the potential impact on those creatives' works, both regarding the amount of competing works, and the number of creatives whose works are impacted (since one person can't "train" on the output of all creatives).

That's not the case with AI models. They can be infinitely replicated AND train on the output of all creatives. A comparable situation isn't one human learning from another human, it's millions of humans learning from every human. Only those humans don't even have to get paid, all their payment is funneled upwards.

It's not one artist vs. another artist, it's one artist against an army of infinitely replicable artists.


So this essentially boils down to an efficiency argument, and honestly it doesn't really address the core issue of whether it's 'stealing' or not.


What kind of creative types exist outside of living organisms? People can create award winning novels, but a table do not. Water do not. A paper with some math do not.

What is the basis that an LLM should be included as a "creative type"?


Well a creative type can be defined as an entity that takes other people's work, recombines it and then hides their sources.

LLMs seem to match.


Precisely. Nothing is truly original. To talk as though there's an abstract ownership over even an observation of the thing that force people to pay rent to use.. well artists definitely don't pay to whoever invented perspective drawings, programmers don't pay the programming language's creator. People don't pay newton and his descendants for making something that makes use of gravity. Copyright has always been counterproductive in many ways.

To go into details though, under copyright law there's a clause for "fair use" under a "transformative" criteria. This allows things like satire, reaction videos to exist. So long as you don't replicate 1-to-1 in product and purpose IMO it's qualifies as tasteful use.


What the fuck? People also need to pay to access that creative work if the rights owner charges for it, and they are also committing an illegal act if they don't. The LLM makers are doing this illegal act billions of times over for something approximating all creative work in existence. I'm not arguing that creative's make things in a vacuum, this is completely besides the point.


Never heard anything about what you are talking about. There isn't a charge for using tropes, plot points, character designs, etc. from other people's works if they are sufficently changed.

If an LLM reads a free wikipedia article on Aladdin and adds a genie to it's story, what copyright law do you think has been broken?


Meta and Anthropic atleast fed the entire copyrighted books into the training. Not the wikipedia page, not a plot summary or some tropes, they fed the entire original book into training. They used atleast the entirety of LibGen which is a pirated dataset of books.


Speaking as a Python programmer, no. Using types in a prototyping language is madness.

The point is you drop things such as types to enable rapid iteration which enables you to converge to the unknownable business requirements faster.

If you want slow development with types, why not Java?


Have you written any go code? it's the closest I've come to actually enjoying a type system - it gets out of your way, and loosely enforces stuff. It could do with some more convenience methods, but overall I'd say it's my most _efficient_ type system. (not necessarily the best)


If you can’t do fast prototypes with types, you need to get better at using types. It’s very fast to throw stuff together in TypeScript.


No one can do fast prototypes with types, all they can do is convince themselves they are faster than they really are.

Having worked in both dynamically typed and statically typed software development shops, the statically typed programmers are considerably slower in general. Usually they only have 1/3 of the output as programmers who use dynamically typing. Statically typed programmers also tend to be much less ambitious in their projects in general.

They still think they are "fast programmers" but it's complete fiction.


because i want fast development with types.


Yeah, but that doesn't exist. Types and fast development are directly opposing goals.

This goes all the way back to Lisp vs C in the 1980s with C programs having triple the development time as Lisp programs.

To modern day with Turborepo taking 3 months to write in structually typed Go vs 14 months in statically typed Rust.


> Using types in a prototyping language is madness.

It's not a prototyping language or a scripting language or whatever. It's just a language. And types are useful, especially when you can opt out of type checking when you need to. Most of the time you don't want to be reassigning variables to be different types anyway, even though occasionally an escape hatch is nice.


Types are not always useful, they increase the line count per delivered feature by 3x to 4x, which results in a corresponding increase in bugs in the delivered code and an corresponding increase in the overall software development costs.

It's very foolish to just use types in all programming projects.


But don't forget OOP was an attempt in the 90s to bring micro-services to single user computers.


This is a conservative problem.

Conservatives are split into 2 groups. Conservatives who are in it for the money and conservatives who are in it because they don't know any better.

College professor is not a well paying job for the level of skill required nor is it a job that someone who isn't very knowledgeable could do. That excludes most conservatives from the position.


> Conservatives are split into 2 groups. Conservatives who are in it for the money and conservatives who are in it because they don't know any better.

An inane assertion made without evidence.

> nor is it a job that someone who isn't very knowledgeable could do

So why is the bias the worst in the least rigorous fields?


Ahh the bad high school maths take, which doesn't account for the risk.

People don't just build 1 thing for a house nor can they afford a $20k failure.

If you take Fred who saves up $10k for a major purchase for his house each year.

If there is a 10% chance of a failure, then Fred will have a 53% chance of bankruptcy in 7 years.

You can't run an economy where everyone is bankrupt.


Calling out people for “bad high school math” when you can’t even write a coherent sentence is pathetic.

A man saving 10K per year goes bankrupt if he doesn’t spend it productively? That’s what you’re trying to say?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: