Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Problems I Have with Python (darkf.github.io)
212 points by ploggingdev on Jan 25, 2017 | hide | past | favorite | 231 comments


Some of the author's points are valid. However, many are subjective preferences, and some are gripes without solutions, and others make it difficult to understand the author's underlying philosophy.

My main critique is that the author added this statement that puts a negative, entitled, and naive tone on the whole article:

>>> to which no real improvements are being made for some reason. (Incompetence? Politics? Both? Who knows.)

The author is not acknowledging that some of his points were not addressed because of reasons other than incompetence and/or politics. I imagine this statement could offend some of the smart and hard-working people who are working on improving the python language.

Reasons the author does not acknowledge:

- the community does not agree with the author's subjective idea of what Python should look like

- the solutions to a problem (I'm thinking GIL) come with a lot of consequences which are not readily acceptable

- solving some of the issues would exacerbate backwards compatibility. This would increase the author's problems even more, because, as he states "This is particularly a pain for libraries where I expect to pip install them and have them "Just Work"."


>However, many are subjective preferences

Certainly, it is titled "Problems I Have" for a reason. :-) I do not expect everyone to agree with me, but it is what I feel I personally lack when using it quite a lot.

> I imagine this statement could offend some of the smart and hard-working people who are working on improving the python language.

That was certainly not my intention -- as stated, I do love the language and appreciate all work going into it. I do not intend to undermine their efforts, just point out some of my perceived design flaws.

>- the community does not agree with the author's subjective idea of what Python should look like

I think we all agree there should be a good solution to concurrency (and "stackless" variants which power eventlet, etc. have been used for ages; as has Twisted, of which asyncio is not a sufficient clone.), parallelism, etc.

The standard library in general encourages use of higher-order functions and concepts borrowed primarily from FPLs (see: comprehensions, map/reduce, sort, etc.) I could not imagine seeing them backtracking on this -- it only helps them to go further in that direction.

>- the solutions to a problem (I'm thinking GIL) come with a lot of consequences which are not readily acceptable

I did not propose a solution because there are many, as you note; there are, however, implementations with decent solutions like AFAIK Jython.

>- solving some of the issues would exacerbate backwards compatibility.

Such as what?


> I did not propose a solution because there are many, as you note; there are, however, implementations with decent solutions like AFAIK Jython.

There are no solutions that satisfy everyone that I am aware of yet. Guido has said in the past that he'd be happy to get rid of the GIL, and would merge a patch that solves it, as long as:

* It does not reduce the performance of single-threaded Python code.

* It stays compatible with all existing pure Python code and C extensions.

But in practice, GIL is not that much of an issue for many types of applications where Python is popular.

* It's not an issue for web apps, because these are typically served from multiple physical servers each running multiple python processes. These do not share a GIL anyway, and "thread safety" is pushed to database transactions.

* It is not an issue for apps which spend most of the time doing I/O. Most IO libraries release the GIL, and other threads can run while you're waiting for results from the database.

* It is not an issue for data science doing heavy number crunching with numpy and everything built on top of numpy. Numpy releases the GIL while doing large computations in C.

* It is not an issue for small scripts, as a "better than Bash".

The GIL is only an issue for apps that do heavy computation in pure Python code, and need parallelism within a single process (socket servers? text data processing?). As a result, many Python users just don't find it a big enough problem to be worth solving, if the solution comes with downsides for their use cases.


The GIL is only a problem because there's "no free lunch" -- no single strategy that is best in all cases.


> It is not an issue for apps which spend most of the time doing I/O.

This is a common misconception that doesn't seem to be backed up by any data.

Dave Beazley did a number of performance tests with profiling, looking at GIL contention in a multi-core scenario: http://www.dabeaz.com/python/GIL.pdf

The results were that even IO-bound workloads still suffered because of the poor implementation of the GIL (details on slide 35 or so). This was an issue up until Python 3.2 (!) when a new GIL implementation was added, which he also profiled: http://www.dabeaz.com/python/NewGIL.pdf


Not backed up by data? Maybe because it's so easy to document that no one bothers to write about it?

Multithreaded IO-bound tasks don't care about the GIL.

Yeah the old implementations of the language were not as good as the latest. It doesn't seem right to criticize the language for problems that have already been fixed.


"IO-bound multithreading is fine" has been the Python mantra for the last 20 years. Lo and behold, someone actually gathers some data and comes to find out that is absolutely wrong. For the last few years they've had a revamped version of the GIL, but that still has a burden of proof that can only be validated by profiling real-world applications.

A community can't make flat-out invalid claims for two decades and then expect everyone to take them at their word that everything is fine now.


>>- the solutions to a problem (I'm thinking GIL) come with a lot of consequences which are not readily acceptable

> I did not propose a solution because there are many, as you note; there are, however, implementations with decent solutions like AFAIK Jython.

>>- solving some of the issues would exacerbate backwards compatibility.

> Such as what?

Let's use the GIL for example. The reason that is still present is not that it is hard to remove it. It already was removed in the past. The problem is that when GIL is replaced with smaller locks, the python becomes much slower, because of some features and behavior that people got used to.

One could make python faster by changing behavior, but then it would break existing code and C extensions.

There's no easy way to do this without sacrificing something else. Larry Hastings work on removing GIL and has interesting talk about it[1]

[1] https://www.youtube.com/watch?v=fgWUwQVoLHo


That talk was a great overview of the issue.

Most people who are ignorant about the subject always assume the GIL is stupid and useless, but the GIL allows Python to be extremely fast in single threaded scenarios, and any attempt to remove it introduces at least 20% slow downs.

And C extension support is also a huge factor as you mention. All of Python's scientific modules would be lost if they broke that.

The author assumes these are simple problems that can be solved but aren't because of politics or incompetence, but some of the smartest mind have attempted and failed.

I'd like the author to try and come up to solutions or at least draft ideas for how each of his points can be fixed. A lot of those are easy to state but difficult to solve without breaking more stuff.


  > with decent solutions like AFAIK Jython.

  >>- solving some of the issues would exacerbate backwards compatibility. Such as what?
Jython can't load native C extensions which should be GIL aware. Most programs, and the python interpreter itself, aren't thread safe so suddenly removing the GIL would break a lot of programs.

I agree with you that the concurrency story for python sucks, but claiming solutions could exist without breaking back-compat is just not right.


FWIW, the folks working on TruffleRuby have done some amazing things on this front - essentially making an interpreter for Ruby C extensions that JITs to code that's more-or-less identical in performance to the original native version.


Is Jython good though?

Last I used it is had issues keeping pace, i.e. demonstrable memory issue that took a long time to fix, lagged considerably behind python 2/3 versions.

It's also worth noting, that as an essentially transcompiled language, you need to have a good appreciation of Java machinery, in which case languages like Groovy provide good competition.


When talking about JVM languages suitable for building systems, Jython isn't generally mentioned for the reasons you give, nor is Apache Groovy. Those two are good for scripting, e.g. testing Java classes, build scripts, glue code. Besides Java, languages like Clojure, Scala, and Kotlin are usually considered as systems languages on the JVM.


True, but I believe Groovy and Jython compete for the same space, if not system-building.


>but claiming solutions could exist without breaking back-compat is just not right.

I mean, you have a good point on C extensions but code relying on them (without a really portable API) is almost never going to be forward compatible anyway. (There are still quite a few C extensions not up to CPython 3 yet.)


I was taken back by this rather harsh treatment of Python.

Is it really realistic to 'have it all'? I'm fully aware that I'd have to go to crazier languages if I want parallelism or speed. For what Python is, it offers me reasonable tradeoffs (mostly slanted towards productivity)..

Regarding the FP comments, since it lacks TCO, my take away has always been that Python can only ever become a quasi-functional language. Its hard to be more than that in its current state.

Anyways. These questions made me want to ask you - what languages do you think are better in comparison?


> I was taken back by this rather harsh treatment of Python.

I am taken aback by the evangelical tone of Python enthusiasts, where is has warts intentionally maintained by the creator in the form of missing features.

If you want speed you go to any other scripting language (other than Ruby). I agree Python is mostly sane and naiively productive. That being said, it's a result of the syntax. Transpiling it to another language like Google did, shows that the underlying technology is not worth much.

> what languages do you think are better in comparison

Better in what way? PHP, Go, Pony, Javascript all have these features and the problems with the languages are not that people don't understand when they come across a switch or map.


> If you want speed you go to any other scripting language (other than Ruby).

Ruby has historically had the same issues. Most Pythonistas I know aren't so evangelical. It's mostly a question of how to go about integrating C/C++ code.

Many people complaining about the GIL (and the like) have some naive microbenchmark, don't understand the trade-offs/limitations of their runtime etc. That doesn't mean critique isn't important and required, but it's going to be better when it's properly researched and improves on the body of work out there (https://www.youtube.com/watch?v=Obt-vMVdM8s).

> Transpiling it to another language like Google did, shows that the underlying technology is not worth much.

How is this any general indicator of the worth of the language?

It shows for some cases, that Google thought this was a worthwhile investment. Google has experimented for a long time with ways to improve how Python code can be run. They ran the Unladen Swallow project, but spent more time on LLVM issues at the time making it infeasible to continue the project.

They'll discontinue one path and try another. None of this is really a commentary from Google on CPython, the community, or the value that it has for most people. The people working on this stuff interact in a pretty friendly basis.


> If you want speed you go to any other scripting language (other than Ruby)

Which one? PHP? Perl? Bash? Scheme? VBScript? Windows PowerShell?

Python is in fact of the fastest scripting languages that exist, especially JIT'ed.

The notable exception is JS, and oh, that has a GIL too :P


The lack of tail-call optimization to make the CPython interpreter simpler and debugging easier by preserving the call stack. It was a choice, not an oversight.


From a debugging viewpoint, this does not make sense, there is usually no interesting information in the in between frames.

TCE can also make debugging easier, how useful is a stack trace of 1000 lines consisting of

  ...
  File "bla.py", line 4, in fib
    return fib(n - 1) + fib(n - 2)
  File "bla.py", line 4, in fib
    return fib(n - 1) + fib(n - 2)
  File "bla.py", line 4, in fib
    return fib(n - 1) + fib(n - 2)
...

Not so much I think.


I like how you picked an example that is explicitly not TCO-able.

In any case, this particular problem is no longer an issue as of Python 3.6, as that now collapses repeated stacktrace lines (see https://bugs.python.org/issue26823). Although this doesn't work for mutual tail calls, it does solve the debug noise issue in the most common case.


Ah yes, that is a bit stupid, I just wanted an example of a traceback :)


The notion that not having proper tail calls aids debugging always seemed like a post-hoc justification. The stack trace of an iterative function will lack exactly the same intermediate evaluation frames as a tail-recursive implementation.


The thing is, tail calls aren't _just_ about emulating iteration via recursion:

  def foo():
      raise ValueError

  def bar():
      return foo()

  bar()
With TCO, the stack trace would contain `main` and `foo`, as `bar`'s frame would be overwritten by `foo`. This example is simple, but `bar` could be a 50 line long if-else chain of tail calls and when debugging you won't necessarily know which condition was evaluated.


> The thing is, tail calls aren't _just_ about emulating iteration via recursion:

I completely agree, but there is also no need to perform TCO to make code like this safely runnable. TCO only becomes necessary/useful when implementing an iterative process where we can't statically know that the call stack won't be exhausted. That said, TCO is usually an all or nothing transformation, and it would be difficult to accurately avoid eliminating trivial tail calls like in your example.

A reasonable compromise might be for the Python VM to implement a TAIL_CALL bytecode op and require the programmer to decorate functions which rely on TCO. This wouldn't be any more onerous than manually trampolining large portions of code, which is the current method of getting around the lack of TCO.


A decorator that enabled TCO makes sense to me. Kind-of like the Numba project, it'd be a specialized JIT-compiler invoked only on some functions.

What's stopping that from being a 3rd-party library like Numba?


You can find simple decorators which try to provide space efficient tail recursion. Usually they work by trampolining the function. I've seen one example where a decorator rewrites the bytecodes to perform a goto in the case of self recursion. The problem is that all of these solutions are rather limited, easy to break, or have a pretty high runtime overhead. The general solution would be for a decorator to rewrite all CALL opcodes in tail postion to TAIL_CALL opcodes, but such an opcode currently does not exist. The actual implementation of a TAIL_CALL opcode would be almost idential to the CALL opcode, so adding it would probably be straightforward, but I'm speculating here.


Why not just make it a dev/production flag, then?


Probably because many tail-recursive functions _rely_ on tail-call elimination working reliably. Without also having an unbounded call stack, disabling tail-call elimination will likely just cause your programs to crash.


I never considered it harsh. If anything, it should be a testament to how nice Python is -- if I /didn't/ like it, I would have a much, much longer list of complaints!

People seem to be missing that sentiment -- I do love Python and use it almost daily. This is merely a list of thorns I run into frequently.

>Regarding the FP comments, since it lacks TCO, my take away has always been that Python can only ever become a quasi-functional language. Its hard to be more than that in its current state.

I mean, it could always encourage playing with functions more -- and importantly, providing an stdlib that encourages that.

>These questions made me want to ask you - what languages do you think are better in comparison?

That is a somewhat loaded question: my counter question would be, "In what regards?"

I cannot say a certain language is better than Python in every or most circumstances, but I can in regards to specific points/features, if you'd like to elaborate.


But it was harsh. You started the post with

> These are obvious flaws in design, in my opinion, that warrant re-looking at, but to which no real improvements are being made for some reason. (Incompetence? Politics? Both? Who knows.)

You list a number of things which Python ecosystem should do better or differently, and suggest that the reason why these changes aren't getting built as fast as you'd like is people playing politics or incompetence.

Whereas in reality, the two main reasons are that the changes will take lots of effort and time, which the volunteers don't have next to their day jobs (PyPy), or that the developers have different opinions on the ideal language design, and just don't agree with you (heavy functional programming).

"It is my personal subjective opinion that you are incompetent" isn't less harsh than "You are incompetent".


Once, I was on mailing lists with GVR and other language contributors, and have seem him go off deeply into functional programming. Some of the stuff he wrote went right over my head.

For someone who declares he hates functional programming even at the most basic levels of data stream manipulation, he knows it quite well.

I've always been frustrated with this disconnect, even more acutely than you have, because I know GVR is being disingenuous when he says, "I don't get it." He absolutely does. He thinks other people won't.


If you claim it's "incompetence" (it's not) then publish the patches that don't break anything but significantly speed up it, for example. Because you complain "it' slow."

It's not what you think it is. It's Python, not a toy language used by nobody. Just first try yourself to "fix" Python and keep its existing users happy by not breaking anything for them, then write about it.


Cool, I think you completely dodged the point. I never said it was slow because they were incompetent.


Python has made some trade-offs that you dislike. You complain about the negative consequences without comparing those against the benefits.

One of the major factors in speed is efficient memory layout. Contrast a Python list with a NumPy array. To achieve speedier loops and vectorized arithmetic [0], the array gives up dynamic typing and dynamic sizing. In most applications, I would gladly give up some compute speed to gain some programming productivity.

I love duck-typing. Formal typing has some impressive examples, but in the projects I've worked on has reduced my productivity. Perhaps because data are so often serialized to simple formats or written to databases that discard the best tools of formal typing. Anecdotal evidence, for sure.

I've never been bothered by the GIL, but I have benefitted from it [1].

[0] https://docs.continuum.io/mkl-optimizations/

[1] https://www.youtube.com/watch?v=P3AyI_u66Bw


>Contrast a Python list with a NumPy array. To achieve speedier loops and vectorized arithmetic [0], the array gives up dynamic typing and dynamic sizing. In most applications, I would gladly give up some compute speed to gain some programming productivity.

Except numpy arrays have a much richer interface and can still store dynamic objects (dtype=object). So what's your point?

>I love duck-typing

So do I. Where does this come from? I don't believe I ever considered it a contra.


Have you ever tried appending to a NumPy array in a loop? It's a total disaster! And dtype=object arrays are mostly useless; they gain almost none of the benefits of regular NumPy (you may as well run np functions on plain lists) and play poorly with other types. NumPy is great for numerics and structured data - lists are general purpose structures for data manipulation. They are different, have different goals and trade offs, and I don't think it's appropriate to claim that one size should fit all.


If you're storing generic objects rather than numbers in a NumPy array, you're discarding its main benefits. Sure, you've got some extra slicing sugar for selecting columns and subsets, but comprehensions are more readable (and faster!) in many of those situations.

Duck-typing is Python's form of dynamic typing and therefore results in the speed penalty. If you want the extra speed, you'll need to give up some dynamicism. I say this now, but some of the work the core devs are doing to optimize dicts might let us have our cake and eat it too. Until then, it's a choice: flexible or fast, not both.


One of the major factors in speed is efficient memory layout.

In most applications, I would gladly give up some compute speed to gain some programming productivity.

By Smalltalk standards, Python is pretty profligate. (By 90's C programmer standards, Smalltalk is pretty profligate.) However, Smalltalk still has many of the high productivity features as Python. (In fact, the debugging story is far superior.) I suspect, though, that Python is still a far superior environment for the things you use it for.


I enjoyed coding homework assignments in Smalltalk, but for some reason I never tried using it professionally.


Your post quite strongly alludes to it being either due to incompetence, or politics, or both. So I think grandparent has a very valid point, and you might want to change the tone of your post a bit; then it'll produce fewer knee-jerk reactions, and might be taken more seriously.


Nah, just people connecting that sentiment with other statements. It should be cleared up since it's causing some mass confusion.

It's funny because I preface it by saying "Remember that it's a matter of opinion" (and, well, the title alone) and people come out of the woodwork completely disregarding this, or outright misinterpreting sections of it.

I maintain that a large reader base here does not actually... read.


Stating that something is a matter of opinion does not mean that insults do not hurt. Either you don't want people to listen to you so you don't have to worry about what you say or you do want people to listen in which case how you phrase things matters.

Or put another way: I'm a core developer of Python and I found your post somewhat insulting (I've unfortunately seen worse). You claim I'm possibly incompetent and I did a half-assed job with asyncio. You very "audibly" sigh and call my work "nonsense". You ask me to "come on" and accept your view on things when I have apparently helped make a "gimped language". And you end by saying I need to "fix [my] language". None of that phrasing comes off as understanding of the hard work and immeasurable number of hours I have put into making sure Python continues to function well for you over the past 14 years that I have been a core developer. I know you like Python as you stated in the post and in the comments here, but that doesn't wash away the rest of the unnecessary negativity in your post such that I want to take your opinions seriously enough to spend the time to explain why things are the way they are.


Fair enough, I never intended it to be posted to HN or receive this much attention or I'd have taken much more care with the tone.

I retract the "ignorance" sentiment, as I do not actually know what Python developers are considering.

I do apologize for that, and thank you for your contributions. It is still by far one of my favorite languages, and I use it daily. :)


More commenters here, including me, tried to explain you that even before Brett chimed in. It shouldn't be even necessary that somebody like he does that. Just imagine how many users there are in the world, imagine the burden of the people who invest their energy in maintaining that huge project. It's the users and all the already written programs that make it huge, not the code base alone.

> I do apologize for that

Maybe making your apology visible on your blog too?


Hey, Brett. Thanks for your work :-)


> I maintain that a large reader base here does not actually... read.

I would disagree.

Although there's often a fair number of commenters who clearly read the title of an article, and then just start commenting on that, in this case we can see that people have read (at a minimum) your opening statement and whichever list item they're taking issue with.

I believe that if a large set of people are misinterpreting what I've written, it's a sign that I probably wrote it poorly. Not in the sense of arguing for the wrong thing, but in the sense that I'm not conveying my argument well enough. This is easy to do, because when I'm writing something, I know what I mean. This sounds obvious, but it's hard to read what I've written tabula rasa, without bringing that "of course I mean X" view to it.

So, the feedback you're getting here is that your opening statement colors the rest of your piece. That "Incompetence? Politics? Both?" aside obviously makes a large subset of readers assume that you're saying "all of these complaints in my list must be unaddressed due to incompetence or politics". That's at odds with the later "just an opinion" statement, and people are sticking with the more-inflammatory initial claim.


I'm pretty sure everybody here can read. We are asserting that your sentence:

"These are obvious flaws in design, in my opinion, that warrant re-looking at, but to which no real improvements are being made for some reason. (Incompetence? Politics? Both? Who knows.)"

sets a very negative tone and makes your "list of problems" look like a "list of complaints that these incompetent losers should fix asap". In my opinion you may just not be the best writer for some reason (Lack of education? Incompetence? Both? Who knows).


>In my opinion you may just not be the best writer for some reason (Lack of education? Incompetence? Both? Who knows).

Then right back at you -- that triggers the same problems as the original statement. :D

You could very well have stated your point more constructively with that.


That was exactly my point. I'm glad at least you have some reading compression skills :).


Makes me think of something I read about cultural divides. Some cultures think that a speaker can say whatever they feel like, and it's the listener's obligation to figure out how to understand it. Others think that it's the speaker's obligation to structure and phrase things in a way that make it clear to the listener what they meant.

Not looking to make value judgements of whether one is generally better, but I think it's clear that when writing on the internet for general audiences, the second way is more effective in spreading your point.


I've read your piece, and unfortunately the overall tone sounds like a rant. I'm sure it wasn't your intent, but tone is hard to convey in a purely textual medium sometimes. I fall victim to this often, and have been actively working to try to avoid excessively negative tone (even if I feel that way).

The ranty tone of the piece obscures the rest of the points you were trying to make - many are good, but a strong tone will immediately put people on the defensive rather than trying to open up and understand what's being said.


Yes, I'm seeing now that the tone of the post is more talked about than the actual contents. Learning!


tyleraldrich wrote to you, to demonstrate you that it's more than a "tone":

"In my opinion you may just not be the best writer for some reason (Lack of education? Incompetence? Both? Who knows)."

Do you consider it to be "just the tone of the post" directed to you?


The thing is, by saying that your incompetence/politics remark is just an opinion, and by not going out of your way to offer a factual justification, you have made it clear that it is essentially an information-free statement, while its snide disparagement of people who have put a lot of effort into Python lingers undiminished, if not actually emphasized as such by its lack of information. Then you double-down by taking on all the people who see that this is so. As a consequence, the currently-top issue in the comments is this, and not whatever it is you think someone should be doing to improve Python.


> I never said it was slow because they were incompetent

Your own article appears to have exactly that claim:

> These are obvious flaws in design, in my opinion, that warrant re-looking at, but to which no real improvements are being made for some reason. (Incompetence? Politics? Both? Who knows.)

> Without further ado: The standard interpreter bring rather slow; PyPy is nice, but its Python 3 support is very immature.


Oh don't be silly, he's applying multiple possible reasons to the set of frustrations. You're applying all of them to a single frustration. What you've done isn't logical.


When he claimed that the properties P or Q are of each of the set of A, B, C, D, E, you claim that I can't conclude that he said that P or Q are the properties of A?

I used A has property P, substitute A has property Q, it's still wrong claim of him, for the very same reasons:

Python has a huge user base, if he can improve it in any of his points (I just selected one) and keep it working for the user base, please. I know he won't be able, his talk is of ignorance and provocation.

And these who develop aren't stupid or doing politics, they keep the Python working for their base, improving it as much as they can.

See https://news.ycombinator.com/item?id=13485784


>I know he won't be able, his talk is of ignorance and provocation.

Says the person missing that it's an opinionated list, lol. It was more meant to provoke discussion, not hurt feelings (like your obviously seem to be).


The standard library in general encourages use of higher-order functions and concepts borrowed primarily from FPLs (see: comprehensions, map/reduce, sort, etc.) I could not imagine seeing them backtracking on this -- it only helps them to go further in that direction.

I don't think the standard library really 'encourages' this in a way that's different from most languages that support first class (rather than 'higher order') functions. If you consider python's evolution, its support for many common programming paradigms was somewhat haphazard and weak and developed over time, mostly pragmatically. OO has become stronger, the 80% use case of common functional idioms is covered by comprehensions, etc. Ill thought out features (e.g. terrible lambdas) have become de-emphasised. The choices are extensively document, even if not everyone cup of tea so it seems both glib and inaccurate to say (re: FP) 'it only helps them to go further in that direction'. How does it help them?


Another big issue to keep in mind when critiquing Python, as opposed to e.g. JavaScript or Ruby, is that Python is used for so much more besides webdev/etc. stuff. So making changes that improve life for webdevs could suddenly make life worse for people using Python in high energy physics, or in chemical engineering, or in AI research, or a large number of other fields that a webdev might not even know exists. Every supercomputer on the planet runs Python code, scientific Python usage is a really big field, with enough passionate users to hold multiple conferences every year; orders of magnitude more in casual users.


I would propose there are plenty of better solutions for webdev than Python, and that Pythons should remain focused on where it has been successful.


Apparently people disagree with this sentiment. Allow me to expand on it:

TL;DR - One language for all purposes is the wrong focus.

- Python's success is largely due to things it has done well: Numeric and scientific computing (NumPy, SciPy), system administration scripting (yum, RHEL), workflow management (Luigi, Airflow, Snakemake).

- The idea that one language can be used in all tiers leads to things like JavaScript. Does all things but none of them well.

- PHP and Ruby are already strong in the webdev area, and there is little need for Python to compete here. Resources spent on making Python a top tier webdev language will always be a game of catch up, and distract from its strengths.

- Yes Django is a thing, and should continue to evolve, but its unlikely to catch up to PHP.


> "- solving some of the issues would exacerbate backwards compatibility."

That boat sailed a long time ago with the introduction of Python 3.

(Ok, Ok my comment is slightly tongue-in-cheek. I agree that backwards compatibility is usually a good idea and should not be broken without a great deal of thought.)


the solutions to a problem (I'm thinking GIL) come with a lot of consequences which are not readily acceptable

Are not readily acceptable to Guido. Whether that's a good thing (for the language) or not is subjective. At least Guido has a vision and is sticking to it!


A language as old and as large and as flexible as Python ends up with a few wrinkles, but designing a successful languages isn't easy and I really admire the work done by everyone behind Python, especially the vision and invention by Guido van Rossum.

A kind of post-experience review of a language's strengths and weaknesses is a good exercise. For comments and complaints that really were influential in the history of programming languages see:

Knuth, The remaining trouble spots in Algol 60, Communications of the ACM, 10, 10, 1967, pp. 611--617. https://www.cs.virginia.edu/~asb/teaching/cs415-fall05/docs/...

J. Welsh, W. J. Sneeringer, C. A. R. Hoare, Ambiguities and insecurities in Pascal, 7, 6, November 1977, pp. 685--696. http://onlinelibrary.wiley.com/doi/10.1002/spe.4380070604/ab...

Brian W. Kernighan, Why Pascal is not my favorite programming language, April 2, 1981, AT&T Bell Laboratories. http://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pasc...


Very short-sightedly written. It sounds like the author just wants a language with a different philosophy, and instead of realizing this goes on to call the differences "obvious flaws in design" that aren't improved because of "Incompetence? Politics? Who knows." This is especially bad given that Python (in my opinion) has a very well thought-out and transparent change process, with PEPs that usually consider most alternative solutions to a problem brought up, and being held to a high standard in order for the BFDL to approve them.

Especially regarding some of the points near the end, it seems that the author just doesn't understand Python. Python doesn't have or want a strong typing system, and it mostly wants an imperative style keeping lines short.

> Well... what's worse, having a slightly goofy looking inline "def", or having a gimped language?

Why is it such a problem to move your closure to its own line and give it a name?


>Why is it such a problem to move your closure to its own line and give it a name?

What if it cannot have a meaningful name? You don't give a name to every single value in your program, why would first-class functions be any different? Like, there's an idiom in python where you write a function named "wrapper" in a decorator and then return wrapper. Except you put @wraps on it, so the name of this function isn't even "wrapper" anymore. So what was the point of naming it "wrapper" when you could just do ```return wraps(lambda ...)``` if lambda wasn't so underpowered.


The name of the function when you read the code is still wrapper. The point of naming it is that it allows you to refer back to it, as well as provide the reader with a clue to what it does. This way, Python attempts to reduce nesting, which reduces ease of reading. You see this a lot in Javascript, with all the anonymous functions being passed around. Besides that it makes your stack trace hard to parse, it also makes it hard to follow if you're currently reading a named function, or an anonymous function that's being passed as a parameter inside another anonymous function, etc.


Then give it a stupid name. "helper_func", "inline()", or even "_a". If you think the name isn't important, then don't waste your time on it.


> You don't give a name to every single value in your program

There are lots of times I name single-use values because I want to throw it to the logger before I pass it to whatever function is going to use it.


> Why is it such a problem to move your closure to its own line and give it a name?

That's actually one of my bigger beefs with Python: it forces a large naming burden on the programmer. When writing python code I find myself struggling to name intermediate results or stupid functions that should just be lambdas. Very very often those results don't warrant a name, or are unnameable. Also, naming something implies it will be useful in another context, which one-off lambdas rarely are.


> Also, naming something implies it will be useful in another context, which one-off lambdas rarely are.

Not necessarily. I find it helps readability to do this:

  def prunde_indices(indices, cutoff):
    def match_old_index(index):
      return index.timestamp < cutoff

    old_indices = filter(match_old_index, indices)
    # ... do things with old_indices
as opposed to using an anonymous lambda. That doesn't mean I have to reuse it outside of this function.


> Also, naming something implies it will be useful in another context, which one-off lambdas rarely are.

Naming something in a local scope doesn't imply that it will be useful other than at least one place on that local scope. In fact, it's a pretty clear indication that you don't expect to need it anywhere outside of that scope.


Whenever I have difficulty thinking of a good name for an intermediate step, I start to worry that my algorithm is unnecessarily complicated.


>Very short-sightedly written. It sounds like the author just wants a language with a different philosophy

Which is not a priori bad to want, especially if a language has a broken philosophy (or partially broken) to begin with.


The only way I see a philosophy as being a broken one if it is founded on false premises, or internally inconsistent (or do you see a different way?). Which is the case with Python and why?


Instead of philosophy what you say applies more to logical statements (they are broken if they are founded on false premises or are internally inconsistent).

A philosophy is more malleable and ad-hoc than axiomatic logical statements, and it can just be bad because it doesn't offer satisfactory solutions the problems it claims to have tackled, or because it sidesteps certain things, etc.

But even that is taking the "broken philosophy" allegation too literally. The author simply means that Python would be a better language if it offered more capable closures and more easy access to functional programming.

And why one might argue about the "better", there's no arguing that Python would be more expressive if it did so.


> And why[while?] one might argue about the "better"

This is my point. Python's philosophy isn't "bad" in a universal sense (let alone broken). It's just that the author has a different philosophy for programming languages himself. And I find it short-sighted to regard other philosophies as "bad" or "flawed", simply because they do not match your own.


Yes, I meant to write "while" (my comments are usually hastily written while waiting some build to finish and full of typos).

>And I find it short-sighted to regard other philosophies as "bad" or "flawed", simply because they do not match your own.

Sure, but isn't a "all philosophies are equal" also misleading? Some philosophies can be inherently worse than one's own.


A philosophy should also be considered broken if it is not useful for any purpose.

By extension, it could also be considered broken if it is needlessly less useful than it could be for its intended purpose.


Well, perhaps that means the right solution is not to complain about the language but to switch to Ruby (or Crystal) instead.. :P


Wouldn't that come with its own can of worms?

At some point, you try to fix what you think is already closer to what you want or what you have legacy code using it.


Python's semantics are unlikely to ever be fast and most Python users have already worked around its speed issues.

asyncio is new; python3 porting is happening. It seems unfair to complain that no real improvements are being made and also complain that these new things are immature.

reduce being pushed behind an import is stupid, but it's only one import.

lambda is fine, if you're writing in functional style you're using expressions for everything anyway. And you will probably be more persuasive if you can make your points less offensively.

For "bag of data" classes look at attrs.

(All that said, having found Scala I don't miss Python at all. (Except when writing a desktop GUI - PyQt was really nice))


Here's a thoughtful critique of asyncio, posted some time ago on Hacker News: http://lucumr.pocoo.org/2016/10/30/i-dont-understand-asyncio...

Yes, attrs seems like a good solution for plain-old-data classes. https://attrs.readthedocs.io/en/stable/


>lambda is fine, if you're writing in functional style you're using expressions for everything anyway.

No, I /really would/ like to be able to write:

foo.on_click(lambda: x += 1)

The language not supporting this (when most others do) is just silly.


> foo.on_click(lambda: x += 1)

Mutation in lambdas is not compatible with your complaint about "inadequate support for high-level functional programming", as it goes against the principles of FP.

You can't do that in Haskell either, and any FP purist would blanch at a statement like that.


> You can't do that in Haskell either, and any FP purist

What about us FP pragmatists? Also in some domains I'd argue being a FP purist is the most pragmatic option.


>You can't do that in Haskell either, and any FP purist would blanch at a statement like that.

Obviously you've never met State and/or lens then.

Yes, you can do it -- and no, I never said it should follow pure FP principles.


But if you write a State equivalent of "x += 1" in Python then you can use it in a lambda too.


Are you saying that a non-mutating variant of the above works? For instance:

   foo.on_click(lambda:x func(x, whatever))


yeah, except that the syntax is

  lambda x: func(x, whatever)
In fact, if x is an object that stores a mutable numeric cell, you could even do a mutating:

  lambda x: x.add(1)
Mutation isn't prohibited in Python lambdas, statements (including assignments, which are statements rather than expressions in Python), however, are.


It is amusing your lambda usage could be used by the Python community as a classic example of problems that can occur if they allowed what you want.

Mutating variables in a lambda is something abhorred in many functional languages.


Try using some other language that is functional like LISP or Haskell maybe. Python goals is not be a functional language that that's okay.


That seems like an exceedingly error-prone thing to write. "x +=1" isn't a value and the unmanaged mutation will be surprising when it happens. On a Python implementation with parallelism (e.g. Jython) you could very easily end up losing updates - on CPython the GIL will probably mean your code accidentally doesn't exhibit that problem, but that doesn't seem a very desirable way to code.


What is the proposed alternative? (using a named function or method just seems to be semantically the same, just more characters)


Something like functional reactive programming style, where you explicitly define a pipeline from foo.click to x and explicitly gather together all your pipelines (explicitly defining the interleaving semantics rather than just "whenever an event happens it happens") and run them in one place.


    foo.on_click(partial(q.append, foo))
The nice thing about ``append`` is it's atomic (for builtins).


Where does the update of `x` go in this example? By observing a Queue or Stream `q`?


If it must be mutable state, then ideally you'd have a single consumer thread for counting off that queue to avoid worrying about locks.

Or maybe what's in the queue gets written to a permanent log and ``x`` is a query of that log, a la Datomic.


Huh. In perl (and javascript) x += 1 is an expression that returns the new value of x.

I guess this is another "simplification" along with the magic scoping model (I want perl's my/ES6's let damnit sulk) ...


x += 1 should be a statement rather than an expression. It's not just a value, evaluating it has an effect.


Thank you for showing me attrs.

I love namedtuple, but this seems like a better implementation for light-weight classes.


I don't understand the desire to turn python into a high performance language. It's 2017, if you want performance just write some go/cpp/rust. If you want to leverage an old and very mature concurrency framework, use elixir/erlang. If you need a giant data integration framework, use java. If you want a stellar bash replacement, use python. Know your tools, don't bloat them with unnecessary crap. (The list was not intended to be exhaustive or pick the best, just to illustrate that we have tools for each type of job).

The idea of single-language buy-in has always perplexed me.


Because other high-performance languages have been improving their readability and expressiveness.

Personally, my tool of choice is C# right now. I find it quite readable, and every bit as expressive as python - even moreso, plus it has the performance advantages of being designed from square 1 as a compiled language instead of an interpreted one.

It has async/await, it has functional features that Guido hates, it has performance, and it's quite legible ever since C#3 included type inference and you can just write "var" all over the place. It has some warts, but the warts are worth it.

Imho, python has stagnated. It's still a useful, wonderful language and I enjoy working in it when I have to, but I never find myself choosing python for new projects, and I don't see that ever changing.


But then, if you want to write a script to convert a bunch of files from one format to another, will you start a whole C# project?

If you want to quickly to a mathematical calculation with matrices, then plot and visualize it, will you use C#?

"Tool of choice" is misleading. Maybe you are a webdev. Maybe you write GUI applications for a living. I don't think there's clear "Tool of choice" for all jobs.

The job dictates the tool of choice.


For one-offs I use arnova chsell so I can have a repl (good way to make unit tests, too, imho)

http://cshell.net

For reusable scripts I make a command-line app. I admit the project structure is a bit heavy weight for that but I have visual studio open all the time anyways.


Python has async/await since Python 3.5. I recall Guido acknowledging C# in a Pycon keynote. I don't think Guido hates (or misunderstands) functional features, but rather prefers Python to be something different from what you want.


It's nice that you like C#, but I don't see how that's related to the parent comment. You certainly cannot argue that C# does a better job at all mentioned tasks than the languages the parent listed.


I was using C# as an example of a language that now, as it has developed more features, lets you have your cake and eat it too. Many such languages exist.

Python seems overspecialized and stagnant to me. It prioritizes readability and simplicity, but that often results in weird workarounds that are even less legible and simple than a more expressive language would have.


Is it nil or null or None?

Is it -eq or == or ===?

Can it be undefined?

Is it / or //?

Is is "".format() or "#{}" or "${}"?

And that's before we get to language idioms, common libraries, package managers, build tools, lint tools, compilers or VM configurations, etc.

There's a reason people try to use a language in as many areas as feasible.


A lot of the author's complaints, especially near the end, are personal preferences which explains by these "improvements" were never added. Not everyone would prefer a move to a significantly more functional style.

Arguably this difference is what caused Coconut to be made in the first place.

There's ample discussion on these topics to simply brush existing decisions off as "Incompetence? Politics? Both? Who knows."

For the fifth point, `list.index` works fine in Python 2 and 3 for me.


>Arguably this difference is what caused Coconut to be made in the first place.

Sure, that and it's far easier to write a new language and transpile than it is to fork and modify existing implementations.

>There's ample discussion on these topics to simply brush existing decisions off as "Incompetence? Politics? Both? Who knows."

If you would like to link to such discussions I would not hesitate to add them as footnotes/amendments.

>For the fifth point, `list.index` works fine in Python 2 and 3 for me.

Sorry, mixed up `index` and `find`. `[].find` is not a function, but `[].index` is. Thanks for catching that, I keep confusing them (another minor annoyance of having both).


Oops, I didn't expect this to blow up, thanks for responding.

Here's a discussion about switch case (I was also looking for one the other day, but in my case, it was purely for optimization).

https://www.python.org/dev/peps/pep-3103/

And the wiki, for example, talks about the GIL

https://wiki.python.org/moin/GlobalInterpreterLock

I remember reading articles from here about that a few times, but can't find them now. If you are still interested, I can try to dig it up.


It is hard to accept inputs from people who dont appreciate that any technical decision require understanding the tradeoffs. It is true that python is not perfect, but perfection was never a goal. Python has made some tradeoffs, just like every other technical system. Also, just like any other technical system people are working towards improving in a specific direction. They only way to change or evolve that direction is to be part of the community, understanding it, and then influencing it. Abusing the community, calling it silly is plain stupid. The author's goal is clearly not in influencing a change but self aggrandising and proving himself right.


Influencing the Pythons of this world is generally a waste of time; just go use that which, today, works the way you want, or make it yourself if it doesn't exist.

Anyway, are you saying that knowledgeable people who think Python is junk shouldn't say anything? Only get involved in Python development or shut up?

If someone's words can save just one person from using Python, that's worthwhile.


Nah, I would love people to use Python -- just to help improve it as well (either through libraries, alternative languages or submitting changes through official channels.)

Saying I am "self aggrandising" is disingenuous and misleading at best.


It's definitely not perfect, though ultimately most trade-offs in Python come down to readability. I once went down the rabbit hole (3 library iterations) of overloading operators to enable a very shell and pipe oriented syntax, only to later realize how much harder my 6-month old code was to read even for me. So I've come to appreciate Guido's experience for the trade-off between expressive power and readability.

For instance in the things you suggest, reduce used in its most straightforward manner is readable, but it can also be used to enable some of nastiest, most head scratching one liners. Lambda is useful for small things like callbacks, but readability should lead you to a real function with a descriptive name sooner rather than later.


Agreed. I came from Perl to Python.

Perl does feel more powerful and expressive, but it often gives you enough rope to hang yourself whereas Python doesn't.


Though I likely haven't completely understood each of the authors gripes, each problem to me seems to have a notable solution provided by Clojure (with the exception of tail-call optimization).

Clojure:

is compiled to JVM byte-code and is fast.

has a good parallelism story (parallel map, parallel fold, channels)

is almost completely backwards compatible.

has a sequence abstraction that leverages the same operations over many different types (string, vecctor, list, set, map, etc.).

has a standard compose function.

has reduce, map and filter in the standard library. Transducers (also first class) further extend their usefulness.

's closures allow statements and there's even sugar for annonymous functions.

has macros to reduce boilerpate.

has a conditional macro.

I'm sure this list isn't unique to Clojure but I'm most familiar with it.


All of the above is good except for transducers. They are a necessary hack in Clojure because the data is immutable running a large number of functions across changing data is rife with overhead in Clojure. So the hack is mutate the code many times, so you only have mutate the data once.


I see what you mean and although the definition of a from-scratch transducer looks a bit ugly to me, the idea of composing existing transducers together seems rather elegant.


I find myself nodding to everything on the list.

I use python all the time. The lack of first-class anonymous functions is plain irritating.

I would add to the list that nonlocal is horridly limited, that I want to be able to better do do-while loops and want to better exit from nested loops more cleanly and such.


I like this list a lot. I also find myself raging at awful lambda and the lack of switch. No, it wouldn't make the language any less "pythonic" to make them useful.


IMHO switch is horrible construct i'd rather see ML style pattern matching


Pattern matching without TCO would leave me feeling deceived


Switch makes sense in really low level languages like C where it becomes a branch table (and allows things like Duff's device), but it has no place in higher-level languages. Fallthrough, while very occasionally useful, is bug prone and weird (breaks the "principle of least surprise" in a major way). Python made the right call not including it.


So do it like "match" in Rust. Fallthrough certainly seems un-Pythonic. It seems worthwhile to ditch it in favor of pattern matching.

It also seems un-Pythonic to have to implement something in a less-obvious way. Choosing an action based on a one item from a set of possible inputs means either a dict that maps to lambdas or function variables or a big chain of if-else. Neither of those options are optimal.


I'd love to have all those problems fixed. I constantly bump in to exactly those things myself, and I believe a majority of developers would find the language better with than without remedies. Especially the trivial things (like flatten, moving reduce back out from functools, lambda state and so forth) takes no time - it's just politics.


The one problem I have with Python and would like to solve, is to be able, from within a request rendering function/method of my web application (think Flask) to run something like:

    handle1 = call_webservice_1(args)
    handle2 = call_webservice_2(args)
    handle3 = call_webservice_3(args)
    (realres1, realres2, realres3) = wait_until_timeout(500, handle1, handle2, handle3) 
    # here, I have my results in realresX or None if timeout
    # the call_webservice_X would be non blocking
This way I can dispatch my requests to my backends and degrade gracefully if one request fails within the time. I was doing that in PHP using zeromq to send the requests and listening to the answers with a unique id on each request, but now I would prefer to stay with an HTTP based protocol to communicate with my backends.


    from asyncio import wait, gather, get_event_loop
    from aiohttp import web

    async my_handler(request):
        handle1 = call_webservice_1(args)
        handle2 = call_webservice_2(args)
        handle3 = call_webservice_3(args)

        await wait(gather(handle1, handle2, handle3), 500)
    
        return web.Response({'finished': True})


    app = web.Application()
    app.router.add_route('GET', '/test/', my_handler)

    loop = asyncio.get_event_loop()
    server = loop.create_server(app.make_handler())
    loop.run_until_complete(server)
Done. Not flask, but if you need to make lots of parallel network calls during a web request why the hell are you using flask?


That's really helpful. Can you make an example which doesn't require making the my_handler function 'async'? For cases when you don't want to make an entire async stack, you just want to slot some async code into your existing code.

If the answer is "to get some async goodness, just use this easy code, plus rewrite your entire project to use a different framework and set of libraries", then we are only fooling ourselves.


You need to use the asyncio (or equivalent) event loop if you want to use the asnycio module. loop.run_until_complete() is synchronous though, so you would simply call that and it will block the control flow despite that function being async. You can definitely mix it with legacy code.

I would recommend against it, but if you had an existing framework, you could just make the endpoints lambdas that are something like:

    app.route("/whatever", lambda: loop.run_until_complete(async_handler_function()))


^ this, but be aware that it's only worth it if you do > 1 external call in parallel. I.E this is pointless:

    res = await get('https://somesite.com')
    return Response(res['data'])
As you'd get the same thing if you just did it synchronously (without the await). But if you want to fetch 2 or more pages in parallel when it really pays off.


Yeah, absolutely. I am not a fan of mixing synchronous and asynchronous code, but the design of asyncio makes it very easy to do. I think that most people struggling with the concept don't realize that asyncio is inherently blocking when its being used (well, with the caveat of run_in_executor, but that's best left ignored for the purposes here)


Sure, you can make an async function and call `loop.run_until_complete` in your handler.


I am using flask because you know, long string of development with flask over the years, your application is growing and then you hit some walls. I do not want to rewrite everything just for a couple of views within my application.

Flask with gunicorn has its own even loop and asyncio io another one and at the end it is hard to be sure if something is working because I am lucky or because it is the way to do things.

Anyway, thank you for your example, it is really clear and easy to understand! Maybe we need a flask like framework which is asyncio based. Which mean I will need to upgrade the code to Python 3 :)


If you haven't already, check out sanic [1] which is powered by uvloop [2] which itself is pretty amazing.

[1] https://github.com/channelcat/sanic

[2] https://github.com/MagicStack/uvloop



Quick two minute implementation:

    import multiprocessing
    
    import time
    
    
    NULL = object()
    
    
    def call_webservice(fail=False):
        if fail:
            time.sleep(10)
            return False
        else:
            return True
    
    
    def wait_until_timeout(timeout, *async_results):
        results = [NULL] * len(async_results)
        end = time.time() + timeout
        while time.time() < end:
            for index, result in enumerate(async_results):
                if results[index] is NULL and result.ready():
                    results[index] = result.get()
            if not results.count(NULL): break
        return tuple(None if result is NULL else result for result in results)
    
    
    def handler():
        with multiprocessing.Pool() as pool:
            handle1 = pool.apply_async(call_webservice, (True,))
            handle2 = pool.apply_async(call_webservice, (False,))
            start = time.time()
            realres1, realres2 = wait_until_timeout(0.5, handle1, handle2)
            duration = time.time() - start
            print(f'Took {duration:.3f} seconds')
        print(realres1, realres2)
    
    
    if __name__ == '__main__':
        handler()


You could probably do that fine with coroutines and/or asyncio.


I think there are many people out there who use python because its practical and useful to them, but they don't love it, and that is completely fine. Anything good is a compromise between different groups and Python is no exception to that rule. I think Python does a decent job at appeasing both functional zealots as well as objection oriented fanatics.


>Parallelism is very bad on CPython and PyPy;

GIL was added because the early python libraries were not written to be threadsafe. This was a terrible oversight and frankly should have been corrected at some point. The underlying implementations use pthreads. It's actually worse, because as multi-core devices came out the "lock-thrashing" behavior of the GIL got worse. Rather than fixing the problem we have 'multiprocessing'. I still don't understand why this hasn't been tacked.

> Quite a few legacy projects are written in Python 2, and it can take some work to port them. This is particularly a pain for libraries where I expect to pip install them and have them "Just Work".

Python 3 was DOA. I don't want to be overly critical here, but there was no compelling reason to switch because python 3 didn't have anything fundamentally more interesting than python 2. It didn't really fix any of the serious language issues (like the GIL). It was almost like like the python version of windows vista or ipv6. People want you to switch, but meh.

>The BDFL himself, Guido van Rossum, has infamously declared that he does not like functional programming

And now you've come to the heart of the matter. There have been some amazing tweaks of python (stackless, pypy, twisted greenlets) some of which have been attempted to be merged into the greater python. Most of which were rejected. At some point people give up and walk away. For better or worse Python is Guido's language. Take it or leave it. No switch statement for you buddy.

I think Python is a fantastic language. It has become ubiquitous. For all it's warts, it's lack of change has probably helped it's adoption. Literally EVERYONE writes python code. Network guys, sysadmins, even your manager (or your manager's manager) probably has some python code stashed away somewhere.

However, I don't feel that Python is not doing enough to catch up. Print as a function does nothing for me. There are so many, many, many, many warts (which i won't get into) and yet python seems to be polishing the chrome rather than fundamentally fixing it's core problems.

I want to use and love python but it's become "the devil you know" so to speak. I've lost hope that python will adapt to the future and have put my bets elsewhere.


Where are your bets now? @.@


The biggest problems with Python are packaging and distribution.

It would be nice to create a single file and be able to send it to someone, like golang which even has cross compilation.

Mobile support, you can't easily write a mobile application in Python.


>Mobile support, you can't easily write a mobile application in Python.

Depends on your needs, but there is at least Kivy.


my perspective on this is that that is a remarkably short list of complaints for a programming language, all things considered.

Python is certainly not perfect, but a similar list of pet peeves and grievances for, say PHP or Java would easily be 10 times longer even under the most charitable interpretations.


The problem I have with Python is that for loops don't have their own scope, only methods.

Add that to the lack of variable declarations (even optional ones, a la my in perl, var in javascript), and it gets hard to work out what scope of any given variable actually is.

Surprising example:

  fns = []

  for n in [1,2,3,4]:
    def fn():
      print(n)
    fns.append(fn)

  for fn in fns:
    fn()


But doesn't this just happen because n is a pointer? What would you expect it to print? 1,2,3,4?


In Lua it prints 1,2,3,4. It has to do with each loop iteration behaving as if it declared a different variable instead of sharing the same variable across the loop.

Anyway, the problem they were talking about is clearer when you are closing over stuff that other than the loop variable:

    fns = []
    for n in [1,2,3,4]:
        x = n*10
        def fn():
            print(x)
        fns.append(fn)


Yeah... so I agree that's confusing. It's akin to:

    def x(y=[]):
        y.append(1)
        return y

    for z in range(4):
        print x()

For your case, I recommend using partial functions, which were created for this type of issue. I think it's also cleaner than closures where x depends on an outside context.


The obvious solution to that is to break your code into smaller functions.


> Quite to the point, lambdas (anonymous closures) in Python are gimped. They are single-expression functions, which means no statements, even global/nonlocal qualifiers.

I remember once on rosettacode I wanted to write a Runge-Kutta function in Python with a lambda. I was stopped by the lack of variable assignment, until I remembered that they can be emulated by nesting function calls:

    def RK4(f):
	return lambda t, y, dt: (
		lambda dy1: (
		lambda dy2: (
		lambda dy3: (
		lambda dy4: (dy1 + 2*dy2 + 2*dy3 + dy4)/6
		)( dt * f( t + dt  , y + dy3   ) )
		)( dt * f( t + dt/2, y + dy2/2 ) )
		)( dt * f( t + dt/2, y + dy1/2 ) )
		)( dt * f( t       , y         ) )
https://rosettacode.org/wiki/Runge-Kutta_method#using_lambda


... Yeah, that's gnarly. :D That is emulating `let` using lambdas, though, and not mutable assignment. Still useful if you really want to nest them, but still immutable.


I don't know, isn't it possible to do the equivalent of mutable assignment if I use the same variable name several times?

For instance for the equivalent of x = 3; x = x + 1; print(x):

    (lambda x: (lambda x: print(x))(x+1))(3);


Personally I'd love to see pattern matching / de-structuring of dicts and strings:

    foobar = "foo{}".format("bar")
    foo = "{}bar".unformat(foobar)

    def dict_returner():
        return {'foo': 1, 'bar': 2, 'foobar': 3}

    {'foo': newvar1, 'bar': foo} = dict_returner()
I find it strange tuple unpacking exists, but not anything equivalent for dicts - I can't see that it would be horribly inefficient? Especially considering that one would be unlikely to use it with more than a few keys.

An extension to that providing a set of keys would be nice, too:

    foo_and_bar = dict_returner(){'foo', 'bar'}


> The standard interpreter bring rather slow;

I'm tired about this one.

In the last 13 years, 97% if the projects I worked on didn't need Python to be any faster, it was not the bottleneck. The remaining ones could leverage some solution to bypass the problem. Python speed is indeed an issue to a few people, but it's not the red flag I can read about here and there.

I've been hearing this argument for ever. PHP is slow. Java is slow. The first one powered the Web for 10 years, the second one is the most used language is the world. A lot of time this argument is like hearing "I want a pony".

Actually the rare persons I met really needing speed never complained. They are usually hardcore professionals, and are already working on solutions.

Let's now talk about solutions.

Python is an interpretted and very dynamic language. It's though to speed up. If you look at the C code, you'll see the Python VM is quite well optimized already.

Now the author says:

> no real improvements are being made for some reason

But there have been:

- psyco

- unladden shallow

- stackless

- numpy and a lot of compiled extensions

- pypy

- pyston

- pyjion

- nuikta

- cython

- numba

People ARE actively working at the problem. It's a HARD problem which is why we don't have yet a definitive solution. And a lot of people working on it are non paid for this.

Yeah, JS became faster. You know how ? Google spent millions and hire a bunch of geniuses just to do it.

In 2011, the Python Software Foundation had ‎$750,000 to spend for the whole operation, including maintaint pypi, the documentation, the official website, the conferences they do and the various grants they provide. Even the few dev that are paid to work on Python (e.g: Guido) have to do it only part time.

So the authors worked with Python for 10 years. He made a living out of a free exceptional software and complain about a problem he may even doesn't have while people are working their ass off to solve it. And we writes an aggressive rant about it.

> Parallelism is very bad on CPython and PyPy

Yes, again, this is a HARD problem. Python is very old. Older than Java. We only had multi-core recently. We can't destroy mono-core perfs to get multi-core, and have to mainteaint a legacy code base.

We also have:

- a good multiprocessig story;

- 2 good async stories;

- tooling to pre-spaws, manage and scales processes;

- tooling to create task queues.

So while the community is trying, for free, to solve the problem. We have solutions. It's not perfect. But again what's the point of complaining like an hungry child à 4 o'clock unhappy it's not yet dinner time ?

> asyncio does not seem very well integrated, and does not seem as useful as libraries like eventlet. They seem to have wanted to reinvent Twisted, but did so half-assed and did not include useful protocols (Twisted has line-based protocols, HTTP, etc. built in and easily subclassable.)

What is he talking about we just got it ? How do you expect it to be well integrated yet ?

And Twisted is a framework (a very hard to use one) while asyncio is a low level lib.

eventlet doesn't let you choose where to switch context, it's basically like threads. We already have threads.

> Quite a few legacy projects are written in Python 2, and it can take some work to port them. This is particularly a pain for libraries where I expect to pip install them and have them "Just Work".

When the last time didn't that happen for anybody ?

Seriously:

http://py3readiness.org/

I've been coded in Python 3 for the last 2 years. It happened twice. Both time I was able to convert the code base in a few minutes. I said minutes. Not hours.

> There is an official tool 2to3 which does not work in all cases.

And the break in your car doesn't work in all cases either. Still it's a nice break.

Plus you got six and Python-future. Converting any pure-python code base is not hard. Compiled extension is harder, but my guess the author never needed to code one.

And i'll say it again...

People have 15 bloody years to migrates. It's not like JS tools breaking every 3 months. It's not like PHP skipping the version 6 or Perl taking 10 years to get V6.

No. Python 3 arrived quickly after many warnings. Then tools, tutorials and a looooooooooooot of time have been provided.

This is nowhere Python's fault. It's the best damn migration story I've ever witnessed in my life.

My only grudge on Python 3 is that it didn't break ENOUGH. I wished for stuff to have changed more.

> The standard library is sometimes inconsistent

One of my pet peeves as well.

> The BDFL himself, Guido van Rossum, has infamously declared that he does not like functional programming (odd, considering the language is built around FP concepts), and that map/reduce/filter should not be in the language. Well -- in my opinion that is a grave mistake, but more importantly the language suffers.

The author doesn't like the style of the language. So it's a matter of taste.

Well I like it that way.

Now what ?

> reduce is now tucked away inside the functools module (as of Python 3), even though it is the only one of map/filter that is not replaceable by list/set/dict comprehensions! Yet map and filter are still in the base global environment. What sense does that make?

Yes it does because reduce is seldom used. Grep github and you'll see. map and filter are still in the built-ins because people like the author complained a lot on the mailing list.

Yet, the majority of code base I read, including most of the libs I use (I spend a lot of time reading the content of my site-packages) don't use map/filter since we got comprehensions.

> I often find myself reimplementing flatten as flatten = lambda xs: itertools.chain.from_iterable(*xs)

Use comprehensions to flatten. Learn you language for van Rossum' sake !

(y for x in xs for y in x)

> The lack of tail call optimization in most implementations makes writing tail recursive algorithms rather pointless, unfortunately, even when they may be more legible than their iterative counterparts.

> There is no standard way (even in functools) to compose functions. There is partial application via functools.partial, at least...

Again it's because recursivity is not encouraged in Python. It's the philosophy of the language. One can dislike it but it's not a Python problem, it's a Python decision.

I stay in Python precisely for this. Everytime I go read functional heavy code, it's hard to read. I'm an expert coder and trainer. I'm paid up to 900€/day. Most code should be easy to understand given my experience. When it's not, I consider that a bug.

Functional lovers write smart code. I hate reading smart code. I want code that is easy to debug.

If you really need TCO, like when implementing a state machine, there are solutions:

http://neopythonic.blogspot.fr/2009/04/final-words-on-tail-c...

Not as elegant, but good enough since it's a rare occurence you do need it. Again. Rare.

The language is optimized for regular use cases and readability, not smart formulas.

> Lambda is awful

Lambda is wonderful. It keeps people from writting budge inline callback like they do everywhere else. It's the best decision Guido every took.

Xith lambda + decorators + list comprehension, the need for multi-lines callbacks is not huge.

You want more ? Write a regular function. How hard is it ?

It's not hard. So eventually it's matter of...

... wait for it ...

taste.

I would have liked a shorter keyword though. But I can live with it.

> Inadequate data modelling facilities

"Inadequate data modelling facilities" because classes are verboses ? Overkill title much ?

Beside, if you just need a container, you use a dict in Python. Not a class. At most you use SimpleNamespace:

>>> from types import SimpleNamespace >>> SimpleNamespace(a=1, b=True) namespace(a=1, b=True)

But again this is "pony"-worth complaining.

I do think classes are too verbose in Python (I use the attrs lib because of this). But this is childish.

algebraic data type and the whole dunder methods vs interfaces are more interesting debates.

> Lack of switch (or match)

> No, dicts with lambdas (see above) are not a replacement. No, long if-else chains are not a replacement. I want a nice way to match on data (preferably richly -- as with ADTs, ranges, ...) and associate matches with logic

Yes they are for switch. Since is the most overrated statement after go to. It's uneeded, as you can express it's logic perfectly without it. And again, "rare use case". There is nothing wrong with a bunch of if or a dict.

Now for match, it's a different story. Pattern matching would be a nice addition for Python IMO. But again things like:

> Please do not suggest awful hacks to do this, and fix your language instead.

Is arrogant and ignorant.

The mailling list have been discussing it for years. It hasn't happen because there no such thing as a magic way to make everybody agree then implement it and maintain it for free.

Things have cost. People have taste. Code base have legacy requirements.


"No. Python 3 arrived quickly after many warnings. Then tools, tutorials and a looooooooooooot of time have been provided. This is nowhere Python's fault. It's the best damn migration story I've ever witnessed in my life. My only grudge on Python 3 is that it didn't break ENOUGH. I wished for stuff to have changed more."

I think you wrote a good comment, thanks!, but regarding 2 to 3, I think you got this wrong. I think a more gradual transition would have helped - people ended up putting off the porting work, which was easy because Python 3 was installable in parallel, and nobody was really using it etc., and then it really went dead for some years, which I think was counter-productive for everyone.

I think in general it's better to keep some compatibility glue code around until most people have migrated instead of letting a let's-clean-this-shit-up! frenzy prevail.

(Now hindsight is everything, etc. etc.)


The problem I have with this theory is that the JS community and Ruby community had big breaking changes, dev told them to fuck off. The community adapted quickly.

Python took care of the community, giving time, tools and doc. And nobody moved. But they surely complained a lot.

What does that say ?


In the case of Ruby, it's because the vast majority of the Ruby community is tied to a single framework. Ruby developers go where Rails goes.


But this just highlights the case with JavaScript even more. In JavaScript nobody is tied to any framework, and they can easily leave for another browser at any time. Heck, with jQuery, it doesn't even matter if you're writing ECMAScript 3 or 6, everything still pretty much works the same out of the box.

At the end of the day there's a lot to be said about the transition from 2 to 3, but I think in general the Python community got off easy compared to some of the breaking changes in other language communities.


I think the point many are making here is, if you're going to break compatibility as was done with Python 3, then fix the underlying issues with the implementation.

- Python 2 calls out to C code alot? Ok, rebuild the interpreter in such as way that allows for a JIT. Prevent callouts to C or severely limit them. JITs are mature technologies now, many dynamic languages have it.

- Remove the GIL, make it fan out for multicore.

- Add a Generational Garbage collector, remove the reference counted one with timely finalization.

- Add back all the missing features for FP that people are asking for.

If you're going to break compatibility, don't take half-measures, go all the way.


Python is an interpretted and very dynamic language. It's tough to speed up.

It was designed in a way that makes it particularly hard to speed up. It shares some of these flaws with Ruby.


Great answer!

I also wished to be paid to dev in Python :p


now count back from 10 slowly.


One of the biggest Python issues I see is the inability to hide or protect Python source code.

'Compiling' into byte code is easily reversible using pip packages like uncompyle2. Various pip packages offer code obfuscation but from my tests cause problems when running the code. Encrypted bytecode seems to always be decryptable due to the very nature of having an interpreter. Moving Python code into modules implemented in C somewhat works but is time consuming and makes me consider just rewriting everything in C/C++ :(

I would be curious to know how other folks hide/protect Python code? I see this issue as a major barrier to getting Python adopted in paranoid tech companies!


In the last 15 years, I haven't encountered many coders or organizations that produce code that think machine code or bytecode is significantly "secret".

The sentiment I mostly encounter is:

Secrets are things that are encrypted.

Compiling (to bytecode or machine code) is just a way to let different kinds of "machines" read the code.

(Paranoia seems to amplify that attitude)


Naively compiled C/C++ is fairly easy to reverse engineer (I say this from a lot of experience!).

If you want to "protect your source code" you need to apply obfuscation techniques to slow down a reverse engineer - but keep in mind that everything ultimately can be reversed and understood given enough time. Plus, many obfuscation techniques can be made applicable to Python code too (e.g. encrypting, obfuscating or mangling Python bytecodes).

The real question is: what are you protecting that is so secret? If it's details about a protocol (network messages, file format or external API calls) those are fairly easy to dissect externally. If it's a proprietary algorithm, someone could blackbox the relevant parts of your code to use in their own application, without even reversing it. If it's proprietary data, client-held keys, etc. there are ways to get at it. Assume that everything you hand a client is no longer secure or private - if you really need to keep secret sauce close to home, make it server-side.


> Naively compiled C/C++ is fairly easy to reverse engineer (I say this from a lot of experience!).

So I assume your position on reverse engineering Python bytecode is that it's trivial.


It can be, since Python is a higher-level language, but it isn't necessarily easier. For one, the state of decompilation technology is much more primitive for Python - the decompilers I've used are more like pattern matchers and break if you even slightly tweak the bytecode or use fancy constructs.

Second, although Python by default outputs plenty of symbolic data to assist a reverse engineer, these can be stripped (just like a C/C++ binary can be stripped), leaving you with a bunch of duck-typed method calls and operations.


> One of the biggest Python issues I see is the inability to hide or protect Python source code.

As with other code (source or object), it is protected by means of law.

If you want to hide it, run a service on a computer you control and sell access to the service.


There's probably not that much code that is so unique and difficult that it could not be reimplemented quickly from a spec. On the other hand, if the code happens to be part of a large system, it gets increasingly difficult to understand and use, from just a code dump without author support.

I submit that those two sets don't intersect much, and probably why this issue does not get much attention. Cython might be a solution.


There's decompilers and deassemblers.

Code is not hidden or protected by being transformed into a binary executable format. Companies that believe/rely on that are disillusioned, not paranoid.


>>I would be curious to know how other folks hide/protect Python code?

With lawyers.

Most of the commercial products that are written in python come in with a EULA that says "don't touch".


I've seen most the points discussed several times already on python-ideas, python-dev lists -- that is at the very least some of the points have merit and if the author has anything new to add then these lists might be also the place to do it.

> Even weirder, str and list both have find, but list does not have index (a related method).

It is in reverse: both str and list has index() methods. str has find(). To find out whether an item is in the list in Python:

  if item in your_list:
      ...

> class FooNode

There is attrs package [1], to avoid boilerplate for a mutable analog of collections.namedtuple ("case classes" [2]):

  @attr.s
  class C(object):
      x = attr.ib(default=42)
      y = attr.ib(default=attr.Factory(list))
> Lack of switch

It is hard to discuss it without a specific code example from an existing popular codebase that shows the advantages of "switch" statement (to compare the current code and how it looks like with a suggested "switch" syntax). To justify a new syntax you should be able to find dozens of applicable examples easily.

[1] https://pypi.python.org/pypi/attrs [2] http://www.codecommit.com/blog/scala/case-classes-are-cool


The point about "Inadequate data modelling facilities" is why I wrote Maps: https://github.com/pcattori/maps . Specifically, the "Named Maps" variants provide the same interface as `namedtuple` but for different levels of immutability/mutability.

Feedback/suggestions welcome!


I use the beta version of this library all the time in my code! It was really simple and my code was cleaner, easier to read and write! (link to beta version of library: https://github.com/pcattori/namespaces ). Thanks and I am looking forward to try out Maps now!


This is really cool. NamedDict is a useful thing indeed!


could someone explain to me why

    >>> ranges = [range(i) for i in range(5)]
    >>> [*item for item in ranges]
    [0, 0, 1, 0, 1, 2, 0, 1, 2, 3]
would be better than:

    >>> ranges = [range(i) for i in range(5)]
    >>> [item for subrange in ranges for item in subrange]
    [0, 0, 1, 0, 1, 2, 0, 1, 2, 3]


To be honest, the second one I have a hard time parsing.

I would expect that ranges is on the end, but it is somewhere in the middle.

   [item for item in subrange for subrange in ranges] 
is clearer to me. Then I can scan from left to right. It would read like a pipeline.

Now I need to start in the middle (ranges), scan to the left (for subrange), then to to the end (for item in subrange) and then back to the beginning (item). Or something like that, it is hard to follow your own eye movements. :)

Note that I seldom use python and the * pattern is something I recognize from another language, so I am biased. I imagine a seasoned python dev has no problems with the second case.

Btw, does python let you overload those comprehensions? That would be nice.


Less verbose.

Also, what happens if you have even more nested lists you want to flatten?

Personally I think it should just be

    flatten(range(i) or i in range(5))
With flatten in the global namespace


Given that flatten is surprisingly tricky to get right, it really should be built-in. The naive recursive variant will crash python if you nest lists beyond the stack limit, which is very no bueno. A list that's nested 10,000 layers deep is not especially hard to create or store in memory, and a flatten implementation should be able to handle it without crashing the interpreter.

In fact, it's not a bad little programming exercise: making a flatten that performs well and never crashes because of stack overflow.


I've not tried, but you can probably get that by recursively mixing standard constructs and functools.chain.from_iterable().


You should try. It's harder than it seems.

(I mean, it's not the most challenging problem ever, but most programmers look at it and go "that's trivial, just do X!", and it's a bit trickier than that).


I agree on lamdas and the gist that things must look pythonic to be accepted as that holds the language back from iterating or evolving. The rest of his points read like scope creep in that he seems to want the language to be something it's not.


Serious question: is the whole deal with the Python GIL solvable if some BigCo decides to throw a ton of money and engineers at it? Like Google with V8, for instance. Or is it a truly hard problem that will take something special to solve?


For awhile, Python was one of the 4 approved languages at Google. The answer is "probably no." If it were easily solvable, Google would already have thrown money and engineers at it.


See my sibling comment to yours - but care to explain your thoughts more?

Jython already fixed the GIL problem. The problem is the legacy codebase built on assumptions of non-concurrency, which is just a matter of engineer time i.e. throwing money at it, plus getting GVR to sign off on it.


Jython already fixed the GIL problem. The problem is the legacy codebase built on assumptions of non-concurrency

Which is to say that Jython didn't completely fix the GIL problem, from the POV of a lot of people.


Sure. There's already variants like Jython that have removed the GIL.

The larger problem is the existing codebase, which is based around the assumption of non-concurrency. You also have a lot of libraries that use C extensions for performance, and a lot of those are going to break horribly if you suddenly throw them into a concurrent environment where the Python state is mutating underneath them.

Again though, "rewriting code for concurrency" is not a fundamentally unsolvable problem, it just takes a lot of engineer time to change everything over. Moving global/static state into instances, adding locks, marking atomic/critical segments, that kind of thing. There's just a lot of things built with Python that would need to be gone over.

I really think you could do a switchover on the fly by adding a Java-style "synchronized" attribute. Before you can enter a synchronized method, you set a flag and all other threads must yield at their next return, function call, loop iteration, the end of their atomic segment, or at safe points marked by a "yield" statement (pick some combination of reasonable behavior). While a synchronized method is on the call stack, no other thread may execute. All existing code is marked synchronized - perhaps any code in a .py file is assumed unsafe by default, while any code in a .jy file is assumed safe. Boom, start converting code.

The thing that really gets me is that Python 3 is already pushing breaking changes that necessitate a complete overhaul anyway. The failure to thread the interpreter/remove the GIL at the same time is a stunningly idiotic decision. How about since we are making everyone review their code anyway, we have them look at thread safety too?

Which leads to the other problem - Python is GVR's baby, and at the end of the day the reason Python 3 has a GIL is because he says so. By all means, Google could go ahead and rewrite everything, but he'd never let them call it Python. It's hard to build momentum for a serious fork like that. And you don't want to spend a bunch of engineer time and end up with an unsupported "toy" that nobody uses.

It's a shame, Python hits real close to the mark but concurrency is its Achilles' Heel. I love the language but I am gunshy about using it because you never know if your project will go from "toy" to "real product that may need to adapt/scale" and you need concurrency.


The biggest challenge of removing GIL is to do it without breaking existing code.

There's work on that. Here's a technical talk about why is it difficult[1].

[1] https://www.youtube.com/watch?v=fgWUwQVoLHo


This is a tired, trolling post. Most of these issues have long been addressed as non-problems or personal preferences; when the author says "Incompetence? Politics?" what I hear is "people don't listen to me, probably because I don't know what I'm talking about". The attitude is confirmed by his/her conflating of stdlib gripes and language gripes - two very different sets of problems - and mixing requests for speed with requests for more lambda support, two things that are notoriously unlikely to go hand-in-hand.


you seem confused. Please explain how lambda and performance are "notoriously" unlikely to go hand-in-hand. They're orthogonal, yes. "Notorious"..what does that actually mean?

Also I will disagree that stdlib and core are "very different problems". Exhibit A: Go delivers stdlib and core language together, hand-and-glove style, with out-of-the-box huge functionality. It's one reason why it's killing Python. Stdlib is a key part of language functionality and is intricately linked to uptake. Just ask Ocaml.


[flagged]


You are just all over this thread with the snark today.


Thanks!


Writing self all the time. And the lack if switch. Everything else is great.


I honestly don't get why they made the big compatibility-breaking move to Python 3 without using that opportunity to change things for better performance and no GIL.


this is the essence of the problem for Python's long term future. They've been so burned by the 2-to-3 mess that nobody will ever dare touch the fundamentals again. As you say, some of this stuff (performance, multicore) should have been slotted into 3 since it was breaking-change already, even if delaying it by a few years. Then everybody would have moved, pronto. Now, even if 3 finally snuffs 2 out, we'll be stuck with the fairly unsatisfactory 3 underlying architecture essentially forever.


Please don't encourage them. I'm fairly certain the CPython core dev team will take almost any suggestion like this as a challenge and break everyone's code again in Python4. They see it as stabbing back at those corporate freeloaders. Or at least that's the public front. I'd just like them to take lessons from Go and actually get unicode right. I'm far more interested in Grumpy. Python2 and Grumpy seems like more of a ace in the hole than Python3.


Well said. Google hired Guido, and from being a big Python shop was so disenchanted with Unladen Swallow's abject failure that they invented a whole new language to replace it, and Guido became surplus to requirements. And the last vestiges of Python are now to be piped through Grumpy. Not a single mention of 3.x and Google in the same breath. Py27+Grumpy looks great.


As disruptive as it was, Py3 had fairly minor backwards compatibility breakage. Dropping the GIL would be many times more disruptive (specifically for C extensions, presumably it could be done with no effect on Python-code compatibility.)

Aside from keeping the GIL, performance improvements have beenade throughout the 3.x line, AFAIK, and should be expected to continue.


1) it was looked at and took too much effort.

2) Performance and GIL are things people who don't actually use Python complain about. In practice they are non-issues or have workable solutions.


Google was a major user of Python, and their solution to those issues was "build a whole new language". While I agree that on those points Python is good enough for many uses, it's inaccurate to say that all criticism of Python on those points is from people who haven't spent considerable time with Python on production.


For flatten, use:

    flattened = sum(list_of_lists, ())


This only flattens one level. Consider the following snippets:

    ; CHICKEN Scheme
    #;1> (flatten '((1 2 3) ((4 5) 6) (7 (8) (((((9))))))))
    (1 2 3 4 5 6 7 8 9)
    #;2> (apply append '((1 2 3) ((4 5) 6) (7 (8) (((((9))))))))
    (1 2 3 (4 5) 6 7 (8) (((((9))))))
vs.

    # Python 3
    >>> sum([[1,2,3], [[4,5],6], [7, [8], [[[[[9]]]]]]], [])
    [1, 2, 3, [4, 5], 6, 7, [8], [[[[[9]]]]]]
There's a big difference here. Flattening a list to just the elements inside isn't terribly hard, especially in a language like Scheme with tail-recursion, but flatten is definitely something that should be in the standard library. The "flatten" you propose is really just appending the elements of the first level of the list.


See https://bugs.python.org/issue27852 for a discussion as to why there isn't a more general flatten().


No! Never do this in Python.

You are making the flattened list by continually concatenating the smaller lists. Each concatenation creates the new bigger list from scratch; the flattened list does not grow dynamically. This is quadratic-performance bad.

Use `list(itertools.chain.from_iterable(...))` instead.


Huh, that's clever! (You meant `[]` though, yes?)


It works, how come no one knows about it??


I realize that after reading the article, most people (including me, unfortunately) read the article as 'Problems WE have with Python'. Maybe a line by author at the top or bottom of the article, reiterating that it's the problem 'he' has with Python -- I know nobody would think such a second clarification would be necessary, but hey, we're humans! -- would help.


Yeah, I'll keep that in mind -- some people, even programmers, apparently don't like to read carefully. :)


http://coconut-lang.org/ looks really interesting - it seems it is a patch exactly for parts of Python I am missing.

(Though, not sure if want to use another language just that case. Vide CoffeeScript and JavaScript; in this case JS absorbed the best pars of CS.)


> in this case JS absorbed the best pars of CS.

Actually my favorite part of CS is the instance var intitialization. e.g.:

    constructor(@x, @y) ->
would initialize @x and @y to the arguments. It makes writing records much nicer.


Surprised nobody mentioned the "Python unicide"

http://lucumr.pocoo.org/2014/5/12/everything-about-unicode/


> Even weirder, str and list both have find, but list does not have index (a related method).

I believe you meant to write "str and list both have index, but list does not have find".


My data structure wish list:

- heapq to support max heap better. (you can invert the value or use heapq._heapify_max, neither is ideal.)

- Tree map.


Does any language have split on a list?


I've spent soo much time struggling with Python over the years, traveled across Europe for PyCon and tried to tune into the community. I really wanted it to be the good enough Lisp that Norvig claims it is. But in the end I always come out of it swearing to never touch the inconsistent, arbitrary, pile of exceptions again. Conceptually, it's C++ in scripting language clothes.


I love how so much person focus on the GIL and multithreading, when GIL is much more a solution to make un-threadsafe libs safe to use, and that most people don't see POSIX threads are an inherently broken abstraction. [1] http://www.daemonology.net/blog/2011-12-17-POSIX-close-is-br...

In fact it pretty much boils down to signals being broken on unices [2] https://lwn.net/Articles/683118/

Which even though I have a hatred for systemd, systemd is trying to fix by leaving the status quo. However, POSIX signals are still a problem to systemd [3] https://github.com/systemd/systemd/issues/1615

Having played with signal in python+C, I have the experience of python having some holes around the signals: no mask can bet set. I thought initially python sucked because of the most common denominator problem of system languages (having to make you support only small subsets of features). But, I am now thinking POSIX signal are just a broken OS level software interrupt implementation.

So going down the rabbit hole, after reading Stevens on unix/POSIX programming (a must read). I am pretty much thinking questioning fred brooks (hence K&R&T) biggest failure: OS360 followed by unices.

What if our quest for a multitasking portable OS that does not care about the HW is doomed?

It makes a darn good job for 99.999% of the case. The .001% remaining being the signals.

Look at it, what is a process meant to be?

A container running code.

A thread? Cooperative code sharing data. But how do you cooperate? You send signals.

The problem, it is in case of high use of signals the OS get "signal bound" in a way we cannot measure.

signals are like a huge software bus that is not easily measurable and at the opposite of a lot of primitive cannot be HW bound. It is basically a software bus that tries to convey the concept of HW interrupts that are normally handled with micro chips. Look at the MC2828 brochure and you can recognize the feature signals are trying to provide [4] https://upload.wikimedia.org/wikipedia/commons/3/31/Motorola...

So to wrap up, we may have a problem of HW architecture that results in a buggy implementation of a common API. Like trying to emulate MMU on a MMU less CPU.

And I would say that it is thanks to my experiments in python that I discovered that signals was an unreliable 1bit message delivery protocol. Python made it easy to experiment.

Python has problems. (mostly a weired mix of conservatism and progress on concerns I don't share and politics). But overall it is a good system language that deals with problem. And poor support of signals, threading are not a bug from python.

A good system language does not try to fix system glitches, he let them stay obvious. All the hate against GIL/threading/signals/weired async IO may be better directed at the quest for a portable multitasking generic OS.

Threads (and implicitly signals) on the other hand are convenient fantasies that we would like to exist but are actually just fantasies. And to solve the problem, we invented the containers... based .... on cooperative multitasking system ... based ... on threads and signals.


I think the hate is directed at GIL because everbody else has figured out reasonably safe ways to thread and share data. Java doesn't have this problem, and I think all the other languages have a solution.

So what, maybe some C implementation of threads has problems, that doesnt mean people haven't figured out models that work.

Whats worse, the distrust of functional style programming means that, if threads are really doomed, python has no way of being a system level language, since the other option is using functional islands that signal or queue to each other (which is directly representable by hardware). In this case, python becomes little more than DSL relegated mostly to imperative programming for tiny scripts, not a system level language.

I for one hope and believe that won't happen, with options like PyPy, Twisted, and Jython. Ironically, the only way Python will survive is for it to become unpythonic, something that has already happened as the orginal maintainers of the language insist on their own pig-headed agendas that most don't care for. I think CPython will ultimately fail, and will be practically replaced by the community with something...less toxic.


The GIL doesn't magically make un-thread safe code thread safe. It makes Pythons reference counting implementation thread safe.


The core dev having worked on new GIL (py3.2) explained this to me.

I never said it was magic.

But he said GIL is a tool to achieve thread-safety in python when calling non 'thread safe' code.

I am not him, I will not take on any argument of how it works.

But since ruby GIL is inspired by python GIL let's hear ruby coders: http://www.rubyinside.com/does-the-gil-make-your-ruby-code-t...

Oh, yes, it seems some people are seeing it my way and that it is a controversial point that can be argued. So I agree to disagree.

Btw, I don't multithread and share states, I multiprocess with 0MQ and communication patterns such as PUB/SUB PUSH/PULL for the obvious reasons that I really think multi-threading is an over-valued and wrong abstraction.


> I really think multi-threading is an over-valued and wrong abstraction.

That's as wrong-headed as thinking it's the only good abstraction. Each style has its place—it really depends on what the code needs to do.


I'd tend to agree with the parent that multi-pthreading is a poor abstraction but I'd add an important note: only systems programmers should be explicitly spinning up pthreads. Explicit pthreading is a poor abstraction for everyone else. Application programmers are probably operating at the 'wrong' level and greenthreads make more sense.


Real threads are great for implementing things like clojure's core.async or task systems like Intel's Threading Building Blocks. Both of these are super useful and don't work as well with processes instead of threads. I definitely agree that most programmers shouldn't be using threads directly, but they should be available as a building block for higher level abstractions.


  > I love how so much person focus on the GIL
Do you really? I don't see much love in your comment.


>most people don't see POSIX threads are an inherently broken abstraction.

o.O

> A thread? Cooperative code sharing data. But how do you cooperate? You send signals.

O.o

> Threads (and implicitly signals) on the other hand are convenient fantasies that we would like to exist but are actually just fantasies.

O.O

I'm not sure how this relates to python but it was entertaining at least.


Why cant the author submit some of these changes ? Its easy to rant


Same reason you're complaining in a comment instead of doing things.


He's complaining at you complaining :P




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: