the benefit is that all the wasted energy that went into all the useless 3.x stuff, could have been spent on advancing Python's speed, multicore, or GPU programming capabilities. Instead, for the single use case where Python is clearly the dominant language (for pure network-effect reasons), namely scientific programming, we have been at a standstill for years. In other words, under current stewardship, Python is going down a path where I don't see a long term future for it in my domain, and therefore, I am looking elsewhere already.
I am also almost 100% certain that if scientific programmers leave Python, the language will stall, and the current 3.x pushers are dangerously looking a gift horse in the mouth. This is particularly true given that Go is rapidly eating Python's lunch in all non-science use cases.
> I am also almost 100% certain that if scientific programmers leave Python, the language will stall, and the current 3.x pushers are dangerously looking a gift horse in the mouth.
This sounds like such similar sour grapes to the systemd escapades. Huge initial outroar as certain things happened, followed by a gradual diminishing, then [mostly] acceptance.
This is what is happening with Python 3 and a smaller, doom-and-gloom subset of the userbase. Most of us are thankful that Python is striving to improve, and that we continue to pay nothing to use it. The alarmists are far more noisy than those that are happily continuing to build stuff.
Python will remain massively used in the sciences. It's simple, easy to learn, expressive, and has an excellent ecosystem of modules (which now mostly work on Python 3).
>This sounds like such similar sour grapes to the systemd escapades. Huge initial outroar as certain things happened, followed by a gradual diminishing, then [mostly] acceptance.
Only here we don't have mere service scripts, but millions of lines of code people have written in perfectly fine 2.x Python.
And also here we don't have any significant uptake -- Python 2.x is still over 60% of what's used (according to PyPi stats and everything we've seen), and that's after 6+ years that Python 3 had its chances.
PyPI stats are massively inflated by automated deployments, and anyway even them show a clear rising trend for 3. Also, 3.4+ (from 2014) is a very different beast from 3.0, and it's probably not a coincidence that porting significantly accelerated after its release (about 3 years ago).
With numerous libraries phasing out support for 2.x in current or upcoming versions, I think we actually do have significant uptake.
As pointed out by others, those PyPi stats are super off. Better to look at what's going on out in the community and with the most popular packages (like Django).
Except that personally I see the benefit of Systemd every time I boot my linux computer. It just works, fast, and clean. I don't partake in the philosophical arguments. I just want it to work. Py3 by contrast, simply throws cruft curveballs at me. No tangible benefit. This is not a systemd-style issue.
And yes Python will likely continue massively to be used in the sciences, 3 bears like me notwithstanding, so it would be good if the current stewards would actually recognize the fact that this is their core base of users and please could they focus on them instead of the web people who are much more fickle and moving already.
> core base of users and please could they focus on them instead of the web people who are much more fickle and moving already.
Ahhh, I see what's going on now. You may be vastly underestimating the size of variety of the Python userbase. This is one of the absolute most popular languages on the planet. Your science subset is but one of many. And it's not even the biggest if we're talking sheer user counts.
Python must be steered for the good of the majority of the userbase. Not just for vegabook. The fact that you described them as "useless" to me just speaks to someone being impatient and dismissive of a ton of excellent work by the contributors. These changes weren't made just for the hell of it. Particularly the Unicode example that you gave above.
If you want help understanding the rationale behind some of these changes, feel free to ask here. Someone with more familiarity will chime in and help clear up the confusion or angst.
Lol. Before Numpy and friends even existed, Python was used mostly by "the web people" and sysadmins. Scientists are one of a number of Python constituencies, and not even the best-paying nor most visible one.
Interesting that you're looking for those improvements in CPython. From your list (IMO): speed and multicore are going to stay where they are in CPython - GIL is unlikely to be removed, because it's basically part of the behaviour right now. There were a few attempts, but it seems nobody even tries anymore. (and I'm fine with that) Maybe pypy, grumpy, or others will do this instead.
I'm not sure why you expect GPU from the python itself though. That's completely up to libraries and they can exist for either 2.x or 3.x. Was there ever a CPython GPU related project?
The GIL is encoded into practically every C API of the interpreter. It's not going away in CPython, and at the very least any Python implementation supporting CPython extensions will have a global lock for calls into extensions.
And even if (there were patches for removing the GIL already around in the 90s): All of the approaches and patches shown so far significantly degraded single-threaded performance, which matters to way more applications than the GIL, which typically is not a significant limit to using multiple cores/processors.
"Teh GIL" is a very, very overblown issue, and is -- I don't want to be condescending here, but well -- usually brought up by people that have little experience writing software that makes effective use of multiple-many cores.
Yeah, you are a bit condescending here. Part of the topic was multicore. If you're interested in actually cutting down the data transfers without resorting to explicit shared memory, GIL and threads are very much on topic. Whether it's an overblown issue depends specifically on your workload and reasons why you're still using python if you care about multicore.
I attended this talk, and it was really great. PyPi is pursuing software transactional memory, which is massively difficult to implement. The Gilectomy approach is much more community oriented, and focuses on the transition.
Put a bit differently: it's easy to remove the GIL safely, it just incurs a non-trivial performance penalty and breaks C extensions, so the Gilectomy effort is around a combination of removing or avoiding that penalty if you're not multi-threading, and smoothly transitioning C extensions. After that talk, removing the GIL looks inevitable.
I am also almost 100% certain that if scientific programmers leave Python, the language will stall, and the current 3.x pushers are dangerously looking a gift horse in the mouth. This is particularly true given that Go is rapidly eating Python's lunch in all non-science use cases.