- PyPy core dev here. If anyone is interested in helping out, either financially or with coding, we can be reached various ways. See https://pypy.org/contact.html
- The website should have a prominent Donate section, maybe have some tiers of donation like the Ladybird browser does.
I wanted to put a little £ towards the project but couldn't see a place to do it.
- I don’t disagree about prominence but to share the links under the about section for people here
- Thanks for sharing this. I just donated. Been using them for many many years.
- Donated. Thank you and everyone else on the PyPy team.
I use PyPy regularly on an app of mine, and very often when I need to do some compute heavy load. Typically over 5x faster than CPython. It makes some stuff that takes impossibly long with CPython (nobody wants to wait 5 minutes...), to returning a response in a few seconds.
- Another suggestion to add for you all (IDK how helpful.) When I see PyPy I see that its speed is faster for CPU-bound work but I'm thinking there is also I/O bound work that would see significant increases in the load they can handle. You could host a page that benchmarks common tasks like HTTP req/s (different types) with asyncio vs CPython. Could even have an automated tool that allows projects to benchmark performance from a web-page using PyPi without having to install or measure anything.
- Benchmarks are tricky. Do you have a specific use case you want sped up?
- I have to say the speed comparison on the front page seems hard to read / backwards
I feel like you should either put absolute numbers side by side or how much faster pypy is (instead of how much time it takes)
- Also big notice that it is unmaintained
- And that the corporations using their work should donate if they actually want it maintained.
- Donating is for individuals. If a corporation wants something done, they can hire or contract someone to do that thing.
- PyPy isn't unmaintained. We are certainly fixing bugs and are occasionally improving the jit. However, the remaining core devs (me among them) don't have the capacity to keep up with cpython. So for supporting new cpython versions we'll need new people to step up. For 3.12 this has started, we have a new contributor who is pushing this along.
- The text merged to the documentation is more concise than the PR title:
> not actively developed anymore
- Which is just as wrong.
- I think the most significant boundary is given by the question: "is there a plan to support new minor versions of Python?" It sounds like there is not.
There may be non-zero maintenance work happening, but a project that only maintains support for old versions and will never adopt new ones is functionally one that the ecosystem will eventually forget about. Maybe you call that "under active development" but my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
On the other hand, if you don't support new minor versions day one, but you eventually support them, that's quite different.
- More specifically, the Scientific Python community through SPEC 0[0] recommends that support for Python versions is dropped three years after their release. Python 3.12 was released in October 2023[1], so that community is going to drop support for it in October 2026.
Considering that PyPy is only just now starting to seriously work on supporting 3.12, there's a pretty high chance that it won't even be ready for use before becoming obsolete. At that point it doesn't even matter whether you want to call it "in active development", it is simply too far behind to be relevant.
- What's the point of a three year window? It seems like a weird middle-point. Either you are in a position to choose/install your own interpreter and libraries or you are not.
If you can choose your own versions and care at all about new releases, you can track latest and greatest with at the very most a few months of lag. Six months of "support" is luxurious in this scenario.
If you can't choose your own versions, you are most likely stuck on some sort of LTS Linux and will need to make do with what they provide. In that case three years is a cruel joke, because almost everything will be more than three years old when it is first deployed in your environment.
- I guess the point of a three year window is to be able as an ecosystem to at some point adopt new language features.
When you have some kind of ecosystem rule for that, you can make these upgrade decisions with a lot more confidence.
For example in my project I have a dependency on zstandard. In 3.14 zstandard was added to the standard library. With this ecosystem wide 3 year support cycle I can in good confidence drop the dependency in three years and use the standard lib from then on.
I feel like it just prevents the ecosystem from going stale because some important core library is still supporting a really old version, thus preventing other smaller libraries from using new language features as well, to not exclude a large user base still on an old version.
- They appear to be talking about CPython implementations, taking into account when those versions continue to be sorted (in the sense of security updates). That's irrelevant for PyPy, which clearly supports version numbers on a different schedule.
- It's not irrelevant, because if SPEC 0 says that a particular Python version is no longer supported, then libraries that follow it won't avoid language or standard library features that that version doesn't have. And then those libraries won't work in the corresponding PyPy version. If there isn't a newer PyPy version to upgrade to, then they won't work in PyPy at all.
- You might make a different decision if you were targeting PyPy.
- This is silly, there's no killer feature for scientific computing being added to python that would make an existing pypy codebase drop that dependency, getting a code validated takes a long time and dropping something like pypy will require re-valditating the entire thing.
- Unfortunately python does add features in a drip-drip kind of way that makes being behind an experience with a lot of niggles. This is particularly the case for the type annotation system, which is retrofit to a language that obviously didn't have one originally. So it's being added slowly in a very conservative way, and there are a lot of limitations and pain points that are gradually being improved (or at least progressed on). The upcoming lazy module loading will also immediately become a sticking point.
- The phenomena you're describing is why Cobol programmers still exist, and simultaneously, why it's increasingly irrelevant to most programmers
The killer feature is ecosystem: Easily and reliably reusing other libraries and tools that work out-of-the-box with other Python code written in the last few years . There are individually neato features motivating the efforts involved in upgrading a widely-used language & engine as well, but that kind of thinking misses the forest for the trees unfortunately.
It's a bit surprising to me, in the age of AI coding, for this to be a problem. Most features seem friendly to bootstrapping with automation (ex: f-strings that support ' not just "), and it's interesting if any don't fall in that camp. The main discussion seems to still be framed by the 2024 comments, before Claude Code etc became widespread: https://github.com/orgs/pypy/discussions/5145 .
- The alternative is when you run a script that you last used a few years ago and now need it again for some reason (very common in research) and you might end up spending way too much time making it work with your now upgraded stack.
Sure you can were you should have pinned dependencies but that's a lot of overhead for a random script...
- > I think the most significant boundary is given by the question: "is there a plan to support new minor versions of Python?" It sounds like there is not.
There is literally a Python 3.12 milestone in the bug tracker.
> my response is "ok, then I don't care whether it's under active development, I (and 99.9% of other people) should care about whether it's going to support new minor versions."
It sounds a lot more like your actual response is "I don't care about pypy".
Which is fine, most people don't to start with. You don't have to pretend just to concern-troll the project.
- CPython has turned into a commercial enterprise where a small number of developers chase away everyone and periodically get useless projects funded by corporations that go nowhere after five years. Intelligent people have all left.
The 150th rewrite of unicodeobject.c is relatively benign (except that it probably costs RedHat money) but the other things are impossible to keep up with.
- PyPy is a fantastic achievement and deserves far more support than it gets. Microsoft’s “Faster CPython” team tried to make Python 5x faster but only achieved ~1.5x in four years - meanwhile PyPy has been running at over 5x faster for decades.
On the other hand, I always got the impression that the main goal of PyPy is to be a research project (on meta-tracing, STM etc) rather than a replacement for CPython in production.
Maybe that, plus the core Python team’s indifference towards non-CPython implementations, is why it doesn’t get the recognition it deserves.
- Third party libraries like SciPy scikit-learn, pandas, tensorflow and pytorch have been critical to python’s success. Since CPython is written in C and exposes a nice C API, those libraries can leverage it to quickly move from (slow) python to (fast) C/C++, hitting an optimum between speed of development and speed of runtime.
PyPy’s alternative, CFFI, was not attractive enough for the big players to adopt. And HPy, another alternative that would have played better with Cython and friends came too late in the game, by that time PyPy development had lost momentum.
- PyPy on numpy heavy code is often a lot slower than CPython
- Yes. The C API those libraries use is a good fit to CPython, a bad fit to PyPy. Hence CFFI and HPy. Actually, many if the lessons from HPy are making their way into CPython since their JIT and speedups face the same problems as PyPy. See https://github.com/py-ni
- Sorry can you explain more the connection between PyPy and CFFI (which generates compiled extension modules to wrap an existing C library)? I have never used PyPy, but I use CFFI all the time (to wrap C libraries unrelated to Python so that I can use them from Python)
- CFFI is fast on PyPy. The JIT still cannot peer into the compiled C/C++ code, but it can generate efficient interface code since there is a dedicated _cffi_backend module built into PyPy. Originally that was the motivation for the PyPy developers to create CFFI.
- Thank you for the background info, and sorry for me explaining CFFI (I just wanted to be sure we were talking about the same thing). Being ignorant about PyPy, I honestly had no idea until now that there was a personnel or purpose overlap between CFFI and PyPy. I am very grateful for CFFI (though I only use it API mode).
- I rather like Python and have used the C API extensively, "nice" is not the word I'd choose ...
- Python was already widely deployed before them, thanks to Zope, and being a saner alternative to Perl.
- The Faster Python project would’ve got further if Microsoft hadn’t let the entire team go when they made large numbers of their programming languages teams redundant last year. All in the name of “AI”. Microsoft basically gave up on core computer science to go chase the hype wave.
- You’re right, of course: even Guido seems to have been moved off working on CPython and onto some tangentially-related AI technology.
However, Faster CPython was supposed be a 4-year project, delivering a 1.5x speedup each year. AFAIK they had the full 4 years at Microsoft, and only achieved what they originally planned to do in 1 year.
- To be fair, they suffered a bit from scope creep, as mid project it was started a second major effort to remove the gil. So the codebase was undergoing two major surgeries at the same time. Hard to believe they could stick to the original schedule under those conditions. Also gil removal decreases performance from sequential execution. I imagine some gains from Faster CPython were/will be spent compensating this hit on gil-less single thread performance.
- We have been using PyPy on core system component on production for like 10 years
- > PyPy is a fantastic achievement and deserves far more support than it gets
PyPy is a toy for getting great numbers in benchmarks and demos, is incompatible in a zillion critical ways, and is basically useless for large-scale development for anything that has to interoperate with "real" Python.
Literally everyone who's ever tried it has the experience that you mock up a trial for your performance code, drop your jaw in amazement, and then run your whole app and it fails. Until there's a serious attempt at real 100% compatibility, none of this is going to change.
Also none of the deltas are well-documented. My personal journey with PyPy hit a wall when I realized that it's GC is lazy instead of greedy. So a loop that relies on the interpreter to free stuff up (e.g. file descriptors needing to be closed) rapidly runs into resource exhaustion in PyPy. This is huge, easy to trip over, extremely hard to audit, and... it's like it's hidden lore or something. No one tells you this, when it needs to be at the top of their front page before your start the port.
- "Ask HN: Is anyone using PyPy for real work?" from 2023 contradicts you about PyPy being a toy. The replies are noticeably biased towards batch jobs (data analysis, ETL, CI), where GC and any other issues affecting long-running processes are less likely to bite, but a few replies talk about sped-up servers as well.
https://news.ycombinator.com/item?id=36940871 (573 points, 181 comments)
- Timely management of external resources is what the `with` statement has been for since 2006, added in python 2.5 or so. To debug these problems Python has Resource Warnings.
Additionally, CPython's gc is also only eager in a best effort kind of way. If cycles are involved it can take long to release memory. This will become even more the case in future versions of CPython, in the free threading variants.
- Sorry, the with statement is non-responsive. The question isn't whether you "can" write PyPy-friendly code. Obviously you can.
The question isn't even whether or not you "should" write PyPy-friendly code, it's whether YOU DID, or your predecessors did. And the answer is "No, they didn't". I mean, duh, as it were.
PyPy isn't compatible. In this way and a thousand tiny others. It's not really "Python" in a measurable and important way. And projects that are making new decisions for what to pick as an implementation language for the evolution of their Python code have, let's be blunt, much better options than PyPy anyway.
- I've run into similar resource limit exhaustion due to the GC not keeping issues with cpython as well
- If anyone else is also barely aware and confused by the similar names, PyPI is the Python Package Index, which is up and maintained. PyPy is "A fast, compliant alternative implementation of Python." which doesn't have enough devs to release a version for 3.12[0].
- Thanks for the clarification. On top of that, being an issue in the 'uv' GitHub repo (uv installs packages from PyPi) made my brain easily cross the letters.
- Reminds me of Cython vs CPython
- What is cpython? I don't think I've heard of this one before.
Edit: it's just python. People are pretending like other attempts to implement this are on equal footing
- CPython (the compiler) is the most popular implementation of Python (the language) like GCC, Clang, and MSVC (compilers) are implementations of C (the language). Other Python implementations include PyPy, Jython, and IronPython.
Nobody is "pretending" anything. These have all been around for 15+ years at this point. Your ignorance does not imply intent to deceive on others part.
- saying the most popular hides the actual reason why it is popular though. it is the original python implementation. it defines the standard and functions a reference for all others. for better or for worse other implementations have to be bug-compatible with it, and that is what puts them not on equal footing.
for C compilers no reference implementation exists. the C standard was created out of multiple existing implementations.
- PyPy is a JIT-compiled implementation of a language called RPython which is a restricted subset of Python. It does not and has never attempted to implement Python or replace your CPython interpreter for most intents and purposes. CPython is the official reference implementation of the Python language and what you probably use if you write Python code and don't understand the difference between a programming language and its implementations (which is fine)
- This doesn't sound right. PyPy has always been described as an alternative implementation of Python that could in some cases be a drop-in replacement for CPython (AKA standard Python) that could speed up production workloads. Underneath that is the RPython toolchain, but that's not what most people are talking about when they talk about PyPy.
- The project has self described as CPython for many years.
It’s literally the name of the repo [1].
There’s no grounding to feign surprise or concern anymore.
Moreover, I have used PyPy for years to beat the pants off CPython programs.
- and mypy is "an optional static type checker for Python" [0]
Given that both pypy (through RPython) and mypy deal with static type checks in some sense, I kept confusing the two projects until recently.
Also, I just learnt (from another comment in this post) about mypyc [1], which seems to complete the circle somehow in my mind.
[0] https://www.mypy-lang.org/ [1] https://github.com/mypyc/mypyc- Don't forget about RPy https://pypi.org/project/rpy2/2.2.7/
- Don't forget about dmypy, the daemon version of mypy.
- pypy existed long before type annotations were a thing
- And JITs often don't care for type specifications as they can generally get better info from the runtime values, need to support that anyway, and for languages like python the type specifications can be complete lies anyway. They also might support (and substitute) optimised versions of types internally (e.g. pypy has supported list specialisation for a long time).
Maybe it's changed since, but last I checked the JVM's JIT did not care at all for java's types.
Which is not to say JITs don't indirectly benefit mind, type annotations tend to encourage monomorphic code, which JITs do like a lot. But unlike most AOT compilers it's not like they mind that annotations are polymorphic as long as the runtime is monomorphic...
- PyPy may not care in principle, but RPython does, being a kind of python dialect designed for static compilation that is intended for writing JIT engines like PyPy.
- Thanks. I knew this already but keep forgetting and getting confused
- The short summary of it being that these people are beyond terrible at giving names to things.
- Programmers and engineers should never be allowed to name things.
I say that as a programmer and engineer.
- "We suck at naming things" -- Bjarne Stroustrup, in a talk about SFINAE
- On one side I agree. On other side I look how marketing people name things and I think we're still better off
Imagine if next edition of GCC, released in 2026 was named 2027. Then it was GCC One. Then GCC 720. Then GCC XE. Then just plain GCC. Then GCC Teams
- And then finally…GNU 720 AssistantDriver.
(Tip of the hat to Microsoft’s marketing teams.)
- The python community has the habit of giving short names for things
- Somewhat interesting that "volunteer project no longer under active development" got changed to "unmaintained".
- For context, they have 2 to 4 commits per month since October [1]. The last release was July 2025 [2].
- That seems reasonably active to me. You can't really expect more from an open source project without paid full-time developers.
- What euphemism do you prefer then...
- There's a difference between dead (i.e. "unmaintained") and low activity ("not under active development"). From what I can see PyPy is in the latter category (and being in that category does not mean it's going to die soon), so choosing to claim it is unmaintained is notable.
- Being three major versions behind CPython is definitely not a great sign for the long-term viability of it.
- It's always been about that many versions behind.
There is more churn in those versions than you'd think.
- I'd genuinely be curious what fraction of those changes actually requires porting to other Python implementations. The free-threading changes are inherently interpreter specific, so we can ignore those. A significant change in Python 3.12 is dropping "dead batteries", so that can be ignored as well. From what I can see, the main language changes are typing-based (so could have parser implications), and the subinterpreter support being exposed at the Python level (I don't know whether that makes sense for PyPy either). I think this hints that while certain area of Python are undergoing larger changes (e.g. typing, free-threading), there is no obvious missing piece that might drive someone to contribute to PyPy.
Also, looking at the alternate (full) interpreters that have been around a while, PyPy is much more active than either Jython or IronPython. Rust-python seems more active than PyPy, but it's not clear how complete it is (and has going through similar periods of low activity).
Would I personally use PyPy? I'm not planning to, but given how uv is positioning itself, this gives me vibes of youtube stating it will drop IE 6 at some unspecified time in order to kill IE 6 (see https://benjamintseng.com/2024/02/the-ie6-youtube-conspiracy...).
- The problem is the million small paper cuts. The stdlib changes are not all in pure python, many have implications for compiled modules like _ssl. The interpreter changes, especially compatibility with small interpreter changes that are reflected in the dis module, also require work to figure out
- I'm not sure "major versions" is the most correct term here, but I think your point is spot on
- They are de facto semantic major versions - think of recent-ish additions like f-strings and match-case (3.7 and 3.11, I think), you'd get a syntax error in an older parser. PyPy targeting 3.9 for example would would support f-strings but not match-case.
Or at runtime, you can import things from the standard library which require a minimum 3.x. - .x releases frequently if not always add things, or even change an existing API.
- For Python, 0.1 increases are major versions and 1.0 increases are cataclysmic shifts.
- I don't know about that. For me, f-strings were the last great quality-of-life improvement that I wouldn't want to live without, and those landed in Python 3.6. Everything after that has not really made much of a difference to me.
- This reads like you think that "major" version bumps should ony happen when things make a big difference to you personally. At least that's where you land when you follow the logic of your statement. I think you may overrate the importance of your particular use case, and misunderstand what GP meant by "major".
The gist of what GP meant is that Python does not exactly follow SemVer in their numbering scheme, and they treat the middle number more like what would warrant a major (left-most) number increase in SemVer. For example, things will get deprecated and dropped from the standard library, which is a backwards-incompatible change. Middle number changes is also when new features are released, and they get their own "what's new" pages. So on the whole, these middle-number changes feel like "major" releases.
That being said, the Python docs themselves [0] call the left-most number the "major" one, so GP is not technically correct, while I'd say they're right for practical, but easier to misunderstand, purposes.
> A is the major version number – it is only incremented for really major changes in the language.
> B is the minor version number – it is incremented for less earth-shattering changes.
> C is the micro version number – it is incremented for each bugfix release.
The docs do not seem to mention you, though. :P
[0]: https://docs.python.org/3/faq/general.html#how-does-the-pyth...
- Oh, you are right, I forgot that "major version" is a technical term and incorrectly read it as "For Python, 0.1 increases make a big difference". My bad!
- If you want your code to run, you need a python interpreter that supports the newest of your dependencies. You may not use features that came after 3.6 (though you obviously do), but even if just one dependency or sub-depdendency used a python 3.10 specific feature you now need interpreter at least this new.
- That is true, and it is also a huge pet peeve of mine. If more library maintainers showed some restraint it using the newest and hottest features, we'd have much less update churn. But on the other hand, this is what keeps half of us employed, so maybe we should keep at it after all.
- That's like saying the last tax that affected you was passed in 2006...
- Undermaintained might be more suited since it does have life but doesn't appear commercially healthy nor apparently relevant to other communities.
- Underphrased like a pro.
- much respect to the PyPy contributors, but it seems like a pretty fair assessment
- 9 months since the last major release definitely feels like a short time in which to declare time-of-death on an open source project
- It’s been a lot longer than that. There was a reasonable sized effort to provide binaries via conda-forge but the users never came. That said, the PyPy devs were always a pleasure to work with.
- > It’s been a lot longer than that.
pypy 7.3.20, officially supporting python 3.11, was released in july 2025: https://pypy.org/posts/2025/07/pypy-v7320-release.html
We're in March 2026. That's 9 months, which is exactly what GP stated.
> There was a reasonable sized effort to provide binaries via conda-forge but the users never came.
How is that in any way relevant to the maintenance status of pypy?
- But if you set up dependabot and automerge some crap every couple of days your project will be very active!
Meanwhile my projects got marked as abandoned because those scanners are unaware of codeberg being a thing.
- It is also lagging behind in terms of Python releases. They are currently on 3.11, which was released 3.5 years ago for mainline Python.
- > It is also lagging behind in terms of Python releases.
Which it has always been, especially since Python 3, as anyone who's followed the pypy project in the last decade years is well aware.
- The problem is that it is lagging behind enough that it is falling out of the support window for a lot of libraries.
Imagine someone releases RustPy tomorrow, which supports Python 2.7. Is it maintained? Technically, yes - it is just lagging behind a few releases. Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
- > The problem is that it is lagging behind enough that it is falling out of the support window for a lot of libraries.
Which is a concern for those libraries, I've not seen one thread criticising (or even discussing) numpy's decision.
> Should tooling give a big fat warning about it being essentially unusable if you try to use it with the 2026 Python ecosystem? Also yes.
But it's not, and either way that has nothing to do with uv, it has to do with people who use pypy and the libraries they want to use.
- wow, that would be a big shame. I hope many of the useful learnings are already ported to CPython.
- - The pure python repl started off in PyPy, although a lot of work was done to make it ready for prime time by the COython core devs - The lessons from HPy are slowly making their way into CPython, see https://github.com/py-ni - There were many fruitful interactions in fixing subtle bugs in CPython that stemmed from testing the stdlib on an alternative implementation
And more
- Almost none of it will have been ported to CPython, as it's a completely different approach.
- I really like PyPy’s approach of using a Python dialect (RPython) as the implementation language, instead of C. From a conceptual perspective, it is much more elegant. And there are other C-like Python dialects now too - Cython, mypy’s mypyc. It would be a shame if PyPy dies.
- Most pure Python libraries run on PyPy without porting, while incompatibilities come from C extensions written against the CPython C-API such as numpy, lxml and many crypto libraries that either fail or run poorly under PyPy's cpyext compatibility layer.
If you plan to support PyPy, add it to your CI, prefer cffi or pure Python fallbacks over CPython C-API extensions, and be ready to rewrite or vendor performance-critical C extensions because cpyext is slow and incomplete and will waste your debugging time.
- Read as PyPi and almost had heart attack
- At this point it's probably better investing time and money into RustPython[1][2].
- Why would anyone use a python interpreter that is slower than CPython?
- Money is a forcing function for development. Why is there still no way to donate to all devs in the dependency tree? Should we just anticipate expensive problems just like this when the rot finally makes it uncomfortable to continue development?
- Thank you for all the work guys, I’ll see how I can help.
- @kvinogradov (Open source endowment), I am (Pinging?) you because I think that you may be of help as I remember you stating that within the Open source endowment and the approach of how & which open source projects are better funded[0]
And I think that PyPy might be of interest to the Fund for sponsoring given its close to unmaintained. PyPy is really great in general speeding up Python[1] by magnitudes of order.
Maybe the fund could be of help in order to help paying the maintainer who are underfunded which lead to the situation being unmaintained in the first place. Pinging you because I am interested to hear your response and hopefully, see PyPy having better funding model for its underfunded maintainers.
[0]: https://endowment.dev/about/#model
[1]: https://benjdd.com/languages2/ (Refer to PyPY and Python difference being ~15x)
- > @kvinogradov (Open source endowment), I am (Pinging?) you
unfortunately, @-pinging does not work on this site, it does nothing to notify anyone. If you want to get a specific person’s attention, use off-site communication mechanisms
- > unfortunately, @-pinging does not work on this site
I’d call it fortunate, and a feature. Not pinging certainly avoids many discussions becoming too heated too fast between two people and lets other opinions intervene.
- There are systems in place to prevent fast back-and-forth arguments.
Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
I'd make @ a page like 'threads' which just includes any comments with @$username.
- > There are systems in place to prevent fast back-and-forth arguments.
Like what? I never saw anything to suggest that is the case.
> Not having a mentions functionality for those who wish to use it doesn't seem to to change anything around over-heated discussions.
Of course it does. If you have to keep checking manually, eventually you’ll get distracted. By the time you come back, if you do, there may already be another reply to the reply and you may no longer feel the need to comment. Nor will you be inclined to respond to a comment made days later in a nested discussion, because you won’t find it. But people just arriving at the thread might, and continue the discussion with new perspectives.
> I'd make @ a page like 'threads' which just includes any comments with @$username.
To each their own, I’m thankful HN doesn’t have that feature.
- HN doesn’t have this sort of pinging behavior :/
- my view/experience is that pypy only makes faster the type of python code which you absolutely should not write in python if you care about performance
- knowing pypy has good implementations of a lot of behavior it helped me fix multiprocessing in Maya's python interpreter, fixing stuff like torch running inside of Maya.
it's too bad. it is a great project for a million little use cases.
- > This thread is about PyPy, not PyPI.
The hardest things in programming. That and designing a logo for something you cannot touch, smell or see.
- Odd how you still see announcements of this nature if Anthropic's marketing is be believed.
- Yup.
For me the biggest signifier is Spotify. They claim their (best) devs don't even code anymore, they use an internal AI tool that they just send prompts to which then checks out a personal test build that they can download off of Slack. "A new feature in 10 minutes!"
Okay, if that is the case, why have we only seen like 3-4 minor new QoL improvements in Spotify the last ~12 months, with no new grand features? And why haven't they fired 95% of their devs and let the remaining elite go buckwild with Claude?
The Emperor really has no clothes.
- Everyone here says "if developers are so much faster, why aren't we seeing more features?!" as if the only thing required to release a feature is developers.
My CEO keeps asking me "how can we go faster with AI", and my answer is "we can't, because even if we had developers that would instantly develop any feature perfectly, we'd still be bottlenecked on how slow we are at deciding what to actually release".
- > why have we only seen like 3-4 minor new QoL improvements
You are seeing improvements? From what I can tell, my user experience has only been going downhill over the past years - even pre-AI...
- tbf they have been saying they've started doing this since December, so we're only a few months in. And like most software it's an iceberg: 99% of work on not observable by users, and in spotify's case listeners are only one of presumably dozens of different users. For all we know they are shipping massive improvements to eg billing
- Also, why isn‘t there a native client for all platforms? Could they not just let the AI auto-translate the code?
- Because believe it or not, majority of users couldn't care less whether it is native or not. I don’t even see Spotify, it’s just something that lives in the background and plays music.
- > They claim their (best) devs don't even code anymore
No, they claimed they didn’t code during a time period. Around year end until early this year. Technically they could have just been on leave.
Also best dev = principal / staff engineers. They rarely code anyway.
AI or no AI anyone could have made that claim.
- Anthropic released vibe coded C compiler that doesn't work, how their LLM can help in maintaining PyPy?
- Strange subthread. I don't see Claude Opus 4.6 changing the tide for PyPy. There is no need to understate AI capabilities for this.
"Anthropic released vibe coded C compiler that doesn't work" sounds like https://github.com/anthropics/claudes-c-compiler/issues/1 passed through a game of telephone. The compiler has some wrong defaults that prevent it from straightforwardly building a "Hello, world!" like GCC and Clang. The compiler works:
> The 100,000-line compiler can build a bootable Linux 6.9 on x86, ARM, and RISC-V. It can also compile QEMU, FFmpeg, SQlite, postgres, redis, and has a 99% pass rate on most compiler test suites including the GCC torture test suite. It also passes the developer's ultimate litmus test: it can compile and run Doom.
- Prompts for this?
The primary objective is to retarget PyPy on top of the Python main branch. A minor objective is to document what of PyPy can be ported to CPython (or RustPython).
Keep a markdown log of issues in order to cluster and close when fixed
Clone PyPy and CPython.
Review the PyPy codebase and docs.
Prepare a devcontainer.json for PyPy to more safely contain coding LLMs and simplify development
Review the backlog of PyPy issues.
Review the CPython whatsnew docs for each version of python (since and including 3.11).
What has changed in CPython since 3.11 which affects PyPy?
Study the differences between PyPy code and CPython code to understand how to optimize like PyPy.
Prepare an AGENTS.md for PyPy.
Prepare an agent skill for upgrading PyPy with these and other methods.
Write tests to verify that everything in PyPy works after updating it to be compatible with the Python main branch (or the latest stable release, CPython 3.14)
- Strikes me as the worst possible solution if they're struggling to find maintainers in the first place. Who reviews the vibe coded patches?
- > Anthropic released vibe coded C compiler that doesn't work, how their LLM can help in maintaining PyPy?
This is the perfect question to highlight the major players. In my opinion, a rapidly developing language with a clear reference implementation, readily accessible specifications, and a vast number of easily runnable tests would make an ideal benchmark.
- Most maintainers don't have a stack of cash to throw at tokens.
- They don’t need to throw a stack of cash at them, Anthropic and OpenAI have programs for open source maintainers.
https://claude.com/contact-sales/claude-for-oss https://openai.com/form/codex-for-oss/
- I'd say they're less of "programs" as they are "six-month trials". What's the plan after six months?
And for what's it worth, PyPy isn't even eligible for the Claude trial because they have a meager 1700 stars on GitHub.
- > What's the plan after six months?
An unmaintainable mass of Ai slop code and the decision to either pay the ai tax or abandon the project.
- Isn't the Claude one only for a few months?
(I haven't checked the OpenAI one, as I have no interest in them)
- Both programs have been announced as granting six months, but neither of them have explicitly said that there won't be options to renew for another six months.
I expect they haven't decided that themselves yet and don't want to commit publicly until they've seen how well the program goes.
- Even if you’re right, no one should be making a decision of enrolling into those programs because maybe, with zero indication they’ll be renewed again in six months.
You know what they could also do? Stop the programs for new enrolments next month. Or if if they renew them like you said, it could be with new conditions which exclude people currently on them.
There are too many unknowns, and giving these companies the benefit of the doubt that they’ll give more instead of taking more goes counter to everything they showed so far.
- Is your argument here that you shouldn't accept the free trial because you might find it useful and then be trapped into paying for more of it later?
- No, my argument is that your “but neither of them have explicitly said that there won't be options to renew for another six months” point is not something anyone should realistically be counting on, and is not a valid counter argument to your parent post of “Isn't the Claude one only for a few months?”.
We should be discussing what is factual now, not be making up scenarios which could maybe happen but have zero indication that they will.
- I didn't say that I thought they would likely extend it, but I stand by my statement that it's a possibility.
Neither company have expressed that the six month thing is a hard limit.
The fact that OpenAI shipped their version within two weeks of Anthropic's announcement suggests to me that they're competing with each other for credibility with the open source community.
(Obviously if you make decisions based on the assumption that the program will be expanded later you're not acting rationally.)
- If I understand correctly, they are literally giving things away for free for a 6 months period and we are complaining that they don't promise it stays free forever?
- No, you did not understand correctly. They are not “literally giving things away for free”, they are providing a very conditional free trial, which is a business decision and not anything new. Then a commenter speculated they might extend that program because they didn’t say they won’t and I pointed out it doesn’t make sense to assume they will. No one on this immediate thread made any complaint, we’re discussing the facts of the offering.
- "You're completely right. That mushroom is poisonous."
- Is Python dying? /s
- What annoys me is the name. Early morning it took me a moment to realise that PyPy is not PyPi, so at first I thought they referred to PyPi. Really, just for the name confusion alone, one of those two should have to go.
Edit: I understand the underlying issue and the PyPy developer's opinion. I don't disagree on that part; I only refer to the name similarity as a problem.
- There is no PyPi, it's PyPI (py pee eye), the Python package index.
- If you have to insist that a name needs a certain capitalization to properly exist, you're in the territory of brand zealotry and pedantry. The people who don't care for one reason or other vastly outnumber you, and they will invent your disfavored capitalization into existence. The same goes for pronunciation. GIF? Jira?
If your thing can be reached under "pypi.org", you can either accept that people will come up with their own ideas of how to capitalize or pronounce the name, or you can fight against windmills and tell people what ought to exist or not.
- Wikipedia tells me that the package index PyPI (launched in 2003) is about 4 years older than the interpreter PyPy (first released in 2007).
Still, at its core, PyPy is a Python interpreter which is itself written in Python and the name PyPy fittingly describes its technical design.
- No. PyPy development was ongoing long before the first release. The first intact commit in the PyPy repo is from February 2003: https://github.com/pypy/pypy/commit/6434e25b53aa307288e5cd8c.... And that commit indicates there's been development going on for a while already. The commit message is:
"Move the pypy trunk into its own top level directory so the path names stay constant."
PyPy migrated from Subversion to git at some point. Not sure how much of the history survived the migration.
- I think back then PyPI was known as the cheeseshop, so there wouldn't have been the same confusion.