Now I just need to wait for GitHub Actions to release it and I can release a couple of libraries’ 3.13 support. GHA having the pre-releases as version 3.13-dev is annoying enough in some cases that I can’t be bothered to release in advance.
I’m proud to say that my name shows up in this document for my contributions related to free-threading support in community packages. Back in 2012 when I was struggling with structuring Python projects - understanding what the heck __init__.py files do - that my name would show up in the Python docs.
This release broke Datasette due to one of my dependencies not working with Python 3.13 yet - I pushed an update just now that fixes that, details of the change in this issue.
The dataclass inheritance hierarchy is supposed to require all classes to be either frozen or non frozen, this works properly for checking that an unfrozen class does not inherit from any frozen classes, but it allows frozen classes to inherit from unfrozen ones as long as there’s at least one frozen class in the MI
Unfortunately my Pint dependency was relying on that bug.
FYI: Python does not use SemVer and never really has. There’s always been a rolling deprecation cycle in Python releases, and it’s always been important to pay attention to deprecation warnings and release notes.
As to why backwards-incompatible changes get made at all, it’s because the alternative is to be something like Java, where there are decades of accumulated “no, don’t use that API for this, or that one, or that one, or that one” footguns because an absolute obedience to never making a breaking change prevents the removal of known-bad APIs.
“no, don’t use that API for this, or that one, or that one, or that one” footguns because an absolute obedience to never making a breaking change prevents the removal of known-bad APIs.
Doesn’t Python have a bunch of known crappy modules in its standard library?
Python’s been working on removing a bunch of outdated modules/APIs in its standard library and replacing them with or directing people to better alternatives.
Of course, doing so breaks backward compatibility, which people then complain about.
I think there’s a difference between “this outright doesn’t work” (which I think is mainly a thing in something like datetime.replace just doing the ‘wrong’ thing when you pass in a timezone for example), and “this is not a nice API”, which I think is how I would qualify some standard library APIs.
And even then, honestly? The standard library is pretty fine in my experience. The main complaint I’d have is urllib stuff just not really working out of the box on Mac without shenanigans to get certs working first.
In my experience, everyone gets dates and time wrong, including me. The implicit assumptions list in date and time math is 400 miles long when printed in size 1 font..
I’ve never looked at whenever, but to believe it fully encompasses the date and time problem is probably ridiculous. When countries can’t even agree on what the right date or time is at a given place, and cultures/groups of people sometimes have their own date and time interpretation separate from their countries, there is zero hope anyone can get it right in a library.
Fair enough. I have the luck of both living in places without DST and also knowing to never use naive datetimes, so most of these are non-issues for me. But it would be nice for all of this stuff to just be…. better.
datetimes are total footguns in general if you just open up the standard library and use it.
I’m happy at least tzinfo is easier to get a hold of than it used to.
You can consider this a bit of legacy support. BerkDB was widely used in the 1990s as a file format for key-value “databases.” There’s a proliferation of implementations, but they’re all basically simple B-tree files. You have a string key and a binary blob value, which is usually a C structure of some kind written blindly.
There isn’t a really compelling reason to use this technology today other than for compatibility with other software that uses it. Some mail servers do, for instance, to handle aliases or other bits of config, for “performance” but I doubt the performance difference is significant today.
Does anyone know how the size of a Python installation has been changing over time? Or memory usage at run-time?
I assume we’ll have to pay for increasing complexity at some point, but it would be nice to be wrong :)
Edit: (not a great metric, but it’s easy to compare) the gzipped source is ~20% larger than 5 years ago. That’s actually much less bad than I expected! But I don’t really know how much of a full installation that actually covers (or what it’s like at run-time).
Totally unscientific since I’m just looking on my phone, but the installers for the 3.x series don’t seem to be growing. I went to python.org and glanced at the sizes for the x86 windows installers for a few versions and they seem to hover about 25-27MB. The latest release, 3.13.0, actually shrunk a little.
Memory usage at runtime:
This shrunk recently because they landed some PEPs that make strings, dicts and plain python objects (ones which have __dict__) smaller in memory.
I kind of expect this to go up a little in future because I bet a little bit of jit will get added, but I’m not sure. It might go down: there’s less low hanging fruit for shrinking python data structures than there used to be but there is still some. e.g. maybe someone could implement nan boxing and fixnums to make ints and floats smaller on architectures that permit it?
Runtime CPU perf: tends to get better with most releases.
My first patch landed on this one! I wrote a single line to add
platform.architecture()
on MacOS.It feels so good when one’s PR is accepted and the changes are upstreamed. Good work.
thank you :)
Same here, I did a single line bugfix in the zipapp module to avoid creating infinitely large zipfiles.
Well done !
thanks :)
Yay! No more
exit()
we can just typeexit
!In case it’s helpful to you, the python REPL will exit on ctrl+d (i.e. EOF), just like many shells.
Unless you’re on Windows, where you need to hit ctrl+z then enter.
Finally! I HATED THAT ERROR MESSAGE
locals()
mutation working consistently is a pretty nice thing for “weirdo” debugging/monkeypatching debugging strategies. Glad to see 667 landInteresting to see the GIL free stuff
This is the money in this release. Super exciting to see what will happen here.
Now I just need to wait for GitHub Actions to release it and I can release a couple of libraries’ 3.13 support. GHA having the pre-releases as version
3.13-dev
is annoying enough in some cases that I can’t be bothered to release in advance.I’m proud to say that my name shows up in this document for my contributions related to free-threading support in community packages. Back in 2012 when I was struggling with structuring Python projects - understanding what the heck
__init__.py
files do - that my name would show up in the Python docs.This release broke Datasette due to one of my dependencies not working with Python 3.13 yet - I pushed an update just now that fixes that, details of the change in this issue.
Huh, why do the Python developers make backwards incompatible changes?
Usually they don’t without several years of warning.
It looks like in this case they were fixing a bug: https://github.com/python/cpython/issues/109409
Unfortunately my Pint dependency was relying on that bug.
FYI: Python does not use SemVer and never really has. There’s always been a rolling deprecation cycle in Python releases, and it’s always been important to pay attention to deprecation warnings and release notes.
As to why backwards-incompatible changes get made at all, it’s because the alternative is to be something like Java, where there are decades of accumulated “no, don’t use that API for this, or that one, or that one, or that one” footguns because an absolute obedience to never making a breaking change prevents the removal of known-bad APIs.
Doesn’t Python have a bunch of known crappy modules in its standard library?
Python’s been working on removing a bunch of outdated modules/APIs in its standard library and replacing them with or directing people to better alternatives.
Of course, doing so breaks backward compatibility, which people then complain about.
Yup, we’re having fun in the trenches.
I think there’s a difference between “this outright doesn’t work” (which I think is mainly a thing in something like
datetime.replace
just doing the ‘wrong’ thing when you pass in a timezone for example), and “this is not a nice API”, which I think is how I would qualify some standard library APIs.And even then, honestly? The standard library is pretty fine in my experience. The main complaint I’d have is
urllib
stuff just not really working out of the box on Mac without shenanigans to get certs working first.Funny you mention datetime… here are a few pitfalls:
https://dev.arie.bovenberg.net/blog/python-datetime-pitfalls/
Referenced via
whenever
:“Do you cross your fingers every time you work with Python’s datetime—hoping that you didn’t mix naive and aware?”
https://github.com/ariebovenberg/whenever
In my experience, everyone gets dates and time wrong, including me. The implicit assumptions list in date and time math is 400 miles long when printed in size 1 font..
I’ve never looked at whenever, but to believe it fully encompasses the date and time problem is probably ridiculous. When countries can’t even agree on what the right date or time is at a given place, and cultures/groups of people sometimes have their own date and time interpretation separate from their countries, there is zero hope anyone can get it right in a library.
Fair enough. I have the luck of both living in places without DST and also knowing to never use naive datetimes, so most of these are non-issues for me. But it would be nice for all of this stuff to just be…. better.
datetimes are total footguns in general if you just open up the standard library and use it.
I’m happy at least tzinfo is easier to get a hold of than it used to.
Removing quite a few of those ancient modules was part of this release!
Could anyone explain what the
dbm
package is meant to do? The summary of the package uses the acronym, again, but doesn’t define it..You can consider this a bit of legacy support. BerkDB was widely used in the 1990s as a file format for key-value “databases.” There’s a proliferation of implementations, but they’re all basically simple B-tree files. You have a string key and a binary blob value, which is usually a C structure of some kind written blindly.
There isn’t a really compelling reason to use this technology today other than for compatibility with other software that uses it. Some mail servers do, for instance, to handle aliases or other bits of config, for “performance” but I doubt the performance difference is significant today.
Wikipedia has a nice overview of the history and many variants of unix dbm https://en.wikipedia.org/wiki/DBM_(computing)
The API you get from python’s dbm module is like a dict whose keys and values must be byte strings, and it’s persisted to disk.
python’s gdbm module is similar but also exposes “first key” and “next key” methods that make it possible to do range queries.
(I recommend using the sqlite3 module instead, which is also in python’s stdlib.)
Does anyone know how the size of a Python installation has been changing over time? Or memory usage at run-time?
I assume we’ll have to pay for increasing complexity at some point, but it would be nice to be wrong :)
Edit: (not a great metric, but it’s easy to compare) the gzipped source is ~20% larger than 5 years ago. That’s actually much less bad than I expected! But I don’t really know how much of a full installation that actually covers (or what it’s like at run-time).
Installer size seems steady
Totally unscientific since I’m just looking on my phone, but the installers for the 3.x series don’t seem to be growing. I went to python.org and glanced at the sizes for the x86 windows installers for a few versions and they seem to hover about 25-27MB. The latest release, 3.13.0, actually shrunk a little.
Memory usage at runtime:
This shrunk recently because they landed some PEPs that make strings, dicts and plain python objects (ones which have
__dict__
) smaller in memory.I kind of expect this to go up a little in future because I bet a little bit of jit will get added, but I’m not sure. It might go down: there’s less low hanging fruit for shrinking python data structures than there used to be but there is still some. e.g. maybe someone could implement nan boxing and fixnums to make ints and floats smaller on architectures that permit it?
Runtime CPU perf: tends to get better with most releases.
Time to hello world: I don’t know.
That’s encouraging! I was having flashbacks to how JavaScript runtimes have grown in footprint.