software engineer, consultant, conference speaker, #tech4good, #stacktivism

PyPI is now on Python 3.11 (TikTok + Blog Post)

Yesterday The PyPI – Python Packaging Index tweeted a graph with the short text "Python 3.11 delivers"

Here's what this tweet means and why it’s a big deal:

Fj9RQVnWIAEGS6A.jpg

The newest stable release of Python dropped in October and came with a fancy new logo. You can even see in the top right of the new logo of green and purple swirling snakes and a negative space lightning bolt: "Faster Python" along with 5 other notable improvements.

Python 3.11 Logo

The horizontal access of this graph is time, and the verticle axis are Cores, which is how you measure CPU (Central Processing Unit) usage. It is the brain of the computer and as you're running your python program, Python – like any other program running on any other computer local or cloud alike, will use up these execution resources. Each of these spikey lines on this beautifully colored graph represents a different Gunicorn Web Service Worker the PyPA (The Python Packaging Authority) runs to support PyPI the Package Index – this is the repository of published packages that can be found and installed with "pip" like `pip install django`

Just a moment on how impressive PyPI is - PyPI Download Stats (pypistats.org) PyPI Stats is a way to show the downloads of any its libraries or packages. In the last DAY the most downloaded package – boto3 the AWS Software Developer Kit, was downloaded 18.98 MILLION times. Second to that is urllib3 an HTTP client in Python was downloaded ELEVEN point SIX million times. They're doing this for free, by a team of volunteers. To learn more about the work PyPA is doing, check out Python Packaging Authority — PyPA documentation

Client applications, for example a web app, phone app, console app, will make a HTTP request to the server for example GET or POST or DELETE, that all follow the same protocol so universally servers recognize the same approach. When we talk about the big general term "server" it has a bunch of different servers running to handle the communication and translation of these requests to have smooth efficient application performance. Within the larger "server", the request from the client will hit HTTP server first, for example NGINX or Apache, which manages incoming traffic and can distribute it to slower upstream servers. Gunicorn receives request from the NGINX server, handles them and returns the results back to NGINX and NGINX returns a response and sometimes a payload of data back to the client. Gunicorn is a WSGI server a "Web Server Gateway Interface" which helps execute the Python as your application server. When we say "handle" the request from NGINX like serving static assets from the database. The separation from NGINX and Gunicorn allows for a performant, pure python server that never has to worry about reading connections directly from the internet. It does one job and it does it well. We als

A really great answer to "Why Gunicorn is here" Server Fault dot Com still bumping a really good answer from 2011. WSGI has been the standard approach for Python web applications and relies on features natively in Python. Now in 2022, Python has a native syntax for handling asynchronous operations like network calls. ASGI – Asynchronous Server Gateway Interface, now allows multiple, asynchronous events per application. Now, just because ASGI can handle both synchronous and asynchronous requests, I don't recommend you ripping out WSGI in all your applications and replacing them with ASGI. Although very cool and shiny, I abide by the old adage – Keep It Simple Stupid when developing.

Back to the graph. This graph is monitoring those Gunicorn events. Each spike represents an increase in activity and at five thirty something, deploying the application upgraded to Python 3.11 quieted some of that noise. VERY COOL. We get an example of Python 3.11 with a giant production use case, in the wild being performant like the CPython team promised.

How and why did it get more performant? Python 3.11 is 10-60% faster than 3.10, and in the PyPI use case, it increased performance by about HALF! The language made for functional and dynamic programming first and optimization of the application layer later, now has the expressed mission to be more performant from your first line of code. All of these enhancements can be tracked down between conversations at conferences, like EuroPython and PyCon, the updates from the Python Steering Council steering-council/updates at main · python/steering-council (github.com), or ideas repo issues in the Faster CPython organization faster-cpython (github.com) -- here's a screenshot of the 5 closed issues labeled 3.11, or in PEPs (Python Enhancement Proposals) like PEP 659 – Specializing Adaptive Interpreter | peps.python.org called specialization, which is done on the compiler but also can lead to improvements on the interpreter by introducing a "family" of specialized instructions, and each new instruction is specialized in single task. Remember when we were referencing Guinicorn – doing one thing and doing it really well to increase performance? I smell a theme!

So, why is it a big deal? Well, a "fast" language although not always the most important when picking a language, is the hallmark of a well maintained programming language, that has had performance enhancements on the compiler, cross-platform compatibility, security and its handling of runtime values: dynamically typed or statically typed. This signals the health of one of the fastest growing languages in the world. It signifies a healthy ecosystem, that as we all upgrade, all can become faster and more performant. So now we're able to get back to the meta of the PyPI sharing the results of a more performant Python. The entire community now has the opportunity to benefit from a faster Python.

What are you waiting for?