I didn’t see any bug, now. … But maybe something around this, doesn’t work… e.g.: github is down or the download is blocked etc.
Btw. one difference between “install” and “setup” is:
The install used in the end: “make altinstall” and this stores the interpreter into /usr/local … All YunoHost projects (that use this solution) on the same instance will use this “shared” interpreter (If major Python version is the same)
The setup solution stores everything in the user directory. So it’s isolated by every YunoHost app.
In that case it should be explained to the user … otherwise if you’re just asking the user if they prefer “slow” or “fast” they’ll always choose “fast”
Python in debian is a pain and unfortunately there is no backport of those packages neither a safe repo where to retrieve them. And all that without touching the default python version to not break the system.
There is also another way that could be interesting : by looking at that repo GitHub - pascallj/python3.12-backport: Python 3.12 backport for Debian 12 bookworm we could imagine creating a false app to build via github ci all the package needed and proposing them in a yunohost community repo… I don’t really know if it’s realistic and compliant against yunohost policy/strategy… just an idea
Hi all I’m glad I found this thread as I’ve also stumbled upon this issue.
Two arguments I see in this thread are:
People should make sure apps work in bookworm. => That’s a race we cannot win. Bookworm ships with 3.11 and 3.12 is almost a year owl now, so it’s to be expected that some apps will require it.
What’s so great about 3.12 that can’t be implemented in 3.11 => While I have some sympathy for this argument, it’s also a moot point as we can’t make third party developers follow our wishes.
So I think if Yunohost wants to be a great home for Python apps, we need a canonical way for apps be based on any Python version that’s not EOL.
Building from source is a very straightforward and secure process that doesn’t depend on any third party infrastructure. I see the argument that it’s somewhat time intensive, which results in bad UX when doing it as part of an app install. However for the time being I think this is the best solution. App packagers should add warnings about potential long install times to the apps info.
I wonder if it could be a feasible solution to add “Install Python 3.X” as a core Yunohost function that could be done independently from app installations?
To me this is the main question : up to what limit do we accept any constrains set by the upstream devs ?
Developers should give some sort of fuck about having their softwares actually be distributed and used by people, and not just by people who are sysadmin wizard and know their way around Docker.
Imho it’s time to put an end to the current webapp development paradigm under which many developer assume that resources are just somehow infinite and you can throw dozens of GB of Ram at build time or even run time - and not even provide a pre-built version of their software.
How the hell can a process compiling “text files” (html, css, js) can take up more than 500M to build is beyond my comprehension. And yet some apps do take 2 or 4GB of RAM to build (in the nodejs ecosystem, typically).
It is quite amazing that tech folks in the year ~2000 with distributions such as Debian achieved a consistent ecosystem of packages, with dynamic linking to mutualize libraries and avoid duplicating every piece of software - and yet fifteen years later everybody just threw everything and started slapping at least one package manager for every language and even several versions of language. Nowadays you have to install nvm to install node to install corepack to intall yarn to download half the internet into a 10GB node_module mess. “Build-it-locally-because-you-are-a-sysadmin-with-infinite-ram-arent-you” became the default deployment workflow, along with “slap-all-these-10-microservices-into-a-docker-compose-because-its-impossible-to-replicate-a-working-setup-otherwise”, all of this in the name of “hurr durr herp derp devs want all the brand new things and don’t want to care about distributing software to actual people nor thinking about resources because RAM is so cheap”.
I’m in favor of saying “if this app requirements are so crazy that it requires madness to install, then fuck it it’s not getting packaged”. If we do not draw the line anywhere, then we’re good to fully embrace the Docker way of things, duplicating every piece of software because app foo is OK with mariadb x.y.z but app bar wants mysql i.j.k and app baz has 10 different microservices to be able to run, and then we can say goodbye to running any sort of useful server on a RPi with less than 1G of RAM and 64GB of disk, and at this rate we’ll just even more encourage people to build a brand new server every 2 years and contribute to the ecological disaster of electronic waste …
I think this is a reasonable stance to take, but I don’t think it applies to the issue we’re discussing here. Requiring a fairly modern Python version isn’t crazy. And if we don’t come up with a default solution for it we can only end up with ynh-app maintainers to come up with their own potentially brittle solutions or apps ending up in a broken state. Both of these aren’t great from a project perspective.