python-packaging

Table of Contents

1. python-packaging   project

1.1. pipenv: promises a lot, delivers very little.

  # requirements.in
  paquetes esenciales sin versión
  numpy
  matplotlib
  pandas
  ...
  # dev-requirements.in
  -r requirements.in # Instala todo lo de requirements (versión más reciente porque no tiene versión)
  paquetes de debug que no se usan en la versión final
  black
  ipython
  ...
  pip-compile requirements.in --generate-hashes
  pip-compile dev-requirements.in --output-file dev-requirements.txt

  # Cuidado porque pip-sync desinstala todo lo que no esté en requirements como pip, setuptools y demás, está super optimizado
  pip-sync requirements.txt
  pip-sync requirements.txt dev-requirements.txt

  # Cuando despleguemos, para comprobar hashes simplemente hacemos:
  pip install --require-hashes -r requirements.txt

1.1.1. Introducing Pants 2.7: Python tool lockfiles, Yapf, Docker, and easier introspection (Like Poetry, but scalable)

https://www.reddit.com/r/Python/comments/pwkb0q/introducing_pants_27_python_tool_lockfiles_yapf/

Poetry and Pants are similar in that both do dependency management, virtualenv management, and creating distributions. They’re different in that Pants focuses intensely on scale, like supporting many projects in the same repository.
For example, Pants coordinates the tools you might be using like Pytest, MyPy, Black/Yapf/isort, Bandit/Flake8/Pylint, and codegen with Protobufs. You have a consistent interface, like ./pants lint :: to run all your linters, with benefits like running the tools in parallel and with caching.
The key differentiator for Pants to tools like Tox and Poetry (or fancy bash scripts) is that it understands your project and its dependencies at the file level, like that the file app.py depends on util.py, which transitively depends on another_util.py. That happens automatically through “dependency inference”, mapping your imports to the rest of your project. Understanding your project’s dependencies at the file-level means that Pants can safely do things like cache your test results at the file-level—if none of the transitive dependencies of a test changed, the cache can be used. You can also do things like ./pants dependees --transitive //:Django for you to find everything that transitively uses Django.
Feel free to DM or stop by on Slack if you want to talk through anything! I’d be happy to help https://www.pantsbuild.org/docs/getting-help

1.2. Python packaging: lo estás haciendo mal

1.2.1. Notas

1.2.1.1. setuptools

distribute, alternativa a pip, ya no se usa
easy_install obsoleto

1.2.1.2. entorno global a nivel de usuario

.local en Linux

1.2.1.3. setup.py vs requirements.txt · caremad

There’s a lot of misunderstanding between setup.py and requirements.txt and their roles. A lot of people have felt they are duplicated information and have even created tools to handle this “duplication”.
They don’t do exactly the same things

1.2.1.5. nunca hacer sudo pip install

Si conflicta con algo que instales con apt-get, entonces te peta el sistema

1.2.1.6. Estandar de paquetes binarios
1.2.1.7. venv vs virtualenv

venv 3.3, virtualenv estaba muerto y se resucitó por Bernat Gabor

1.2.1.8. Alternativas
  1. virtualenvwrapper, no tiene actividad desde febrero 2019
  2. pyenv rompe el path (issue 1112 de pyenv)

    Deja de correr git(!!) por estas movidas

  3. Pipenv
  4. poetry
  5. conda

    cada vez tarda más en resolver el entorno, satisfability, menos necesario en Linux

1.2.1.9. Desacoplar backend (setuptools) y frontend (pip)
1.2.1.11. Clarifying PEP 518 (a.k.a. pyproject.toml)
[build-system]
requires = ["setuptools", "wheel"]  # PEP 518
build-backend = "setuptools.build_meta"  # PEP 517
  1. PEP 518 – Specifying Minimum Build System Requirements for Python Projects | Python.org

    estandariza una forma de especificar las dependencias en tiempo de instalación, que se escriben en el nuevo pyproject.toml ¡No confundir con las dependencias en tiempo de ejecución!

  2. PEP 517 – A build-system independent format for source trees | Python.org

    estandariza una forma de especificar cómo el frontend tiene que llamar al backend

1.2.1.12. Alternativas a setuptools
  • Hace falta setup.py si tienes build complicados con dependencias binarias (scipy), pero es mejor utilizar setup.cfg si es posible (declarativo y estático)
  1. Flit 3.2.0 — Flit 3.2.0 documentation

    Más sencillo, más fácil de usar, más ligero
    Asistente que te pregunta todo

  2. Home - build 0.3.1
  3. pypiserver · PyPI

    Hacer un pip interno

  4. PEP 621 – Storing project metadata in pyproject.toml | Python.org

    Manera de estandarizar todos setuptools, flit, o poetry, para no tener especificar los metadatos de una manera totalmente distinta

1.2.1.14. Conclusiones
  • ¡Larga vida a pip! 🎉 Y nunca jamás nombrar a easy_install 🚫
  • ¡Larga vida a venv! 🎉 Y nunca jamás hacer sudo pip install 🚫
  • ¡Larga vida a los wheels! 🎉 Y si no hay para tu plataforma, tendrás que compilarlo o buscar una alternativa 🔨
  • ¡Larga vida a flit! 🎉 Y, si lo necesitas, puedes seguir usando setuptools ✔

1.3. Reverse-engineering Ian Bicking’s brain: inside pip and virtualenv

1.3.1. carljm/pipvirtualenv-preso: presentation on pip and virtualenv for PyCon 2011

file:///usr/lib/python3.10 (standard library)
Imports site.py (Lib/site.py at source)

1.3.2. Hacks to run a python virtualenv

  1. Just running ~/.virtualenvs/<name>/bin/python
  2. Prepend $PATH with ~/.virtualenvs/<name>/bin
  3. Set $VIRTUAL_ENV to ~/.virtualenvs/<name>

1.5. Alternativas virtualenvwrapper

1.5.1. berdario/pew: A tool to manage multiple virtual environments written in pure python

Como virtualenvwrapper pero sin hooks. Está casi igual de poco mantenida
Tiene un comando “add” que te añade dependencias globales que tienen que estar instaladas en todos los entornos virtuales

1.8. pyenv + zpy   project

1.8.1. https://jacobian.org/2019/nov/11/python-environment-2020/

  1. pyenv
    Why? I need to run multiple Python versions, isolated from the system Python. pyenv makes it easy to install, manage, and switch between those multiple Pythons.
  2. pipx
    Why? pipx lets me install Python-based CLI stuff (youtube-dl, awscli, doc2dash, etc.) without those projects’ dependencies messing up my global Python.
  3. Poetry
    Why? Poetry handles dependency- and virtual-environment-management in a way that’s very intuitive (to me), and fits perfectly with my desired workflow.


«I currently do not use Poetry, instead I use zpy, which provides pipz as a drop-in replacement for pipx»

1.8.2. pyenv

Everytime a package changes (like /usr/lib/libssl.so.1.1 becomes /usr/lib/libssl.so.3) you have to run again pyenv install <version>

read
pyenv versions | grep -v system | xargs -I _ pyenv install -f _

1.8.3. zpy

1.8.3.1. Flujo y demás

flow.svg

El flujo me gusta

  • No es necesariamente malo, pero los entornos virtuales no tienen un nombre bonito sino que son un hash (md5sum <<< $PWD | awk '{print $1}'), la función .zpy_path_hash. El nombre bonito lo tienen los directorios y se asocia un entorno virtual a cada directorio
    Si lo creas con pyenv te muestra la versión de python
    Tienes un link en la carpeta del entorno virtual/projects y luego también puedes tener distintos entornos virtuales
  • activate (a8)
    1. Activate the venv for the current folder or specified project, if it exists.
    2. Otherwise create, activate, sync.
    3. Pass -i to interactively choose the project.
    4. Pass –py to use another interpreter and named venv.
  • Utiliza pipacs (add to requirements.in, compile, sync) y puedes utilizar subconjuntos: pipa, pipc, pips, pipac, and pipcs
    • Se le puede pasar -c <category>, donde <category> por defecto autocompleta con dev, doc, test
  • No sé cómo integrarlo bien con pyenv, pero cada uno utiliza una variable de entorno distinta:
    zpy
    ZPY_VENVS_HOME
    pyenv
    PYENV_ROOT
# Eso va en .zshrc
export PYENV_ROOT=$ZPY_VENVS_HOME
eval "$(pyenv init -)"

pyenv install 3.8.2 # Sólo se ejecuta la primera vez
pyenv shell 3.8.2
# Esto en teoría no haría falta ejecutarlo así, si no que es para cuando tienes que mantener múltiples versiones
a8 --py current

En teoría si no utilizase --py current no debería de ser necesario el link
Los entornos virtuales los crea en otro path distinto, pero si los linko entonces funciona bien:

ln -s ~/.local/share/venvs/ffa971eb3f55d0278fbeed430a16adb4/venv-python-3.9.13 ~/.local/share/venvs/ffa971eb3f55d0278fbeed430a16adb4/venv

Lo que no he conseguido es cambiar el nombre del entorno virtual en zsh (me sale venv)

Utiliza flit para gestión de pyproject.toml

pipacs -c dev flit
flit init

vpy <script.py> para ejecutarlo en el entorno virtual que le corresponde a ese script
vpyshebang do_thing.py para añadirle un shebang
zpy mkbin vpy ~/.local/bin te permite usar zpy fuera de zsh

1.8.3.2. Resumen

Al iniciar, instalamos la versión de python

pyenv install 3.8.2 # Sólo se ejecuta la primera vez
pyenv shell 3.8.2
a8 --py current

ln -s ~/.local/share/venvs/ffa971eb3f55d0278fbeed430a16adb4/venv-python-3.9.13 ~/.local/share/venvs/ffa971eb3f55d0278fbeed430a16adb4/venv

Luego cuando entras en la carpeta

a8
1.8.3.3. pip-compile para que funcione

En dev-requirements.in

-c requirements.txt

Así puedes hacer pipcs y que funcione. Si no, tienes que hacer pipcs requirements.in y pipcs dev-requirements.in por separado porque son inconsistentes

1.8.3.4. Qué hacer si haces un renombramiento de una carpeta

En este ejemplo lo que quería cambiar era un remombramiento de 1_Projects a 1projects (al revés de lo que hace sed)

read
# Moverte a la carpeta que tiene el entorno virtual y se ha visto afectada por el cambio
# 0. Asegurarnos de que existen los comandos existen (si no la sustitución de más abajo podría ser "" y te borra todos los entornos virtuales)
command -v md5sum || echo "Command md5sum not found"; exit 1
command -v awk || echo "Command awk not found"; exit 1

# 1. Asegurarnos de que no hay nada en la carpeta para que el mv funcione
rm ~/.local/share/venvs/$(md5sum <<< "$PWD") -rf

# 2. Mover a la nueva localización los scripts
mv ~/.local/share/venvs/$(md5sum <<< $(echo "$PWD" | sed 's/1projects/1_Projects/') | awk '{print $1}') ~/.local/share/venvs/$(md5sum <<< "$PWD" | awk '{print $1}')

# 3. Actualizar el link del proyecto
ln -sf "$PWD" ~/.local/share/venvs/$(md5sum <<< "$PWD" | awk '{print $1}')/project
# Esto para ver qué entornos están rotos
ls ~/.local/share/venvs/*/project -l
1.8.3.5. Sacar asociación entre archivos y entornos virtuales
for dir in ~/.local/share/venvs/*; do echo -en "$dir $(readlink $dir/project)\n"; done | grep -v -e shims -e versions
1.8.3.6. Implementar un ha8   project someday_20230330
  • hactivate, hierarchical activate
    Te busca en la jerarquía de carpetas y comprueba si existe un entorno virtual en una carpeta padre de dese directorio
  • https://andydecleyre.github.io/zpy/new_proj/pips_envin/
    activate uses the current folder by default, but also accepts a project path argument, or -i to select the project folder interactively
1.8.3.7. Añadir sólamente si no existe ya el paquete   project someday_20230330

Ahora lo añade esté o no esté

1.9. pip-tools

1.9.1. pip-tools fails with pip >= 22

1.9.2. pip-tools con hashes no funciona en Windows

Hay ciertos paquetes que sólo estan en Windows, o que tienen distinto hash

1.9.3. pip-tools for Python dependencies management (constraint.txt)

1.9.4. Tamaño de las dependencias con pip-tools

Creo que no lo tiene y sería una buena opción a tener. Igual que tienes de dónde viene cada dependencia, que te ponga el tamaño y puedas ordenar el requirements.txt por tamaño en vez de alfabéticamente
https://stackoverflow.com/questions/34266159/how-to-see-sizes-of-installed-pip-packages
No incluye paquetes que tienen un nombre distinto al importar vs al instalar como beautifulsoup4 ↔ bs4

LANG=C LC_ALL=C pip list \
  | tail -n +3 \
  | awk '{print $1}' \
  | xargs pip show \
  | grep -E 'Location:|Name:' \
  | cut -d ' ' -f 2 \
  | paste -d ' ' - - \
  | awk '{print $2 "/" tolower($1)}' \
  | xargs du -sh 2> /dev/null \
  | sort -hr

Posible mejorada:

LANG=C LC_ALL=C pip list \
  | tail -n +3 \
  | awk '{print $1}' \
  | xargs -I {} python -c 'import os; from importlib_metadata import packages_distributions; print(os.environ["VIRTUAL_ENV"]+"/lib/python3.10/site-packages/"+([k for k, v in packages_distributions().items() if "{}" in v and "__dummy__" != k][0]))' \
  | xargs du -sh 2> /dev/null \
  | sort -hr

1.10. Crear un sistema de gestión de paquetes comunes a varios venv   ARCHIVE

1.10.1. Motivación

  • Si tenemos la misma versión de unos mismos paquetes, estaría bien que no tuviésemos ese paquete repetido muchas veces a lo largo de muchos venvs
  • La solución sería(?) algo así como tener un directorio común donde tuviésemos guardadas las versiones comunes de un mismo paquete, pero cuando actualicemos, que se detecte de algún modo esta diferencia
    • Una buena opción es usar ~.local, que es donde se instalan los paquetes con pip install pandas como usuario normal

          ~/.local/lib/python3.8/site-packages/pandas/*
          ~/.local/lib/python3.8/site-packages/pandas-1.2.2.dist-info/*
      
  • Así, para los paquetes comunes, hacemos un link
    • Podría ser un hard link a cada uno de los archivos, o un soft link a la carpeta. Si intentas actualizar un paquete y tienes un soft link no sé cómo se gestiona, pero si tienes un hard link y se remplaza uno por otro debería ser transparente
    • De hecho las instalaciones editables de pip hace links de site-packages a donde tengas el paquete, se podría hacer una instalación editable del paquete del entorno virtual común en el venv actual
      https://www.reddit.com/r/learnpython/comments/ayx7za/how_does_pip_install_e_work_is_there_a_specific/

      Rather than take all the code and put it into the site-packages directory where my python.exe (or just python) is located, make a link in the site-packages directory telling it to look here for the package.

      Creo que no se puede hacer porque en el site-packages no hay ningún setup.py

  • Esto probablemente debería ir con pip (cuando actualices un paquete, tiene que darse cuenta de que en el resto de venvs hay otra versión y deshacer los links, y quizás que te avise de que se han actualizado cuando abras el resto de venvs afectados), o quizás en virtualenvwrapper
  • Un buen caso de uso es una aplicación que sea un “doctor” que te diga qué dependencias están duplicadas
    • incluso qué paquetes de qué entorno puedes subir/bajar de versión para que se conviertan en dependencias comunes

1.10.2. Flatpak utiliza OSTree y hace lo mismo, con hard links

1.10.3. Utilidades Linux que desduplican archivos con hard links

https://superuser.com/questions/909731/how-to-replace-all-duplicate-files-with-hard-links
I know of 4 command-line solutions for linux. My preferred one is the last one listed here, rdfind, because of all the options available.

  • fdupes
    • This appears to be the most recommended/most well-known one.
    • It’s the simplest to use, but its only action is to delete duplicates.
    • To ensure duplicates are actually duplicates (while not taking forever to run), comparisons between files are done first by file size, then md5 hash, then bye-by-byte comparison.
  • hardlink
    • Designed to, as the name indicates, replace found files with hardlinks.
    • Has a --dry-run option.
    • Does not indicate how contents are compared, but unlike all other options, does take into account file mode, owner, and modified time.
  • duff
    • Made to find files that the user then acts upon; has no actions available.
    • Comparisons are done by file size, then sha1 hash.
      • Hash can be changed to sha256, sha384, or sha512.
      • Hash can be disabled to do a byte-by-byte comparison
  • rdfind
    • Options have an unusual syntax (meant to mimic find?).
    • Several options for actions to take on duplicate files (delete, make symlinks, make hardlinks).
    • Has a dry-run mode.
    • Comparisons are done by file size, then first-bytes, then last-bytes, then either md5 (default) or sha1.
    • Ranking of files found makes it predictable which file is considered the original.
1.10.3.1. jdupes (desduplicar entornos virtuales de python)
  • Es posible que con esto ya me valga para reducir duplicaciones
    • Si instalo un nuevo paquete se olvida del anterior
  • Hacer una función pipg que hace:
    1. allvirtualenv pip install <paquete> # Lo instala en todos los vens
      1. Quizás tener una lista que excluya ciertos venvs para que no se instale en todos los entornos virtuales, sino sólo en algunos ??
        .pipignore
    2. cd ~/.virtualenvs
    3. jdupes -r –linkhard .

-—

  1. Crearse dos entornos virtuales iguales
  2. Aplicar una herramienta de desduplicado
  3. Desinstalar paquetes de uno y ver que pasa en el otro, cambiar paquetes de versión…
  4. Ver si me vale y en cualquier caso apuntar los resultados

1.10.4. Paquetes existentes en javascript

1.10.6. Simplify Your Python Developer Environment

  • Three tools (pyenv, pipx, pipenv) make for smooth, isolated, reproducible Python developer and production environments.
    Ya se nos está empezando a ir la pinza como en javascript, 4000 herramientas distintas que tienen menos de un año cada una :/

1.10.7. Ideas

  • Utilizar Pathlib para que sea lo más portable posible
    https://realpython.com/python-pathlib/
  • Utilizar tox para probar distintas versiones de Python
  • Ponerlo con CI/CD para que no rompa el entorno virtual
  • Hacer links independientes del sistema operativo
    https://docs.python.org/3/library/os.html#os.symlink

    On newer versions of Windows 10, unprivileged accounts can create symlinks if Developer Mode is enabled. When Developer Mode is not available/enabled, the SeCreateSymbolicLinkPrivilege privilege is required, or the process must be run as an administrator.

1.11. Actualizar todos los paquetes desactualizados

1.12. You don’t really need a virtualenv - DEV Community

Install all in __pypackages__, like node_modules, in a local folder
But then, node_modules gets unnecessarily huge if you have a lot of shared dependencies
https://frostming.com/2021/01-22/introducing-pdm/

1.12.1. PDM - Python Development Master

Opted-in centralized installation cache like pnpm.
Has a plugin for creating virtual environments associated to a project https://github.com/pdm-project/pdm-venv
https://discuss.python.org/t/pep-582-python-local-packages-directory/963/430

Exclude dependencies?
https://github.com/pdm-project/pdm/issues/1316

1.12.1.2. Guía de uso
  • ↓ Se queja de cosas que funcionan con requirements.txt, no permite un entorno “sucio” que funciona bien con pip
pdm venv create --name virtual-env
pdm init # Te crea pyproject.toml .pdm.toml
pdm import requirements.txt # En producción, versiones fijadas
pdm import --dev requirements-dev.in # En desarrollo, versiones sin fijar
pdm install --prod
pdm install --dev

1.13. https://news.ycombinator.com/item?id=29502715

I started a new Python project last month. I tried both Poetry and PDM but I decided not to use neither of them. PDM is currently basically one man show, and the Poetry’s doc isn’t great - The doc page seems pretty but it only describes command line usages and does not tell how I can configure metadata. Most importantly Poetry does not support the standard PEP621 yet.
So I stick with this setup:

1.15. Pip vs Conda: an in-depth comparison of Python’s two packaging systems

1.16. Reproducible and upgradable Conda environments: dependency management with conda-lock

https://pythonspeed.com/articles/conda-dependency-management/
Focusing just on dependencies:

  1. Reproducibility: If we reinstall the code, we should get the same libraries.
  2. Upgradability: We should be able to change versions of our dependencies without having to fight the packaging system.

1.16.1. Direct dependencies

Direct → Only list packages you use
No version lock → ⇈ upgradability ⇊ reproducibility

1.16.2. Versioned direct dependencies

Direct → Only list packages you use
Version lock → ⇊ upgradability ↑ reproducibility
Reproducibility is not the best since indirect dependencies are not specified

1.16.3. Transitively-pinned dependencies, aka locked dependencies

Full version lock, direct and indirect dependencies → ⇊ upgradability ⇈ reproducibility

1.16.4. Choosing how to specify dependencies

  1. You use the versioned direct file to generate the locked dependency file.
    List packages you use with a specific version
  2. When creating an environment, you use the locked dependency file.
    List all the packages you use and its recursive dependencies with a specific version

This gives you the best of both worlds: most of the time you are just creating a new, reproducible environment from the locked dependency file. When you want to upgrade, you regenerate the locked file, and since you’re starting with a versioned direct dependency list, the hope is that the changes of dependencies-of-dependencies won’t be too bad.

1.17. Building on solid ground: reproducible Docker builds for Python

1.18. pipx run isolated command line applications

1.19. Python: Please stop screwing over Linux distros (Python packaging summary)

https://www.reddit.com/r/Python/comments/qv5zv0/python_please_stop_screwing_over_linux_distros/

> The Python community is obsessed with reinventing the wheel, over and over and over and over and over and over again. distutils, setuptools, pip, pipenv, tox, flit, conda, poetry, virtualenv, requirements.txt, setup.py, setup.cfg, pyproject.toml…

All these things are not equivalent and each has a very specific use case which may or may not be useful. The fact that he doesn’t want to spend time to learn about them is his problem, not python problem.
Let’s see all of them:

  • distutils: It’s the original, standard library way of creating a python package (or as they call it, distribution). It works, but it’s very limited in features and its release cycle is too slow because it’s part of the stdlib. This prompted the development of
  • setuptools. Much much better, external from the stdlib, and compatible with distutils. Basically an extension of it with a lot of more powerful features that are very useful, especially for complex packages or mixed languages.
  • pip: this is a program that downloads and install python packages, typically from pypi. It’s completely unrelated to the above, but does need to build the packages it downloads, so it needs at least to know that it needs to run setup.py (more on that later)
  • pipenv: pip in itself installs packages, but when you install packages you install also their dependencies. When you install multiple packages some of their subdependencies may not agree with each other in constraints. so you need to solve a “find the right version of package X for the environment as a whole”, rather than what pip does, which cannot have a full overview because it’s not made for that.
  • tox: this is a utility that allows you to run separate pythons because if you are a developer you might want to check if your package works on different versions of python, and of the library dependencies. Creating different isolated environments for all python versions you want to test and all dependency sets gets old very fast, so you use tox to make it easier.
  • flit: this is a builder. It builds your package, but instead of using plain old setuptools it’s more powerful in driving the process.
  • conda: some python packages, typically those with C dependencies, need specific system libraries (e.g. libpng, libjpeg, VTK, QT) of a specific version installed, as well as the -devel package. This proves to be very annoying to some users, because e.g. they don’t have admin rights to install the devel package. or they have the wrong system library. Python provides no functionality to provide compiled binary versions of these non-python libraries, with the risk that you might have something that does not compile or compiles but crashes, or that you need multiple versions of the same system library. Conda also packages these system libraries, and installs them so that all these use cases just work. It’s their business model. Pay, or suffer through the pain of installing opencv.
  • poetry. Equivalent to pipenv + flit + virtualenv together. Creates a consistent environment, in a separate virtual env, and also helps you build your package. Uses a new standard of pyproject.toml instead of setup.py, which is a good thing.
  • virtualenv: when you develop, you generally don’t have one environment and that’s it. You have multiple projects, multiple versions of the same project, and each of these needs its own dependencies, with their own versions. What are you going to do? stuff them all in your site-packages? good luck. it won’t work, because project A needs a library of a given version, and project B needs the same library of a different version. So virtualenv keeps these separated and you enable each environment depending on the project you are working on. I don’t know any developer that doesn’t handle multiple projects/versions at once.
  • requirements.txt: a poor’s man way of specifying the environment for pip. Today you use poetry or pipenv instead.
  • setup.py the original file and entry point to build your package for release. distutils, and then setuptools, uses this. pip looks for it, and runs it when it downloads a package from pypi. Unfortunately you can paint yourself into a corner if you have complex builds, hence the idea is to move away from setup.py and specify the builder in pyproject.toml. It’s a GOOD THING. trust me.
  • setup.cfg: if your setup.py is mostly declarative, information can go into setup.cfg instead. It’s not mandatory, and you can work with setup.py only.
  • pyproject.toml: a unique file that defines the one-stop entry point for the build and development. It won’t override setup.py, not really. It comes before it. Like a metaclass is a way to inject a different “type” to use in the type() call that creates a class, pyproject.toml allows you to specify what to use to build your package. You can keep using setuptools, and that will then use setup.py/cfg, or use something else. As a consequence. pyproject.toml is a nice, guaranteed one stop file for any other tool that developers use. This is why you see the tool sections in there. It’s just a practical place where to config stuff, instead of having 200 dotfiles for each of your linter, formatter, etc

1.20. Customize Python dependency resolution with machine learning

https://www.reddit.com/r/Python/comments/qxan65/customize_python_dependency_resolution_with/
utilizan simulated annealing para explorar más primero y luego explotar

1.21. Zebradil | Pipfile.lock → requirements.txt

As it was pointed out in the comments, in the v2022.4.8 release a new command is added, which does the desired

pipenv requirements > requirements.txt
pipenv requirements --dev-only > requirements-dev.txt

1.22. Python virtualenvironment across all languages

1.23. Cómo crear un entorno virtual

python3 -m venv venv
source venv/bin/activate
deactivate

1.23.1. Falta ensurepip

The virtual environment was not created successfully because ensurepip is not
available.  On Debian/Ubuntu systems, you need to install the python3-venv
package using the following command.
    apt install python3.9-venv
You may need to use sudo with that command.  After installing the python3-venv
package, recreate your virtual environment.
Failing command: ['/home/julian/code/venv/bin/python3', '-Im', 'ensurepip', '--upgrade', '--default-pip']

Author: Julian Lopez Carballal

Created: 2024-11-06 mié 12:53