Skip to content

[WIP] pyproject.toml #5307

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: dev
Choose a base branch
from

Conversation

pordyna
Copy link
Member

@pordyna pordyna commented Mar 11, 2025

Adds pyproject.toml into lib/python. With that, you can install the picongpu python package into any python environment with pip install -e <picongpu_src>/lib/python. You can also install it not in edit mode, but I think it makes more sense for picongpu to include the -e.
With this PR there is no need for editing the PYTHONPATH, which is quite ugly and defies the whole point of having python envs.

The dependencies list is taken from all the requirements.txt files. At some point (with this PR or after?) we should remove the requirements.txt and only use the pyproject.toml specification.

I also add here a default python .gitignore, since pip will create some hidden files in /lib/pytho when installing with -e, that need to be excluded.

Everything in lib/python/synchrotronRadiationExtension and in lib/python/test is ignored.

TODO:

  • remove requirements.txt (?)
  • edit all mentions of PYTHONPATH and requirements.txt in the docs.
  • remove PYTHONPATH from all *.profile.example
  • probably switch to the new install method in tests / CI, otherwise it will look for missing requirements.txt ?

@psychocoderHPC, should we switch completely at once and remove requirements.txt, edit all documentation mentions?
@chillenzer @BrianMarre @SimeonEhrig can you comment on the possibly needed CI / tests changes?

@pordyna pordyna changed the title [WIP] Topic instal picongpu python [WIP] pyproject.toml Mar 11, 2025
@pordyna pordyna marked this pull request as draft March 11, 2025 11:48
@SimeonEhrig
Copy link
Contributor

Current state

For the CI, we need to change the way how dependency versions to be tested are configured. At the moment, we have tow custom script, which does the job:

  1. share/ci/pypicongpu_generator.py: Reads a given requirements.txt, determines all dependency versions to be tested (read the supported version range and check on pypy.org which version exists in this range) and generated for each version combination a separated CI job. Each job has different environment variables, which defines which dependency versions need to be set.
  2. share/ci/pypicongpu.sh: Is executed in each generated job. It takes the original requirements.txt, replace the version ranges for the dependency to be tested by a specific version given by the environment variables and install the modified requirements.txt.

Possible Solution

Modify the python

Instead reading and modifying requirements.txt, we can read and modify the pyproject.toml. Changing the read and write functions is not the big deal, but I'm not sure, if we get new problems, because the separated requirements.txt are composed in a single pyproject.toml now.

Using an existing test framework

tox and nox should be able to test different dependency versions by a given configuration. Unfortunately I have no experience with it. After a short review, I'm not sure if this solution works with our CI workflow of generating a CI job for each test combination. But if it works, we save the maintenance of the custom scripts. Maybe somebody has experience with it at the center.

@chillenzer
Copy link
Contributor

I think this is to be seen in a larger context. We will probably move more towards using the Python interface in general. There's also been discussions about properly modularising and individually packaging the different components. It might even be a way to provide an "installable" version in the sense of making the PICMI interface available as a package and coordinating the whole build and run process via Python. So, this is an interesting attempt and definitely in the right direction. The question is if we should approach this incrementally with this draft being a first step or as a general overhaul completely reorganising our Python approach.

Concretely:

  1. I like your approach of how you gathered the dependencies including where they come from. We'd likely want to remove those additional comments at some point but it's certainly nice to have this information at hand for now.
  2. I think we could add some more metadata to the pyproject.toml but that doesn't need be in this PR.
  3. I don't like overblown .gitignores. My philosophy about .gitignore is that it should reflect the general things to ignore not individual choices. Example: I'm in favour of ignoring __pycache__ because there is most likely a workflow in which those pop up sooner or later. I'm also in favour of ignoring some of the temporaries for packaging. I'm sternly against ignoring any editor or tool specific stuff. That's what global .gitignores are for. We'll be busy updating that list whenever the next student arrives with their new and shiny text editor not yet in the list. So my approach would be: If you follow the simplest, most standardised workflow to package and import that code, what are the files that get created. Anything else is a liability for the project, I'd say.

@chillenzer
Copy link
Contributor

Concerning the possible CI solutions: I think that nox looks like a good fit. In fact, for Python-internal dependencies I think spawning a separate runner for each combination is a bit over the top anyways. It will probably significantly reduce the overhead on the runners to have a single job with nox handling the dependencies internally.

@SimeonEhrig
Copy link
Contributor

Concerning the possible CI solutions: I think that nox looks like a good fit. In fact, for Python-internal dependencies I think spawning a separate runner for each combination is a bit over the top anyways. It will probably significantly reduce the overhead on the runners to have a single job with nox handling the dependencies internally.

In an offline discussion with @pordyna, we agreed that he is checking if nox/tox is able to manage dependencies like we need it. Most of the examples only shows how to test different python versions, which is the easiest part. If nox/tox saves work, we will use it and fuse all jobs in a single CI job. Otherwise I will update the scripts, that they modify the pyproject.toml instead the requirements.txt.

@chillenzer
Copy link
Contributor

@pordyna
Copy link
Member Author

pordyna commented Mar 21, 2025

@SimeonEhrig @chillenzer @psychocoderHPC Since the discussion about dependencies testing is still ongoing, I would suggest adding a very minimal pyproject.toml for now, that does not list dependencies, and keeping the requirements files. So that user installation goes like:

pip install -r requirements.txt
pip install -e .

We can still do our tests just like before, we could consider also already switching to pip install -e . in the CI. In the meantime, we can figure out an elegant solution with sth like nox.

@SimeonEhrig
Copy link
Contributor

@pordyna I will present the problem on the RSE meeting on Thursday. Let's see, if somebody can help me.

@pordyna
Copy link
Member Author

pordyna commented Apr 4, 2025

just a note so I don't forget it. @SimeonEhrig we should probably add https://pyproject-fmt.readthedocs.io/en/latest/ with our pre-commit once we switch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants