-
Notifications
You must be signed in to change notification settings - Fork 29
conda environments
As described in the README, one of the options for installing PyFstat is via some flavour of conda. This gives you project-specific environments that are better isolated than python's standard venvs. (Though they still rely on some system packages such as LaTeX.)
If you don't have conda (or the drop-in replacement mamba) on your system yet, our currently recommended method is to install Miniforge, downloading the right installer for your operating system. The "forge" part refers to the community repository that packages will be pulled from, see for example PyFstat's conda-forge page.
Modern conda
versions (since 2023)
include the faster solver from mamba
internally,
but if on your system it is preferred, you can still
replace all conda
commands below with mamba
.
To not have PyFstat interfere with any of your other setups and to start from a minimal clean environment:
- download pyfstat.yml
- run
conda env create -f pyfstat.yml
conda activate pyfstat
or, from the main directory of a git clone, run:
conda env create -f etc/pyfstat.yml
conda activate pyfstat
If you plan to contribute to PyFstat development, instead definitely start from a git clone, and then please do
NO_LALSUITE_FROM_PYPI=1 conda env create -f etc/pyfstat-dev.yml
conda activate pyfstat-dev
If you get an error like does not appear to be a Python project: neither 'setup.py' nor 'pyproject.toml' found
,
you likely ran the create
command from a different relative path to the .yml
file.
Please run this command from the repo's main directory as indicated above.
Note that the NO_LALSUITE_FROM_PYPI
environment variable is set
to avoid pulling in the lalsuite
monolithic dependency from PyPI,
while we already have the lalpulsar
and other specific requirements installed from conda.
And in this case don't forget to run afterwards one more command to set you up for automated code quality checks:
pre-commit install
Alternatively, if you prefer to manage your own conda environment, you can simply
- add
pyfstat
as a dependency to an existing recipe - or, to install PyFstat into your currently active environment, just run
conda install -c conda-forge pyfstat
To update after a new release,
it might be smartest to just wipe and recreate your environment.
Alternatively, being aware of possible side effects,
and especially only if you have not manually installed/upgraded any packages with pip
in the same environment,
you may try at your own risk:
conda update -c conda-forge pyfstat
(Note the -c conda-forge
is important, otherwise it will change a lot of packages to the default channel.)
The most robust ways to run PyFstat in jupyter notebooks is to install the jupyter server into the same environment and launch it from a terminal with the environment activated, e.g.:
conda activate pyfstat
conda install -c conda-forge jupyter
jupyter notebook
Note: many sites recommend to install a kernel for your environment to your user profile, so that you can then run the jupyter server from the system installation and still get all package dependencies from the environment.
This turns out to be problematic for PyFstat because we need to call some LALSuite commandline executables, and apparently loading a jupyter kernel does not update the path for executables ($PATH
or os.environ["PATH"]
, specifically).
The method recommended above, and then using the default kernel (will typically show as Python 3 (ipykernel)
), avoids that problem.
But if you need to run from a jupyter server outside the environment and hence use a custom kernel, see the official ipython docs and, if you get errors like RuntimeError: Could not find either lalpulsar or lalapps version of command Makefakedata_v5
from PyFstat, then a workaround could be to put at the start of your notebook
import os
import sys
env_bin = os.path.dirname(sys.executable)
if not env_bin in os.environ["PATH"]:
os.environ["PATH"] = f'{env_bin}:{os.environ["PATH"]}'
import pyfstat
Most of the dependencies of PyFstat (but not PyFstat itself so far) are also contained in the igwn environments which contain many gravitational-wave data analysis tools. (Warning: huge download size if you're not working on an LVK machine!)
If you want to use the pycuda
implementation of the transient-CW detection statistics,
you also need to add the corresponding package to your .yml
file,
or in an existing environment do
conda install -c conda-forge pycuda
Depending on your operating system, that may already be sufficient, or you may also have to do
conda install -c conda-forge compilers cudatoolkit-dev
to get a working nvcc
with a compatible gcc
.
Or better yet, you could add these dependencies to your copy of the .yml
recipe.
While there are conda installers for Windows and other python environments that look like they should do the job, to the best of our knowledge you currently cannot run PyFstat on it, because there are no recent LALSuite versions available.
Instead, please consider using the Windows subsystem for Linux and installing miniforge inside it. This has been tested a few times and seems to work reasonably well. If you want to increase your chances of getting support for this installation mode from the PyFstat maintainers, using a Debian-based distribution is recommended.
If you want to submit jobs on a cluster running the condor scheduler
(more properly called HTCondor, and not to be confused with conda the package/environment manager),
like those of the LIGO-Virgo-KAGRA collaboration,
advice on how that should optimally interact with conda environments changes frequently.
As of July 2023, a robust way seems to be to no longer use GetEnv
,
but instead point the executable
to the absolute path of the python executable inside your environment,
and let it automagically figure out how to load the dependencies from that environment.
The name of the script you want to run then has to be the first entry under arguments
, before any named argparse arguments.
Check the HTCondor documentation for more details.