fennol

PyPI - Version DOI:10.1063/5.0217688

FeNNol: Force-field-enhanced Neural Networks optimized library

FeNNol is a library for building, training and running neural network potentials for molecular simulations. It is based on the JAX library and is designed to be fast and flexible.

FeNNol's documentation is available here and the article describing the library at https://doi.org/10.1063/5.0217688

Active Learning tutorial in this Colab notebook

Installation

From PyPI

# CPU version
pip install fennol

# GPU version
pip install "fennol[cuda]"

Latest version from Github repo

You can start with a fresh environment, for example using venv:

python -m venv fennol
source fennol/bin/activate

The first step is to install jax (see details at: https://jax.readthedocs.io/en/latest/installation.html). For example, to install the latest version using pip:

# CPU version
pip install -U jax

# GPU version
pip install -U "jax[cuda12]"

Then, you can clone the repo and install FeNNol using pip:

git clone https://github.com/thomasple/FeNNol.git
cd FeNNol
pip install .

Optional dependencies

pip install --upgrade e3nn-jax
  • The provided training script requires pytorch (at least the cpu version) for dataloaders:
pip install torch --index-url https://download.pytorch.org/whl/cpu
  • For the Deep-HP interface, cffi and pycuda are required:
pip install cffi pycuda

Examples

To learn how to train a FeNNol model, you can check the examples in the examples/training directory. The README.md file in that directory contains instructions on how to train a model on the aspirin revMD17 dataset.

To learn how to run molecular dynamics simulations with FeNNol models, you can check the examples in the examples/md directory. The README.md file in that directory contains instructions on how to run simulations with the provided ANI-2x model.

Citation

Please cite this paper if you use the library.

T. Plé, O. Adjoua, L. Lagardère and J-P. Piquemal. FeNNol: an Efficient and Flexible Library for Building Force-field-enhanced Neural Network Potentials. J. Chem. Phys. 161, 042502 (2024)
@article{ple2024fennol,
    author = {Plé, Thomas and Adjoua, Olivier and Lagardère, Louis and Piquemal, Jean-Philip},
    title = {FeNNol: An efficient and flexible library for building force-field-enhanced neural network potentials},
    journal = {The Journal of Chemical Physics},
    volume = {161},
    number = {4},
    pages = {042502},
    year = {2024},
    month = {07},
    doi = {10.1063/5.0217688},
    url = {https://doi.org/10.1063/5.0217688},
}

License

This project is licensed under the terms of the GNU LGPLv3 license. See LICENSE for additional details.

1"""
2.. include:: ../../README.md
3
4"""
5
6from .models import FENNIX,register_fennix_module,get_modules_documentation, available_fennix_modules
7
8load = FENNIX.load
@classmethod
def load(filename, use_atom_padding=False, graph_config={}):
650    @classmethod
651    def load(
652        cls,
653        filename,
654        use_atom_padding=False,
655        graph_config={},
656    ):
657        """load a model from a file"""
658        with open(filename, "rb") as f:
659            state_dict = serialization.msgpack_restore(f.read())
660        state_dict["preprocessing"] = {k: v for k, v in state_dict["preprocessing"]}
661        state_dict["modules"] = {k: v for k, v in state_dict["modules"]}
662        return cls(
663            **state_dict,
664            graph_config=graph_config,
665            use_atom_padding=use_atom_padding,
666        )

load a model from a file