Create large-scale ML-driven multiscale simulation ensembles to study the interactions

Overview

MuMMI RAS v0.1

Released: Nov 16, 2021

MuMMI RAS is the application component of the MuMMI framework developed to create large-scale ML-driven multiscale simulation ensembles to study the interactions of RAS proteins and RAS-RAF protein complexes with lipid plasma membranes.

MuMMI framework was developed as part of the Pilot2 project of the Joint Design of Advanced Computing Solutions for Cancer funded jointly by the Department of Energy (DOE) and the National Cancer Institute (NCI).

The Pilot 2 project focuses on developing multiscale simulation models for understanding the interactions of the lipid plasma membrane with the RAS and RAF proteins. The broad computational tool development aims of this pilot are:

  • Developing scalable multi-scale molecular dynamics code that will automatically switch between phase field, coarse-grained and all-atom simulations.
  • Developing scalable machine learning and predictive models of molecular simulations to:
    • identify and quantify states from simulations
    • identify events from simulations that can automatically signal change of resolution between phase field, coarse-grained and all-atom simulations
    • aggregate information from the multi-resolution simulations to efficiently feedback to/from machine learning tools
  • Integrate sparse information from experiments with simulation data

MuMMI RAS defines the specific functionalities needed for the various components and scales of a target multiscale simulation. The application components need to define the scales, how to read the corresponding data, how to perform ML-based selection, how to run the simulations, how to perform analysis, and how to perform feedback. This code uses several utilities made available through "MuMMI Core".

Publications

MuMMI framework is described in the following publications.

  1. Bhatia et al. Generalizable Coordination of Large Multiscale Ensembles: Challenges and Learnings at Scale. In Proceedings of the ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis, SC '21, Article No. 10, November 2021. doi:10.1145/3458817.3476210.

  2. Di Natale et al. A Massively Parallel Infrastructure for Adaptive Multiscale Simulations: Modeling RAS Initiation Pathway for Cancer. In Proceedings of the ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis, SC '19, Article No. 57, November 2019. doi:10.1145/3295500.3356197.
    Best Paper at SC 2019.

  3. Ingólfsson et al. Machine Learning-driven Multiscale Modeling Reveals Lipid-Dependent Dynamics of RAS Signaling Protein. Proceedings of the National Academy of Sciences (PNAS), accepted, 2021. preprint.

  4. Reciprocal Coupling of Coarse-Grained and All-Atom scales. In preparation.

Installation

git clone https://github.com/mummi-framework/mummi-ras
cd mummi-ras
pip3 install .

export MUMMI_ROOT=/path/to/outputs
export MUMMI_CORE=/path/to/core/repo
export MUMMI_APP=/path/to/app/repo
export MUMMI_RESOURCES=/path/to/resources
The installaton process as described above installs the MuMMI framework. The simulation codes (gridsim2d, ddcMD, AMBER, GROMACS) are not included and are to be installed separately.
Spack installation. We are also working towards releasing the option of installing MuMMI and its dependencies through spack.

Authors and Acknowledgements

MuMMI was developed at Lawrence Livermore National Laboratory, in collaboration with Los Alamos National Laboratory, Oak Ridge National Laboratory, and International Business Machines. A list of main contributors is given below.

  • LLNL: Harsh Bhatia, Francesco Di Natale, Helgi I Ingólfsson, Joseph Y Moon, Xiaohua Zhang, Joseph R Chavez, Fikret Aydin, Tomas Oppelstrup, Timothy S Carpenter, Shiv Sundaram (previously LLNL), Gautham Dharuman (previously LLNL), Dong H Ahn, Stephen Herbein, Tom Scogland, Peer-Timo Bremer, and James N Glosli.

  • LANL: Chris Neale and Cesar Lopez

  • ORNL: Chris Stanley

  • IBM: Sara K Schumacher

MuMMI was funded by the Pilot2 project led by Dr. Fred Streitz (DOE) and Dr. Dwight Nissley (NIH). We acknowledge contributions from the entire Pilot 2 team.

This work was performed under the auspices of the U.S. Department of Energy (DOE) by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, Los Alamos National Laboratory (LANL) under Contract DE-AC5206NA25396, and Oak Ridge National Laboratory under Contract DE-AC05-00OR22725.

Contact: Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA 94550.

Contributing

Contributions may be made through pull requests and/or issues on github.

License

MuMMI RAS is distributed under the terms of the MIT License.

Livermore Release Number: LLNL-CODE-827655

Comments
  • Are the trajectories in your publications publicly available?

    Are the trajectories in your publications publicly available?

    Hi, Congrats on the success, and huge thanks for making it open source. I wonder whether the trajectories in your publications are publicly available. Or are there any demo trajectories?

    I am a Ph.D. student at KAUST, using computer graphics to build and visualize mesoscale biology models, such as SARS-CoV-2 and bacteriophage T4. If possible, I (and my colleagues) would like to perform (multiscale, multi-representation, multi-granularity) visualization research on the trajectories you generated.

    Many thanks, Roden

    opened by RodenLuo 2
  • `flux` vs `slurm`

    `flux` vs `slurm`

    Hi,

    As flux is mentioned in the dependencies, is it possible to reproduce MuMMI RAS on a cluster that only has slurm?

    Workflow dependencies (e.g., python, flux, dynim, keras, etc.)

    Quoted from: https://github.com/mummi-framework/mummi-ras/blob/main/INSTALL.md

    Many thanks, Roden

    opened by RodenLuo 0
  • gridsim2d availability

    gridsim2d availability

    Hi, I wonder if the following code is available or not.

    gridsim2d: to be released shortly

    Quoted from: https://github.com/mummi-framework/mummi-ras/blob/main/INSTALL.md

    Thanks, Roden

    opened by RodenLuo 0
  • Patch for gromacs availability

    Patch for gromacs availability

    Hi, I wonder if the following patch is available or not.

    Note that we have a patch for gromacs installation for customization. To be open-sourced soon.

    Quoted from: https://github.com/mummi-framework/mummi-ras/blob/main/INSTALL.md

    Thanks, Roden

    opened by RodenLuo 0
  • Small scale test data for local deployment

    Small scale test data for local deployment

    Hi, I'm interested in deploying MuMMI on the KAUST IBEX cluster. It is mentioned in the installation doc that there is a small set of test data. Is it now publicly available? If not, is it possible for me to somehow access it so that I can perform a test run?

    Many thanks, Roden

    Again on lassen and on summit, we have created a small set of test data, which can be used to launch MuMMI at small scales. This (and the larger dataset) will be made public through NCI website. Until then, we can make this data available upon request.

    opened by RodenLuo 1
Releases(v1.0.0)
Python/Sage Tool for deriving Scattering Matrices for WDF R-Adaptors

R-Solver A Python tools for deriving R-Type adaptors for Wave Digital Filters. This code is not quite production-ready. If you are interested in contr

8 Sep 19, 2022
Hypernets: A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.

A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.

DataCanvas 216 Dec 23, 2022
Falken provides developers with a service that allows them to train AI that can play their games

Falken provides developers with a service that allows them to train AI that can play their games. Unlike traditional RL frameworks that learn through rewards or batches of offline training, Falken is

Google Research 223 Jan 03, 2023
scikit-learn models hyperparameters tuning and feature selection, using evolutionary algorithms.

Sklearn-genetic-opt scikit-learn models hyperparameters tuning and feature selection, using evolutionary algorithms. This is meant to be an alternativ

Rodrigo Arenas 180 Dec 20, 2022
MLBox is a powerful Automated Machine Learning python library.

MLBox is a powerful Automated Machine Learning python library. It provides the following features: Fast reading and distributed data preprocessing/cle

Axel 1.4k Jan 06, 2023
A modular active learning framework for Python

Modular Active Learning framework for Python3 Page contents Introduction Active learning from bird's-eye view modAL in action From zero to one in a fe

modAL 1.9k Dec 31, 2022
Exemplary lightweight and ready-to-deploy machine learning project

Exemplary lightweight and ready-to-deploy machine learning project

snapADDY GmbH 6 Dec 20, 2022
pandas, scikit-learn, xgboost and seaborn integration

pandas, scikit-learn and xgboost integration.

299 Dec 30, 2022
Module is created to build a spam filter using Python and the multinomial Naive Bayes algorithm.

Naive-Bayes Spam Classificator Module is created to build a spam filter using Python and the multinomial Naive Bayes algorithm. Main goal is to code a

Viktoria Maksymiuk 1 Jun 27, 2022
ThunderSVM: A Fast SVM Library on GPUs and CPUs

What's new We have recently released ThunderGBM, a fast GBDT and Random Forest library on GPUs. add scikit-learn interface, see here Overview The miss

Xtra Computing Group 1.4k Dec 22, 2022
Uplift modeling and causal inference with machine learning algorithms

Disclaimer This project is stable and being incubated for long-term support. It may contain new experimental code, for which APIs are subject to chang

Uber Open Source 3.7k Jan 07, 2023
Machine Learning Study 혼자 해보기

Machine Learning Study 혼자 해보기 기여자 (Contributors) ✨ Teddy Lee 🏠 HongJaeKwon 🏠 Seungwoo Han 🏠 Tae Heon Kim 🏠 Steve Kwon 🏠 SW Song 🏠 K1A2 🏠 Wooil

Teddy Lee 1.7k Jan 01, 2023
Scikit learn library models to account for data and concept drift.

liquid_scikit_learn Scikit learn library models to account for data and concept drift. This python library focuses on solving data drift and concept d

7 Nov 18, 2021
MasTrade is a trading bot in baselines3,pytorch,gym

mastrade MasTrade is a trading bot in baselines3,pytorch,gym idea we have for example 1 btc and we buy a crypto with it with market option to trade in

Masoud Azizi 18 May 24, 2022
Bayesian Modeling and Computation in Python

Bayesian Modeling and Computation in Python Open access and Code This repository contains the open access version of the text and the code examples in

Bayesian Modeling and Computation in Python 339 Jan 02, 2023
A comprehensive repository containing 30+ notebooks on learning machine learning!

A comprehensive repository containing 30+ notebooks on learning machine learning!

Jean de Dieu Nyandwi 3.8k Jan 09, 2023
DistML is a Ray extension library to support large-scale distributed ML training on heterogeneous multi-node multi-GPU clusters

DistML is a Ray extension library to support large-scale distributed ML training on heterogeneous multi-node multi-GPU clusters

27 Aug 19, 2022
Interactive Parallel Computing in Python

Interactive Parallel Computing with IPython ipyparallel is the new home of IPython.parallel. ipyparallel is a Python package and collection of CLI scr

IPython 2.3k Dec 30, 2022
Python implementation of the rulefit algorithm

RuleFit Implementation of a rule based prediction algorithm based on the rulefit algorithm from Friedman and Popescu (PDF) The algorithm can be used f

Christoph Molnar 326 Jan 02, 2023
A game theoretic approach to explain the output of any machine learning model.

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allo

Scott Lundberg 18.2k Jan 02, 2023