{ "cells": [ { "cell_type": "markdown", "metadata": { "nbsphinx": "hidden" }, "source": [ "Hi there 👋 \n", "\n", "It looks like you're running this notebook in [Google Colab](https://colab.research.google.com/github/jla-gardner/graph-pes/blob/colab/docs/source/tools/lammps.ipynb).\n", "\n", "To get everything below working, please first run the following cell to enable ``conda`` integration. Once this has been run once, the kernel will restart automatically. In the restarted session, run the cell again and keep going from there 😊\n", "\n", "If you want to run this notebook with GPU acceleration, please ensure the correct runtime is selected before running any of the cells below." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "nbsphinx": "hidden" }, "outputs": [], "source": [ "!pip install -q condacolab\n", "import condacolab\n", "condacolab.install()" ] }, { "cell_type": "markdown", "metadata": { "nbsphinx": "hidden" }, "source": [ "Installing ``graph-pes`` here takes an unfortunate amount of time on Colab (c. 4 minutes), since we're installing lots of packages into the base conda environment we set up above.\n", "\n", "Please ignore any messages you get from Colab about wanting to restart the kernel - this is uneccesary here." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "nbsphinx": "hidden" }, "outputs": [], "source": [ "# this takes an unfortunate amount of time on colab, since we're installing \n", "# lots of packages into the base conda environment we set up above\n", "#\n", "!pip install -q graph-pes" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# LAMMPS\n", "\n", "> **FYI**, you can open this documentation as a [Google Colab notebook](https://colab.research.google.com/github/jla-gardner/graph-pes/blob/main/docs/source/tools/lammps.ipynb) to follow along interactively.\n", "\n", "[LAMMPS](https://www.lammps.org/) is a **fast** software package for molecular dynamics simulations written in ``C++``.\n", "\n", "[graph-pes](https://github.com/jla-gardner/graph-pes) provides a ``pair_style`` for interfacing LAMMPS to **any model** that inherits from [GraphPESModel](https://jla-gardner.github.io/graph-pes/models/root.html#graph_pes.GraphPESModel) - that includes all of the architectures implemented in ``graph-pes``, together with any other ones you implement yourself.\n", "\n", "To run MD simulations with a ``graph-pes`` model, you need to do 3 things:\n", "\n", "1. build LAMMPS with support for ``graph_pes``\n", "2. \"deploy\" the model for use with LAMMPS\n", "3. run LAMMPS with the ``pair_style graph_pes`` command\n", "\n", "Let's go through each of these steps in turn.\n", "\n", "## 1. Building LAMMPS\n", "\n", "Compiling LAMMPS can be a pain. \n", "\n", "To combat this, we provide a [script](https://github.com/jla-gardner/graph-pes/blob/main/scripts/build-lammps.sh) that attempts to automatically build LAMMPS with ``graph_pes`` support. We can't guarantee that this will work in all cases, but we've tested it on various **Linux** machines, and in Google Colab, with success. If you have any issues, please [open an issue](https://github.com/jla-gardner/graph-pes/issues).\n" ] }, { "cell_type": "raw", "metadata": { "raw_mimetype": "text/restructuredtext", "vscode": { "languageId": "raw" } }, "source": [ ".. note::\n", "\n", " Our current ``pair_graph_pes.cpp`` implementation assumes that you will be running LAMMPS in a single process (no MPI parallelism), optionally with a single GPU attached. A multi-GPU and MPI parallelised ``pair_style`` is on our roadmap for the near future (watch this space).\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's start by downloading this script and running it with the ``--help`` flag to see what options are available:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Usage: ../../../scripts/build-lammps.sh [OPTIONS]\n", "\n", "This script builds a LAMMPS executable with support for 'pair_style graph_pes'.\n", "\n", "It performs the following tasks:\n", " 1. Locates the graph-pes installation\n", " 2. Creates a conda environment with necessary dependencies\n", " 3. Clones the LAMMPS repository\n", " 4. Patches LAMMPS with graph-pes source code\n", " 5. Builds LAMMPS with graph-pes support\n", "\n", "The final executable will be located at:\n", " ./graph_pes_lmp_cpu_only (if --cpu-only is used)\n", " ./graph_pes_lmp (default, with GPU support)\n", "\n", "Requirements:\n", " - conda\n", "\n", "Options:\n", " --help Display this help message and exit\n", " --cpu-only Build LAMMPS for CPU only (default: GPU enabled)\n", " --force-rebuild Force rebuilding of conda environment and LAMMPS\n", "\n" ] } ], "source": [ "!wget https://tinyurl.com/graph-pes-build-lammps -O build-lammps.sh\n", "!bash build-lammps.sh --help" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This ``build-lammps.sh`` script creates a custom, throwaway conda environment to ensure a uniform build process. It you don't have conda installed, you can install it from e.g. [here](https://docs.conda.io/en/latest/miniconda.html).\n", "\n", "> **Note:**\n", "> As we discover common problems, we'll document them here:\n", ">\n", "> - ``CMake Error at (...) Failed to detect a default CUDA architecture.``\n", "> This is a common issue when installing on e.g. an HPC system. A simple fix (if available) is to run ``module load cuda`` before running the script.\n", "\n", "Now let's build our LAMMPS executable:" ] }, { "cell_type": "markdown", "metadata": { "nbsphinx": "hidden" }, "source": [ "**NB:** this may take a while to run. On tested local Linux machines, it takes ~3 minutes. On Google Colab, it can take up to 20 minutes! Thankfully, this need only be done once per session. We're looking into ways to speed this up in the future." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Running build-lammps.sh with the following parameters:\n", " CPU_ONLY: false\n", " FORCE_REBUILD : false\n", "Found graph-pes pair style at (...)/graph_pes/pair_style\n", "Creating conda environment lammps-env-gpu-throwaway\n", "Channels:\n", " - conda-forge\n", " - defaults\n", "Platform: linux-64\n", "Collecting package metadata (repodata.json): ...working... done\n", "Solving environment: ...working... done\n", "\n", "Downloading and Extracting Packages: ...working... done\n", "Preparing transaction: ...working... done\n", "Verifying transaction: ...working... done\n", "Executing transaction: ...working... done\n", "(...)\n", "Installing collected packages: mpmath, typing-extensions, sympy, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, numpy, networkx, MarkupSafe, fsspec, filelock, triton, nvidia-cusparse-cu12, nvidia-cudnn-cu12, jinja2, nvidia-cusolver-cu12, torch\n", "Successfully installed MarkupSafe-3.0.1 filelock-3.16.1 fsspec-2024.9.0 jinja2-3.1.4 mpmath-1.3.0 networkx-3.4.1 numpy-2.1.2 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-9.1.0.70 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.20.5 nvidia-nvjitlink-cu12-12.6.77 nvidia-nvtx-cu12-12.1.105 sympy-1.13.3 torch-2.4.1 triton-3.0.0 typing-extensions-4.12.2\n", "\n", "done\n", "#\n", "# To activate this environment, use\n", "#\n", "# $ conda activate lammps-env-gpu-throwaway\n", "#\n", "# To deactivate an active environment, use\n", "#\n", "# $ conda deactivate\n", "\n", "Conda environment lammps-env-gpu-throwaway successfully activated\n", "Cloning into 'lammps'...\n", "remote: Enumerating objects: 403276, done.\n", "remote: Counting objects: 100% (3486/3486), done.\n", "remote: Compressing objects: 100% (1504/1504), done.\n", "remote: Total 403276 (delta 2317), reused 3036 (delta 1977), pack-reused 399790 (from 1)\n", "Receiving objects: 100% (403276/403276), 757.23 MiB | 52.42 MiB/s, done.\n", "Resolving deltas: 100% (332564/332564), done.\n", "Updating files: 100% (13334/13334), done.\n", "-- The CXX compiler identification is GNU 12.4.0\n", "-- Detecting CXX compiler ABI info\n", "-- Detecting CXX compiler ABI info - done\n", "-- Check for working CXX compiler: (..)/miniconda3/envs/lammps-env-gpu-throwaway/bin/c++ - skipped\n", "-- Detecting CXX compile features\n", "-- Detecting CXX compile features - done\n", "-- Found Git: (...)/miniconda/envs/lammps-env-gpu-throwaway/bin/git (found version \"2.47.0\")\n", "-- Running check for auto-generated files from make-based build system\n", "-- Checking for module 'mpi-cxx'\n", "-- Package 'mpi-cxx', required by 'virtual:world', not found\n", "-- Looking for C++ include omp.h\n", "-- Looking for C++ include omp.h - found\n", "-- Found OpenMP_CXX: -fopenmp (found version \"4.5\")\n", "-- Found OpenMP: TRUE (found version \"4.5\") found components: CXX\n", "-- Found CURL: (...)/miniconda/envs/lammps-env-gpu-throwaway/lib/libcurl.so (found version \"8.10.1\") found components: HTTP HTTPS\n", "-- Found GZIP: /usr/bin/gzip\n", "-- Could NOT find FFMPEG (missing: FFMPEG_EXECUTABLE) \n", "-- Looking for C++ include cmath\n", "-- Looking for C++ include cmath - found\n", "-- Generating style headers...\n", "-- Generating package headers...\n", "-- Generating lmpinstalledpkgs.h...\n", "-- Found Python3: (...)/miniconda/envs/lammps-env-gpu-throwaway/bin/python3.10 (found version \"3.10.15\") found components: Interpreter\n", "-- Could NOT find ClangFormat (missing: ClangFormat_EXECUTABLE) (Required is at least version \"11.0\")\n", "-- The following tools and libraries have been found and configured:\n", " * Git\n", " * OpenMP\n", " * CURL\n", " * Python3\n", "\n", "-- <<< Build configuration >>>\n", " LAMMPS Version: 20240829 patch_29Aug2024-modified\n", " Operating System: Linux Rocky 8.9\n", " CMake Version: 3.30.5\n", " Build type: RelWithDebInfo\n", " Install path: (...)/.local\n", " Generator: Unix Makefiles using /usr/bin/gmake\n", "-- Enabled packages: \n", "-- <<< Compilers and Flags: >>>\n", "-- C++ Compiler: (...)/miniconda/envs/lammps-env-gpu-throwaway/bin/c++\n", " Type: GNU\n", " Version: 12.4.0\n", " C++ Standard: 17\n", " C++ Flags: -O2 -g -DNDEBUG\n", " Defines: LAMMPS_SMALLBIG;LAMMPS_MEMALIGN=64;LAMMPS_OMP_COMPAT=4;LAMMPS_CURL;LAMMPS_GZIP\n", "-- <<< Linker flags: >>>\n", "-- Executable name: lmp\n", "-- Static library flags: \n", "-- Found CUDA: /usr/local/cuda-12.2 (found version \"12.2\") \n", "-- The CUDA compiler identification is NVIDIA 12.2.140\n", "-- Detecting CUDA compiler ABI info\n", "-- Detecting CUDA compiler ABI info - done\n", "-- Check for working CUDA compiler: /usr/local/cuda-12.2/bin/nvcc - skipped\n", "-- Detecting CUDA compile features\n", "-- Detecting CUDA compile features - done\n", "-- Found CUDAToolkit: /usr/local/cuda-12.2/include (found version \"12.2.140\")\n", "-- Performing Test CMAKE_HAVE_LIBC_PTHREAD\n", "-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed\n", "-- Looking for pthread_create in pthreads\n", "-- Looking for pthread_create in pthreads - not found\n", "-- Looking for pthread_create in pthread\n", "-- Looking for pthread_create in pthread - found\n", "-- Found Threads: TRUE\n", "-- Caffe2: CUDA detected: 12.2\n", "-- Caffe2: CUDA nvcc is: /usr/local/cuda-12.2/bin/nvcc\n", "-- Caffe2: CUDA toolkit directory: /usr/local/cuda-12.2\n", "-- Caffe2: Header version is: 12.2\n", "-- /usr/local/cuda-12.2/lib64/libnvrtc.so shorthash is 000ca627\n", "-- USE_CUDNN is set to 0. Compiling without cuDNN support\n", "-- USE_CUSPARSELT is set to 0. Compiling without cuSPARSELt support\n", "-- Autodetected CUDA architecture(s): 8.6 8.6\n", "-- Added CUDA NVCC flags for: -gencode;arch=compute_86,code=sm_86\n", "-- Found Torch: (...)/miniconda/envs/lammps-env-gpu-throwaway/lib/python3.10/site-packages/torch/lib/libtorch.so\n", "-- Configuring done (4.9s)\n", "-- Generating done (0.0s)\n", "-- Build files have been written to: (...)/graph-pes/ignore/lammps/build\n", "Building LAMMPS executable\n", "(...)\n", "[ 98%] Linking CXX static library liblammps.a\n", "[ 98%] Built target lammps\n", "[ 98%] Building CXX object CMakeFiles/lmp.dir(...)/graph-pes/ignore/lammps/src/main.cpp.o\n", "[100%] Linking CXX executable lmp\n", "[100%] Built target lmp\n", "LAMMPS executable successfully built with graph-pes, and is available at ./graph_pes_lmp\n", "\n" ] } ], "source": [ "!bash build-lammps.sh" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can check that our ``graph_pes_lmp`` executable works by inspecting the list of available ``pair_style`` modules:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "* Pair styles:\n", "\n", "born buck buck/coul/cut coul/cut coul/debye \n", "coul/dsf coul/wolf meam/c reax reax/c \n", "mesont/tpm graph_pes hybrid hybrid/omp hybrid/molecular \n", "hybrid/molecular/omp hybrid/overlay hybrid/overlay/omp \n", "hybrid/scaled hybrid/scaled/omp lj/cut lj/cut/coul/cut \n", "lj/expand morse soft table yukawa \n", "zbl zero \n" ] } ], "source": [ "!./graph_pes_lmp -h | grep -A8 \"* Pair styles:\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Success!\n", "\n", "## 2. Model deployment\n", "\n", "To use a [GraphPESModel](https://jla-gardner.github.io/graph-pes/models/root.html#graph_pes.GraphPESModel) with LAMMPS, we need to first convert it to a format that LAMMPS can understand. This involves compiling the ``PyTorch`` object into a ``C++`` object using ``TorchScript``. All function and classes provided by ``graph-pes`` are compatible with this process.\n", "\n", "By default, the [graph-pes-train](https://jla-gardner.github.io/graph-pes/cli/graph-pes-train.html) command saves a deployed model to ``/lammps_model.pt``. We can also deploy models from within ``Python`` using the [deploy_model](https://jla-gardner.github.io/graph-pes/tools/lammps.html#graph_pes.utils.lammps.deploy_model) function." ] }, { "cell_type": "raw", "metadata": { "raw_mimetype": "text/restructuredtext", "vscode": { "languageId": "raw" } }, "source": [ ".. autofunction:: graph_pes.utils.lammps.deploy_model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's create a simple [LennardJones](https://jla-gardner.github.io/graph-pes/models/pairwise.html#graph_pes.models.LennardJones) model and deploy it for use in LAMMPS:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "from graph_pes.models import LennardJones\n", "from graph_pes.utils.lammps import deploy_model\n", "\n", "model = LennardJones(epsilon=0.4, sigma=1.2, cutoff=5.0)\n", "deploy_model(model, \"lj_model_lammps.pt\")" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "lj_model_lammps.pt\n" ] } ], "source": [ "!ls | grep \".*pt\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 3. Running MD\n", "\n", "Now let's run a simple \"NVT\" simulation of a copper crystal using our Lennard-Jones potential.\n", "\n", "First of all, we create a starting structure for our simulation:" ] }, { "cell_type": "code", "execution_count": 73, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", " \n", " \n", "
\n", " \n", "\n", "\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "\n", "\n", "\n", "\n", "\n", "
\n", " \n", " \n", "\n" ], "text/plain": [ "" ] }, "execution_count": 73, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import numpy as np\n", "from ase.build import bulk, make_supercell\n", "from load_atoms import view\n", "\n", "atoms = bulk(\"Cu\", \"hcp\", a=3.6, c=5.8)\n", "atoms = make_supercell(atoms, np.eye(3) * 5)\n", "atoms.write(\"starting-structure.data\", format=\"lammps-data\")\n", "\n", "view(atoms)" ] }, { "cell_type": "code", "execution_count": 74, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "starting-structure.data (written by ASE) \n", "\n", "250 \t atoms \n", "1 atom types\n", "0.0 18 xlo xhi\n", "0.0 15.588457268119896 ylo yhi\n", "0.0 29 zlo zhi\n", " -9 0 0 xy xz yz\n", "\n", "\n", "Atoms \n", "\n", " 1 1 0 0 0\n", " 2 1 0 2.0784609690826525 2.8999999999999999\n", " 3 1 0 0 5.8000000000000007\n" ] } ], "source": [ "!cat starting-structure.data | head -n 15" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### `pair_style graph_pes` usage\n", "\n", "Usage:\n", "\n", "```bash\n", "pair_style graph_pes \n", "pair_coeff * * \n", "```\n", "\n", "Use ``pair_style graph_pes cpu`` to run the model on the CPU only. By default, the model will be run on the GPU if available.\n", "\n", "In the ``pair_coeff`` command, replace ```` with a path to your **deployed** model and ```` with the chemical symbols of the atoms in your system (e.g. ``Cu`` for copper) **in the order of their respective LAMMPS types**.\n", "\n", "For instance, if you have a structure with C, H and O atoms, and the LAMMPS types for C, H and O are 1, 2 and 3 respectively, then you should use ``pair_coeff * * lj_model_lammps.pt C H O``." ] }, { "cell_type": "code", "execution_count": 75, "metadata": {}, "outputs": [], "source": [ "from pathlib import Path\n", "\n", "input_script = \"\"\"\n", "# a basic NVT simulation using pair_style graph_pes\n", "# \n", "# expected variables:\n", "# output_dir: path to output directory\n", "# temp_K: target temperature in K\n", "# timestep_fs: simulation timestep in fs\n", "# total_time: total simulation time in ps\n", "\n", "variable timestep_ps equal ${timestep_fs}/1000\n", "variable n_steps equal ${total_time}/${timestep_ps}\n", "\n", "log ${output_dir}/log.lammps\n", "\n", "# Set units to 'metal' for atomic units (ps, eV, etc.)\n", "units metal\n", "atom_style atomic\n", "newton off\n", "\n", "# Read initial structure from data file\n", "read_data starting-structure.data\n", "mass 1 63.546\n", "\n", "# Define graph-pes pair style\n", "pair_style graph_pes\n", "pair_coeff * * lj_model_lammps.pt Cu\n", "\n", "# Define neighbor list\n", "neighbor 0.5 bin\n", "neigh_modify delay 0 every 1 check yes\n", "\n", "timestep ${timestep_ps}\n", "\n", "# Dump output every 50 timesteps\n", "dump 1 all atom 50 ${output_dir}/dumps/*.data\n", "\n", "# Thermodynamic output every 1000 steps\n", "thermo 1000\n", "thermo_style custom step time cpu temp etotal press\n", "thermo_modify flush yes\n", "\n", "# Setup NVT ensemble with Nose-Hoover thermostat\n", "# relaxation time of 10 fs\n", "fix 1 all nvt temp ${temp_K} ${temp_K} 0.01\n", "velocity all create ${temp_K} 42\n", "run ${n_steps}\n", "\"\"\"\n", "\n", "Path(\"nvt.in\").write_text(input_script);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now lets (finally) run our simulation:" ] }, { "cell_type": "code", "execution_count": 76, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "LAMMPS (29 Aug 2024)\n", "OMP_NUM_THREADS environment is not set. Defaulting to 1 thread. (src/comm.cpp:98)\n", " using 1 OpenMP thread(s) per MPI task\n", "Reading data file ...\n", " triclinic box = (0 0 0) to (18 15.588457 29) with tilt (-9 0 0)\n", "WARNING: Triclinic box skew is large. LAMMPS will run inefficiently. (src/domain.cpp:221)\n", " 1 by 1 by 1 MPI processor grid\n", " reading atoms ...\n", " 250 atoms\n", " read_data CPU = 0.001 seconds\n", "GraphPES is using device cuda\n", "Loading model from lj_model_lammps.pt\n", "Freezing TorchScript model...\n", "\n", "CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE\n", "\n", "Your simulation uses code contributions which should be cited:\n", "- Type Label Framework: https://doi.org/10.1021/acs.jpcb.3c08419\n", "The log file lists these citations in BibTeX format.\n", "\n", "CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE-CITE\n", "\n", "Neighbor list info ...\n", " update: every = 1 steps, delay = 0 steps, check = yes\n", " max neighbors/atom: 2000, page size: 100000\n", " master list distance cutoff = 5.5\n", " ghost atom cutoff = 5.5\n", " binsize = 2.75, bins = 10 6 11\n", " 1 neighbor lists, perpetual/occasional/extra = 1 0 0\n", " (1) pair graph_pes, perpetual\n", " attributes: full, newton off\n", " pair build: full/bin/atomonly\n", " stencil: full/bin/3d\n", " bin: standard\n", "Setting up Verlet run ...\n", " Unit style : metal\n", " Current step : 0\n", " Time step : 0.001\n", "Per MPI rank memory allocation (min/avg/max) = 3.076 | 3.076 | 3.076 Mbytes\n", " Step Time CPU Temp TotEng Press \n", " 0 0 0 400 9.4960562 361.48303 \n", " 1000 1 3.411231 432.8207 -87.101875 -880.49216 \n", " 2000 2 6.4558746 413.18378 -190.99657 -2779.0133 \n", " 3000 3 9.5907054 437.73432 -245.78015 658.24861 \n", " 4000 4 12.88936 404.33008 -279.66436 1123.0481 \n", " 5000 5 16.439895 404.60413 -290.74592 132.02435 \n", " 6000 6 20.475099 405.53413 -325.35688 3090.0458 \n", " 7000 7 24.82393 454.21859 -338.08393 -370.19839 \n", " 8000 8 30.120096 414.12291 -371.99827 1340.3408 \n", " 9000 9 36.077551 439.844 -378.79259 6711.7289 \n", " 10000 10 42.336936 404.27497 -385.80563 -737.94683 \n", "Loop time of 42.3371 on 1 procs for 10000 steps with 250 atoms\n", "\n", "Performance: 20.408 ns/day, 1.176 hours/ns, 236.199 timesteps/s, 59.050 katom-step/s\n", "99.4% CPU use with 1 MPI tasks x 1 OpenMP threads\n", "\n", "MPI task timing breakdown:\n", "Section | min time | avg time | max time |%varavg| %total\n", "---------------------------------------------------------------\n", "Pair | 42.133 | 42.133 | 42.133 | 0.0 | 99.52\n", "Neigh | 0.064318 | 0.064318 | 0.064318 | 0.0 | 0.15\n", "Comm | 0.021263 | 0.021263 | 0.021263 | 0.0 | 0.05\n", "Output | 0.055924 | 0.055924 | 0.055924 | 0.0 | 0.13\n", "Modify | 0.048878 | 0.048878 | 0.048878 | 0.0 | 0.12\n", "Other | | 0.01398 | | | 0.03\n", "\n", "Nlocal: 250 ave 250 max 250 min\n", "Histogram: 1 0 0 0 0 0 0 0 0 0\n", "Nghost: 732 ave 732 max 732 min\n", "Histogram: 1 0 0 0 0 0 0 0 0 0\n", "Neighs: 0 ave 0 max 0 min\n", "Histogram: 1 0 0 0 0 0 0 0 0 0\n", "FullNghs: 9430 ave 9430 max 9430 min\n", "Histogram: 1 0 0 0 0 0 0 0 0 0\n", "\n", "Total # of neighbors = 9430\n", "Ave neighs/atom = 37.72\n", "Neighbor list builds = 394\n", "Dangerous builds = 0\n", "Total wall time: 0:00:42\n" ] } ], "source": [ "! mkdir -p nvt-simulation/dumps\n", "! ./graph_pes_lmp -in nvt.in -var output_dir \"nvt-simulation\" -var temp_K 400 -var timestep_fs 1 -var total_time 10\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "That took 42s to simulate 10 ps of MD.\n", "\n", "We've used a very simple LJ model here, and so we expect (a) the simulation to run very quickly, and (b) the MD to have produced some LJ type clusters:" ] }, { "cell_type": "code", "execution_count": 77, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", " \n", " \n", "
\n", " \n", "\n", "\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "\n", "\n", "\n", "\n", "\n", "
\n", " \n", " \n", "\n" ], "text/plain": [ "" ] }, "execution_count": 77, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from pathlib import Path\n", "\n", "from ase.io import read\n", "\n", "final_file = sorted(\n", " Path(\"nvt-simulation/dumps/\").glob(\"*.data\"),\n", " key=lambda x: int(x.stem),\n", ")[-1]\n", "final_structure = read(str(final_file), format=\"lammps-dump-text\")\n", "final_structure.symbols = \"Cu\" * len(final_structure)\n", "view(final_structure)" ] } ], "metadata": { "kernelspec": { "display_name": "graph-pes", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.18" } }, "nbformat": 4, "nbformat_minor": 2 }