# Local inference

The Ersilia Model Hub is conveniently offered as a python package through PyPi and CondaForge, and each model is individually packaged as a Docker container.&#x20;

## Installation in Linux/MacOS

Ersilia is only maintained for Linux and Mac Operating Systems. If you work in Windows please use a [Windows Subsystem Linux](https://learn.microsoft.com/en-us/windows/wsl/install).

#### Prerequisites

* Python: we maintain Ersilia for Python 3.8 and above. Please make sure you have a compatible Python version installed on your computer. Visit the [official Python site](https://www.python.org) to learn more.
* Conda: ensure either [Anaconda](https://docs.anaconda.com/anaconda/install/index.html) or [Miniconda](https://docs.anaconda.com/miniconda/) are available in your system. This is the command to install it in **Ubuntu** (the command may be different if you do not use Ubuntu):

```
mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
rm -rf ~/miniconda3/miniconda.sh
~/miniconda3/bin/conda init bash
~/miniconda3/bin/conda init zsh
```

* Docker: Docker containers are an excellent way to share applications and simplify the management of system requirements and configurations. Please [install Docker](https://www.docker.com) to ensure that all our AI/ML assets will work smoothly in your local device.

#### Install from PyPi

```bash
# install ersilia from PyPi in an independent conda environment
conda create -n ersilia python=3.12
conda activate ersilia
pip install ersilia
```

#### Install from CondaForge

```bash
# install ersilia from CondaForge in an independent conda environment
conda create -n ersilia python=3.12
conda activate ersilia
conda install -c conda-forge ersilia
```

Once the Ersilia Model Hub is installed, test that it works by running the --help command:

```bash
ersilia --help
```

## Input and output

The Ersilia Model Hub takes **chemical structures** as input, which should be specified as SMILES strings. To obtain the SMILES string of your compounds, you can use resources like [PubChem](https://pubchem.ncbi.nlm.nih.gov/).

Ersilia only accepts an input file in csv format, with one column and a header. Predictions are returned in tabular format as .csv.&#x20;

{% code title="input.csv" %}

```bash
smiles
C1=C(SC(=N1)SC2=NN=C(S2)N)[N+](=O)[O-]
CC(C)CC1=CC=C(C=C1)C(C)C(=O)O
```

{% endcode %}

### Understanding Model Outputs

Because the Ersilia Model Hub hosts a wide variety of models the outputs you receive will look different depending on the specific model you run.

When you run a prediction (either locally via the CLI or through the online inference app), the resulting CSV file will always contain at least two standard columns:

* `key`: A 32-character unique identifier created by Ersilia for the molecule.
* `input`: The input itself (SMILES string).

The remaining columns contain the actual predictions generated by the model. To find out exactly what each prediction column means, what type of data it contains (float, integer, string), and how to interpret the direction of the value, you should refer to the model's specific documentation.

You can find the exact definitions for a model's output columns in two places:

1. The Model's GitHub Repository: You can read the description directly in the README file of the specific model's repository (e.g., `https://github.com/ersilia-os/eos...`).
2. The `run_columns.csv` File: Inside every Ersilia model repository, there is a dedicated file that strictly defines the outputs. You can find this file located at:

   `model/framework/columns/run_columns.csv`

## Model Usage

### Using the Ersilia CLI

{% stepper %}
{% step %}

### Fetch&#x20;

Download a model, along with its specific environment and dependencies, from the Ersilia Model Hub to your local machine.

```
ersilia fetch eos2r5a
```

{% hint style="warning" %}
⚠️The fetch command will download the model from DockerHub. Please make sure to have Docker active in your system before fetching a model.
{% endhint %}

{% hint style="info" %}
You can access the list of available models in our [ersilia catalog](https://catalog.ersilia.io/)
{% endhint %}
{% endstep %}

{% step %}

### Serve&#x20;

Initialize the downloaded model by spinning up a local container or API, preparing it to accept inputs and generate predictions.

```
ersilia serve eos2r5a 
```

{% hint style="warning" %}
If you serve a model that is not available locally, Ersilia will try to fetch it automatically, from DockerHub first and then defaulting to S3 if Docker is not active.&#x20;
{% endhint %}
{% endstep %}

{% step %}

### Run

Run: Pass your input data (like SMILES strings) through the active model to generate and retrieve predictions.

```
ersilia run -i input.csv -o output.csv
```

{% endstep %}

{% step %}

### Close

Shut down the active model and terminate its local container or API to free up system resources.

```
ersilia close
```

{% endstep %}
{% endstepper %}

### As a Python package

Models can be fetched from the Ersilia Model Hub, served, and run as a Python package. The commands mirror the Command Line Interface for simplicity:

```python
from ersilia.api import Model

input_list = [
    "C1=C(SC(=N1)SC2=NN=C(S2)N)[N+](=O)[O-]",
    "CC(C)CC1=CC=C(C=C1)C(C)C(=O)O"
]

mdl = Model(model="eos3b5e")
mdl.fetch()
mdl.serve()
df = mdl.run(input_list)
mdl.close()
```
