diff --git a/.github/workflows/docker.yaml b/.github/workflows/docker.yaml
new file mode 100644
index 0000000..6caf658
--- /dev/null
+++ b/.github/workflows/docker.yaml
@@ -0,0 +1,17 @@
+name: Publish Docker Image
+
+on: [workflow_dispatch]
+
+jobs:
+ publish:
+ runs-on: ubuntu-latest
+ steps:
+ - uses: actions/checkout@v4
+ - name: Publish to Registry
+ uses: elgohr/Publish-Docker-Github-Action@v5
+ with:
+ name: ami-iit/dnn-mpc-walking-docker
+ username: ${{ github.actor }}
+ password: ${{ secrets.GITHUB_TOKEN }}
+ workdir: dockerfiles
+ registry: ghcr.io
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 69c2ae3..53e8732 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -6,7 +6,7 @@ cmake_minimum_required(VERSION 3.14)
## MAIN project
project(DNNMPCWalking
- VERSION 0.0.1)
+ VERSION 1.0.0)
list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake)
diff --git a/README.md b/README.md
index 99b04d6..6724bff 100644
--- a/README.md
+++ b/README.md
@@ -3,126 +3,131 @@ Online DNN-Driven Nonlinear MPC for Stylistic Humanoid Robot Walking with Step A
-Giulio Romualdi, Paolo Maria Viceconte, Lorenzo Moretti, Ines Sorrentino, Stefano Dafarra, Silvio Traversaro, and Daniele Pucci
-
-Paolo Maria Viceconte and Giulio Romualdi are co-first authors
+Giulio Romualdi, Paolo Maria Viceconte, Lorenzo Moretti, Ines Sorrentino, Stefano Dafarra, Silvio Traversaro, and Daniele Pucci
+
+Co-first authors: Paolo Maria Viceconte and Giulio Romualdi
-📅 This paper has been accepted for publication at the 2024 IEEE-RAS International Conference on Humanoid Robots (Humanoids), Nancy, France 🤖
+📅 Accepted for publication at the 2024 IEEE-RAS International Conference on Humanoid Robots (Humanoids), Nancy, France 🤖
-https://github.com/user-attachments/assets/404c9af8-528e-43c2-abd2-138a98adfc04
-
-
+---
## Reproducing the Experiments
-You can reprocue the experiments in two ways: either with conda or pixi.
-
-### Conda
-
+You can reproduce the experiments using **Docker**, **Conda**, or **Pixi**.
-To reproduce the experiments, we provide a conda environment for easy setup. Follow the steps below:
+### Docker
-#### 1. Install the Environment
-Run the following command to create the conda environment:
+Run the experiments via Docker for an isolated and reproducible environment.
-```bash
-conda env create -f environment.yml
-```
+1. Pull the Docker image:
+ ```bash
+ docker pull ghcr.io/ami-iit/dnn-mpc-walking-docker:latest
+ ```
-#### 2. Activate the Environment
-Activate the newly created environment:
+2. Launch the container:
+ ```bash
+ xhost +
+ docker run -it --rm \
+ --device=/dev/dri:/dev/dri \
+ --env="DISPLAY=$DISPLAY" \
+ --net=host \
+ ghcr.io/ami-iit/dnn-mpc-walking-docker:latest
+ ```
-```bash
-conda activate dnn-mpc-env
-```
+3. Wait for `Gazebo` to start and launch the experiment.
-#### 3. Run the Simulation
-Start the experiment with:
+> ⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10 due to the MUMPS linear solver in the IPOPT Docker image. Alternative solvers (e.g., MA97) are available but cannot be redistributed.
-```bash
-./run_simulation.sh
-```
-
-This script will:
-- Launch the Gazebo simulator
-- Start the YARP server (for simulator communication)
-- Initialize the DNN-driven MPC controller
+---
-When prompted, type `y` and press `Enter` to start the simulation. The humanoid robot will begin walking, and you can observe its behavior in the Gazebo simulator.
+### Conda
-⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10. This is due to the use of the MUMPS linear solver in the IPOPT docker image. Alternative solvers (e.g., MA27) are available but cannot be redistributed.
+Follow these steps to set up the experiments using Conda:
+1. Install the environment:
+ ```bash
+ conda env create -f environment.yml
+ ```
-### Pixi
+2. Activate the environment:
+ ```bash
+ conda activate dnn-mpc-env
+ ```
-To run the experiments with pixi on Linux, just download the repo and run:
+3. Compile the code:
+ ```bash
+ cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
+ mkdir build && cd build
+ cmake ..
+ make -j
+ make install
+ ```
-~~~
-git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
-cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
-pixi run -e default run_simulation
-~~~
+4. Run the simulation:
+ ```bash
+ ./run_simulation.sh
+ ```
-This command will install all the dependencies, compiled the code and run the simulation. At that point, when prompted, type `y` and press `Enter` to start the simulation. The humanoid robot will begin walking, and you can observe its behavior in the Gazebo simulator.
+> ⚠️ The Gazebo real-time factor is scaled by a factor of 10 due to the MUMPS linear solver.
-⚠️ **Known Issue:** The Gazebo real-time factor is scaled by a factor of 10. This is due to the use of the MUMPS linear solver in the IPOPT docker image. If you have have access to a Coin-HSL license, you can use it following the instructions in the following, to reduce the Gazebo real-time factor scaling from 10 to 2.
-
-To run the simulation using the Coin-HSL's `ma97` solver, follow the following steps:
+---
-1. Go to https://licences.stfc.ac.uk/product/coin-hsl, and:
- - If you already have a license for Coin-HSL:
- - Go to https://licences.stfc.ac.uk/account/orders and find the coin-hsl order.
- - Download the coinhsl-2023.11.17.zip file and place it in the './coinhsl_src' folder of this repository.
- - If you do not have a license for Coin-HSL:
- - If you are an academic, request a license at https://licences.stfc.ac.uk/product/coin-hsl.
- - If you are not an academic, purchase a license at https://licences.stfc.ac.uk/product/coin-hsl.
- - Once your order is approved, download the coinhsl-2023.11.17.zip file and place it in the './coinhsl_src' folder.
+### Pixi
-Once the `coinhsl-2023.11.17.zip` archive is in the './coinhsl_src' folder, just run:
+To run the experiments with Pixi:
-~~~
-pixi run -e coinhsl run_simulation
-~~~
+1. Clone the repository:
+ ```bash
+ git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
+ cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking
+ ```
-This will execute the same steps of running `pixi run -e default run_simulation`, but additionally it will:
-* compile `coinhsl` to be able to use the `ma97` linear solver,
-* it will modify the configuration files to use `ma97`
-* use a different Gazebo world model, to ensure a faster simulation.
+2. Run the simulation:
+ ```bash
+ pixi run -e default run_simulation
+ ```
+> **Using MA97 Solver (Optional):**
+> If you have access to the Coin-HSL license, you can use the MA97 solver to improve performance:
+> 1. Obtain the Coin-HSL archive (`coinhsl-2023.11.17.zip`) and place it in the `./coinhsl_src` folder.
+> 2. Run:
+> ```bash
+> pixi run -e coinhsl run_simulation
+> ```
---
## Maintainers
-
-
-
-
-
- 👨💻 @paolo-viceconte
-
- |
-
-
-
- 👨💻 @GiulioRomualdi
-
- |
-
+
diff --git a/dockerfiles/Dockerfile b/dockerfiles/Dockerfile
index 17a9681..de8f1e7 100644
--- a/dockerfiles/Dockerfile
+++ b/dockerfiles/Dockerfile
@@ -35,6 +35,7 @@ WORKDIR /workspace
# Clone the repository with the submodules compile it and install it in the conda environment
RUN git clone https://github.com/ami-iit/paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking.git && \
cd paper_romualdi_viceconte_2024_humanoids_dnn-mpc-walking && \
+ git checkout v1.0.0 && \
mkdir build && \
cd build && \
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/opt/conda/envs/dnn-mpc-env .. && \