Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docker] Add Dockerfiles for Rocky 9 #1793

Merged
merged 6 commits into from
Dec 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions .github/workflows/continuous-integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,15 +21,19 @@ on:
jobs:
build-linux:
runs-on: ubuntu-latest
strategy:
matrix:
container: ["alicevision/alicevision-deps:2024.12.03-ubuntu22.04-cuda12.1.0", "alicevision/alicevision-deps:2024.12.09-rocky9-cuda12.1.0"]
container:
image: alicevision/alicevision-deps:2024.11.25-ubuntu22.04-cuda12.1.0
image: ${{ matrix.container }}
env:
DEPS_INSTALL_DIR: /opt/AliceVision_install
BUILD_TYPE: Release
CTEST_OUTPUT_ON_FAILURE: 1
ALICEVISION_ROOT: ${{ github.workspace }}/../AV_install
ALICEVISION_SENSOR_DB: ${{ github.workspace }}/../AV_install/share/aliceVision/cameraSensors.db
ALICEVISION_LENS_PROFILE_INFO: ""
BUILD_CCTAG: "${{ matrix.container == 'alicevision/alicevision-deps:2024.12.03-ubuntu22.04-cuda12.1.0' && 'ON' || 'OFF' }}"
steps:
- uses: actions/checkout@v1

Expand All @@ -53,7 +57,7 @@ jobs:
-DALICEVISION_BUILD_SWIG_BINDING=ON \
-DALICEVISION_USE_OPENCV=ON \
-DALICEVISION_USE_CUDA=ON \
-DALICEVISION_USE_CCTAG=ON \
-DALICEVISION_USE_CCTAG="${BUILD_CCTAG}" \
-DALICEVISION_USE_POPSIFT=ON \
-DALICEVISION_USE_ALEMBIC=ON \
-DOpenCV_DIR="${DEPS_INSTALL_DIR}/share/OpenCV" \
Expand Down
16 changes: 8 additions & 8 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -384,33 +384,33 @@ Check the sample in [samples](src/samples/aliceVisionAs3rdParty) for an example

### Docker image

A docker image can be built using the CentOS or Ubuntu Dockerfiles.
A docker image can be built using the Ubuntu or Rocky Linux Dockerfiles.
The Dockerfiles are based on `nvidia/cuda` images (https://hub.docker.com/r/nvidia/cuda/)

To generate the docker image, just run:
```
./docker/build-centos.sh
./docker/build-rocky.sh
```

To do it manually, parameters `OS_TAG` and `CUDA_TAG` should be passed to choose the OS and CUDA version.
For example, the first line of below's commands shows the example to create docker for a CentOS 7 with Cuda 11.3.1 and second line for Ubuntu 16.04 with Cuda 11.0:
To do it manually, parameters `ROCKY_VERSION`/`UBUNTU_VERSION` and `CUDA_TAG` should be passed to choose the OS and CUDA versions.
For example, the first line of the commands below shows the example to create docker for a Rocky 9 with Cuda 12.1.0 and the second line for Ubuntu 16.04 with Cuda 11.0:

```
docker build --build-arg OS_TAG=7 --build-arg CUDA_TAG=11.3.1 --tag alicevision:centos7-cuda11.3.1 .
docker build --build-arg OS_TAG=16.04 --build-arg CUDA_TAG=11.0 --build-arg NPROC=8 --tag alicevision:ubuntu16.04-cuda11.0 -f Dockerfile_ubuntu .
docker build --build-arg ROCKY_VERSION=9 --build-arg CUDA_TAG=12.1.0 --tag alicevision:rocky9-cuda12.1.0 -f Dockerfile_rocky .
docker build --build-arg UBUNTU_VERSION=22.04 --build-arg CUDA_TAG=12.1.0 --build-arg NPROC=8 --tag alicevision:ubuntu22.04-cuda12.1.0 -f Dockerfile_ubuntu .
```

In order to run the image [nvidia docker](https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0)) is needed.

```
docker run -it --runtime=nvidia alicevision:centos7-cuda9.2
docker run -it --runtime=nvidia alicevision:rocky9-cuda12.1.0
```

To retrieve the generated files:

```
# Create an instance of the image, copy the files and remove the temporary docker instance.
CID=$(docker create alicevision:centos7-cuda11.3.1) && docker cp ${CID}:/opt/AliceVision_install . && docker cp ${CID}:/opt/AliceVision_bundle . && docker rm ${CID}
CID=$(docker create alicevision:rocky9-cuda12.1.0) && docker cp ${CID}:/opt/AliceVision_install . && docker cp ${CID}:/opt/AliceVision_bundle . && docker rm ${CID}
```

Environment variable
Expand Down
49 changes: 0 additions & 49 deletions docker/Dockerfile_centos

This file was deleted.

149 changes: 0 additions & 149 deletions docker/Dockerfile_centos_deps

This file was deleted.

71 changes: 71 additions & 0 deletions docker/Dockerfile_rocky
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
ARG AV_DEPS_VERSION
ARG AV_VERSION
ARG CUDA_VERSION
ARG ROCKY_VERSION
FROM alicevision/alicevision-deps:${AV_DEPS_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION}
LABEL maintainer="AliceVision Team alicevision-team@googlegroups.com"
ARG TARGET_ARCHITECTURE=core

# use CUDA_VERSION to select the image version to use
# see https://hub.docker.com/r/nvidia/cuda/
#
# AV_VERSION=2.2.8
# CUDA_VERSION=11.0
# ROCKY_VERSION=9
# docker build \
# --build-arg CUDA_VERSION=${CUDA_VERSION} \
# --build-arg ROCKY_VERSION${ROCKY_VERSION} \
# --build-arg AV_VERSION=2.2.8.develop \
# --tag alicevision/alicevision:${AV_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION} \
# -f Dockerfile_rocky .
#
# then execute with nvidia docker (https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0))
# docker run -it --runtime=nvidia alicevision/alicevision:{AV_VERSION}-rocky${ROCKY_VERSION}-cuda${CUDA_VERSION}


# OS/Version (FILE): cat /etc/issue.net
# Cuda version (ENV): $CUDA_VERSION

ENV AV_DEV=/opt/AliceVision_git \
AV_BUILD=/tmp/AliceVision_build \
AV_INSTALL=/opt/AliceVision_install \
AV_BUNDLE=/opt/AliceVision_bundle \
PATH="${PATH}:${AV_BUNDLE}" \
VERBOSE=1

COPY CMakeLists.txt *.md ${AV_DEV}/
COPY src ${AV_DEV}/src

WORKDIR "${AV_BUILD}"

COPY docker ${AV_DEV}/docker

RUN export CPU_CORES=`${AV_DEV}/docker/check-cpu.sh`

RUN cmake -DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS:BOOL=ON \
-DTARGET_ARCHITECTURE=${TARGET_ARCHITECTURE} \
-DALICEVISION_BUILD_DEPENDENCIES:BOOL=OFF \
-DCMAKE_PREFIX_PATH:PATH="${AV_INSTALL}" \
-DCMAKE_INSTALL_PREFIX:PATH="${AV_INSTALL}" \
-DALICEVISION_BUNDLE_PREFIX="${AV_BUNDLE}" \
-DALICEVISION_USE_ALEMBIC:BOOL=ON \
-DMINIGLOG:BOOL=ON \
-DALICEVISION_USE_CCTAG:BOOL=OFF \
-DALICEVISION_USE_OPENCV:BOOL=ON \
-DALICEVISION_USE_OPENGV:BOOL=ON \
-DALICEVISION_USE_POPSIFT:BOOL=ON \
-DALICEVISION_USE_CUDA:BOOL=ON \
-DALICEVISION_USE_ONNX_GPU:BOOL=OFF \
-DALICEVISION_BUILD_DOC:BOOL=OFF \
-DALICEVISION_BUILD_SWIG_BINDING:BOOL=ON \
-DSWIG_DIR:PATH="${AV_INSTALL}/share/swig/4.3.0" -DSWIG_EXECUTABLE:PATH="${AV_INSTALL}/bin-deps/swig" \
"${AV_DEV}"

RUN make install -j${CPU_CORES}

RUN make bundle

RUN rm -rf "${AV_BUILD}" "${AV_DEV}" && \
echo "export ALICEVISION_SENSOR_DB=${AV_BUNDLE}/share/aliceVision/cameraSensors.db" >> /etc/profile.d/alicevision.sh && \
echo "export ALICEVISION_ROOT=${AV_BUNDLE}" >> /etc/profile.d/alicevision.sh
Loading
Loading