Deploying a MedNIST Classifier App with MONAI Deploy App SDK (Prebuilt Model)

This tutorial demos the process of packaging up a trained model using MONAI Deploy App SDK into an deployable inference application which can be run as a local program, as well as an MONAI Application Package (MAP) for containerized workflow execution.

Clone the github project (the latest version of the main branch only)

!rm -rf source \
 && git clone --branch main --depth 1 https://github.com/Project-MONAI/monai-deploy-app-sdk.git source \
 && rm -rf source/.git
Cloning into 'source'...
remote: Enumerating objects: 277, done.
remote: Counting objects: 100% (277/277), done.
remote: Compressing objects: 100% (222/222), done.
remote: Total 277 (delta 55), reused 159 (delta 33), pack-reused 0
Receiving objects: 100% (277/277), 1.44 MiB | 10.45 MiB/s, done.
Resolving deltas: 100% (55/55), done.
!ls source/examples/apps/mednist_classifier_monaideploy/
app.yaml  mednist_classifier_monaideploy.py  requirements.txt

Install monai-deploy-app-sdk package

!pip install monai-deploy-app-sdk
Requirement already satisfied: monai-deploy-app-sdk in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (0.5.1+18.gea0c032.dirty)
Requirement already satisfied: numpy>=1.21.6 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (1.26.4)
Requirement already satisfied: holoscan~=2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (2.0.0)
Requirement already satisfied: colorama>=0.4.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (0.4.6)
Requirement already satisfied: typeguard>=3.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (4.2.1)
Requirement already satisfied: pip>=20.3 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (24.0)
Requirement already satisfied: cupy-cuda12x==12.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (12.2.0)
Requirement already satisfied: cloudpickle==2.2.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (2.2.1)
Requirement already satisfied: python-on-whales==0.60.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (0.60.1)
Requirement already satisfied: Jinja2==3.1.3 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (3.1.3)
Requirement already satisfied: packaging==23.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (23.1)
Requirement already satisfied: pyyaml==6.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (6.0)
Requirement already satisfied: requests==2.31.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (2.31.0)
Requirement already satisfied: psutil==5.9.6 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (5.9.6)
Requirement already satisfied: wheel-axle-runtime<1.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk) (0.0.5)
Requirement already satisfied: fastrlock>=0.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from cupy-cuda12x==12.2->holoscan~=2.0->monai-deploy-app-sdk) (0.8.2)
Requirement already satisfied: MarkupSafe>=2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from Jinja2==3.1.3->holoscan~=2.0->monai-deploy-app-sdk) (2.1.5)
Requirement already satisfied: pydantic<2,>=1.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (1.10.15)
Requirement already satisfied: tqdm in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (4.66.2)
Requirement already satisfied: typer>=0.4.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (0.12.3)
Requirement already satisfied: typing-extensions in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (4.11.0)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk) (3.7)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk) (2.2.1)
Requirement already satisfied: certifi>=2017.4.17 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk) (2024.2.2)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from wheel-axle-runtime<1.0->holoscan~=2.0->monai-deploy-app-sdk) (3.13.4)
Requirement already satisfied: click>=8.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (8.1.7)
Requirement already satisfied: shellingham>=1.3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (1.5.4)
Requirement already satisfied: rich>=10.11.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (13.7.1)
Requirement already satisfied: markdown-it-py>=2.2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (3.0.0)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (2.17.2)
Requirement already satisfied: mdurl~=0.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from markdown-it-py>=2.2.0->rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk) (0.1.2)

Install necessary packages for the app

!pip install monai Pillow # for MONAI transforms and Pillow
!python -c "import pydicom" || pip install -q "pydicom>=1.4.2"
!python -c "import highdicom" || pip install -q "highdicom>=0.18.2" # for the use of DICOM Writer operators
Requirement already satisfied: monai in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (1.3.0)
Requirement already satisfied: Pillow in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (10.3.0)
Requirement already satisfied: numpy>=1.20 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai) (1.26.4)
Requirement already satisfied: torch>=1.9 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai) (2.0.1)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.13.4)
Requirement already satisfied: typing-extensions in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (4.11.0)
Requirement already satisfied: sympy in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (1.12)
Requirement already satisfied: networkx in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.1)
Requirement already satisfied: jinja2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.1.3)
Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.7.99)
Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.7.99)
Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.7.101)
Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (8.5.0.96)
Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.10.3.66)
Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (10.9.0.58)
Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (10.2.10.91)
Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.4.0.1)
Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.7.4.91)
Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (2.14.3)
Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.7.91)
Requirement already satisfied: triton==2.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (2.0.0)
Requirement already satisfied: setuptools in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.9->monai) (69.5.1)
Requirement already satisfied: wheel in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.9->monai) (0.43.0)
Requirement already satisfied: cmake in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from triton==2.0.0->torch>=1.9->monai) (3.29.2)
Requirement already satisfied: lit in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from triton==2.0.0->torch>=1.9->monai) (18.1.3)
Requirement already satisfied: MarkupSafe>=2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from jinja2->torch>=1.9->monai) (2.1.5)
Requirement already satisfied: mpmath>=0.19 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from sympy->torch>=1.9->monai) (1.3.0)

Download/Extract mednist_classifier_data.zip from Google Drive

# Download mednist_classifier_data.zip
!pip install gdown 
!gdown "https://drive.google.com/uc?id=1yJ4P-xMNEfN6lIOq_u6x1eMAq1_MJu-E"
Requirement already satisfied: gdown in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (5.1.0)
Requirement already satisfied: beautifulsoup4 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gdown) (4.12.3)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gdown) (3.13.4)
Requirement already satisfied: requests[socks] in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gdown) (2.31.0)
Requirement already satisfied: tqdm in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gdown) (4.66.2)
Requirement already satisfied: soupsieve>1.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from beautifulsoup4->gdown) (2.5)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests[socks]->gdown) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests[socks]->gdown) (3.7)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests[socks]->gdown) (2.2.1)
Requirement already satisfied: certifi>=2017.4.17 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests[socks]->gdown) (2024.2.2)
Requirement already satisfied: PySocks!=1.5.7,>=1.5.6 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests[socks]->gdown) (1.7.1)
Downloading...
From (original): https://drive.google.com/uc?id=1yJ4P-xMNEfN6lIOq_u6x1eMAq1_MJu-E
From (redirected): https://drive.google.com/uc?id=1yJ4P-xMNEfN6lIOq_u6x1eMAq1_MJu-E&confirm=t&uuid=72f2b083-c6ce-44ba-aafd-19c9bd097d63
To: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/mednist_classifier_data.zip
100%|██████████████████████████████████████| 28.6M/28.6M [00:00<00:00, 34.3MB/s]
# Unzip the downloaded mednist_classifier_data.zip from the web browser or using gdown, and set up folders
input_folder = "input"
output_folder = "output"
models_folder = "models"
!rm -rf {input_folder}
!unzip -o "mednist_classifier_data.zip"

# Need to copy the model file to its own clean subfolder for pacakging, to workaround an issue in the Packager
models_folder = "models"
!rm -rf {models_folder} && mkdir -p {models_folder}/model && cp classifier.zip {models_folder}/model && ls {models_folder}/model
Archive:  mednist_classifier_data.zip
 extracting: classifier.zip          
 extracting: input/AbdomenCT_007000.jpeg  
classifier.zip

Set up environment variables

The application uses well-known enviornment variables for the input/output data path, working dir, as well as AI model file path if applicable. Defaults are used if these environment variable are absent.

Set the environment variables corresponding to the extracted data path.

%env HOLOSCAN_INPUT_PATH {input_folder}
%env HOLOSCAN_OUTPUT_PATH {output_folder}
%env HOLOSCAN_MODEL_PATH {models_folder}
env: HOLOSCAN_INPUT_PATH=input
env: HOLOSCAN_OUTPUT_PATH=output
env: HOLOSCAN_MODEL_PATH=models

Package app (creating MAP container image)

Now we can use the CLI package command to build the MONAI Application Package (MAP) container image based on a supported base image

Use -l DEBUG option to see progress.

Note

This assumes that NVIDIA Container Toolkit or nvidia docker is installed on the local machine.

tag_prefix = "mednist_app"

!monai-deploy package "source/examples/apps/mednist_classifier_monaideploy/mednist_classifier_monaideploy.py" -m {models_folder} -c "source/examples/apps/mednist_classifier_monaideploy/app.yaml" -t {tag_prefix}:1.0 --platform x64-workstation -l DEBUG
[2024-04-23 15:33:53,163] [INFO] (common) - Downloading CLI manifest file...
[2024-04-23 15:33:53,444] [DEBUG] (common) - Validating CLI manifest file...
[2024-04-23 15:33:53,446] [INFO] (packager.parameters) - Application: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/source/examples/apps/mednist_classifier_monaideploy/mednist_classifier_monaideploy.py
[2024-04-23 15:33:53,446] [INFO] (packager.parameters) - Detected application type: Python File
[2024-04-23 15:33:53,447] [INFO] (packager) - Scanning for models in /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models...
[2024-04-23 15:33:53,447] [DEBUG] (packager) - Model model=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models/model added.
[2024-04-23 15:33:53,447] [INFO] (packager) - Reading application configuration from /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/source/examples/apps/mednist_classifier_monaideploy/app.yaml...
[2024-04-23 15:33:53,453] [INFO] (packager) - Generating app.json...
[2024-04-23 15:33:53,453] [INFO] (packager) - Generating pkg.json...
[2024-04-23 15:33:53,464] [DEBUG] (common) - 
=============== Begin app.json ===============
{
    "apiVersion": "1.0.0",
    "command": "[\"python3\", \"/opt/holoscan/app/mednist_classifier_monaideploy.py\"]",
    "environment": {
        "HOLOSCAN_APPLICATION": "/opt/holoscan/app",
        "HOLOSCAN_INPUT_PATH": "input/",
        "HOLOSCAN_OUTPUT_PATH": "output/",
        "HOLOSCAN_WORKDIR": "/var/holoscan",
        "HOLOSCAN_MODEL_PATH": "/opt/holoscan/models",
        "HOLOSCAN_CONFIG_PATH": "/var/holoscan/app.yaml",
        "HOLOSCAN_APP_MANIFEST_PATH": "/etc/holoscan/app.json",
        "HOLOSCAN_PKG_MANIFEST_PATH": "/etc/holoscan/pkg.json",
        "HOLOSCAN_DOCS_PATH": "/opt/holoscan/docs",
        "HOLOSCAN_LOGS_PATH": "/var/holoscan/logs"
    },
    "input": {
        "path": "input/",
        "formats": null
    },
    "liveness": null,
    "output": {
        "path": "output/",
        "formats": null
    },
    "readiness": null,
    "sdk": "monai-deploy",
    "sdkVersion": "0.5.1",
    "timeout": 0,
    "version": 1.0,
    "workingDirectory": "/var/holoscan"
}
================ End app.json ================
                 
[2024-04-23 15:33:53,465] [DEBUG] (common) - 
=============== Begin pkg.json ===============
{
    "apiVersion": "1.0.0",
    "applicationRoot": "/opt/holoscan/app",
    "modelRoot": "/opt/holoscan/models",
    "models": {
        "model": "/opt/holoscan/models/model"
    },
    "resources": {
        "cpu": 1,
        "gpu": 1,
        "memory": "1Gi",
        "gpuMemory": "1Gi"
    },
    "version": 1.0,
    "platformConfig": "dgpu"
}
================ End pkg.json ================
                 
[2024-04-23 15:33:53,510] [DEBUG] (packager.builder) - 
========== Begin Dockerfile ==========


FROM nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu

ENV DEBIAN_FRONTEND=noninteractive
ENV TERM=xterm-256color

ARG UNAME
ARG UID
ARG GID

RUN mkdir -p /etc/holoscan/ \
        && mkdir -p /opt/holoscan/ \
        && mkdir -p /var/holoscan \
        && mkdir -p /opt/holoscan/app \
        && mkdir -p /var/holoscan/input \
        && mkdir -p /var/holoscan/output

LABEL base="nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu"
LABEL tag="mednist_app:1.0"
LABEL org.opencontainers.image.title="MONAI Deploy App Package - MedNIST Classifier App"
LABEL org.opencontainers.image.version="1.0"
LABEL org.nvidia.holoscan="2.0.0"
LABEL org.monai.deploy.app-sdk="0.5.1"


ENV HOLOSCAN_ENABLE_HEALTH_CHECK=true
ENV HOLOSCAN_INPUT_PATH=/var/holoscan/input
ENV HOLOSCAN_OUTPUT_PATH=/var/holoscan/output
ENV HOLOSCAN_WORKDIR=/var/holoscan
ENV HOLOSCAN_APPLICATION=/opt/holoscan/app
ENV HOLOSCAN_TIMEOUT=0
ENV HOLOSCAN_MODEL_PATH=/opt/holoscan/models
ENV HOLOSCAN_DOCS_PATH=/opt/holoscan/docs
ENV HOLOSCAN_CONFIG_PATH=/var/holoscan/app.yaml
ENV HOLOSCAN_APP_MANIFEST_PATH=/etc/holoscan/app.json
ENV HOLOSCAN_PKG_MANIFEST_PATH=/etc/holoscan/pkg.json
ENV HOLOSCAN_LOGS_PATH=/var/holoscan/logs
ENV PATH=/root/.local/bin:/opt/nvidia/holoscan:$PATH
ENV LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/libtorch/1.13.1/lib/:/opt/nvidia/holoscan/lib

RUN apt-get update \
    && apt-get install -y curl jq \
    && rm -rf /var/lib/apt/lists/*

ENV PYTHONPATH="/opt/holoscan/app:$PYTHONPATH"


RUN groupadd -f -g $GID $UNAME
RUN useradd -rm -d /home/$UNAME -s /bin/bash -g $GID -G sudo -u $UID $UNAME
RUN chown -R holoscan /var/holoscan 
RUN chown -R holoscan /var/holoscan/input 
RUN chown -R holoscan /var/holoscan/output 

# Set the working directory
WORKDIR /var/holoscan

# Copy HAP/MAP tool script
COPY ./tools /var/holoscan/tools
RUN chmod +x /var/holoscan/tools


# Copy gRPC health probe

USER $UNAME

ENV PATH=/root/.local/bin:/home/holoscan/.local/bin:/opt/nvidia/holoscan:$PATH

COPY ./pip/requirements.txt /tmp/requirements.txt

RUN pip install --upgrade pip
RUN pip install --no-cache-dir --user -r /tmp/requirements.txt

 
# MONAI Deploy

# Copy user-specified MONAI Deploy SDK file
COPY ./monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl /tmp/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl
RUN pip install /tmp/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl


COPY ./models  /opt/holoscan/models

COPY ./map/app.json /etc/holoscan/app.json
COPY ./app.config /var/holoscan/app.yaml
COPY ./map/pkg.json /etc/holoscan/pkg.json

COPY ./app /opt/holoscan/app

ENTRYPOINT ["/var/holoscan/tools"]
=========== End Dockerfile ===========

[2024-04-23 15:33:53,510] [INFO] (packager.builder) - 
===============================================================================
Building image for:                 x64-workstation
    Architecture:                   linux/amd64
    Base Image:                     nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu
    Build Image:                    N/A
    Cache:                          Enabled
    Configuration:                  dgpu
    Holoscan SDK Package:           pypi.org
    MONAI Deploy App SDK Package:   /home/mqin/src/monai-deploy-app-sdk/dist/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl
    gRPC Health Probe:              N/A
    SDK Version:                    2.0.0
    SDK:                            monai-deploy
    Tag:                            mednist_app-x64-workstation-dgpu-linux-amd64:1.0
    
[2024-04-23 15:33:53,781] [INFO] (common) - Using existing Docker BuildKit builder `holoscan_app_builder`
[2024-04-23 15:33:53,782] [DEBUG] (packager.builder) - Building Holoscan Application Package: tag=mednist_app-x64-workstation-dgpu-linux-amd64:1.0
#0 building with "holoscan_app_builder" instance using docker-container driver

#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 2.67kB done
#1 DONE 0.0s

#2 [internal] load metadata for nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu
#2 DONE 0.1s

#3 [internal] load .dockerignore
#3 transferring context: 1.79kB done
#3 DONE 0.0s

#4 [internal] load build context
#4 DONE 0.0s

#5 importing cache manifest from local:3840576277762201667
#5 inferred cache manifest type: application/vnd.oci.image.index.v1+json done
#5 DONE 0.0s

#6 [ 1/21] FROM nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu@sha256:20adbccd2c7b12dfb1798f6953f071631c3b85cd337858a7506f8e420add6d4a
#6 resolve nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu@sha256:20adbccd2c7b12dfb1798f6953f071631c3b85cd337858a7506f8e420add6d4a 0.0s done
#6 DONE 0.0s

#7 importing cache manifest from nvcr.io/nvidia/clara-holoscan/holoscan:v2.0.0-dgpu
#7 inferred cache manifest type: application/vnd.docker.distribution.manifest.list.v2+json done
#7 DONE 0.4s

#4 [internal] load build context
#4 transferring context: 28.73MB 0.2s done
#4 DONE 0.2s

#8 [10/21] COPY ./tools /var/holoscan/tools
#8 CACHED

#9 [11/21] RUN chmod +x /var/holoscan/tools
#9 CACHED

#10 [ 5/21] RUN useradd -rm -d /home/holoscan -s /bin/bash -g 1000 -G sudo -u 1000 holoscan
#10 CACHED

#11 [ 4/21] RUN groupadd -f -g 1000 holoscan
#11 CACHED

#12 [ 6/21] RUN chown -R holoscan /var/holoscan
#12 CACHED

#13 [ 2/21] RUN mkdir -p /etc/holoscan/         && mkdir -p /opt/holoscan/         && mkdir -p /var/holoscan         && mkdir -p /opt/holoscan/app         && mkdir -p /var/holoscan/input         && mkdir -p /var/holoscan/output
#13 CACHED

#14 [ 9/21] WORKDIR /var/holoscan
#14 CACHED

#15 [ 3/21] RUN apt-get update     && apt-get install -y curl jq     && rm -rf /var/lib/apt/lists/*
#15 CACHED

#16 [ 8/21] RUN chown -R holoscan /var/holoscan/output
#16 CACHED

#17 [ 7/21] RUN chown -R holoscan /var/holoscan/input
#17 CACHED

#18 [12/21] COPY ./pip/requirements.txt /tmp/requirements.txt
#18 CACHED

#19 [13/21] RUN pip install --upgrade pip
#19 CACHED

#20 [14/21] RUN pip install --no-cache-dir --user -r /tmp/requirements.txt
#20 0.770 Collecting monai>=1.2.0 (from -r /tmp/requirements.txt (line 1))
#20 0.845   Downloading monai-1.3.0-202310121228-py3-none-any.whl.metadata (10 kB)
#20 1.064 Collecting Pillow>=8.4.0 (from -r /tmp/requirements.txt (line 2))
#20 1.068   Downloading pillow-10.3.0-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (9.2 kB)
#20 1.168 Collecting pydicom>=2.3.0 (from -r /tmp/requirements.txt (line 3))
#20 1.179   Downloading pydicom-2.4.4-py3-none-any.whl.metadata (7.8 kB)
#20 1.292 Collecting highdicom>=0.18.2 (from -r /tmp/requirements.txt (line 4))
#20 1.299   Downloading highdicom-0.22.0-py3-none-any.whl.metadata (3.8 kB)
#20 1.417 Collecting SimpleITK>=2.0.0 (from -r /tmp/requirements.txt (line 5))
#20 1.422   Downloading SimpleITK-2.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.9 kB)
#20 1.424 Requirement already satisfied: setuptools>=59.5.0 in /usr/lib/python3/dist-packages (from -r /tmp/requirements.txt (line 6)) (59.6.0)
#20 1.492 Requirement already satisfied: numpy>=1.20 in /usr/local/lib/python3.10/dist-packages (from monai>=1.2.0->-r /tmp/requirements.txt (line 1)) (1.23.5)
#20 1.536 Collecting torch>=1.9 (from monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 1.541   Downloading torch-2.2.2-cp310-cp310-manylinux1_x86_64.whl.metadata (26 kB)
#20 1.728 Collecting pillow-jpls>=1.0 (from highdicom>=0.18.2->-r /tmp/requirements.txt (line 4))
#20 1.808   Downloading pillow_jpls-1.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.1 kB)
#20 1.882 Collecting filelock (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 1.886   Downloading filelock-3.13.4-py3-none-any.whl.metadata (2.8 kB)
#20 1.912 Collecting typing-extensions>=4.8.0 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 1.915   Downloading typing_extensions-4.11.0-py3-none-any.whl.metadata (3.0 kB)
#20 1.942 Collecting sympy (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 1.946   Downloading sympy-1.12-py3-none-any.whl.metadata (12 kB)
#20 1.976 Collecting networkx (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 1.980   Downloading networkx-3.3-py3-none-any.whl.metadata (5.1 kB)
#20 1.982 Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1)) (3.1.3)
#20 2.021 Collecting fsspec (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.025   Downloading fsspec-2024.3.1-py3-none-any.whl.metadata (6.8 kB)
#20 2.045 Collecting nvidia-cuda-nvrtc-cu12==12.1.105 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.050   Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB)
#20 2.073 Collecting nvidia-cuda-runtime-cu12==12.1.105 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.077   Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB)
#20 2.095 Collecting nvidia-cuda-cupti-cu12==12.1.105 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.099   Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB)
#20 2.117 Collecting nvidia-cudnn-cu12==8.9.2.26 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.120   Downloading nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB)
#20 2.136 Collecting nvidia-cublas-cu12==12.1.3.1 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.139   Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB)
#20 2.155 Collecting nvidia-cufft-cu12==11.0.2.54 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.158   Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB)
#20 2.172 Collecting nvidia-curand-cu12==10.3.2.106 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.175   Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl.metadata (1.5 kB)
#20 2.193 Collecting nvidia-cusolver-cu12==11.4.5.107 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.197   Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB)
#20 2.214 Collecting nvidia-cusparse-cu12==12.1.0.106 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.218   Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl.metadata (1.6 kB)
#20 2.233 Collecting nvidia-nccl-cu12==2.19.3 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.238   Downloading nvidia_nccl_cu12-2.19.3-py3-none-manylinux1_x86_64.whl.metadata (1.8 kB)
#20 2.258 Collecting nvidia-nvtx-cu12==12.1.105 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.262   Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl.metadata (1.7 kB)
#20 2.286 Collecting triton==2.2.0 (from torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.291   Downloading triton-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.4 kB)
#20 2.327 Collecting nvidia-nvjitlink-cu12 (from nvidia-cusolver-cu12==11.4.5.107->torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.336   Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
#20 2.404 Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1)) (2.1.3)
#20 2.442 Collecting mpmath>=0.19 (from sympy->torch>=1.9->monai>=1.2.0->-r /tmp/requirements.txt (line 1))
#20 2.446   Downloading mpmath-1.3.0-py3-none-any.whl.metadata (8.6 kB)
#20 2.469 Downloading monai-1.3.0-202310121228-py3-none-any.whl (1.3 MB)
#20 2.504    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 48.0 MB/s eta 0:00:00
#20 2.510 Downloading pillow-10.3.0-cp310-cp310-manylinux_2_28_x86_64.whl (4.5 MB)
#20 2.559    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.5/4.5 MB 104.0 MB/s eta 0:00:00
#20 2.683 Downloading pydicom-2.4.4-py3-none-any.whl (1.8 MB)
#20 2.704    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 104.8 MB/s eta 0:00:00
#20 2.710 Downloading highdicom-0.22.0-py3-none-any.whl (825 kB)
#20 2.719    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 825.0/825.0 kB 133.9 MB/s eta 0:00:00
#20 2.728 Downloading SimpleITK-2.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (52.7 MB)
#20 3.239    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.7/52.7 MB 115.0 MB/s eta 0:00:00
#20 3.245 Downloading pillow_jpls-1.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (305 kB)
#20 3.249    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 305.6/305.6 kB 225.9 MB/s eta 0:00:00
#20 3.254 Downloading torch-2.2.2-cp310-cp310-manylinux1_x86_64.whl (755.5 MB)
#20 10.47    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 755.5/755.5 MB 116.7 MB/s eta 0:00:00
#20 10.48 Downloading nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB)
#20 14.42    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 410.6/410.6 MB 113.5 MB/s eta 0:00:00
#20 14.43 Downloading nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB)
#20 14.59    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.1/14.1 MB 82.1 MB/s eta 0:00:00
#20 14.60 Downloading nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB)
#20 14.90    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 23.7/23.7 MB 56.1 MB/s eta 0:00:00
#20 14.90 Downloading nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB)
#20 14.91    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 823.6/823.6 kB 161.9 MB/s eta 0:00:00
#20 14.92 Downloading nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl (731.7 MB)
#20 22.57    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 731.7/731.7 MB 27.6 MB/s eta 0:00:00
#20 22.58 Downloading nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB)
#20 23.73    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.6/121.6 MB 116.1 MB/s eta 0:00:00
#20 23.74 Downloading nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB)
#20 24.30    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.5/56.5 MB 72.6 MB/s eta 0:00:00
#20 24.30 Downloading nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB)
#20 25.47    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 MB 121.6 MB/s eta 0:00:00
#20 25.48 Downloading nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB)
#20 27.64    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 196.0/196.0 MB 112.9 MB/s eta 0:00:00
#20 27.65 Downloading nvidia_nccl_cu12-2.19.3-py3-none-manylinux1_x86_64.whl (166.0 MB)
#20 30.17    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 166.0/166.0 MB 24.2 MB/s eta 0:00:00
#20 30.17 Downloading nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB)
#20 30.18    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 99.1/99.1 kB 124.9 MB/s eta 0:00:00
#20 30.18 Downloading triton-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (167.9 MB)
#20 31.71    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 167.9/167.9 MB 117.2 MB/s eta 0:00:00
#20 31.72 Downloading typing_extensions-4.11.0-py3-none-any.whl (34 kB)
#20 31.72 Downloading filelock-3.13.4-py3-none-any.whl (11 kB)
#20 31.73 Downloading fsspec-2024.3.1-py3-none-any.whl (171 kB)
#20 31.73    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 172.0/172.0 kB 289.9 MB/s eta 0:00:00
#20 31.73 Downloading networkx-3.3-py3-none-any.whl (1.7 MB)
#20 31.75    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 135.8 MB/s eta 0:00:00
#20 31.75 Downloading sympy-1.12-py3-none-any.whl (5.7 MB)
#20 31.81    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 104.1 MB/s eta 0:00:00
#20 31.81 Downloading mpmath-1.3.0-py3-none-any.whl (536 kB)
#20 31.82    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 536.2/536.2 kB 207.8 MB/s eta 0:00:00
#20 32.03 Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (21.1 MB)
#20 32.25    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.1/21.1 MB 112.2 MB/s eta 0:00:00
#20 39.25 Installing collected packages: SimpleITK, mpmath, typing-extensions, sympy, pydicom, Pillow, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, networkx, fsspec, filelock, triton, pillow-jpls, nvidia-cusparse-cu12, nvidia-cudnn-cu12, nvidia-cusolver-cu12, highdicom, torch, monai
#20 82.08 Successfully installed Pillow-10.3.0 SimpleITK-2.3.1 filelock-3.13.4 fsspec-2024.3.1 highdicom-0.22.0 monai-1.3.0 mpmath-1.3.0 networkx-3.3 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-8.9.2.26 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.19.3 nvidia-nvjitlink-cu12-12.4.127 nvidia-nvtx-cu12-12.1.105 pillow-jpls-1.3.2 pydicom-2.4.4 sympy-1.12 torch-2.2.2 triton-2.2.0 typing-extensions-4.11.0
#20 DONE 84.1s

#21 [15/21] COPY ./monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl /tmp/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl
#21 DONE 0.3s

#22 [16/21] RUN pip install /tmp/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl
#22 0.474 Defaulting to user installation because normal site-packages is not writeable
#22 0.542 Processing /tmp/monai_deploy_app_sdk-0.5.1+20.gb869749.dirty-py3-none-any.whl
#22 0.554 Requirement already satisfied: numpy>=1.21.6 in /usr/local/lib/python3.10/dist-packages (from monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (1.23.5)
#22 0.676 Collecting holoscan~=2.0 (from monai-deploy-app-sdk==0.5.1+20.gb869749.dirty)
#22 0.744   Downloading holoscan-2.0.0-cp310-cp310-manylinux_2_35_x86_64.whl.metadata (6.7 kB)
#22 0.816 Collecting colorama>=0.4.1 (from monai-deploy-app-sdk==0.5.1+20.gb869749.dirty)
#22 0.820   Downloading colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB)
#22 0.892 Collecting typeguard>=3.0.0 (from monai-deploy-app-sdk==0.5.1+20.gb869749.dirty)
#22 0.896   Downloading typeguard-4.2.1-py3-none-any.whl.metadata (3.7 kB)
#22 0.935 Requirement already satisfied: pip>=20.3 in /home/holoscan/.local/lib/python3.10/site-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (24.0)
#22 0.936 Requirement already satisfied: cupy-cuda12x==12.2 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (12.2.0)
#22 0.937 Requirement already satisfied: cloudpickle==2.2.1 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2.2.1)
#22 0.937 Requirement already satisfied: python-on-whales==0.60.1 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (0.60.1)
#22 0.938 Requirement already satisfied: Jinja2==3.1.3 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (3.1.3)
#22 0.939 Requirement already satisfied: packaging==23.1 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (23.1)
#22 0.940 Requirement already satisfied: pyyaml==6.0 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (6.0)
#22 0.940 Requirement already satisfied: requests==2.31.0 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2.31.0)
#22 0.941 Requirement already satisfied: psutil==5.9.6 in /usr/local/lib/python3.10/dist-packages (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (5.9.6)
#22 0.975 Collecting wheel-axle-runtime<1.0 (from holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty)
#22 0.979   Downloading wheel_axle_runtime-0.0.5-py3-none-any.whl.metadata (7.7 kB)
#22 1.016 Requirement already satisfied: fastrlock>=0.5 in /usr/local/lib/python3.10/dist-packages (from cupy-cuda12x==12.2->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (0.8.2)
#22 1.019 Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from Jinja2==3.1.3->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2.1.3)
#22 1.031 Requirement already satisfied: pydantic<2,>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (1.10.15)
#22 1.032 Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (4.66.2)
#22 1.032 Requirement already satisfied: typer>=0.4.1 in /usr/local/lib/python3.10/dist-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (0.12.3)
#22 1.033 Requirement already satisfied: typing-extensions in /home/holoscan/.local/lib/python3.10/site-packages (from python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (4.11.0)
#22 1.042 Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (3.3.2)
#22 1.043 Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (3.7)
#22 1.043 Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2.2.1)
#22 1.044 Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests==2.31.0->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2024.2.2)
#22 1.061 Requirement already satisfied: filelock in /home/holoscan/.local/lib/python3.10/site-packages (from wheel-axle-runtime<1.0->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (3.13.4)
#22 1.081 Requirement already satisfied: click>=8.0.0 in /usr/local/lib/python3.10/dist-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (8.1.7)
#22 1.082 Requirement already satisfied: shellingham>=1.3.0 in /usr/local/lib/python3.10/dist-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (1.5.4)
#22 1.083 Requirement already satisfied: rich>=10.11.0 in /usr/local/lib/python3.10/dist-packages (from typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (13.7.1)
#22 1.120 Requirement already satisfied: markdown-it-py>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (3.0.0)
#22 1.121 Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.10/dist-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (2.17.2)
#22 1.143 Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.10/dist-packages (from markdown-it-py>=2.2.0->rich>=10.11.0->typer>=0.4.1->python-on-whales==0.60.1->holoscan~=2.0->monai-deploy-app-sdk==0.5.1+20.gb869749.dirty) (0.1.2)
#22 1.157 Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)
#22 1.181 Downloading holoscan-2.0.0-cp310-cp310-manylinux_2_35_x86_64.whl (33.2 MB)
#22 1.668    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 33.2/33.2 MB 36.8 MB/s eta 0:00:00
#22 1.673 Downloading typeguard-4.2.1-py3-none-any.whl (34 kB)
#22 1.696 Downloading wheel_axle_runtime-0.0.5-py3-none-any.whl (12 kB)
#22 2.029 Installing collected packages: wheel-axle-runtime, typeguard, colorama, holoscan, monai-deploy-app-sdk
#22 2.773 Successfully installed colorama-0.4.6 holoscan-2.0.0 monai-deploy-app-sdk-0.5.1+20.gb869749.dirty typeguard-4.2.1 wheel-axle-runtime-0.0.5
#22 DONE 3.3s

#23 [17/21] COPY ./models  /opt/holoscan/models
#23 DONE 0.3s

#24 [18/21] COPY ./map/app.json /etc/holoscan/app.json
#24 DONE 0.1s

#25 [19/21] COPY ./app.config /var/holoscan/app.yaml
#25 DONE 0.1s

#26 [20/21] COPY ./map/pkg.json /etc/holoscan/pkg.json
#26 DONE 0.1s

#27 [21/21] COPY ./app /opt/holoscan/app
#27 DONE 0.1s

#28 exporting to docker image format
#28 exporting layers
#28 exporting layers 152.5s done
#28 exporting manifest sha256:c70218a06b33b272ca2399e48c3804a3632c64ad2026e95b55d9ddae0cae7e74 0.0s done
#28 exporting config sha256:553a0ca99a2e26fc278babfe7ad247254fbaeadbc8cbad3e0053f4aede0c3aae 0.0s done
#28 sending tarball
#28 ...

#29 importing to docker
#29 loading layer 072c1594ab9f 557.06kB / 2.90GB
#29 loading layer 072c1594ab9f 108.63MB / 2.90GB 6.2s
#29 loading layer 072c1594ab9f 317.52MB / 2.90GB 10.3s
#29 loading layer 072c1594ab9f 514.72MB / 2.90GB 14.5s
#29 loading layer 072c1594ab9f 706.90MB / 2.90GB 18.6s
#29 loading layer 072c1594ab9f 889.06MB / 2.90GB 22.7s
#29 loading layer 072c1594ab9f 1.09GB / 2.90GB 26.8s
#29 loading layer 072c1594ab9f 1.32GB / 2.90GB 31.0s
#29 loading layer 072c1594ab9f 1.52GB / 2.90GB 35.1s
#29 loading layer 072c1594ab9f 1.76GB / 2.90GB 39.2s
#29 loading layer 072c1594ab9f 1.95GB / 2.90GB 43.3s
#29 loading layer 072c1594ab9f 1.98GB / 2.90GB 50.1s
#29 loading layer 072c1594ab9f 2.13GB / 2.90GB 56.3s
#29 loading layer 072c1594ab9f 2.32GB / 2.90GB 60.5s
#29 loading layer 072c1594ab9f 2.54GB / 2.90GB 64.6s
#29 loading layer 072c1594ab9f 2.73GB / 2.90GB 68.8s
#29 loading layer 072c1594ab9f 2.90GB / 2.90GB 75.0s
#29 loading layer f11aaa8e87ac 32.77kB / 125.83kB
#29 loading layer fa504599989c 557.06kB / 67.36MB
#29 loading layer 671fce7ea0e7 262.14kB / 25.59MB
#29 loading layer 1c0f42dfa575 514B / 514B
#29 loading layer 6fb8bbe7fb20 698B / 698B
#29 loading layer 2903a6d1ea2e 300B / 300B
#29 loading layer 1602070f430e 4.17kB / 4.17kB
#29 loading layer fa504599989c 557.06kB / 67.36MB 3.3s done
#29 loading layer 072c1594ab9f 2.90GB / 2.90GB 78.6s done
#29 loading layer f11aaa8e87ac 32.77kB / 125.83kB 3.4s done
#29 loading layer 671fce7ea0e7 262.14kB / 25.59MB 1.4s done
#29 loading layer 1c0f42dfa575 514B / 514B 1.0s done
#29 loading layer 6fb8bbe7fb20 698B / 698B 0.9s done
#29 loading layer 2903a6d1ea2e 300B / 300B 0.9s done
#29 loading layer 1602070f430e 4.17kB / 4.17kB 0.8s done
#29 DONE 78.6s

#28 exporting to docker image format
#28 sending tarball 119.9s done
#28 DONE 272.5s

#30 exporting cache to client directory
#30 preparing build cache for export
#30 writing layer sha256:014cff740c9ec6e9a30d0b859219a700ae880eb385d62095d348f5ea136d6015
#30 writing layer sha256:014cff740c9ec6e9a30d0b859219a700ae880eb385d62095d348f5ea136d6015 done
#30 writing layer sha256:0487800842442c7a031a39e1e1857bc6dae4b4f7e5daf3d625f7a8a4833fb364 done
#30 writing layer sha256:06c6aee94862daf0603783db4e1de6f8524b30ac9fbe0374ab3f1d85b2f76f7f done
#30 writing layer sha256:0a1756432df4a4350712d8ae5c003f1526bd2180800b3ae6301cfc9ccf370254 done
#30 writing layer sha256:0a77dcbd0e648ddc4f8e5230ade8fdb781d99e24fa4f13ca96a360c7f7e6751f done
#30 writing layer sha256:0ec682bf99715a9f88631226f3749e2271b8b9f254528ef61f65ed829984821c done
#30 writing layer sha256:1c5c3aa9c2c8bfd1b9eb36248f5b6d67b3db73ef43440f9dd897615771974b39 done
#30 writing layer sha256:1f73278b7f17492ce1a8b28b139d54596961596d6790dc20046fa6d5909f3e9c done
#30 writing layer sha256:2070dbb5e4fec1f79111a1b9934b95a4bda91fff6888840d2e53b48b655f352d
#30 writing layer sha256:2070dbb5e4fec1f79111a1b9934b95a4bda91fff6888840d2e53b48b655f352d 1.3s done
#30 writing layer sha256:20d331454f5fb557f2692dfbdbe092c718fd2cb55d5db9d661b62228dacca5c2
#30 writing layer sha256:20d331454f5fb557f2692dfbdbe092c718fd2cb55d5db9d661b62228dacca5c2 done
#30 writing layer sha256:238f69a43816e481f0295995fcf5fe74d59facf0f9f99734c8d0a2fb140630e0 done
#30 writing layer sha256:2ad84487f9d4d31cd1e0a92697a5447dd241935253d036b272ef16d31620c1e7 done
#30 writing layer sha256:2f65750928993b5b31fe572d9e085b53853c5a344feeb0e8615898e285a8c256 done
#30 writing layer sha256:2f868c17ea0c13f86d79c6ea231aa9677089aa72e290ec3b95f9983f46048136 0.0s done
#30 writing layer sha256:3777c6498f08c0400339c243e827d465075b7296eb2526e38d9b01c84f8764d8 done
#30 writing layer sha256:3e3e04011ebdba380ab129f0ee390626cb2a600623815ca756340c18bedb9517 done
#30 writing layer sha256:42619ce4a0c9e54cfd0ee41a8e5f27d58b3f51becabd1ac6de725fbe6c42b14a done
#30 writing layer sha256:49bdc9abf8a437ccff67cc11490ba52c976577992909856a86be872a34d3b950 done
#30 writing layer sha256:4b691ba9f48b41eaa0c754feba8366f1c030464fcbc55eeffa6c86675990933a done
#30 writing layer sha256:4d04a8db404f16c2704fa10739cb6745a0187713a21a6ef0deb34b48629b54c1 done
#30 writing layer sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1 done
#30 writing layer sha256:51d232f1f4212f460f268207affeab6246b2b60caf6d9f61b4fab22202848747 0.0s done
#30 writing layer sha256:53e291bdb605f68216b97e71175b5a348001546ff073083ae622feb044916445 done
#30 writing layer sha256:542bc8c8d18fbc95e6794122c3593a4a693f8ab6dda4460406f4d7b1ae64a2bc done
#30 writing layer sha256:57f244836ad318f9bbb3b29856ae1a5b31038bfbb9b43d2466d51c199eb55041 done
#30 writing layer sha256:5b5b131e0f20db4cb8e568b623a95f8fc16ed1c6b322a9366df70b59a881f24f done
#30 writing layer sha256:5b90d17b5048adcadefd0b1e4dba9a99247a8827a887e1ca042df375c85b518d done
#30 writing layer sha256:5e622c7efc8f430c7e6928544ba87f94eac8a4127d711933a0bf49e6a76eda0c
#30 writing layer sha256:5e622c7efc8f430c7e6928544ba87f94eac8a4127d711933a0bf49e6a76eda0c 0.4s done
#30 writing layer sha256:62452179df7c18e292f141d4aec29e6aba9ff8270c893731169fc6f41dc07631
#30 writing layer sha256:62452179df7c18e292f141d4aec29e6aba9ff8270c893731169fc6f41dc07631 done
#30 writing layer sha256:6630c387f5f2115bca2e646fd0c2f64e1f3d5431c2e050abe607633883eda230 done
#30 writing layer sha256:6661e0146e77a8bcb03edbfda95bf7780c8bb4c4f98bc03a398c88f4b2403d12 done
#30 writing layer sha256:717ebf8c9c66ae393ad01e50dbac4413d7b026b9c97d4d348b22ad17052a1a35 done
#30 writing layer sha256:773c6815e5e7d6855a62f8c5e2fabce3d939ded36c5420f15b54dd7908cdbcfa done
#30 writing layer sha256:7852b73ea931e3a8d3287ee7ef3cf4bad068e44f046583bfc2b81336fb299284 done
#30 writing layer sha256:7f8ec130348bcdac81c295e37fe82b4a6e5e9a3ca980a6343809c561020d82d7 done
#30 writing layer sha256:80885adcad6b5d021bb9f68b6c952018085bb4ce72011bdc0cf7fe8178b5960b done
#30 writing layer sha256:826d794ef7ef68566c3eede8e80bca99b8e40994c5dea2079da82d7d429e0203 0.0s done
#30 writing layer sha256:82a3436133b2b17bb407c7fe488932aa0ca55411f23ab55c34a6134b287c6a27 done
#30 writing layer sha256:8371d15eb4d69b1d98174dd098b8ddd5c4f19ec6f8d8b67e72dfa9891dc454b4 done
#30 writing layer sha256:85713f9b166b5add777c524ee807f6265d88b967cbeb9f961d6b09bf220c9a65 done
#30 writing layer sha256:8fe00505006a09966e763918147ef6ed55bb6695b26e4940c780ee430dc5da8e done
#30 writing layer sha256:90eae6faa5cc5ba62f12c25915cdfb1a7a51abfba0d05cb5818c3f908f4e345f done
#30 writing layer sha256:9205d97d9d3e906698bcc6c42d45727c2fa6ec2622abf953d46778c3b8c78edc done
#30 writing layer sha256:993369dbcc13162a6654d2a3e990b8d8b5f37963564d25710e12764337261ae3 done
#30 writing layer sha256:99e42a4adebadb39bf55bf94bbd9fb8034230ee19b6b0a42e6ff96f2e7794f30 done
#30 writing layer sha256:9ac855545fa90ed2bf3b388fdff9ef06ac9427b0c0fca07c9e59161983d8827e done
#30 writing layer sha256:9bd8f1c975ca5c9efcdedc1e1a31269e8492195a76e73b95af8c0fc7c7d8a2c6
#30 writing layer sha256:9bd8f1c975ca5c9efcdedc1e1a31269e8492195a76e73b95af8c0fc7c7d8a2c6 46.2s done
#30 writing layer sha256:9d19ee268e0d7bcf6716e6658ee1b0384a71d6f2f9aa1ae2085610cf7c7b316f
#30 writing layer sha256:9d19ee268e0d7bcf6716e6658ee1b0384a71d6f2f9aa1ae2085610cf7c7b316f done
#30 writing layer sha256:9fafbd4203c4fefe007a462e0d2cd4c1c7c41db2cfdc58d212279e1b9b4b230c done
#30 writing layer sha256:a1748eee9d376f97bd19225ba61dfada9986f063f4fc429e435f157abb629fc6 done
#30 writing layer sha256:a251fe5ae6c6d2d5034e4ca88b5dfe5d4827ff90b18e9b143a073232a32bb18d done
#30 writing layer sha256:a68f4e0ec09ec3b78cb4cf8e4511d658e34e7b6f676d7806ad9703194ff17604 done
#30 writing layer sha256:a8e4decc8f7289623b8fd7b9ba1ca555b5a755ebdbf81328d68209f148d9e602 done
#30 writing layer sha256:afa8073b7854514d2a2a4c91eb31250d02a8cbb6a4364c38d20219ed08dcdb13 done
#30 writing layer sha256:afde1c269453ce68a0f2b54c1ba8c5ecddeb18a19e5618a4acdef1f0fe3921af done
#30 writing layer sha256:b406feb20a37b8c87ef4f5ef814039e3adc90473d50c366b7d9bb6ded4e94a2e done
#30 writing layer sha256:b48a5fafcaba74eb5d7e7665601509e2889285b50a04b5b639a23f8adc818157 done
#30 writing layer sha256:ba9f7c75e4dd7942b944679995365aab766d3677da2e69e1d74472f471a484dd done
#30 writing layer sha256:bdc13166216ae226fa6976f9ce91f4f259d43972f1e0a9b723e436919534b2f4 done
#30 writing layer sha256:c815f0be64eded102822d81e029bd23b0d8d9a0fbfeb492ec0b4b0bc4ee777bf done
#30 writing layer sha256:c98533d2908f36a5e9b52faae83809b3b6865b50e90e2817308acfc64cd3655f done
#30 writing layer sha256:d0a18329aa85666501e304a488b966559ff54fab09dd36886f4ba1c97d9a3f4c 0.0s done
#30 writing layer sha256:d7da5c5e9a40c476c4b3188a845e3276dedfd752e015ea5113df5af64d4d43f7 done
#30 writing layer sha256:db20521a869adda8244cb64b783c65e1a911efaae0e73ae00e4a34ea6213d6ce done
#30 writing layer sha256:de6e4313f5826cce8249354c2525b5a9acde7edea4ca02018e437c9b4de3d9fc 0.0s done
#30 writing layer sha256:df4fd0ac710d7af949afbc6d25b5b4daf3f0596dabf3dec36fa7ca8fa6e1d049 done
#30 writing layer sha256:e291ddecfbe16b95ee9e90b5e90b1a3d0cfd53dc5e720d6b0f3d28e4a47cf5ac done
#30 writing layer sha256:e8acb678f16bc0c369d5cf9c184f2d3a1c773986816526e5e3e9c0354f7e757f done
#30 writing layer sha256:e9225f7ab6606813ec9acba98a064826ebfd6713a9645a58cd068538af1ecddb done
#30 writing layer sha256:f249faf9663a96b0911a903f8803b11a553c59b698013fb8343492fefdaaea90 done
#30 writing layer sha256:f608e2fbff86e98627b7e462057e7d2416522096d73fe4664b82fe6ce8a4047d done
#30 writing layer sha256:f65d191416580d6c38e3d95eee12377b75a4df548be1492618ce2a8c3c41b99e done
#30 writing config sha256:c5d7ca2ff9b60cb7174369608d683a7735902b5a315ccbd3661cdb01d19b7db3 0.0s done
#30 preparing build cache for export 48.8s done
#30 writing cache manifest sha256:43a5a705376a7f516f02e6223e10f78f54002aca9882c7de45970b0d70baa3e8 0.0s done
#30 DONE 48.8s
[2024-04-23 15:40:45,771] [INFO] (packager) - Build Summary:

Platform: x64-workstation/dgpu
    Status:     Succeeded
    Docker Tag: mednist_app-x64-workstation-dgpu-linux-amd64:1.0
    Tarball:    None

We can see that the MAP Docker image is created

!docker image ls | grep {tag_prefix}
mednist_app-x64-workstation-dgpu-linux-amd64                                              1.0                 553a0ca99a2e   5 minutes ago    17.7GB

We can choose to display and inspect the MAP manifests by running the container with the show command. Furthermore, we can also extract the manifests and other contents in the MAP by using the extract command while mapping specific folder to the host’s (we know that our MAP is compliant and supports these commands).

Note

The host folder for storing the extracted content must first be created by the user, and if it has been created by Docker on running the container, the folder needs to be deleted and re-created.

!echo "Display manifests and extract MAP contents to the host folder, ./export"
!docker run --rm {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 show
!rm -rf `pwd`/export && mkdir -p `pwd`/export
!docker run --rm -v `pwd`/export/:/var/run/holoscan/export/ {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 extract
!ls `pwd`/export
Display manifests and extract MAP contents to the host folder, ./export

============================== app.json ==============================
{
  "apiVersion": "1.0.0",
  "command": "[\"python3\", \"/opt/holoscan/app/mednist_classifier_monaideploy.py\"]",
  "environment": {
    "HOLOSCAN_APPLICATION": "/opt/holoscan/app",
    "HOLOSCAN_INPUT_PATH": "input/",
    "HOLOSCAN_OUTPUT_PATH": "output/",
    "HOLOSCAN_WORKDIR": "/var/holoscan",
    "HOLOSCAN_MODEL_PATH": "/opt/holoscan/models",
    "HOLOSCAN_CONFIG_PATH": "/var/holoscan/app.yaml",
    "HOLOSCAN_APP_MANIFEST_PATH": "/etc/holoscan/app.json",
    "HOLOSCAN_PKG_MANIFEST_PATH": "/etc/holoscan/pkg.json",
    "HOLOSCAN_DOCS_PATH": "/opt/holoscan/docs",
    "HOLOSCAN_LOGS_PATH": "/var/holoscan/logs"
  },
  "input": {
    "path": "input/",
    "formats": null
  },
  "liveness": null,
  "output": {
    "path": "output/",
    "formats": null
  },
  "readiness": null,
  "sdk": "monai-deploy",
  "sdkVersion": "0.5.1",
  "timeout": 0,
  "version": 1,
  "workingDirectory": "/var/holoscan"
}

============================== pkg.json ==============================
{
  "apiVersion": "1.0.0",
  "applicationRoot": "/opt/holoscan/app",
  "modelRoot": "/opt/holoscan/models",
  "models": {
    "model": "/opt/holoscan/models/model"
  },
  "resources": {
    "cpu": 1,
    "gpu": 1,
    "memory": "1Gi",
    "gpuMemory": "1Gi"
  },
  "version": 1,
  "platformConfig": "dgpu"
}

2024-04-23 22:40:48 [INFO] Copying application from /opt/holoscan/app to /var/run/holoscan/export/app

2024-04-23 22:40:48 [INFO] Copying application manifest file from /etc/holoscan/app.json to /var/run/holoscan/export/config/app.json
2024-04-23 22:40:48 [INFO] Copying pkg manifest file from /etc/holoscan/pkg.json to /var/run/holoscan/export/config/pkg.json
2024-04-23 22:40:48 [INFO] Copying application configuration from /var/holoscan/app.yaml to /var/run/holoscan/export/config/app.yaml

2024-04-23 22:40:48 [INFO] Copying models from /opt/holoscan/models to /var/run/holoscan/export/models

2024-04-23 22:40:48 [INFO] Copying documentation from /opt/holoscan/docs/ to /var/run/holoscan/export/docs
2024-04-23 22:40:48 [INFO] '/opt/holoscan/docs/' cannot be found.

app  config  models

Executing packaged app locally

The packaged app can be run locally through MONAI Application Runner.

# Clear the output folder and run the MAP. The input is expected to be a folder.
!rm -rf {ouput_folder}
!monai-deploy run -i $HOLOSCAN_INPUT_PATH -o $HOLOSCAN_OUTPUT_PATH mednist_app-x64-workstation-dgpu-linux-amd64:1.0
[2024-04-23 15:40:49,986] [INFO] (runner) - Checking dependencies...
[2024-04-23 15:40:49,986] [INFO] (runner) - --> Verifying if "docker" is installed...

[2024-04-23 15:40:49,986] [INFO] (runner) - --> Verifying if "docker-buildx" is installed...

[2024-04-23 15:40:49,986] [INFO] (runner) - --> Verifying if "mednist_app-x64-workstation-dgpu-linux-amd64:1.0" is available...

[2024-04-23 15:40:50,062] [INFO] (runner) - Reading HAP/MAP manifest...
Preparing to copy...?25lCopying from container - 0B?25hSuccessfully copied 2.56kB to /tmp/tmp9_4t5v97/app.json
Preparing to copy...?25lCopying from container - 0B?25hSuccessfully copied 2.05kB to /tmp/tmp9_4t5v97/pkg.json
[2024-04-23 15:40:50,322] [INFO] (runner) - --> Verifying if "nvidia-ctk" is installed...

[2024-04-23 15:40:50,322] [INFO] (runner) - --> Verifying "nvidia-ctk" version...

[2024-04-23 15:40:50,636] [INFO] (common) - Launching container (f9af17c16239) using image 'mednist_app-x64-workstation-dgpu-linux-amd64:1.0'...
    container name:      objective_merkle
    host name:           mingq-dt
    network:             host
    user:                1000:1000
    ulimits:             memlock=-1:-1, stack=67108864:67108864
    cap_add:             CAP_SYS_PTRACE
    ipc mode:            host
    shared memory size:  67108864
    devices:             
    group_add:           44
2024-04-23 22:40:51 [INFO] Launching application python3 /opt/holoscan/app/mednist_classifier_monaideploy.py ...

[2024-04-23 22:40:54,170] [INFO] (root) - Parsed args: Namespace(log_level=None, input=None, output=None, model=None, workdir=None, argv=['/opt/holoscan/app/mednist_classifier_monaideploy.py'])

[2024-04-23 22:40:54,175] [INFO] (root) - AppContext object: AppContext(input_path=/var/holoscan/input, output_path=/var/holoscan/output, model_path=/opt/holoscan/models, workdir=/var/holoscan)

[info] [app_driver.cpp:1161] Launching the driver/health checking service

[info] [gxf_executor.cpp:247] Creating context

[info] [server.cpp:87] Health checking server listening on 0.0.0.0:8777

[info] [gxf_executor.cpp:1672] Loading extensions from configs...

[info] [gxf_executor.cpp:1842] Activating Graph...

[info] [gxf_executor.cpp:1874] Running Graph...

[info] [gxf_executor.cpp:1876] Waiting for completion...

2024-04-23 22:40:54.201 INFO  gxf/std/greedy_scheduler.cpp@191: Scheduling 3 entities

/home/holoscan/.local/lib/python3.10/site-packages/monai/data/meta_tensor.py:116: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:206.)

  return torch.as_tensor(x, *args, **_kwargs).as_subclass(cls)

[2024-04-23 22:40:55,396] [INFO] (root) - Finished writing DICOM instance to file /var/holoscan/output/1.2.826.0.1.3680043.8.498.27996829466530719648374470054709482881.dcm

[2024-04-23 22:40:55,396] [INFO] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - DICOM SOP instance saved in /var/holoscan/output/1.2.826.0.1.3680043.8.498.27996829466530719648374470054709482881.dcm

2024-04-23 22:40:55.396 INFO  gxf/std/greedy_scheduler.cpp@372: Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.

[info] [gxf_executor.cpp:1879] Deactivating Graph...

2024-04-23 22:40:55.397 INFO  gxf/std/greedy_scheduler.cpp@401: Scheduler finished.

[info] [gxf_executor.cpp:1887] Graph execution finished.

[info] [gxf_executor.cpp:275] Destroying context

AbdomenCT

[2024-04-23 15:40:56,349] [INFO] (common) - Container 'objective_merkle'(f9af17c16239) exited.
!cat {output_folder}/output.json
"AbdomenCT"

Implementing and Packaging Application with MONAI Deploy App SDK

In the following sections we will discuss the details of buildng the application that was packaged and run above.

Based on the Torchscript model(classifier.zip), we will implement an application that process an input Jpeg image and write the prediction(classification) result as JSON file(output.json).

In our inference application, we will define two operators:

  1. LoadPILOperator - Load a JPEG image from the input path and pass the loaded image object to the next operator.

    • Input: a file path (Path)

    • Output: an image object in memory (Image)

  2. MedNISTClassifierOperator - Pre-transform the given image by using MONAI’s Compose class, feed to the Torchscript model (classifier.zip), and write the prediction into JSON file(output.json)

    • Pre-transforms consist of three transforms – EnsureChannelFirst, ScaleIntensity, and EnsureType.

    • Input: an image object in memory (Image)

    • Output: a folder path that the prediction result(output.json) would be written (Path)

The workflow of the application would look like this.

Workflow

Setup imports

Let’s import necessary classes/decorators and define MEDNIST_CLASSES.

import logging
import os
from pathlib import Path
from typing import Optional

import torch

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application, ConditionType, Fragment, Image, Operator, OperatorSpec
from monai.deploy.operators.dicom_text_sr_writer_operator import DICOMTextSRWriterOperator, EquipmentInfo, ModelInfo
from monai.transforms import EnsureChannelFirst, Compose, EnsureType, ScaleIntensity

MEDNIST_CLASSES = ["AbdomenCT", "BreastMRI", "CXR", "ChestCT", "Hand", "HeadCT"]

Creating Operator classes

LoadPILOperator

class LoadPILOperator(Operator):
    """Load image from the given input (DataPath) and set numpy array to the output (Image)."""

    DEFAULT_INPUT_FOLDER = Path.cwd() / "input"
    DEFAULT_OUTPUT_NAME = "image"

    # For now, need to have the input folder as an instance attribute, set on init.
    # If dynamically changing the input folder, per compute, then use a (optional) input port to convey the
    # value of the input folder, which is then emitted by a upstream operator.
    def __init__(
        self,
        fragment: Fragment,
        *args,
        input_folder: Path = DEFAULT_INPUT_FOLDER,
        output_name: str = DEFAULT_OUTPUT_NAME,
        **kwargs,
    ):
        """Creates an loader object with the input folder and the output port name overrides as needed.

        Args:
            fragment (Fragment): An instance of the Application class which is derived from Fragment.
            input_folder (Path): Folder from which to load input file(s).
                                 Defaults to `input` in the current working directory.
            output_name (str): Name of the output port, which is an image object. Defaults to `image`.
        """

        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        self.input_path = input_folder
        self.index = 0
        self.output_name_image = (
            output_name.strip() if output_name and len(output_name.strip()) > 0 else LoadPILOperator.DEFAULT_OUTPUT_NAME
        )

        super().__init__(fragment, *args, **kwargs)

    def setup(self, spec: OperatorSpec):
        """Set up the named input and output port(s)"""
        spec.output(self.output_name_image)

    def compute(self, op_input, op_output, context):
        import numpy as np
        from PIL import Image as PILImage

        # Input path is stored in the object attribute, but could change to use a named port if need be.
        input_path = self.input_path
        if input_path.is_dir():
            input_path = next(self.input_path.glob("*.*"))  # take the first file

        image = PILImage.open(input_path)
        image = image.convert("L")  # convert to greyscale image
        image_arr = np.asarray(image)

        output_image = Image(image_arr)  # create Image domain object with a numpy array
        op_output.emit(output_image, self.output_name_image)  # cannot omit the name even if single output.

MedNISTClassifierOperator

class MedNISTClassifierOperator(Operator):
    """Classifies the given image and returns the class name.

    Named inputs:
        image: Image object for which to generate the classification.
        output_folder: Optional, the path to save the results JSON file, overridingthe the one set on __init__

    Named output:
        result_text: The classification results in text.
    """

    DEFAULT_OUTPUT_FOLDER = Path.cwd() / "classification_results"
    # For testing the app directly, the model should be at the following path.
    MODEL_LOCAL_PATH = Path(os.environ.get("HOLOSCAN_MODEL_PATH", Path.cwd() / "model/model.ts"))

    def __init__(
        self,
        frament: Fragment,
        *args,
        app_context: AppContext,
        model_name: Optional[str] = "",
        model_path: Path = MODEL_LOCAL_PATH,
        output_folder: Path = DEFAULT_OUTPUT_FOLDER,
        **kwargs,
    ):
        """Creates an instance with the reference back to the containing application/fragment.

        fragment (Fragment): An instance of the Application class which is derived from Fragment.
        model_name (str, optional): Name of the model. Default to "" for single model app.
        model_path (Path): Path to the model file. Defaults to model/models.ts of current working dir.
        output_folder (Path, optional): output folder for saving the classification results JSON file.
        """

        # the names used for the model inference input and output
        self._input_dataset_key = "image"
        self._pred_dataset_key = "pred"

        # The names used for the operator input and output
        self.input_name_image = "image"
        self.output_name_result = "result_text"

        # The name of the optional input port for passing data to override the output folder path.
        self.input_name_output_folder = "output_folder"

        # The output folder set on the object can be overriden at each compute by data in the optional named input
        self.output_folder = output_folder

        # Need the name when there are multiple models loaded
        self._model_name = model_name.strip() if isinstance(model_name, str) else ""
        # Need the path to load the models when they are not loaded in the execution context
        self.model_path = model_path
        self.app_context = app_context
        self.model = self._get_model(self.app_context, self.model_path, self._model_name)

        # This needs to be at the end of the constructor.
        super().__init__(frament, *args, **kwargs)

    def _get_model(self, app_context: AppContext, model_path: Path, model_name: str):
        """Load the model with the given name from context or model path

        Args:
            app_context (AppContext): The application context object holding the model(s)
            model_path (Path): The path to the model file, as a backup to load model directly
            model_name (str): The name of the model, when multiples are loaded in the context
        """

        if app_context.models:
            # `app_context.models.get(model_name)` returns a model instance if exists.
            # If model_name is not specified and only one model exists, it returns that model.
            model = app_context.models.get(model_name)
        else:
            model = torch.jit.load(
                MedNISTClassifierOperator.MODEL_LOCAL_PATH,
                map_location=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
            )

        return model

    def setup(self, spec: OperatorSpec):
        """Set up the operator named input and named output, both are in-memory objects."""

        spec.input(self.input_name_image)
        spec.input(self.input_name_output_folder).condition(ConditionType.NONE)  # Optional for overriding.
        spec.output(self.output_name_result).condition(ConditionType.NONE)  # Not forcing a downstream receiver.

    @property
    def transform(self):
        return Compose([EnsureChannelFirst(channel_dim="no_channel"), ScaleIntensity(), EnsureType()])

    def compute(self, op_input, op_output, context):
        import json

        import torch

        img = op_input.receive(self.input_name_image).asnumpy()  # (64, 64), uint8. Input validation can be added.
        image_tensor = self.transform(img)  # (1, 64, 64), torch.float64
        image_tensor = image_tensor[None].float()  # (1, 1, 64, 64), torch.float32

        device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
        image_tensor = image_tensor.to(device)

        with torch.no_grad():
            outputs = self.model(image_tensor)

        _, output_classes = outputs.max(dim=1)

        result = MEDNIST_CLASSES[output_classes[0]]  # get the class name
        print(result)
        op_output.emit(result, self.output_name_result)

        # Get output folder, with value in optional input port overriding the obj attribute
        output_folder_on_compute = op_input.receive(self.input_name_output_folder) or self.output_folder
        Path.mkdir(output_folder_on_compute, parents=True, exist_ok=True)  # Let exception bubble up if raised.
        output_path = output_folder_on_compute / "output.json"
        with open(output_path, "w") as fp:
            json.dump(result, fp)

Creating Application class

Our application class would look like below.

It defines App class inheriting Application class.

LoadPILOperator is connected to MedNISTClassifierOperator by using self.add_flow() in compose() method of App.

class App(Application):
    """Application class for the MedNIST classifier."""

    def compose(self):
        app_context = Application.init_app_context({})  # Do not pass argv in Jupyter Notebook
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)
        load_pil_op = LoadPILOperator(self, CountCondition(self, 1), input_folder=app_input_path, name="pil_loader_op")
        classifier_op = MedNISTClassifierOperator(
            self, app_context=app_context, output_folder=app_output_path, model_path=model_path, name="classifier_op"
        )

        my_model_info = ModelInfo("MONAI WG Trainer", "MEDNIST Classifier", "0.1", "xyz")
        my_equipment = EquipmentInfo(manufacturer="MOANI Deploy App SDK", manufacturer_model="DICOM SR Writer")
        my_special_tags = {"SeriesDescription": "Not for clinical use. The result is for research use only."}
        dicom_sr_operator = DICOMTextSRWriterOperator(
            self,
            copy_tags=False,
            model_info=my_model_info,
            equipment_info=my_equipment,
            custom_tags=my_special_tags,
            output_folder=app_output_path,
        )

        self.add_flow(load_pil_op, classifier_op, {("image", "image")})
        self.add_flow(classifier_op, dicom_sr_operator, {("result_text", "text")})

Executing app locally

We can execute the app in the Jupyter notebook. Before doing so, we also need to clean the output folder which was created by running the packaged containerizd app in the previous cell.

!rm -rf $HOLOSCAN_OUTPUT_PATH
app = App().run()
[2024-04-23 15:41:01,799] [INFO] (root) - Parsed args: Namespace(log_level=None, input=None, output=None, model=None, workdir=None, argv=[])
[2024-04-23 15:41:01,818] [INFO] (root) - AppContext object: AppContext(input_path=input, output_path=output, model_path=models, workdir=)
2024-04-23 15:41:01.850 INFO  gxf/std/greedy_scheduler.cpp@191: Scheduling 3 entities
[info] [gxf_executor.cpp:247] Creating context
[info] [gxf_executor.cpp:1672] Loading extensions from configs...
[info] [gxf_executor.cpp:1842] Activating Graph...
[info] [gxf_executor.cpp:1874] Running Graph...
[info] [gxf_executor.cpp:1876] Waiting for completion...
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/monai/data/meta_tensor.py:116: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:206.)
  return torch.as_tensor(x, *args, **_kwargs).as_subclass(cls)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/pydicom/valuerep.py:443: UserWarning: Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
  warnings.warn(msg)
[2024-04-23 15:41:04,196] [INFO] (root) - Finished writing DICOM instance to file output/1.2.826.0.1.3680043.8.498.35898050102915969373889764509894247367.dcm
[2024-04-23 15:41:04,198] [INFO] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - DICOM SOP instance saved in output/1.2.826.0.1.3680043.8.498.35898050102915969373889764509894247367.dcm
AbdomenCT
2024-04-23 15:41:04.199 INFO  gxf/std/greedy_scheduler.cpp@372: Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
2024-04-23 15:41:04.200 INFO  gxf/std/greedy_scheduler.cpp@401: Scheduler finished.
[info] [gxf_executor.cpp:1879] Deactivating Graph...
[info] [gxf_executor.cpp:1887] Graph execution finished.
[info] [gxf_executor.cpp:275] Destroying context
!cat $HOLOSCAN_OUTPUT_PATH/output.json
"AbdomenCT"

Once the application is verified inside Jupyter notebook, we can write the whole application as a file(mednist_classifier_monaideploy.py) by concatenating code above, then add the following lines:

if __name__ == "__main__":
    App().run()

The above lines are needed to execute the application code by using python interpreter.

# Create an application folder
!mkdir -p mednist_app && rm -rf mednist_app/*
%%writefile mednist_app/mednist_classifier_monaideploy.py

# Copyright 2021-2023 MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#     http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import logging
import os
from pathlib import Path
from typing import Optional

import torch

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application, ConditionType, Fragment, Image, Operator, OperatorSpec
from monai.deploy.operators.dicom_text_sr_writer_operator import DICOMTextSRWriterOperator, EquipmentInfo, ModelInfo
from monai.transforms import EnsureChannelFirst, Compose, EnsureType, ScaleIntensity

MEDNIST_CLASSES = ["AbdomenCT", "BreastMRI", "CXR", "ChestCT", "Hand", "HeadCT"]


# @md.env(pip_packages=["pillow"])
class LoadPILOperator(Operator):
    """Load image from the given input (DataPath) and set numpy array to the output (Image)."""

    DEFAULT_INPUT_FOLDER = Path.cwd() / "input"
    DEFAULT_OUTPUT_NAME = "image"

    # For now, need to have the input folder as an instance attribute, set on init.
    # If dynamically changing the input folder, per compute, then use a (optional) input port to convey the
    # value of the input folder, which is then emitted by a upstream operator.
    def __init__(
        self,
        fragment: Fragment,
        *args,
        input_folder: Path = DEFAULT_INPUT_FOLDER,
        output_name: str = DEFAULT_OUTPUT_NAME,
        **kwargs,
    ):
        """Creates an loader object with the input folder and the output port name overrides as needed.

        Args:
            fragment (Fragment): An instance of the Application class which is derived from Fragment.
            input_folder (Path): Folder from which to load input file(s).
                                 Defaults to `input` in the current working directory.
            output_name (str): Name of the output port, which is an image object. Defaults to `image`.
        """

        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        self.input_path = input_folder
        self.index = 0
        self.output_name_image = (
            output_name.strip() if output_name and len(output_name.strip()) > 0 else LoadPILOperator.DEFAULT_OUTPUT_NAME
        )

        super().__init__(fragment, *args, **kwargs)

    def setup(self, spec: OperatorSpec):
        """Set up the named input and output port(s)"""
        spec.output(self.output_name_image)

    def compute(self, op_input, op_output, context):
        import numpy as np
        from PIL import Image as PILImage

        # Input path is stored in the object attribute, but could change to use a named port if need be.
        input_path = self.input_path
        if input_path.is_dir():
            input_path = next(self.input_path.glob("*.*"))  # take the first file

        image = PILImage.open(input_path)
        image = image.convert("L")  # convert to greyscale image
        image_arr = np.asarray(image)

        output_image = Image(image_arr)  # create Image domain object with a numpy array
        op_output.emit(output_image, self.output_name_image)  # cannot omit the name even if single output.


# @md.env(pip_packages=["monai"])
class MedNISTClassifierOperator(Operator):
    """Classifies the given image and returns the class name.

    Named inputs:
        image: Image object for which to generate the classification.
        output_folder: Optional, the path to save the results JSON file, overridingthe the one set on __init__

    Named output:
        result_text: The classification results in text.
    """

    DEFAULT_OUTPUT_FOLDER = Path.cwd() / "classification_results"
    # For testing the app directly, the model should be at the following path.
    MODEL_LOCAL_PATH = Path(os.environ.get("HOLOSCAN_MODEL_PATH", Path.cwd() / "model/model.ts"))

    def __init__(
        self,
        frament: Fragment,
        *args,
        app_context: AppContext,
        model_name: Optional[str] = "",
        model_path: Path = MODEL_LOCAL_PATH,
        output_folder: Path = DEFAULT_OUTPUT_FOLDER,
        **kwargs,
    ):
        """Creates an instance with the reference back to the containing application/fragment.

        fragment (Fragment): An instance of the Application class which is derived from Fragment.
        model_name (str, optional): Name of the model. Default to "" for single model app.
        model_path (Path): Path to the model file. Defaults to model/models.ts of current working dir.
        output_folder (Path, optional): output folder for saving the classification results JSON file.
        """

        # the names used for the model inference input and output
        self._input_dataset_key = "image"
        self._pred_dataset_key = "pred"

        # The names used for the operator input and output
        self.input_name_image = "image"
        self.output_name_result = "result_text"

        # The name of the optional input port for passing data to override the output folder path.
        self.input_name_output_folder = "output_folder"

        # The output folder set on the object can be overriden at each compute by data in the optional named input
        self.output_folder = output_folder

        # Need the name when there are multiple models loaded
        self._model_name = model_name.strip() if isinstance(model_name, str) else ""
        # Need the path to load the models when they are not loaded in the execution context
        self.model_path = model_path
        self.app_context = app_context
        self.model = self._get_model(self.app_context, self.model_path, self._model_name)

        # This needs to be at the end of the constructor.
        super().__init__(frament, *args, **kwargs)

    def _get_model(self, app_context: AppContext, model_path: Path, model_name: str):
        """Load the model with the given name from context or model path

        Args:
            app_context (AppContext): The application context object holding the model(s)
            model_path (Path): The path to the model file, as a backup to load model directly
            model_name (str): The name of the model, when multiples are loaded in the context
        """

        if app_context.models:
            # `app_context.models.get(model_name)` returns a model instance if exists.
            # If model_name is not specified and only one model exists, it returns that model.
            model = app_context.models.get(model_name)
        else:
            model = torch.jit.load(
                MedNISTClassifierOperator.MODEL_LOCAL_PATH,
                map_location=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
            )

        return model

    def setup(self, spec: OperatorSpec):
        """Set up the operator named input and named output, both are in-memory objects."""

        spec.input(self.input_name_image)
        spec.input(self.input_name_output_folder).condition(ConditionType.NONE)  # Optional for overriding.
        spec.output(self.output_name_result).condition(ConditionType.NONE)  # Not forcing a downstream receiver.

    @property
    def transform(self):
        return Compose([EnsureChannelFirst(channel_dim="no_channel"), ScaleIntensity(), EnsureType()])

    def compute(self, op_input, op_output, context):
        import json

        import torch

        img = op_input.receive(self.input_name_image).asnumpy()  # (64, 64), uint8. Input validation can be added.
        image_tensor = self.transform(img)  # (1, 64, 64), torch.float64
        image_tensor = image_tensor[None].float()  # (1, 1, 64, 64), torch.float32

        device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
        image_tensor = image_tensor.to(device)

        with torch.no_grad():
            outputs = self.model(image_tensor)

        _, output_classes = outputs.max(dim=1)

        result = MEDNIST_CLASSES[output_classes[0]]  # get the class name
        print(result)
        op_output.emit(result, self.output_name_result)

        # Get output folder, with value in optional input port overriding the obj attribute
        output_folder_on_compute = op_input.receive(self.input_name_output_folder) or self.output_folder
        Path.mkdir(output_folder_on_compute, parents=True, exist_ok=True)  # Let exception bubble up if raised.
        output_path = output_folder_on_compute / "output.json"
        with open(output_path, "w") as fp:
            json.dump(result, fp)


# @md.resource(cpu=1, gpu=1, memory="1Gi")
class App(Application):
    """Application class for the MedNIST classifier."""

    def compose(self):
        # Use Commandline options over environment variables to init context.
        app_context = Application.init_app_context(self.argv)
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)
        load_pil_op = LoadPILOperator(self, CountCondition(self, 1), input_folder=app_input_path, name="pil_loader_op")
        classifier_op = MedNISTClassifierOperator(
            self, app_context=app_context, output_folder=app_output_path, model_path=model_path, name="classifier_op"
        )

        my_model_info = ModelInfo("MONAI WG Trainer", "MEDNIST Classifier", "0.1", "xyz")
        my_equipment = EquipmentInfo(manufacturer="MOANI Deploy App SDK", manufacturer_model="DICOM SR Writer")
        my_special_tags = {"SeriesDescription": "Not for clinical use. The result is for research use only."}
        dicom_sr_operator = DICOMTextSRWriterOperator(
            self,
            copy_tags=False,
            model_info=my_model_info,
            equipment_info=my_equipment,
            custom_tags=my_special_tags,
            output_folder=app_output_path,
        )

        self.add_flow(load_pil_op, classifier_op, {("image", "image")})
        self.add_flow(classifier_op, dicom_sr_operator, {("result_text", "text")})


if __name__ == "__main__":
    App().run()
Writing mednist_app/mednist_classifier_monaideploy.py

This time, let’s execute the app on the command line.

Note

Since the environment variables have been set and contain the correct paths, it is not necessary to provide the command line options on running the application, though the following demonstrates the use of the options.

!python "mednist_app/mednist_classifier_monaideploy.py" -i {input_folder} -o {output_folder} -m {models_folder} -l DEBUG
[2024-04-23 15:41:08,736] [INFO] (root) - Parsed args: Namespace(log_level='DEBUG', input=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/input'), output=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output'), model=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models'), workdir=None, argv=['mednist_app/mednist_classifier_monaideploy.py', '-i', 'input', '-o', 'output', '-m', 'models', '-l', 'DEBUG'])
[2024-04-23 15:41:08,740] [INFO] (root) - AppContext object: AppContext(input_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/input, output_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output, model_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models, workdir=)
[info] [gxf_executor.cpp:247] Creating context
[info] [gxf_executor.cpp:1672] Loading extensions from configs...
[info] [gxf_executor.cpp:1842] Activating Graph...
[info] [gxf_executor.cpp:1874] Running Graph...
[info] [gxf_executor.cpp:1876] Waiting for completion...
2024-04-23 15:41:08.763 INFO  gxf/std/greedy_scheduler.cpp@191: Scheduling 3 entities
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/monai/data/meta_tensor.py:116: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:206.)
  return torch.as_tensor(x, *args, **_kwargs).as_subclass(cls)
AbdomenCT
[2024-04-23 15:41:10,980] [DEBUG] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - Writing DICOM object...

[2024-04-23 15:41:10,980] [DEBUG] (root) - Writing DICOM common modules...
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/pydicom/valuerep.py:443: UserWarning: Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
  warnings.warn(msg)
[2024-04-23 15:41:10,983] [DEBUG] (root) - DICOM common modules written:
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 198
(0002, 0001) File Meta Information Version       OB: b'01'
(0002, 0002) Media Storage SOP Class UID         UI: Basic Text SR Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728
(0002, 0010) Transfer Syntax UID                 UI: Implicit VR Little Endian
(0002, 0012) Implementation Class UID            UI: 1.2.40.0.13.1.1.1
(0002, 0013) Implementation Version Name         SH: '0.5.1+20.gb8697'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0012) Instance Creation Date              DA: '20240423'
(0008, 0013) Instance Creation Time              TM: '154110'
(0008, 0016) SOP Class UID                       UI: Basic Text SR Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728
(0008, 0020) Study Date                          DA: '20240423'
(0008, 0021) Series Date                         DA: '20240423'
(0008, 0023) Content Date                        DA: '20240423'
(0008, 002a) Acquisition DateTime                DT: '20240423154110'
(0008, 0030) Study Time                          TM: '154110'
(0008, 0031) Series Time                         TM: '154110'
(0008, 0033) Content Time                        TM: '154110'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'SR'
(0008, 0070) Manufacturer                        LO: 'MOANI Deploy App SDK'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 0201) Timezone Offset From UTC            SH: '-0700'
(0008, 1030) Study Description                   LO: 'AI results.'
(0008, 103e) Series Description                  LO: 'CAUTION: Not for Diagnostic Use, for research use only.'
(0008, 1090) Manufacturer's Model Name           LO: 'DICOM SR Writer'
(0010, 0010) Patient's Name                      PN: ''
(0010, 0020) Patient ID                          LO: ''
(0010, 0021) Issuer of Patient ID                LO: ''
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: ''
(0018, 0015) Body Part Examined                  CS: ''
(0018, 1020) Software Versions                   LO: '0.5.1+20.gb8697'
(0018, a001)  Contributing Equipment Sequence  1 item(s) ---- 
   (0008, 0070) Manufacturer                        LO: 'MONAI WG Trainer'
   (0008, 1090) Manufacturer's Model Name           LO: 'MEDNIST Classifier'
   (0018, 1002) Device UID                          UI: xyz
   (0018, 1020) Software Versions                   LO: '0.1'
   (0040, a170)  Purpose of Reference Code Sequence  1 item(s) ---- 
      (0008, 0100) Code Value                          SH: 'Newcode1'
      (0008, 0102) Coding Scheme Designator            SH: '99IHE'
      (0008, 0104) Code Meaning                        LO: '"Processing Algorithm'
      ---------
   ---------
(0020, 000d) Study Instance UID                  UI: 1.2.826.0.1.3680043.8.498.12077853224410842102200362099184100728
(0020, 000e) Series Instance UID                 UI: 1.2.826.0.1.3680043.8.498.11250566385163932995456431918851640579
(0020, 0010) Study ID                            SH: '1'
(0020, 0011) Series Number                       IS: '2474'
(0020, 0013) Instance Number                     IS: '1'
(0040, 1001) Requested Procedure ID              SH: ''
[2024-04-23 15:41:10,984] [DEBUG] (root) - DICOM dataset to be written:Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 198
(0002, 0001) File Meta Information Version       OB: b'01'
(0002, 0002) Media Storage SOP Class UID         UI: Basic Text SR Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728
(0002, 0010) Transfer Syntax UID                 UI: Implicit VR Little Endian
(0002, 0012) Implementation Class UID            UI: 1.2.40.0.13.1.1.1
(0002, 0013) Implementation Version Name         SH: '0.5.1+20.gb8697'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0012) Instance Creation Date              DA: '20240423'
(0008, 0013) Instance Creation Time              TM: '154110'
(0008, 0016) SOP Class UID                       UI: Basic Text SR Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728
(0008, 0020) Study Date                          DA: '20240423'
(0008, 0021) Series Date                         DA: '20240423'
(0008, 0023) Content Date                        DA: '20240423'
(0008, 002a) Acquisition DateTime                DT: '20240423154110'
(0008, 0030) Study Time                          TM: '154110'
(0008, 0031) Series Time                         TM: '154110'
(0008, 0033) Content Time                        TM: '154110'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'SR'
(0008, 0070) Manufacturer                        LO: 'MOANI Deploy App SDK'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 0201) Timezone Offset From UTC            SH: '-0700'
(0008, 1030) Study Description                   LO: 'AI results.'
(0008, 103e) Series Description                  LO: 'Not for clinical use. The result is for research use only.'
(0008, 1090) Manufacturer's Model Name           LO: 'DICOM SR Writer'
(0010, 0010) Patient's Name                      PN: ''
(0010, 0020) Patient ID                          LO: ''
(0010, 0021) Issuer of Patient ID                LO: ''
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: ''
(0018, 0015) Body Part Examined                  CS: ''
(0018, 1020) Software Versions                   LO: '0.5.1+20.gb8697'
(0018, a001)  Contributing Equipment Sequence  1 item(s) ---- 
   (0008, 0070) Manufacturer                        LO: 'MONAI WG Trainer'
   (0008, 1090) Manufacturer's Model Name           LO: 'MEDNIST Classifier'
   (0018, 1002) Device UID                          UI: xyz
   (0018, 1020) Software Versions                   LO: '0.1'
   (0040, a170)  Purpose of Reference Code Sequence  1 item(s) ---- 
      (0008, 0100) Code Value                          SH: 'Newcode1'
      (0008, 0102) Coding Scheme Designator            SH: '99IHE'
      (0008, 0104) Code Meaning                        LO: '"Processing Algorithm'
      ---------
   ---------
(0020, 000d) Study Instance UID                  UI: 1.2.826.0.1.3680043.8.498.12077853224410842102200362099184100728
(0020, 000e) Series Instance UID                 UI: 1.2.826.0.1.3680043.8.498.11250566385163932995456431918851640579
(0020, 0010) Study ID                            SH: '1'
(0020, 0011) Series Number                       IS: '2474'
(0020, 0013) Instance Number                     IS: '1'
(0040, 1001) Requested Procedure ID              SH: ''
(0040, a040) Value Type                          CS: 'CONTAINER'
(0040, a043)  Concept Name Code Sequence  1 item(s) ---- 
   (0008, 0100) Code Value                          SH: '18748-4'
   (0008, 0102) Coding Scheme Designator            SH: 'LN'
   (0008, 0104) Code Meaning                        LO: 'Diagnostic Imaging Report'
   ---------
(0040, a050) Continuity Of Content               CS: 'SEPARATE'
(0040, a493) Verification Flag                   CS: 'UNVERIFIED'
(0040, a730)  Content Sequence  1 item(s) ---- 
   (0040, a010) Relationship Type                   CS: 'CONTAINS'
   (0040, a040) Value Type                          CS: 'TEXT'
   (0040, a043)  Concept Name Code Sequence  1 item(s) ---- 
      (0008, 0100) Code Value                          SH: '111412'
      (0008, 0102) Coding Scheme Designator            SH: 'DCM'
      (0008, 0104) Code Meaning                        LO: 'Narrative Summary'
      ---------
   (0040, a160) Text Value                          UT: 'AbdomenCT'
   ---------
[2024-04-23 15:41:10,987] [INFO] (root) - Finished writing DICOM instance to file /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output/1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728.dcm
[2024-04-23 15:41:10,988] [INFO] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - DICOM SOP instance saved in /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output/1.2.826.0.1.3680043.8.498.10612655328316632493342405878828496728.dcm
2024-04-23 15:41:10.988 INFO  gxf/std/greedy_scheduler.cpp@372: Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
2024-04-23 15:41:10.988 INFO  gxf/std/greedy_scheduler.cpp@401: Scheduler finished.
[info] [gxf_executor.cpp:1879] Deactivating Graph...
[info] [gxf_executor.cpp:1887] Graph execution finished.
[info] [gxf_executor.cpp:275] Destroying context
!cat {output_folder}/output.json
"AbdomenCT"

Additional file required for packaging the app (creating MAP Docker image)

In this version of the App SDK, we need to write out the configuration yaml file as well as the package requirements file, in the application folder.

%%writefile mednist_app/app.yaml
%YAML 1.2
---
application:
  title: MONAI Deploy App Package - MedNIST Classifier App
  version: 1.0
  inputFormats: ["file"]
  outputFormats: ["file"]

resources:
  cpu: 1
  gpu: 1
  memory: 1Gi
  gpuMemory: 1Gi
Writing mednist_app/app.yaml
%%writefile mednist_app/requirements.txt
monai>=1.2.0
Pillow>=8.4.0
pydicom>=2.3.0
highdicom>=0.18.2
SimpleITK>=2.0.0
setuptools>=59.5.0 # for pkg_resources
Writing mednist_app/requirements.txt

By now, we have built the application and prepared all necessary files for create the MONAI Application Package (MAP).