Deploying a MedNIST Classifier App with MONAI Deploy App SDK (Prebuilt Model)

This tutorial demos the process of packaging up a trained model using MONAI Deploy App SDK into an deployable inference application which can be run as a local program, as well as an MONAI Application Package (MAP) for containerized workflow execution.

Clone the github project (the latest version of the main branch only)

!rm -rf source \
 && git clone --branch main --depth 1 https://github.com/Project-MONAI/monai-deploy-app-sdk.git source \
 && rm -rf source/.git
Cloning into 'source'...
remote: Enumerating objects: 314, done.
remote: Counting objects: 100% (314/314), done.
remote: Compressing objects: 100% (254/254), done.
remote: Total 314 (delta 71), reused 184 (delta 36), pack-reused 0 (from 0)
Receiving objects: 100% (314/314), 1.47 MiB | 3.95 MiB/s, done.
Resolving deltas: 100% (71/71), done.
!ls source/examples/apps/mednist_classifier_monaideploy/
app.yaml  mednist_classifier_monaideploy.py  requirements.txt

Install monai-deploy-app-sdk package

!pip install monai-deploy-app-sdk
Requirement already satisfied: monai-deploy-app-sdk in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (0.5.1+37.g96f7e31.dirty)
Requirement already satisfied: numpy>=1.21.6 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (1.26.4)
Requirement already satisfied: holoscan~=3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (3.1.0)
Requirement already satisfied: holoscan-cli~=3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (3.1.0)
Requirement already satisfied: colorama>=0.4.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (0.4.6)
Requirement already satisfied: tritonclient>=2.53.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (2.56.0)
Requirement already satisfied: typeguard>=3.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai-deploy-app-sdk) (4.4.2)
Requirement already satisfied: pip>22.0.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=3.0->monai-deploy-app-sdk) (25.0.1)
Requirement already satisfied: cupy-cuda12x<14.0,>=12.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=3.0->monai-deploy-app-sdk) (13.4.1)
Requirement already satisfied: cloudpickle<4.0,>=3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=3.0->monai-deploy-app-sdk) (3.1.1)
Requirement already satisfied: wheel-axle-runtime<1.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan~=3.0->monai-deploy-app-sdk) (0.0.6)
Requirement already satisfied: Jinja2<4.0.0,>=3.1.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (3.1.6)
Requirement already satisfied: packaging<24.0,>=23.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (23.2)
Requirement already satisfied: psutil<7.0.0,>=6.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (6.1.1)
Requirement already satisfied: python-on-whales<0.61.0,>=0.60.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (0.60.1)
Requirement already satisfied: pyyaml<7.0,>=6.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (6.0.2)
Requirement already satisfied: requests<3.0.0,>=2.31.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from holoscan-cli~=3.0->monai-deploy-app-sdk) (2.32.3)
Requirement already satisfied: python-rapidjson>=0.9.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient>=2.53.0->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.20)
Requirement already satisfied: urllib3>=2.0.7 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient>=2.53.0->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (2.4.0)
Requirement already satisfied: aiohttp<4.0.0,>=3.8.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (3.11.18)
Requirement already satisfied: cuda-python in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (12.8.0)
Requirement already satisfied: geventhttpclient>=2.3.3 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (2.3.3)
Requirement already satisfied: grpcio<1.68,>=1.63.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.67.1)
Requirement already satisfied: protobuf<6.0dev,>=5.26.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (5.29.4)
Requirement already satisfied: typing_extensions>=4.10.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typeguard>=3.0.0->monai-deploy-app-sdk) (4.13.2)
Requirement already satisfied: aiohappyeyeballs>=2.3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (2.6.1)
Requirement already satisfied: aiosignal>=1.1.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.3.2)
Requirement already satisfied: async-timeout<6.0,>=4.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (5.0.1)
Requirement already satisfied: attrs>=17.3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (25.3.0)
Requirement already satisfied: frozenlist>=1.1.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.6.0)
Requirement already satisfied: multidict<7.0,>=4.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (6.4.3)
Requirement already satisfied: propcache>=0.2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (0.3.1)
Requirement already satisfied: yarl<2.0,>=1.17.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.1->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.20.0)
Requirement already satisfied: fastrlock>=0.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from cupy-cuda12x<14.0,>=12.2->holoscan~=3.0->monai-deploy-app-sdk) (0.8.3)
Requirement already satisfied: gevent in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (25.4.1)
Requirement already satisfied: certifi in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (2025.1.31)
Requirement already satisfied: brotli in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (1.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from Jinja2<4.0.0,>=3.1.5->holoscan-cli~=3.0->monai-deploy-app-sdk) (3.0.2)
Requirement already satisfied: pydantic<2,>=1.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (1.10.21)
Requirement already satisfied: tqdm in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (4.67.1)
Requirement already satisfied: typer>=0.4.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (0.15.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests<3.0.0,>=2.31.0->holoscan-cli~=3.0->monai-deploy-app-sdk) (3.4.1)
Requirement already satisfied: idna<4,>=2.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from requests<3.0.0,>=2.31.0->holoscan-cli~=3.0->monai-deploy-app-sdk) (3.10)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from wheel-axle-runtime<1.0->holoscan~=3.0->monai-deploy-app-sdk) (3.18.0)
Requirement already satisfied: cuda-bindings~=12.8.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from cuda-python->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (12.8.0)
Requirement already satisfied: click>=8.0.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (8.1.8)
Requirement already satisfied: shellingham>=1.3.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (1.5.4)
Requirement already satisfied: rich>=10.11.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (14.0.0)
Requirement already satisfied: greenlet>=3.2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gevent->geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (3.2.0)
Requirement already satisfied: zope.event in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gevent->geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (5.0)
Requirement already satisfied: zope.interface in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from gevent->geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (7.2)
Requirement already satisfied: markdown-it-py>=2.2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (3.0.0)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from rich>=10.11.0->typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (2.19.1)
Requirement already satisfied: setuptools in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from zope.event->gevent->geventhttpclient>=2.3.3->tritonclient[all]>=2.53.0->monai-deploy-app-sdk) (79.0.0)
Requirement already satisfied: mdurl~=0.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from markdown-it-py>=2.2.0->rich>=10.11.0->typer>=0.4.1->python-on-whales<0.61.0,>=0.60.1->holoscan-cli~=3.0->monai-deploy-app-sdk) (0.1.2)

Install necessary packages for the app

!pip install monai Pillow # for MONAI transforms and Pillow
!python -c "import pydicom" || pip install -q "pydicom>=1.4.2"
!python -c "import highdicom" || pip install -q "highdicom>=0.18.2" # for the use of DICOM Writer operators
Requirement already satisfied: monai in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (1.4.0)
Requirement already satisfied: Pillow in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (11.2.1)
Requirement already satisfied: numpy<2.0,>=1.24 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai) (1.26.4)
Requirement already satisfied: torch>=1.9 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from monai) (2.6.0)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.18.0)
Requirement already satisfied: typing-extensions>=4.10.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (4.13.2)
Requirement already satisfied: networkx in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.4.2)
Requirement already satisfied: jinja2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.1.6)
Requirement already satisfied: fsspec in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (2025.3.2)
Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.4.127 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.127)
Requirement already satisfied: nvidia-cuda-runtime-cu12==12.4.127 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.127)
Requirement already satisfied: nvidia-cuda-cupti-cu12==12.4.127 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.127)
Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (9.1.0.70)
Requirement already satisfied: nvidia-cublas-cu12==12.4.5.8 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.5.8)
Requirement already satisfied: nvidia-cufft-cu12==11.2.1.3 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.2.1.3)
Requirement already satisfied: nvidia-curand-cu12==10.3.5.147 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (10.3.5.147)
Requirement already satisfied: nvidia-cusolver-cu12==11.6.1.9 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (11.6.1.9)
Requirement already satisfied: nvidia-cusparse-cu12==12.3.1.170 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.3.1.170)
Requirement already satisfied: nvidia-cusparselt-cu12==0.6.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (0.6.2)
Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (2.21.5)
Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.127)
Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (12.4.127)
Requirement already satisfied: triton==3.2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (3.2.0)
Requirement already satisfied: sympy==1.13.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from torch>=1.9->monai) (1.13.1)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from sympy==1.13.1->torch>=1.9->monai) (1.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages (from jinja2->torch>=1.9->monai) (3.0.2)

Download/Extract mednist_classifier_data.zip from Google Drive

Note: Data files are now access controlled. Please first request permission to access the shared folder on Google Drive, then download the zip file, mednist_classifier_data.zip found in the medmist_classifier_app folder, to the same folder as this notebook example.

import os
data_dir = os.path.join(os.path.curdir, "mednist_classifier_data.zip")
if not os.path.exists(data_dir):
    # Download mednist_classifier_data.zip
    !pip install gdown
    !gdown "https://drive.google.com/uc?id=1IoEJZFFixcNtPPKeKZfD_xSJSFQCbawl" # Redundant if already manually downloaded the file to avoid permission issue.
# Unzip the downloaded mednist_classifier_data.zip from the web browser or using gdown, to the notebook/turotials folder, and set up folders
input_folder = "input"
output_folder = "output"
models_folder = "models"
!rm -rf {input_folder}
!unzip -o "mednist_classifier_data.zip"

# Need to copy the model file to its own clean subfolder for packaging, to workaround an issue in the Packager
models_folder = "models"
!rm -rf {models_folder} && mkdir -p {models_folder}/model && cp classifier.zip {models_folder}/model && ls {models_folder}/model
Archive:  mednist_classifier_data.zip
 extracting: classifier.zip          
 extracting: input/AbdomenCT_007000.jpeg  
classifier.zip

Set up environment variables

The application uses well-known environment variables for the input/output data path, working dir, as well as AI model file path if applicable. Defaults are used if these environment variable are absent.

Set the environment variables corresponding to the extracted data path.

%env HOLOSCAN_INPUT_PATH {input_folder}
%env HOLOSCAN_OUTPUT_PATH {output_folder}
%env HOLOSCAN_MODEL_PATH {models_folder}
env: HOLOSCAN_INPUT_PATH=input
env: HOLOSCAN_OUTPUT_PATH=output
env: HOLOSCAN_MODEL_PATH=models

Package app (creating MAP container image)

Now we can use the CLI package command to build the MONAI Application Package (MAP) container image based on a supported base image

Use -l DEBUG option to see progress.

Note

This assumes that NVIDIA Container Toolkit or nvidia docker is installed on the local machine.

tag_prefix = "mednist_app"

!monai-deploy package "source/examples/apps/mednist_classifier_monaideploy/mednist_classifier_monaideploy.py" -m {models_folder} -c "source/examples/apps/mednist_classifier_monaideploy/app.yaml" -t {tag_prefix}:1.0 --platform x64-workstation -l DEBUG
usage: monai-deploy package [-h] [-l {DEBUG,INFO,WARN,ERROR,CRITICAL}]
                            --config CONFIG [--docs DOCS] [--models MODELS]
                            --platform PLATFORM [--add ADDITIONAL_LIBS]
                            [--timeout TIMEOUT] [--version VERSION]
                            [--base-image BASE_IMAGE]
                            [--build-image BUILD_IMAGE]
                            [--includes [{debug,holoviz,torch,onnx} ...]]
                            [--build-cache BUILD_CACHE]
                            [--cmake-args CMAKE_ARGS]
                            [--holoscan-sdk-file HOLOSCAN_SDK_FILE]
                            [--monai-deploy-sdk-file MONAI_DEPLOY_SDK_FILE]
                            [--no-cache] [--sdk SDK] [--source SOURCE]
                            [--sdk-version SDK_VERSION] [--output OUTPUT]
                            --tag TAG [--username USERNAME] [--uid UID]
                            [--gid GID]
                            application
monai-deploy package: error: argument --platform: x64-workstation is not a valid option for --platforms.

We can see that the MAP Docker image is created

!docker image ls | grep {tag_prefix}

We can choose to display and inspect the MAP manifests by running the container with the show command. Furthermore, we can also extract the manifests and other contents in the MAP by using the extract command while mapping specific folder to the host’s (we know that our MAP is compliant and supports these commands).

Note

The host folder for storing the extracted content must first be created by the user, and if it has been created by Docker on running the container, the folder needs to be deleted and re-created.

!echo "Display manifests and extract MAP contents to the host folder, ./export"
!docker run --rm {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 show
!rm -rf `pwd`/export && mkdir -p `pwd`/export
!docker run --rm -v `pwd`/export/:/var/run/holoscan/export/ {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 extract
!ls `pwd`/export
Display manifests and extract MAP contents to the host folder, ./export
Unable to find image 'mednist_app-x64-workstation-dgpu-linux-amd64:1.0' locally
docker: Error response from daemon: pull access denied for mednist_app-x64-workstation-dgpu-linux-amd64, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

Run 'docker run --help' for more information
Unable to find image 'mednist_app-x64-workstation-dgpu-linux-amd64:1.0' locally
docker: Error response from daemon: pull access denied for mednist_app-x64-workstation-dgpu-linux-amd64, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

Run 'docker run --help' for more information

Executing packaged app locally

The packaged app can be run locally through MONAI Application Runner.

# Clear the output folder and run the MAP. The input is expected to be a folder.
!rm -rf {ouput_folder}
!monai-deploy run -i $HOLOSCAN_INPUT_PATH -o $HOLOSCAN_OUTPUT_PATH mednist_app-x64-workstation-dgpu-linux-amd64:1.0
[2025-04-22 10:01:00,178] [INFO] (runner) - Checking dependencies...
[2025-04-22 10:01:00,178] [INFO] (runner) - --> Verifying if "docker" is installed...

[2025-04-22 10:01:00,179] [INFO] (runner) - --> Verifying if "docker-buildx" is installed...

[2025-04-22 10:01:00,179] [INFO] (runner) - --> Verifying if "mednist_app-x64-workstation-dgpu-linux-amd64:1.0" is available...

[2025-04-22 10:01:00,206] [INFO] (common) - Attempting to pull image mednist_app-x64-workstation-dgpu-linux-amd64:1.0..
Error response from daemon: pull access denied for mednist_app-x64-workstation-dgpu-linux-amd64, repository does not exist or may require 'docker login': denied: requested access to the resource is denied
[2025-04-22 10:01:01,166] [ERROR] (common) - The docker command executed was `/usr/bin/docker image pull mednist_app-x64-workstation-dgpu-linux-amd64:1.0`.
It returned with code 1
The content of stdout can be found above the stacktrace (it wasn't captured).
The content of stderr can be found above the stacktrace (it wasn't captured).
[2025-04-22 10:01:01,166] [ERROR] (runner) - Unable to fetch required image.
[2025-04-22 10:01:01,167] [ERROR] (runner) - Execution Aborted
!cat {output_folder}/output.json
cat: output/output.json: No such file or directory

Implementing and Packaging Application with MONAI Deploy App SDK

In the following sections we will discuss the details of buildng the application that was packaged and run above.

Based on the Torchscript model(classifier.zip), we will implement an application that process an input Jpeg image and write the prediction(classification) result as JSON file(output.json).

In our inference application, we will define two operators:

  1. LoadPILOperator - Load a JPEG image from the input path and pass the loaded image object to the next operator.

    • Input: a file path (Path)

    • Output: an image object in memory (Image)

  2. MedNISTClassifierOperator - Pre-transform the given image by using MONAI’s Compose class, feed to the Torchscript model (classifier.zip), and write the prediction into JSON file(output.json)

    • Pre-transforms consist of three transforms – EnsureChannelFirst, ScaleIntensity, and EnsureType.

    • Input: an image object in memory (Image)

    • Output: a folder path that the prediction result(output.json) would be written (Path)

The workflow of the application would look like this.

Workflow

Setup imports

Let’s import necessary classes/decorators and define MEDNIST_CLASSES.

import logging
import os
from pathlib import Path
from typing import Optional

import torch

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application, ConditionType, Fragment, Image, Operator, OperatorSpec
from monai.deploy.operators.dicom_text_sr_writer_operator import DICOMTextSRWriterOperator, EquipmentInfo, ModelInfo
from monai.transforms import EnsureChannelFirst, Compose, EnsureType, ScaleIntensity

MEDNIST_CLASSES = ["AbdomenCT", "BreastMRI", "CXR", "ChestCT", "Hand", "HeadCT"]

Creating Operator classes

LoadPILOperator

class LoadPILOperator(Operator):
    """Load image from the given input (DataPath) and set numpy array to the output (Image)."""

    DEFAULT_INPUT_FOLDER = Path.cwd() / "input"
    DEFAULT_OUTPUT_NAME = "image"

    # For now, need to have the input folder as an instance attribute, set on init.
    # If dynamically changing the input folder, per compute, then use a (optional) input port to convey the
    # value of the input folder, which is then emitted by a upstream operator.
    def __init__(
        self,
        fragment: Fragment,
        *args,
        input_folder: Path = DEFAULT_INPUT_FOLDER,
        output_name: str = DEFAULT_OUTPUT_NAME,
        **kwargs,
    ):
        """Creates an loader object with the input folder and the output port name overrides as needed.

        Args:
            fragment (Fragment): An instance of the Application class which is derived from Fragment.
            input_folder (Path): Folder from which to load input file(s).
                                 Defaults to `input` in the current working directory.
            output_name (str): Name of the output port, which is an image object. Defaults to `image`.
        """

        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        self.input_path = input_folder
        self.index = 0
        self.output_name_image = (
            output_name.strip() if output_name and len(output_name.strip()) > 0 else LoadPILOperator.DEFAULT_OUTPUT_NAME
        )

        super().__init__(fragment, *args, **kwargs)

    def setup(self, spec: OperatorSpec):
        """Set up the named input and output port(s)"""
        spec.output(self.output_name_image)

    def compute(self, op_input, op_output, context):
        import numpy as np
        from PIL import Image as PILImage

        # Input path is stored in the object attribute, but could change to use a named port if need be.
        input_path = self.input_path
        if input_path.is_dir():
            input_path = next(self.input_path.glob("*.*"))  # take the first file

        image = PILImage.open(input_path)
        image = image.convert("L")  # convert to greyscale image
        image_arr = np.asarray(image)

        output_image = Image(image_arr)  # create Image domain object with a numpy array
        op_output.emit(output_image, self.output_name_image)  # cannot omit the name even if single output.

MedNISTClassifierOperator

class MedNISTClassifierOperator(Operator):
    """Classifies the given image and returns the class name.

    Named inputs:
        image: Image object for which to generate the classification.
        output_folder: Optional, the path to save the results JSON file, overridingthe the one set on __init__

    Named output:
        result_text: The classification results in text.
    """

    DEFAULT_OUTPUT_FOLDER = Path.cwd() / "classification_results"
    # For testing the app directly, the model should be at the following path.
    MODEL_LOCAL_PATH = Path(os.environ.get("HOLOSCAN_MODEL_PATH", Path.cwd() / "model/model.ts"))

    def __init__(
        self,
        fragment: Fragment,
        *args,
        app_context: AppContext,
        model_name: Optional[str] = "",
        model_path: Path = MODEL_LOCAL_PATH,
        output_folder: Path = DEFAULT_OUTPUT_FOLDER,
        **kwargs,
    ):
        """Creates an instance with the reference back to the containing application/fragment.

        fragment (Fragment): An instance of the Application class which is derived from Fragment.
        model_name (str, optional): Name of the model. Default to "" for single model app.
        model_path (Path): Path to the model file. Defaults to model/models.ts of current working dir.
        output_folder (Path, optional): output folder for saving the classification results JSON file.
        """

        # the names used for the model inference input and output
        self._input_dataset_key = "image"
        self._pred_dataset_key = "pred"

        # The names used for the operator input and output
        self.input_name_image = "image"
        self.output_name_result = "result_text"

        # The name of the optional input port for passing data to override the output folder path.
        self.input_name_output_folder = "output_folder"

        # The output folder set on the object can be overridden at each compute by data in the optional named input
        self.output_folder = output_folder

        # Need the name when there are multiple models loaded
        self._model_name = model_name.strip() if isinstance(model_name, str) else ""
        # Need the path to load the models when they are not loaded in the execution context
        self.model_path = model_path
        self.app_context = app_context
        self.model = self._get_model(self.app_context, self.model_path, self._model_name)

        # This needs to be at the end of the constructor.
        super().__init__(fragment, *args, **kwargs)

    def _get_model(self, app_context: AppContext, model_path: Path, model_name: str):
        """Load the model with the given name from context or model path

        Args:
            app_context (AppContext): The application context object holding the model(s)
            model_path (Path): The path to the model file, as a backup to load model directly
            model_name (str): The name of the model, when multiples are loaded in the context
        """

        if app_context.models:
            # `app_context.models.get(model_name)` returns a model instance if exists.
            # If model_name is not specified and only one model exists, it returns that model.
            model = app_context.models.get(model_name)
        else:
            model = torch.jit.load(
                MedNISTClassifierOperator.MODEL_LOCAL_PATH,
                map_location=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
            )

        return model

    def setup(self, spec: OperatorSpec):
        """Set up the operator named input and named output, both are in-memory objects."""

        spec.input(self.input_name_image)
        spec.input(self.input_name_output_folder).condition(ConditionType.NONE)  # Optional for overriding.
        spec.output(self.output_name_result).condition(ConditionType.NONE)  # Not forcing a downstream receiver.

    @property
    def transform(self):
        return Compose([EnsureChannelFirst(channel_dim="no_channel"), ScaleIntensity(), EnsureType()])

    def compute(self, op_input, op_output, context):
        import json

        import torch

        img = op_input.receive(self.input_name_image).asnumpy()  # (64, 64), uint8. Input validation can be added.
        image_tensor = self.transform(img)  # (1, 64, 64), torch.float64
        image_tensor = image_tensor[None].float()  # (1, 1, 64, 64), torch.float32

        device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
        image_tensor = image_tensor.to(device)

        with torch.no_grad():
            outputs = self.model(image_tensor)

        _, output_classes = outputs.max(dim=1)

        result = MEDNIST_CLASSES[output_classes[0]]  # get the class name
        print(result)
        op_output.emit(result, self.output_name_result)

        # Get output folder, with value in optional input port overriding the obj attribute
        output_folder_on_compute = op_input.receive(self.input_name_output_folder) or self.output_folder
        Path.mkdir(output_folder_on_compute, parents=True, exist_ok=True)  # Let exception bubble up if raised.
        output_path = output_folder_on_compute / "output.json"
        with open(output_path, "w") as fp:
            json.dump(result, fp)

Creating Application class

Our application class would look like below.

It defines App class inheriting Application class.

LoadPILOperator is connected to MedNISTClassifierOperator by using self.add_flow() in compose() method of App.

class App(Application):
    """Application class for the MedNIST classifier."""

    def compose(self):
        app_context = Application.init_app_context({})  # Do not pass argv in Jupyter Notebook
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)
        load_pil_op = LoadPILOperator(self, CountCondition(self, 1), input_folder=app_input_path, name="pil_loader_op")
        classifier_op = MedNISTClassifierOperator(
            self, app_context=app_context, output_folder=app_output_path, model_path=model_path, name="classifier_op"
        )

        my_model_info = ModelInfo("MONAI WG Trainer", "MEDNIST Classifier", "0.1", "xyz")
        my_equipment = EquipmentInfo(manufacturer="MOANI Deploy App SDK", manufacturer_model="DICOM SR Writer")
        my_special_tags = {"SeriesDescription": "Not for clinical use. The result is for research use only."}
        dicom_sr_operator = DICOMTextSRWriterOperator(
            self,
            copy_tags=False,
            model_info=my_model_info,
            equipment_info=my_equipment,
            custom_tags=my_special_tags,
            output_folder=app_output_path,
        )

        self.add_flow(load_pil_op, classifier_op, {("image", "image")})
        self.add_flow(classifier_op, dicom_sr_operator, {("result_text", "text")})

Executing app locally

We can execute the app in the Jupyter notebook. Before doing so, we also need to clean the output folder which was created by running the packaged containerizd app in the previous cell.

!rm -rf $HOLOSCAN_OUTPUT_PATH
app = App().run()
[info] [fragment.cpp:705] Loading extensions from configs...
[2025-04-22 10:01:06,211] [INFO] (root) - Parsed args: Namespace(log_level=None, input=None, output=None, model=None, workdir=None, triton_server_netloc=None, argv=[])
[2025-04-22 10:01:06,224] [INFO] (root) - AppContext object: AppContext(input_path=input, output_path=output, model_path=models, workdir=), triton_server_netloc=
[info] [gxf_executor.cpp:265] Creating context
[info] [gxf_executor.cpp:2396] Activating Graph...
[info] [gxf_executor.cpp:2426] Running Graph...
[info] [gxf_executor.cpp:2428] Waiting for completion...
[info] [greedy_scheduler.cpp:191] Scheduling 3 entities
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/monai/data/meta_tensor.py:116: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at /pytorch/torch/csrc/utils/tensor_numpy.cpp:203.)
  return torch.as_tensor(x, *args, **_kwargs).as_subclass(cls)
[2025-04-22 10:01:07,561] [WARNING] (pydicom) - 'Dataset.is_implicit_VR' will be removed in v4.0, set the Transfer Syntax UID or use the 'implicit_vr' argument with Dataset.save_as() or dcmwrite() instead
[2025-04-22 10:01:07,562] [WARNING] (pydicom) - 'Dataset.is_little_endian' will be removed in v4.0, set the Transfer Syntax UID or use the 'little_endian' argument with Dataset.save_as() or dcmwrite() instead
[2025-04-22 10:01:07,565] [WARNING] (pydicom) - Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/pydicom/valuerep.py:440: UserWarning: Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
  warn_and_log(msg)
[2025-04-22 10:01:07,575] [WARNING] (pydicom) - 'write_like_original' is deprecated and will be removed in v4.0, please use 'enforce_file_format' instead
[2025-04-22 10:01:07,581] [INFO] (root) - Finished writing DICOM instance to file output/1.2.826.0.1.3680043.8.498.59762034317112105131069375575619402726.dcm
[2025-04-22 10:01:07,585] [INFO] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - DICOM SOP instance saved in output/1.2.826.0.1.3680043.8.498.59762034317112105131069375575619402726.dcm
AbdomenCT
[info] [greedy_scheduler.cpp:372] Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
[info] [greedy_scheduler.cpp:401] Scheduler finished.
[info] [gxf_executor.cpp:2431] Deactivating Graph...
[info] [gxf_executor.cpp:2439] Graph execution finished.
[info] [gxf_executor.cpp:295] Destroying context
!cat $HOLOSCAN_OUTPUT_PATH/output.json
"AbdomenCT"

Once the application is verified inside Jupyter notebook, we can write the whole application as a file(mednist_classifier_monaideploy.py) by concatenating code above, then add the following lines:

if __name__ == "__main__":
    App().run()

The above lines are needed to execute the application code by using python interpreter.

# Create an application folder
!mkdir -p mednist_app && rm -rf mednist_app/*
%%writefile mednist_app/mednist_classifier_monaideploy.py

# Copyright 2021-2023 MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#     http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import logging
import os
from pathlib import Path
from typing import Optional

import torch

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application, ConditionType, Fragment, Image, Operator, OperatorSpec
from monai.deploy.operators.dicom_text_sr_writer_operator import DICOMTextSRWriterOperator, EquipmentInfo, ModelInfo
from monai.transforms import EnsureChannelFirst, Compose, EnsureType, ScaleIntensity

MEDNIST_CLASSES = ["AbdomenCT", "BreastMRI", "CXR", "ChestCT", "Hand", "HeadCT"]


# @md.env(pip_packages=["pillow"])
class LoadPILOperator(Operator):
    """Load image from the given input (DataPath) and set numpy array to the output (Image)."""

    DEFAULT_INPUT_FOLDER = Path.cwd() / "input"
    DEFAULT_OUTPUT_NAME = "image"

    # For now, need to have the input folder as an instance attribute, set on init.
    # If dynamically changing the input folder, per compute, then use a (optional) input port to convey the
    # value of the input folder, which is then emitted by a upstream operator.
    def __init__(
        self,
        fragment: Fragment,
        *args,
        input_folder: Path = DEFAULT_INPUT_FOLDER,
        output_name: str = DEFAULT_OUTPUT_NAME,
        **kwargs,
    ):
        """Creates an loader object with the input folder and the output port name overrides as needed.

        Args:
            fragment (Fragment): An instance of the Application class which is derived from Fragment.
            input_folder (Path): Folder from which to load input file(s).
                                 Defaults to `input` in the current working directory.
            output_name (str): Name of the output port, which is an image object. Defaults to `image`.
        """

        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        self.input_path = input_folder
        self.index = 0
        self.output_name_image = (
            output_name.strip() if output_name and len(output_name.strip()) > 0 else LoadPILOperator.DEFAULT_OUTPUT_NAME
        )

        super().__init__(fragment, *args, **kwargs)

    def setup(self, spec: OperatorSpec):
        """Set up the named input and output port(s)"""
        spec.output(self.output_name_image)

    def compute(self, op_input, op_output, context):
        import numpy as np
        from PIL import Image as PILImage

        # Input path is stored in the object attribute, but could change to use a named port if need be.
        input_path = self.input_path
        if input_path.is_dir():
            input_path = next(self.input_path.glob("*.*"))  # take the first file

        image = PILImage.open(input_path)
        image = image.convert("L")  # convert to greyscale image
        image_arr = np.asarray(image)

        output_image = Image(image_arr)  # create Image domain object with a numpy array
        op_output.emit(output_image, self.output_name_image)  # cannot omit the name even if single output.


# @md.env(pip_packages=["monai"])
class MedNISTClassifierOperator(Operator):
    """Classifies the given image and returns the class name.

    Named inputs:
        image: Image object for which to generate the classification.
        output_folder: Optional, the path to save the results JSON file, overridingthe the one set on __init__

    Named output:
        result_text: The classification results in text.
    """

    DEFAULT_OUTPUT_FOLDER = Path.cwd() / "classification_results"
    # For testing the app directly, the model should be at the following path.
    MODEL_LOCAL_PATH = Path(os.environ.get("HOLOSCAN_MODEL_PATH", Path.cwd() / "model/model.ts"))

    def __init__(
        self,
        fragment: Fragment,
        *args,
        app_context: AppContext,
        model_name: Optional[str] = "",
        model_path: Path = MODEL_LOCAL_PATH,
        output_folder: Path = DEFAULT_OUTPUT_FOLDER,
        **kwargs,
    ):
        """Creates an instance with the reference back to the containing application/fragment.

        fragment (Fragment): An instance of the Application class which is derived from Fragment.
        model_name (str, optional): Name of the model. Default to "" for single model app.
        model_path (Path): Path to the model file. Defaults to model/models.ts of current working dir.
        output_folder (Path, optional): output folder for saving the classification results JSON file.
        """

        # the names used for the model inference input and output
        self._input_dataset_key = "image"
        self._pred_dataset_key = "pred"

        # The names used for the operator input and output
        self.input_name_image = "image"
        self.output_name_result = "result_text"

        # The name of the optional input port for passing data to override the output folder path.
        self.input_name_output_folder = "output_folder"

        # The output folder set on the object can be overridden at each compute by data in the optional named input
        self.output_folder = output_folder

        # Need the name when there are multiple models loaded
        self._model_name = model_name.strip() if isinstance(model_name, str) else ""
        # Need the path to load the models when they are not loaded in the execution context
        self.model_path = model_path
        self.app_context = app_context
        self.model = self._get_model(self.app_context, self.model_path, self._model_name)

        # This needs to be at the end of the constructor.
        super().__init__(fragment, *args, **kwargs)

    def _get_model(self, app_context: AppContext, model_path: Path, model_name: str):
        """Load the model with the given name from context or model path

        Args:
            app_context (AppContext): The application context object holding the model(s)
            model_path (Path): The path to the model file, as a backup to load model directly
            model_name (str): The name of the model, when multiples are loaded in the context
        """

        if app_context.models:
            # `app_context.models.get(model_name)` returns a model instance if exists.
            # If model_name is not specified and only one model exists, it returns that model.
            model = app_context.models.get(model_name)
        else:
            model = torch.jit.load(
                MedNISTClassifierOperator.MODEL_LOCAL_PATH,
                map_location=torch.device("cuda" if torch.cuda.is_available() else "cpu"),
            )

        return model

    def setup(self, spec: OperatorSpec):
        """Set up the operator named input and named output, both are in-memory objects."""

        spec.input(self.input_name_image)
        spec.input(self.input_name_output_folder).condition(ConditionType.NONE)  # Optional for overriding.
        spec.output(self.output_name_result).condition(ConditionType.NONE)  # Not forcing a downstream receiver.

    @property
    def transform(self):
        return Compose([EnsureChannelFirst(channel_dim="no_channel"), ScaleIntensity(), EnsureType()])

    def compute(self, op_input, op_output, context):
        import json

        import torch

        img = op_input.receive(self.input_name_image).asnumpy()  # (64, 64), uint8. Input validation can be added.
        image_tensor = self.transform(img)  # (1, 64, 64), torch.float64
        image_tensor = image_tensor[None].float()  # (1, 1, 64, 64), torch.float32

        device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
        image_tensor = image_tensor.to(device)

        with torch.no_grad():
            outputs = self.model(image_tensor)

        _, output_classes = outputs.max(dim=1)

        result = MEDNIST_CLASSES[output_classes[0]]  # get the class name
        print(result)
        op_output.emit(result, self.output_name_result)

        # Get output folder, with value in optional input port overriding the obj attribute
        output_folder_on_compute = op_input.receive(self.input_name_output_folder) or self.output_folder
        Path.mkdir(output_folder_on_compute, parents=True, exist_ok=True)  # Let exception bubble up if raised.
        output_path = output_folder_on_compute / "output.json"
        with open(output_path, "w") as fp:
            json.dump(result, fp)


# @md.resource(cpu=1, gpu=1, memory="1Gi")
class App(Application):
    """Application class for the MedNIST classifier."""

    def compose(self):
        # Use Commandline options over environment variables to init context.
        app_context = Application.init_app_context(self.argv)
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)
        load_pil_op = LoadPILOperator(self, CountCondition(self, 1), input_folder=app_input_path, name="pil_loader_op")
        classifier_op = MedNISTClassifierOperator(
            self, app_context=app_context, output_folder=app_output_path, model_path=model_path, name="classifier_op"
        )

        my_model_info = ModelInfo("MONAI WG Trainer", "MEDNIST Classifier", "0.1", "xyz")
        my_equipment = EquipmentInfo(manufacturer="MOANI Deploy App SDK", manufacturer_model="DICOM SR Writer")
        my_special_tags = {"SeriesDescription": "Not for clinical use. The result is for research use only."}
        dicom_sr_operator = DICOMTextSRWriterOperator(
            self,
            copy_tags=False,
            model_info=my_model_info,
            equipment_info=my_equipment,
            custom_tags=my_special_tags,
            output_folder=app_output_path,
        )

        self.add_flow(load_pil_op, classifier_op, {("image", "image")})
        self.add_flow(classifier_op, dicom_sr_operator, {("result_text", "text")})


if __name__ == "__main__":
    App().run()
Writing mednist_app/mednist_classifier_monaideploy.py

This time, let’s execute the app on the command line.

Note

Since the environment variables have been set and contain the correct paths, it is not necessary to provide the command line options on running the application, though the following demonstrates the use of the options.

!python "mednist_app/mednist_classifier_monaideploy.py" -i {input_folder} -o {output_folder} -m {models_folder} -l DEBUG
[info] [fragment.cpp:705] Loading extensions from configs...
[2025-04-22 10:01:12,273] [INFO] (root) - Parsed args: Namespace(log_level='DEBUG', input=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/input'), output=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output'), model=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models'), workdir=None, triton_server_netloc=None, argv=['mednist_app/mednist_classifier_monaideploy.py', '-i', 'input', '-o', 'output', '-m', 'models', '-l', 'DEBUG'])
[2025-04-22 10:01:12,278] [INFO] (root) - AppContext object: AppContext(input_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/input, output_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output, model_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models, workdir=), triton_server_netloc=
[info] [gxf_executor.cpp:265] Creating context
[info] [gxf_executor.cpp:2396] Activating Graph...
[info] [gxf_executor.cpp:2426] Running Graph...
[info] [gxf_executor.cpp:2428] Waiting for completion...
[info] [greedy_scheduler.cpp:191] Scheduling 3 entities
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/monai/data/meta_tensor.py:116: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at /pytorch/torch/csrc/utils/tensor_numpy.cpp:203.)
  return torch.as_tensor(x, *args, **_kwargs).as_subclass(cls)
AbdomenCT
[2025-04-22 10:01:13,572] [DEBUG] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - Writing DICOM object...

[2025-04-22 10:01:13,572] [DEBUG] (root) - Writing DICOM common modules...
[2025-04-22 10:01:13,573] [WARNING] (pydicom) - 'Dataset.is_implicit_VR' will be removed in v4.0, set the Transfer Syntax UID or use the 'implicit_vr' argument with Dataset.save_as() or dcmwrite() instead
[2025-04-22 10:01:13,573] [WARNING] (pydicom) - 'Dataset.is_little_endian' will be removed in v4.0, set the Transfer Syntax UID or use the 'little_endian' argument with Dataset.save_as() or dcmwrite() instead
[2025-04-22 10:01:13,574] [WARNING] (pydicom) - Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.10/site-packages/pydicom/valuerep.py:440: UserWarning: Invalid value for VR UI: 'xyz'. Please see <https://dicom.nema.org/medical/dicom/current/output/html/part05.html#table_6.2-1> for allowed values for each VR.
  warn_and_log(msg)
[2025-04-22 10:01:13,576] [DEBUG] (root) - DICOM common modules written:
Dataset.file_meta -------------------------------
(0002,0000) File Meta Information Group Length  UL: 198
(0002,0001) File Meta Information Version       OB: b'01'
(0002,0002) Media Storage SOP Class UID         UI: Basic Text SR Storage
(0002,0003) Media Storage SOP Instance UID      UI: 1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229
(0002,0010) Transfer Syntax UID                 UI: Implicit VR Little Endian
(0002,0012) Implementation Class UID            UI: 1.2.40.0.13.1.1.1
(0002,0013) Implementation Version Name         SH: '0.5.1+37.g96f7e'
-------------------------------------------------
(0008,0005) Specific Character Set              CS: 'ISO_IR 100'
(0008,0012) Instance Creation Date              DA: '20250422'
(0008,0013) Instance Creation Time              TM: '100113'
(0008,0016) SOP Class UID                       UI: Basic Text SR Storage
(0008,0018) SOP Instance UID                    UI: 1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229
(0008,0020) Study Date                          DA: '20250422'
(0008,0021) Series Date                         DA: '20250422'
(0008,0023) Content Date                        DA: '20250422'
(0008,002A) Acquisition DateTime                DT: '20250422100113'
(0008,0030) Study Time                          TM: '100113'
(0008,0031) Series Time                         TM: '100113'
(0008,0033) Content Time                        TM: '100113'
(0008,0050) Accession Number                    SH: ''
(0008,0060) Modality                            CS: 'SR'
(0008,0070) Manufacturer                        LO: 'MOANI Deploy App SDK'
(0008,0090) Referring Physician's Name          PN: ''
(0008,0201) Timezone Offset From UTC            SH: '-0700'
(0008,1030) Study Description                   LO: 'AI results.'
(0008,103E) Series Description                  LO: 'CAUTION: Not for Diagnostic Use, for research use only.'
(0008,1090) Manufacturer's Model Name           LO: 'DICOM SR Writer'
(0010,0010) Patient's Name                      PN: ''
(0010,0020) Patient ID                          LO: ''
(0010,0021) Issuer of Patient ID                LO: ''
(0010,0030) Patient's Birth Date                DA: ''
(0010,0040) Patient's Sex                       CS: ''
(0018,0015) Body Part Examined                  CS: ''
(0018,1020) Software Versions                   LO: '0.5.1+37.g96f7e'
(0018,A001)  Contributing Equipment Sequence  1 item(s) ---- 
   (0008,0070) Manufacturer                        LO: 'MONAI WG Trainer'
   (0008,1090) Manufacturer's Model Name           LO: 'MEDNIST Classifier'
   (0018,1002) Device UID                          UI: xyz
   (0018,1020) Software Versions                   LO: '0.1'
   (0040,A170)  Purpose of Reference Code Sequence  1 item(s) ---- 
      (0008,0100) Code Value                          SH: 'Newcode1'
      (0008,0102) Coding Scheme Designator            SH: '99IHE'
      (0008,0104) Code Meaning                        LO: '"Processing Algorithm'
      ---------
   ---------
(0020,000D) Study Instance UID                  UI: 1.2.826.0.1.3680043.8.498.21427650624285250793329047854027764031
(0020,000E) Series Instance UID                 UI: 1.2.826.0.1.3680043.8.498.53141607669515853472048821908030378483
(0020,0010) Study ID                            SH: '1'
(0020,0011) Series Number                       IS: '1679'
(0020,0013) Instance Number                     IS: '1'
(0040,1001) Requested Procedure ID              SH: ''
[2025-04-22 10:01:13,577] [DEBUG] (root) - DICOM dataset to be written:Dataset.file_meta -------------------------------
(0002,0000) File Meta Information Group Length  UL: 198
(0002,0001) File Meta Information Version       OB: b'01'
(0002,0002) Media Storage SOP Class UID         UI: Basic Text SR Storage
(0002,0003) Media Storage SOP Instance UID      UI: 1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229
(0002,0010) Transfer Syntax UID                 UI: Implicit VR Little Endian
(0002,0012) Implementation Class UID            UI: 1.2.40.0.13.1.1.1
(0002,0013) Implementation Version Name         SH: '0.5.1+37.g96f7e'
-------------------------------------------------
(0008,0005) Specific Character Set              CS: 'ISO_IR 100'
(0008,0012) Instance Creation Date              DA: '20250422'
(0008,0013) Instance Creation Time              TM: '100113'
(0008,0016) SOP Class UID                       UI: Basic Text SR Storage
(0008,0018) SOP Instance UID                    UI: 1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229
(0008,0020) Study Date                          DA: '20250422'
(0008,0021) Series Date                         DA: '20250422'
(0008,0023) Content Date                        DA: '20250422'
(0008,002A) Acquisition DateTime                DT: '20250422100113'
(0008,0030) Study Time                          TM: '100113'
(0008,0031) Series Time                         TM: '100113'
(0008,0033) Content Time                        TM: '100113'
(0008,0050) Accession Number                    SH: ''
(0008,0060) Modality                            CS: 'SR'
(0008,0070) Manufacturer                        LO: 'MOANI Deploy App SDK'
(0008,0090) Referring Physician's Name          PN: ''
(0008,0201) Timezone Offset From UTC            SH: '-0700'
(0008,1030) Study Description                   LO: 'AI results.'
(0008,103E) Series Description                  LO: 'Not for clinical use. The result is for research use only.'
(0008,1090) Manufacturer's Model Name           LO: 'DICOM SR Writer'
(0010,0010) Patient's Name                      PN: ''
(0010,0020) Patient ID                          LO: ''
(0010,0021) Issuer of Patient ID                LO: ''
(0010,0030) Patient's Birth Date                DA: ''
(0010,0040) Patient's Sex                       CS: ''
(0018,0015) Body Part Examined                  CS: ''
(0018,1020) Software Versions                   LO: '0.5.1+37.g96f7e'
(0018,A001)  Contributing Equipment Sequence  1 item(s) ---- 
   (0008,0070) Manufacturer                        LO: 'MONAI WG Trainer'
   (0008,1090) Manufacturer's Model Name           LO: 'MEDNIST Classifier'
   (0018,1002) Device UID                          UI: xyz
   (0018,1020) Software Versions                   LO: '0.1'
   (0040,A170)  Purpose of Reference Code Sequence  1 item(s) ---- 
      (0008,0100) Code Value                          SH: 'Newcode1'
      (0008,0102) Coding Scheme Designator            SH: '99IHE'
      (0008,0104) Code Meaning                        LO: '"Processing Algorithm'
      ---------
   ---------
(0020,000D) Study Instance UID                  UI: 1.2.826.0.1.3680043.8.498.21427650624285250793329047854027764031
(0020,000E) Series Instance UID                 UI: 1.2.826.0.1.3680043.8.498.53141607669515853472048821908030378483
(0020,0010) Study ID                            SH: '1'
(0020,0011) Series Number                       IS: '1679'
(0020,0013) Instance Number                     IS: '1'
(0040,1001) Requested Procedure ID              SH: ''
(0040,A040) Value Type                          CS: 'CONTAINER'
(0040,A043)  Concept Name Code Sequence  1 item(s) ---- 
   (0008,0100) Code Value                          SH: '18748-4'
   (0008,0102) Coding Scheme Designator            SH: 'LN'
   (0008,0104) Code Meaning                        LO: 'Diagnostic Imaging Report'
   ---------
(0040,A050) Continuity Of Content               CS: 'SEPARATE'
(0040,A493) Verification Flag                   CS: 'UNVERIFIED'
(0040,A730)  Content Sequence  1 item(s) ---- 
   (0040,A010) Relationship Type                   CS: 'CONTAINS'
   (0040,A040) Value Type                          CS: 'TEXT'
   (0040,A043)  Concept Name Code Sequence  1 item(s) ---- 
      (0008,0100) Code Value                          SH: '111412'
      (0008,0102) Coding Scheme Designator            SH: 'DCM'
      (0008,0104) Code Meaning                        LO: 'Narrative Summary'
      ---------
   (0040,A160) Text Value                          UT: 'AbdomenCT'
   ---------
[2025-04-22 10:01:13,577] [WARNING] (pydicom) - 'write_like_original' is deprecated and will be removed in v4.0, please use 'enforce_file_format' instead
[2025-04-22 10:01:13,580] [INFO] (root) - Finished writing DICOM instance to file /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output/1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229.dcm
[2025-04-22 10:01:13,581] [INFO] (monai.deploy.operators.dicom_text_sr_writer_operator.DICOMTextSRWriterOperator) - DICOM SOP instance saved in /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output/1.2.826.0.1.3680043.8.498.41171981535561245877202758927925418229.dcm
[info] [greedy_scheduler.cpp:372] Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
[info] [greedy_scheduler.cpp:401] Scheduler finished.
[info] [gxf_executor.cpp:2431] Deactivating Graph...
[info] [gxf_executor.cpp:2439] Graph execution finished.
[info] [gxf_executor.cpp:295] Destroying context
!cat {output_folder}/output.json
"AbdomenCT"

Additional file required for packaging the app (creating MAP Docker image)

In this version of the App SDK, we need to write out the configuration yaml file as well as the package requirements file, in the application folder.

%%writefile mednist_app/app.yaml
%YAML 1.2
---
application:
  title: MONAI Deploy App Package - MedNIST Classifier App
  version: 1.0
  inputFormats: ["file"]
  outputFormats: ["file"]

resources:
  cpu: 1
  gpu: 1
  memory: 1Gi
  gpuMemory: 1Gi
Writing mednist_app/app.yaml
%%writefile mednist_app/requirements.txt
monai>=1.2.0
Pillow>=8.4.0
pydicom>=2.3.0
highdicom>=0.18.2
SimpleITK>=2.0.0
setuptools>=59.5.0 # for pkg_resources
Writing mednist_app/requirements.txt

By now, we have built the application and prepared all necessary files for create the MONAI Application Package (MAP).