Creating a Deploy App with MONAI Deploy App SDK and MONAI Bundle

This tutorial shows how to create an application for organ segmentation using a PyTorch model that has been trained with MONAI and packaged in the MONAI Bundle format.

Deploying AI models requires the integration with clinical imaging network, even if just in a for-research-use setting. This means that the AI deploy application will need to support standards-based imaging protocols, and specifically for Radiological imaging, DICOM protocol.

Typically, DICOM network communication, either in DICOM TCP/IP network protocol or DICOMWeb, would be handled by DICOM devices or services, e.g. MONAI Deploy Informatics Gateway, so the deploy application itself would only need to use DICOM Part 10 files as input and save the AI result in DICOM Part10 file(s). For segmentation use cases, the DICOM instance file for AI results could be a DICOM Segmentation object or a DICOM RT Structure Set, and for classification, DICOM Structure Report and/or DICOM Encapsulated PDF.

DICOM instances received from modalities and Picture Archiving and Communications System (PACS) are often times the whole DICOM study, so an AI deploy application has to deal with a whole DICOM study with multiple series, whose images’ spacing may not be the same as expected by the trained model. To address these cases consistently and efficiently, MONAI Deploy Application SDK provides classes, called operators, to parse DICOM studies, select specific series with application-defined rules, and convert the selected DICOM series into domain-specific image format along with meta-data representing the pertinent DICOM attributes. The image is then further processed in the pre-processing stage to normalize spacing, orientation, intensity, etc., before pixel data as Tensors are used for inference.

In the following sections, we will demonstrate how to create a MONAI Deploy application package using the MONAI Deploy App SDK, and importantly, using the built-in MONAI Bundle Inference Operator to perform inference with the Spleen CT Segmentation PyTorch model in a MONAI Bundle.

Note

For local testing, if there is a lack of DICOM Part 10 files, one can use open source programs, e.g. 3D Slicer, to convert a NIfTI file to a DICOM series.

To make running this example simpler, the DICOM files and the Spleen CT Segmentation MONAI Bundle, published in MONAI Model Zoo, have been packaged and shared on Google Drive.

Creating Operators and connecting them in Application class

We will implement an application that consists of five Operators:

  • DICOMDataLoaderOperator:

    • Input(dicom_files): a folder path (Path)

    • Output(dicom_study_list): a list of DICOM studies in memory (List[DICOMStudy])

  • DICOMSeriesSelectorOperator:

    • Input(dicom_study_list): a list of DICOM studies in memory (List[DICOMStudy])

    • Input(selection_rules): a selection rule (Dict)

    • Output(study_selected_series_list): a DICOM series object in memory (StudySelectedSeries)

  • DICOMSeriesToVolumeOperator:

    • Input(study_selected_series_list): a DICOM series object in memory (StudySelectedSeries)

    • Output(image): an image object in memory (Image)

  • MonaiBundleInferenceOperator:

    • Input(image): an image object in memory (Image)

    • Output(pred): an image object in memory (Image)

  • DICOMSegmentationWriterOperator:

    • Input(seg_image): a segmentation image object in memory (Image)

    • Input(study_selected_series_list): a DICOM series object in memory (StudySelectedSeries)

    • Output(dicom_seg_instance): a file path (Path)

Note

The DICOMSegmentationWriterOperator needs both the segmentation image as well as the original DICOM series meta-data in order to use the patient demographics and the DICOM Study level attributes.

The workflow of the application is illustrated below.

%%{init: {"theme": "base", "themeVariables": { "fontSize": "16px"}} }%% classDiagram direction TB DICOMDataLoaderOperator --|> DICOMSeriesSelectorOperator : dicom_study_list...dicom_study_list DICOMSeriesSelectorOperator --|> DICOMSeriesToVolumeOperator : study_selected_series_list...study_selected_series_list DICOMSeriesToVolumeOperator --|> MonaiBundleInferenceOperator : image...image DICOMSeriesSelectorOperator --|> DICOMSegmentationWriterOperator : study_selected_series_list...study_selected_series_list MonaiBundleInferenceOperator --|> DICOMSegmentationWriterOperator : pred...seg_image class DICOMDataLoaderOperator { <in>dicom_files : DISK dicom_study_list(out) IN_MEMORY } class DICOMSeriesSelectorOperator { <in>dicom_study_list : IN_MEMORY <in>selection_rules : IN_MEMORY study_selected_series_list(out) IN_MEMORY } class DICOMSeriesToVolumeOperator { <in>study_selected_series_list : IN_MEMORY image(out) IN_MEMORY } class MonaiBundleInferenceOperator { <in>image : IN_MEMORY pred(out) IN_MEMORY } class DICOMSegmentationWriterOperator { <in>seg_image : IN_MEMORY <in>study_selected_series_list : IN_MEMORY dicom_seg_instance(out) DISK }

Setup environment

# Install MONAI and other necessary image processing packages for the application
!python -c "import monai" || pip install --upgrade -q "monai"
!python -c "import torch" || pip install -q "torch>=1.12.0"
!python -c "import numpy" || pip install -q "numpy>=1.21.6"
!python -c "import nibabel" || pip install -q "nibabel>=3.2.1"
!python -c "import pydicom" || pip install -q "pydicom>=2.3.0"
!python -c "import highdicom" || pip install -q "highdicom>=0.18.2"
!python -c "import SimpleITK" || pip install -q "SimpleITK>=2.0.0"

# Install MONAI Deploy App SDK package
!python -c "import holoscan" || pip install --upgrade -q "holoscan>=0.6.0"
!python -c "import monai.deploy" || pip install -q "monai-deploy-app-sdk"

Note: you may need to restart the Jupyter kernel to use the updated packages.

Download/Extract input and model/bundle files from Google Drive

# Download the test data and MONAI bundle zip file
!pip install gdown
!gdown "https://drive.google.com/uc?id=1Uds8mEvdGNYUuvFpTtCQ8gNU97bAPCaQ"

# After downloading ai_spleen_bundle_data zip file from the web browser or using gdown,
!unzip -o "ai_spleen_seg_bundle_data.zip"

# Need to copy the model.ts file to its own clean subfolder for pacakging, to workaround an issue in the Packager
models_folder = "models"
!rm -rf {models_folder} && mkdir -p {models_folder}/model && cp model.ts {models_folder}/model && ls {models_folder}/model
Requirement already satisfied: gdown in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (4.7.1)
Requirement already satisfied: filelock in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from gdown) (3.12.2)
Requirement already satisfied: requests[socks] in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from gdown) (2.31.0)
Requirement already satisfied: six in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from gdown) (1.16.0)
Requirement already satisfied: tqdm in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from gdown) (4.66.1)
Requirement already satisfied: beautifulsoup4 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from gdown) (4.12.2)
Requirement already satisfied: soupsieve>1.2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from beautifulsoup4->gdown) (2.4.1)
Requirement already satisfied: charset-normalizer<4,>=2 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from requests[socks]->gdown) (3.2.0)
Requirement already satisfied: idna<4,>=2.5 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from requests[socks]->gdown) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from requests[socks]->gdown) (2.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from requests[socks]->gdown) (2023.7.22)
Requirement already satisfied: PySocks!=1.5.7,>=1.5.6 in /home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages (from requests[socks]->gdown) (1.7.1)
Downloading...
From (uriginal): https://drive.google.com/uc?id=1Uds8mEvdGNYUuvFpTtCQ8gNU97bAPCaQ
From (redirected): https://drive.google.com/uc?id=1Uds8mEvdGNYUuvFpTtCQ8gNU97bAPCaQ&confirm=t&uuid=9c85302e-fe27-4e68-8eb5-cc19a530a25c
To: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/ai_spleen_seg_bundle_data.zip
100%|██████████████████████████████████████| 79.4M/79.4M [00:01<00:00, 67.3MB/s]
Archive:  ai_spleen_seg_bundle_data.zip
  inflating: dcm/1-001.dcm           
  inflating: dcm/1-002.dcm           
  inflating: dcm/1-003.dcm           
  inflating: dcm/1-004.dcm           
  inflating: dcm/1-005.dcm           
  inflating: dcm/1-006.dcm           
  inflating: dcm/1-007.dcm           
  inflating: dcm/1-008.dcm           
  inflating: dcm/1-009.dcm           
  inflating: dcm/1-010.dcm           
  inflating: dcm/1-011.dcm           
  inflating: dcm/1-012.dcm           
  inflating: dcm/1-013.dcm           
  inflating: dcm/1-014.dcm           
  inflating: dcm/1-015.dcm           
  inflating: dcm/1-016.dcm           
  inflating: dcm/1-017.dcm           
  inflating: dcm/1-018.dcm           
  inflating: dcm/1-019.dcm           
  inflating: dcm/1-020.dcm           
  inflating: dcm/1-021.dcm           
  inflating: dcm/1-022.dcm           
  inflating: dcm/1-023.dcm           
  inflating: dcm/1-024.dcm           
  inflating: dcm/1-025.dcm           
  inflating: dcm/1-026.dcm           
  inflating: dcm/1-027.dcm           
  inflating: dcm/1-028.dcm           
  inflating: dcm/1-029.dcm           
  inflating: dcm/1-030.dcm           
  inflating: dcm/1-031.dcm           
  inflating: dcm/1-032.dcm           
  inflating: dcm/1-033.dcm           
  inflating: dcm/1-034.dcm           
  inflating: dcm/1-035.dcm           
  inflating: dcm/1-036.dcm           
  inflating: dcm/1-037.dcm           
  inflating: dcm/1-038.dcm           
  inflating: dcm/1-039.dcm           
  inflating: dcm/1-040.dcm           
  inflating: dcm/1-041.dcm           
  inflating: dcm/1-042.dcm           
  inflating: dcm/1-043.dcm           
  inflating: dcm/1-044.dcm           
  inflating: dcm/1-045.dcm           
  inflating: dcm/1-046.dcm           
  inflating: dcm/1-047.dcm           
  inflating: dcm/1-048.dcm           
  inflating: dcm/1-049.dcm           
  inflating: dcm/1-050.dcm           
  inflating: dcm/1-051.dcm           
  inflating: dcm/1-052.dcm           
  inflating: dcm/1-053.dcm           
  inflating: dcm/1-054.dcm           
  inflating: dcm/1-055.dcm           
  inflating: dcm/1-056.dcm           
  inflating: dcm/1-057.dcm           
  inflating: dcm/1-058.dcm           
  inflating: dcm/1-059.dcm           
  inflating: dcm/1-060.dcm           
  inflating: dcm/1-061.dcm           
  inflating: dcm/1-062.dcm           
  inflating: dcm/1-063.dcm           
  inflating: dcm/1-064.dcm           
  inflating: dcm/1-065.dcm           
  inflating: dcm/1-066.dcm           
  inflating: dcm/1-067.dcm           
  inflating: dcm/1-068.dcm           
  inflating: dcm/1-069.dcm           
  inflating: dcm/1-070.dcm           
  inflating: dcm/1-071.dcm           
  inflating: dcm/1-072.dcm           
  inflating: dcm/1-073.dcm           
  inflating: dcm/1-074.dcm           
  inflating: dcm/1-075.dcm           
  inflating: dcm/1-076.dcm           
  inflating: dcm/1-077.dcm           
  inflating: dcm/1-078.dcm           
  inflating: dcm/1-079.dcm           
  inflating: dcm/1-080.dcm           
  inflating: dcm/1-081.dcm           
  inflating: dcm/1-082.dcm           
  inflating: dcm/1-083.dcm           
  inflating: dcm/1-084.dcm           
  inflating: dcm/1-085.dcm           
  inflating: dcm/1-086.dcm           
  inflating: dcm/1-087.dcm           
  inflating: dcm/1-088.dcm           
  inflating: dcm/1-089.dcm           
  inflating: dcm/1-090.dcm           
  inflating: dcm/1-091.dcm           
  inflating: dcm/1-092.dcm           
  inflating: dcm/1-093.dcm           
  inflating: dcm/1-094.dcm           
  inflating: dcm/1-095.dcm           
  inflating: dcm/1-096.dcm           
  inflating: dcm/1-097.dcm           
  inflating: dcm/1-098.dcm           
  inflating: dcm/1-099.dcm           
  inflating: dcm/1-100.dcm           
  inflating: dcm/1-101.dcm           
  inflating: dcm/1-102.dcm           
  inflating: dcm/1-103.dcm           
  inflating: dcm/1-104.dcm           
  inflating: dcm/1-105.dcm           
  inflating: dcm/1-106.dcm           
  inflating: dcm/1-107.dcm           
  inflating: dcm/1-108.dcm           
  inflating: dcm/1-109.dcm           
  inflating: dcm/1-110.dcm           
  inflating: dcm/1-111.dcm           
  inflating: dcm/1-112.dcm           
  inflating: dcm/1-113.dcm           
  inflating: dcm/1-114.dcm           
  inflating: dcm/1-115.dcm           
  inflating: dcm/1-116.dcm           
  inflating: dcm/1-117.dcm           
  inflating: dcm/1-118.dcm           
  inflating: dcm/1-119.dcm           
  inflating: dcm/1-120.dcm           
  inflating: dcm/1-121.dcm           
  inflating: dcm/1-122.dcm           
  inflating: dcm/1-123.dcm           
  inflating: dcm/1-124.dcm           
  inflating: dcm/1-125.dcm           
  inflating: dcm/1-126.dcm           
  inflating: dcm/1-127.dcm           
  inflating: dcm/1-128.dcm           
  inflating: dcm/1-129.dcm           
  inflating: dcm/1-130.dcm           
  inflating: dcm/1-131.dcm           
  inflating: dcm/1-132.dcm           
  inflating: dcm/1-133.dcm           
  inflating: dcm/1-134.dcm           
  inflating: dcm/1-135.dcm           
  inflating: dcm/1-136.dcm           
  inflating: dcm/1-137.dcm           
  inflating: dcm/1-138.dcm           
  inflating: dcm/1-139.dcm           
  inflating: dcm/1-140.dcm           
  inflating: dcm/1-141.dcm           
  inflating: dcm/1-142.dcm           
  inflating: dcm/1-143.dcm           
  inflating: dcm/1-144.dcm           
  inflating: dcm/1-145.dcm           
  inflating: dcm/1-146.dcm           
  inflating: dcm/1-147.dcm           
  inflating: dcm/1-148.dcm           
  inflating: dcm/1-149.dcm           
  inflating: dcm/1-150.dcm           
  inflating: dcm/1-151.dcm           
  inflating: dcm/1-152.dcm           
  inflating: dcm/1-153.dcm           
  inflating: dcm/1-154.dcm           
  inflating: dcm/1-155.dcm           
  inflating: dcm/1-156.dcm           
  inflating: dcm/1-157.dcm           
  inflating: dcm/1-158.dcm           
  inflating: dcm/1-159.dcm           
  inflating: dcm/1-160.dcm           
  inflating: dcm/1-161.dcm           
  inflating: dcm/1-162.dcm           
  inflating: dcm/1-163.dcm           
  inflating: dcm/1-164.dcm           
  inflating: dcm/1-165.dcm           
  inflating: dcm/1-166.dcm           
  inflating: dcm/1-167.dcm           
  inflating: dcm/1-168.dcm           
  inflating: dcm/1-169.dcm           
  inflating: dcm/1-170.dcm           
  inflating: dcm/1-171.dcm           
  inflating: dcm/1-172.dcm           
  inflating: dcm/1-173.dcm           
  inflating: dcm/1-174.dcm           
  inflating: dcm/1-175.dcm           
  inflating: dcm/1-176.dcm           
  inflating: dcm/1-177.dcm           
  inflating: dcm/1-178.dcm           
  inflating: dcm/1-179.dcm           
  inflating: dcm/1-180.dcm           
  inflating: dcm/1-181.dcm           
  inflating: dcm/1-182.dcm           
  inflating: dcm/1-183.dcm           
  inflating: dcm/1-184.dcm           
  inflating: dcm/1-185.dcm           
  inflating: dcm/1-186.dcm           
  inflating: dcm/1-187.dcm           
  inflating: dcm/1-188.dcm           
  inflating: dcm/1-189.dcm           
  inflating: dcm/1-190.dcm           
  inflating: dcm/1-191.dcm           
  inflating: dcm/1-192.dcm           
  inflating: dcm/1-193.dcm           
  inflating: dcm/1-194.dcm           
  inflating: dcm/1-195.dcm           
  inflating: dcm/1-196.dcm           
  inflating: dcm/1-197.dcm           
  inflating: dcm/1-198.dcm           
  inflating: dcm/1-199.dcm           
  inflating: dcm/1-200.dcm           
  inflating: dcm/1-201.dcm           
  inflating: dcm/1-202.dcm           
  inflating: dcm/1-203.dcm           
  inflating: dcm/1-204.dcm           
  inflating: model.ts                
model.ts

Set up environment variables

%env HOLOSCAN_INPUT_PATH dcm
%env HOLOSCAN_MODEL_PATH {models_folder}
%env HOLOSCAN_OUTPUT_PATH output
env: HOLOSCAN_INPUT_PATH=dcm
env: HOLOSCAN_MODEL_PATH=models
env: HOLOSCAN_OUTPUT_PATH=output

Set up imports

Let’s import necessary classes/decorators to define Application and Operator.

import logging
from pathlib import Path

# Required for setting SegmentDescription attributes. Direct import as this is not part of App SDK package.
from pydicom.sr.codedict import codes

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application
from monai.deploy.core.domain import Image
from monai.deploy.core.io_type import IOType
from monai.deploy.operators.dicom_data_loader_operator import DICOMDataLoaderOperator
from monai.deploy.operators.dicom_seg_writer_operator import DICOMSegmentationWriterOperator, SegmentDescription
from monai.deploy.operators.dicom_series_selector_operator import DICOMSeriesSelectorOperator
from monai.deploy.operators.dicom_series_to_volume_operator import DICOMSeriesToVolumeOperator
from monai.deploy.operators.monai_bundle_inference_operator import (
    BundleConfigNames,
    IOMapping,
    MonaiBundleInferenceOperator,
)
from monai.deploy.operators.stl_conversion_operator import STLConversionOperator

Determining the Input and Output for the Model Bundle Inference Operator

The App SDK provides a MonaiBundleInferenceOperator class to perform inference with a MONAI Bundle, which is essentially a PyTorch model in TorchScript with additional metadata describing the model network and processing specification. This operator uses the MONAI utilities to parse a MONAI Bundle to automatically instantiate the objects required for input and output processing as well as inference, as such it depends on MONAI transforms, inferers, and in turn their dependencies.

Each Operator class inherits from the base Operator base class, and its input/output properties are specified in the setup function (as opposed to using decorators @inputand @output in Version 0.5 and below).

For the MonaiBundleInferenceOperator class, the input/output need to be defined to match those of the model network, both in name and data type. For the current release, an IOMapping object is used to connect the operator input/output to those of the model network by using the same names. This is likely to change, to be automated, in the future releases once certain limitation in the App SDK is removed.

The Spleen CT Segmentation model network has a named input, called “image”, and the named output called “pred”, and both are of image type, which can all be mapped to the App SDK Image. This piece of information is typically acquired by examining the model metadata network_data_format attribute in the bundle, as seen in this [example] (https://github.com/Project-MONAI/model-zoo/blob/dev/models/spleen_ct_segmentation/configs/metadata.json).

Creating Application class

Our application class would look like below.

It defines App class, inheriting the base Application class.

Objects required for DICOM parsing, series selection, pixel data conversion to volume image, model specific inference, and the AI result specific DICOM Segmentation object writers are created. The execution pipeline, as a Directed Acyclic Graph, is then constructed by connecting these objects through self.add_flow().

class AISpleenSegApp(Application):
    """Demonstrates inference with built-in MONAI Bundle inference operator with DICOM files as input/output

    This application loads a set of DICOM instances, select the appropriate series, converts the series to
    3D volume image, performs inference with the built-in MONAI Bundle inference operator, including pre-processing
    and post-processing, save the segmentation image in a DICOM Seg OID in an instance file, and optionally the
    surface mesh in STL format.

    Pertinent MONAI Bundle:
      https://github.com/Project-MONAI/model-zoo/tree/dev/models/spleen_ct_segmentation

    Execution Time Estimate:
      With a Nvidia GV100 32GB GPU, for an input DICOM Series of 515 instances, the execution time is around
      25 seconds with saving both DICOM Seg and surface mesh STL file, and 15 seconds with DICOM Seg only.
    """

    def __init__(self, *args, **kwargs):
        """Creates an application instance."""
        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        super().__init__(*args, **kwargs)

    def run(self, *args, **kwargs):
        # This method calls the base class to run. Can be omitted if simply calling through.
        self._logger.info(f"Begin {self.run.__name__}")
        super().run(*args, **kwargs)
        self._logger.info(f"End {self.run.__name__}")

    def compose(self):
        """Creates the app specific operators and chain them up in the processing DAG."""

        logging.info(f"Begin {self.compose.__name__}")

        app_context = Application.init_app_context({})  # Do not pass argv in Jupyter Notebook
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)

        # Create the custom operator(s) as well as SDK built-in operator(s).
        study_loader_op = DICOMDataLoaderOperator(
            self, CountCondition(self, 1), input_folder=app_input_path, name="study_loader_op"
        )
        series_selector_op = DICOMSeriesSelectorOperator(self, rules=Sample_Rules_Text, name="series_selector_op")
        series_to_vol_op = DICOMSeriesToVolumeOperator(self, name="series_to_vol_op")

        # Create the inference operator that supports MONAI Bundle and automates the inference.
        # The IOMapping labels match the input and prediction keys in the pre and post processing.
        # The model_name is optional when the app has only one model.
        # The bundle_path argument optionally can be set to an accessible bundle file path in the dev
        # environment, so when the app is packaged into a MAP, the operator can complete the bundle parsing
        # during init.

        config_names = BundleConfigNames(config_names=["inference"])  # Same as the default

        bundle_spleen_seg_op = MonaiBundleInferenceOperator(
            self,
            input_mapping=[IOMapping("image", Image, IOType.IN_MEMORY)],
            output_mapping=[IOMapping("pred", Image, IOType.IN_MEMORY)],
            app_context=app_context,
            bundle_config_names=config_names,
            bundle_path=model_path,
            name="bundle_spleen_seg_op",
        )

        # Create DICOM Seg writer providing the required segment description for each segment with
        # the actual algorithm and the pertinent organ/tissue. The segment_label, algorithm_name,
        # and algorithm_version are of DICOM VR LO type, limited to 64 chars.
        # https://dicom.nema.org/medical/dicom/current/output/chtml/part05/sect_6.2.html
        segment_descriptions = [
            SegmentDescription(
                segment_label="Spleen",
                segmented_property_category=codes.SCT.Organ,
                segmented_property_type=codes.SCT.Spleen,
                algorithm_name="volumetric (3D) segmentation of the spleen from CT image",
                algorithm_family=codes.DCM.ArtificialIntelligence,
                algorithm_version="0.3.2",
            )
        ]

        custom_tags = {"SeriesDescription": "AI generated Seg, not for clinical use."}

        dicom_seg_writer = DICOMSegmentationWriterOperator(
            self,
            segment_descriptions=segment_descriptions,
            custom_tags=custom_tags,
            output_folder=app_output_path,
            name="dicom_seg_writer",
        )

        # Create the processing pipeline, by specifying the source and destination operators, and
        # ensuring the output from the former matches the input of the latter, in both name and type.
        self.add_flow(study_loader_op, series_selector_op, {("dicom_study_list", "dicom_study_list")})
        self.add_flow(
            series_selector_op, series_to_vol_op, {("study_selected_series_list", "study_selected_series_list")}
        )
        self.add_flow(series_to_vol_op, bundle_spleen_seg_op, {("image", "image")})
        # Note below the dicom_seg_writer requires two inputs, each coming from a source operator.
        self.add_flow(
            series_selector_op, dicom_seg_writer, {("study_selected_series_list", "study_selected_series_list")}
        )
        self.add_flow(bundle_spleen_seg_op, dicom_seg_writer, {("pred", "seg_image")})
        # Create the surface mesh STL conversion operator and add it to the app execution flow, if needed, by
        # uncommenting the following couple lines.
        stl_conversion_op = STLConversionOperator(
            self, output_file=app_output_path.joinpath("stl/spleen.stl"), name="stl_conversion_op"
        )
        self.add_flow(bundle_spleen_seg_op, stl_conversion_op, {("pred", "image")})

        logging.info(f"End {self.compose.__name__}")


# This is a sample series selection rule in JSON, simply selecting CT series.
# If the study has more than 1 CT series, then all of them will be selected.
# Please see more detail in DICOMSeriesSelectorOperator.
Sample_Rules_Text = """
{
    "selections": [
        {
            "name": "CT Series",
            "conditions": {
                "StudyDescription": "(.*?)",
                "Modality": "(?i)CT",
                "SeriesDescription": "(.*?)"
            }
        }
    ]
}
"""

Executing app locally

We can execute the app in the Jupyter notebook. Note that the DICOM files of the CT Abdomen series must be present in the dcm folder and the Torch Script model, model.ts, also in the folder as pointed to by the environment variables.

!rm -rf $HOLOSCAN_OUTPUT_PATH
logging.info(f"Begin {__name__}")
AISpleenSegApp().run()
logging.info(f"End {__name__}")
[2023-08-30 01:19:59,977] [INFO] (root) - Parsed args: Namespace(argv=[], input=None, log_level=None, model=None, output=None, workdir=None)
[2023-08-30 01:19:59,985] [INFO] (root) - AppContext object: AppContext(input_path=dcm, output_path=output, model_path=models, workdir=)
[2023-08-30 01:19:59,991] [INFO] (root) - End compose
[info] [gxf_executor.cpp:210] Creating context
[info] [gxf_executor.cpp:1595] Loading extensions from configs...
[info] [gxf_executor.cpp:1741] Activating Graph...
[info] [gxf_executor.cpp:1771] Running Graph...
[info] [gxf_executor.cpp:1773] Waiting for completion...
[info] [gxf_executor.cpp:1774] Graph execution waiting. Fragment: 
[info] [greedy_scheduler.cpp:190] Scheduling 8 entities
[2023-08-30 01:20:00,118] [INFO] (monai.deploy.operators.dicom_data_loader_operator.DICOMDataLoaderOperator) - No or invalid input path from the optional input port: None
[2023-08-30 01:20:00,446] [INFO] (root) - Finding series for Selection named: CT Series
[2023-08-30 01:20:00,448] [INFO] (root) - Searching study, : 1.3.6.1.4.1.14519.5.2.1.7085.2626.822645453932810382886582736291
  # of series: 1
[2023-08-30 01:20:00,448] [INFO] (root) - Working on series, instance UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239
[2023-08-30 01:20:00,449] [INFO] (root) - On attribute: 'StudyDescription' to match value: '(.*?)'
[2023-08-30 01:20:00,449] [INFO] (root) -     Series attribute StudyDescription value: CT ABDOMEN W IV CONTRAST
[2023-08-30 01:20:00,450] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:00,451] [INFO] (root) - On attribute: 'Modality' to match value: '(?i)CT'
[2023-08-30 01:20:00,451] [INFO] (root) -     Series attribute Modality value: CT
[2023-08-30 01:20:00,452] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:00,453] [INFO] (root) - On attribute: 'SeriesDescription' to match value: '(.*?)'
[2023-08-30 01:20:00,453] [INFO] (root) -     Series attribute SeriesDescription value: ABD/PANC 3.0 B31f
[2023-08-30 01:20:00,454] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:00,454] [INFO] (root) - Selected Series, UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239
[2023-08-30 01:20:00,662] [INFO] (root) - Parsing from bundle_path: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models/model/model.ts
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary LoadImaged.__init__:image_only: Current default value of argument `image_only=False` has been deprecated since version 1.1. It will be changed to `image_only=True` in version 1.3.
  warn_deprecated(argname, msg, warning_category)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary SaveImaged.__init__:resample: Current default value of argument `resample=True` has been deprecated since version 1.1. It will be changed to `resample=False` in version 1.3.
  warn_deprecated(argname, msg, warning_category)
[2023-08-30 01:20:10,739] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConversionOperator) - Output will be saved in file output/stl/spleen.stl.
[2023-08-30 01:20:12,173] [INFO] (monai.deploy.operators.stl_conversion_operator.SpatialImage) - 3D image
[2023-08-30 01:20:12,174] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConverter) - Image ndarray shape:(204, 512, 512)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/highdicom/valuerep.py:54: UserWarning: The string "C3N-00198" is unlikely to represent the intended person name since it contains only a single component. Construct a person name according to the format in described in http://dicom.nema.org/dicom/2013/output/chtml/part05/sect_6.2.html#sect_6.2.1.2, or, in pydicom 2.2.0 or later, use the pydicom.valuerep.PersonName.from_named_components() method to construct the person name correctly. If a single-component name is really intended, add a trailing caret character to disambiguate the name.
  warnings.warn(
[2023-08-30 01:20:22,822] [INFO] (highdicom.seg.sop) - add plane #0 for segment #1
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR UL.
  warnings.warn(msg)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR US.
  warnings.warn(msg)
[2023-08-30 01:20:22,825] [INFO] (highdicom.seg.sop) - add plane #1 for segment #1
[2023-08-30 01:20:22,826] [INFO] (highdicom.seg.sop) - add plane #2 for segment #1
[2023-08-30 01:20:22,827] [INFO] (highdicom.seg.sop) - add plane #3 for segment #1
[2023-08-30 01:20:22,829] [INFO] (highdicom.seg.sop) - add plane #4 for segment #1
[2023-08-30 01:20:22,830] [INFO] (highdicom.seg.sop) - add plane #5 for segment #1
[2023-08-30 01:20:22,831] [INFO] (highdicom.seg.sop) - add plane #6 for segment #1
[2023-08-30 01:20:22,833] [INFO] (highdicom.seg.sop) - add plane #7 for segment #1
[2023-08-30 01:20:22,834] [INFO] (highdicom.seg.sop) - add plane #8 for segment #1
[2023-08-30 01:20:22,836] [INFO] (highdicom.seg.sop) - add plane #9 for segment #1
[2023-08-30 01:20:22,837] [INFO] (highdicom.seg.sop) - add plane #10 for segment #1
[2023-08-30 01:20:22,839] [INFO] (highdicom.seg.sop) - add plane #11 for segment #1
[2023-08-30 01:20:22,841] [INFO] (highdicom.seg.sop) - add plane #12 for segment #1
[2023-08-30 01:20:22,843] [INFO] (highdicom.seg.sop) - add plane #13 for segment #1
[2023-08-30 01:20:22,844] [INFO] (highdicom.seg.sop) - add plane #14 for segment #1
[2023-08-30 01:20:22,846] [INFO] (highdicom.seg.sop) - add plane #15 for segment #1
[2023-08-30 01:20:22,848] [INFO] (highdicom.seg.sop) - add plane #16 for segment #1
[2023-08-30 01:20:22,850] [INFO] (highdicom.seg.sop) - add plane #17 for segment #1
[2023-08-30 01:20:22,852] [INFO] (highdicom.seg.sop) - add plane #18 for segment #1
[2023-08-30 01:20:22,853] [INFO] (highdicom.seg.sop) - add plane #19 for segment #1
[2023-08-30 01:20:22,857] [INFO] (highdicom.seg.sop) - add plane #20 for segment #1
[2023-08-30 01:20:22,858] [INFO] (highdicom.seg.sop) - add plane #21 for segment #1
[2023-08-30 01:20:22,860] [INFO] (highdicom.seg.sop) - add plane #22 for segment #1
[2023-08-30 01:20:22,862] [INFO] (highdicom.seg.sop) - add plane #23 for segment #1
[2023-08-30 01:20:22,863] [INFO] (highdicom.seg.sop) - add plane #24 for segment #1
[2023-08-30 01:20:22,864] [INFO] (highdicom.seg.sop) - add plane #25 for segment #1
[2023-08-30 01:20:22,866] [INFO] (highdicom.seg.sop) - add plane #26 for segment #1
[2023-08-30 01:20:22,867] [INFO] (highdicom.seg.sop) - add plane #27 for segment #1
[2023-08-30 01:20:22,868] [INFO] (highdicom.seg.sop) - add plane #28 for segment #1
[2023-08-30 01:20:22,870] [INFO] (highdicom.seg.sop) - add plane #29 for segment #1
[2023-08-30 01:20:22,871] [INFO] (highdicom.seg.sop) - add plane #30 for segment #1
[2023-08-30 01:20:22,873] [INFO] (highdicom.seg.sop) - add plane #31 for segment #1
[2023-08-30 01:20:22,874] [INFO] (highdicom.seg.sop) - add plane #32 for segment #1
[2023-08-30 01:20:22,875] [INFO] (highdicom.seg.sop) - add plane #33 for segment #1
[2023-08-30 01:20:22,877] [INFO] (highdicom.seg.sop) - add plane #34 for segment #1
[2023-08-30 01:20:22,878] [INFO] (highdicom.seg.sop) - add plane #35 for segment #1
[2023-08-30 01:20:22,880] [INFO] (highdicom.seg.sop) - add plane #36 for segment #1
[2023-08-30 01:20:22,881] [INFO] (highdicom.seg.sop) - add plane #37 for segment #1
[2023-08-30 01:20:22,883] [INFO] (highdicom.seg.sop) - add plane #38 for segment #1
[2023-08-30 01:20:22,884] [INFO] (highdicom.seg.sop) - add plane #39 for segment #1
[2023-08-30 01:20:22,886] [INFO] (highdicom.seg.sop) - add plane #40 for segment #1
[2023-08-30 01:20:22,887] [INFO] (highdicom.seg.sop) - add plane #41 for segment #1
[2023-08-30 01:20:22,889] [INFO] (highdicom.seg.sop) - add plane #42 for segment #1
[2023-08-30 01:20:22,891] [INFO] (highdicom.seg.sop) - add plane #43 for segment #1
[2023-08-30 01:20:22,892] [INFO] (highdicom.seg.sop) - add plane #44 for segment #1
[2023-08-30 01:20:22,894] [INFO] (highdicom.seg.sop) - add plane #45 for segment #1
[2023-08-30 01:20:22,895] [INFO] (highdicom.seg.sop) - add plane #46 for segment #1
[2023-08-30 01:20:22,897] [INFO] (highdicom.seg.sop) - add plane #47 for segment #1
[2023-08-30 01:20:22,899] [INFO] (highdicom.seg.sop) - add plane #48 for segment #1
[2023-08-30 01:20:22,901] [INFO] (highdicom.seg.sop) - add plane #49 for segment #1
[2023-08-30 01:20:22,902] [INFO] (highdicom.seg.sop) - add plane #50 for segment #1
[2023-08-30 01:20:22,904] [INFO] (highdicom.seg.sop) - add plane #51 for segment #1
[2023-08-30 01:20:22,906] [INFO] (highdicom.seg.sop) - add plane #52 for segment #1
[2023-08-30 01:20:22,908] [INFO] (highdicom.seg.sop) - add plane #53 for segment #1
[2023-08-30 01:20:22,909] [INFO] (highdicom.seg.sop) - add plane #54 for segment #1
[2023-08-30 01:20:22,911] [INFO] (highdicom.seg.sop) - add plane #55 for segment #1
[2023-08-30 01:20:22,913] [INFO] (highdicom.seg.sop) - add plane #56 for segment #1
[2023-08-30 01:20:22,915] [INFO] (highdicom.seg.sop) - add plane #57 for segment #1
[2023-08-30 01:20:22,917] [INFO] (highdicom.seg.sop) - add plane #58 for segment #1
[2023-08-30 01:20:22,919] [INFO] (highdicom.seg.sop) - add plane #59 for segment #1
[2023-08-30 01:20:22,921] [INFO] (highdicom.seg.sop) - add plane #60 for segment #1
[2023-08-30 01:20:22,923] [INFO] (highdicom.seg.sop) - add plane #61 for segment #1
[2023-08-30 01:20:22,925] [INFO] (highdicom.seg.sop) - add plane #62 for segment #1
[2023-08-30 01:20:22,927] [INFO] (highdicom.seg.sop) - add plane #63 for segment #1
[2023-08-30 01:20:22,929] [INFO] (highdicom.seg.sop) - add plane #64 for segment #1
[2023-08-30 01:20:22,931] [INFO] (highdicom.seg.sop) - add plane #65 for segment #1
[2023-08-30 01:20:22,933] [INFO] (highdicom.seg.sop) - add plane #66 for segment #1
[2023-08-30 01:20:22,934] [INFO] (highdicom.seg.sop) - add plane #67 for segment #1
[2023-08-30 01:20:22,936] [INFO] (highdicom.seg.sop) - add plane #68 for segment #1
[2023-08-30 01:20:22,939] [INFO] (highdicom.seg.sop) - add plane #69 for segment #1
[2023-08-30 01:20:22,941] [INFO] (highdicom.seg.sop) - add plane #70 for segment #1
[2023-08-30 01:20:22,943] [INFO] (highdicom.seg.sop) - add plane #71 for segment #1
[2023-08-30 01:20:22,946] [INFO] (highdicom.seg.sop) - add plane #72 for segment #1
[2023-08-30 01:20:22,951] [INFO] (highdicom.seg.sop) - add plane #73 for segment #1
[2023-08-30 01:20:22,957] [INFO] (highdicom.seg.sop) - add plane #74 for segment #1
[2023-08-30 01:20:22,959] [INFO] (highdicom.seg.sop) - add plane #75 for segment #1
[2023-08-30 01:20:22,961] [INFO] (highdicom.seg.sop) - add plane #76 for segment #1
[2023-08-30 01:20:22,964] [INFO] (highdicom.seg.sop) - add plane #77 for segment #1
[2023-08-30 01:20:22,966] [INFO] (highdicom.seg.sop) - add plane #78 for segment #1
[2023-08-30 01:20:22,968] [INFO] (highdicom.seg.sop) - add plane #79 for segment #1
[2023-08-30 01:20:22,972] [INFO] (highdicom.seg.sop) - add plane #80 for segment #1
[2023-08-30 01:20:22,974] [INFO] (highdicom.seg.sop) - add plane #81 for segment #1
[2023-08-30 01:20:22,977] [INFO] (highdicom.seg.sop) - add plane #82 for segment #1
[2023-08-30 01:20:22,980] [INFO] (highdicom.seg.sop) - add plane #83 for segment #1
[2023-08-30 01:20:22,982] [INFO] (highdicom.seg.sop) - add plane #84 for segment #1
[2023-08-30 01:20:22,985] [INFO] (highdicom.seg.sop) - add plane #85 for segment #1
[2023-08-30 01:20:22,990] [INFO] (highdicom.seg.sop) - add plane #86 for segment #1
[2023-08-30 01:20:23,056] [INFO] (highdicom.base) - copy Image-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:23,057] [INFO] (highdicom.base) - copy attributes of module "Specimen"
[2023-08-30 01:20:23,058] [INFO] (highdicom.base) - copy Patient-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:23,058] [INFO] (highdicom.base) - copy attributes of module "Patient"
[2023-08-30 01:20:23,060] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Subject"
[2023-08-30 01:20:23,060] [INFO] (highdicom.base) - copy Study-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:23,061] [INFO] (highdicom.base) - copy attributes of module "General Study"
[2023-08-30 01:20:23,062] [INFO] (highdicom.base) - copy attributes of module "Patient Study"
[2023-08-30 01:20:23,063] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Study"
[info] [greedy_scheduler.cpp:369] Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
[info] [greedy_scheduler.cpp:398] Scheduler finished.
[info] [gxf_executor.cpp:1783] Graph execution deactivating. Fragment: 
[info] [gxf_executor.cpp:1784] Deactivating Graph...
[2023-08-30 01:20:23,159] [INFO] (__main__.AISpleenSegApp) - End run
[info] [gxf_executor.cpp:1787] Graph execution finished. Fragment: 
[2023-08-30 01:20:23,161] [INFO] (root) - End __main__

Once the application is verified inside Jupyter notebook, we can write the above Python code into Python files in an application folder.

The application folder structure would look like below:

my_app
├── __main__.py
└── app.py
# Create an application folder
!mkdir -p my_app && rm -rf my_app/*

app.py

%%writefile my_app/app.py

# Copyright 2021-2023 MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#     http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import logging
from pathlib import Path

# Required for setting SegmentDescription attributes. Direct import as this is not part of App SDK package.
from pydicom.sr.codedict import codes

from monai.deploy.conditions import CountCondition
from monai.deploy.core import AppContext, Application
from monai.deploy.core.domain import Image
from monai.deploy.core.io_type import IOType
from monai.deploy.operators.dicom_data_loader_operator import DICOMDataLoaderOperator
from monai.deploy.operators.dicom_seg_writer_operator import DICOMSegmentationWriterOperator, SegmentDescription
from monai.deploy.operators.dicom_series_selector_operator import DICOMSeriesSelectorOperator
from monai.deploy.operators.dicom_series_to_volume_operator import DICOMSeriesToVolumeOperator
from monai.deploy.operators.monai_bundle_inference_operator import (
    BundleConfigNames,
    IOMapping,
    MonaiBundleInferenceOperator,
)
from monai.deploy.operators.stl_conversion_operator import STLConversionOperator


class AISpleenSegApp(Application):
    """Demonstrates inference with built-in MONAI Bundle inference operator with DICOM files as input/output

    This application loads a set of DICOM instances, select the appropriate series, converts the series to
    3D volume image, performs inference with the built-in MONAI Bundle inference operator, including pre-processing
    and post-processing, save the segmentation image in a DICOM Seg OID in an instance file, and optionally the
    surface mesh in STL format.

    Pertinent MONAI Bundle:
      https://github.com/Project-MONAI/model-zoo/tree/dev/models/spleen_ct_segmentation

    Execution Time Estimate:
      With a Nvidia GV100 32GB GPU, for an input DICOM Series of 515 instances, the execution time is around
      25 seconds with saving both DICOM Seg and surface mesh STL file, and 15 seconds with DICOM Seg only.
    """

    def __init__(self, *args, **kwargs):
        """Creates an application instance."""
        self._logger = logging.getLogger("{}.{}".format(__name__, type(self).__name__))
        super().__init__(*args, **kwargs)

    def run(self, *args, **kwargs):
        # This method calls the base class to run. Can be omitted if simply calling through.
        self._logger.info(f"Begin {self.run.__name__}")
        super().run(*args, **kwargs)
        self._logger.info(f"End {self.run.__name__}")

    def compose(self):
        """Creates the app specific operators and chain them up in the processing DAG."""

        logging.info(f"Begin {self.compose.__name__}")

        # Use Commandline options over environment variables to init context.
        app_context = Application.init_app_context(self.argv)
        app_input_path = Path(app_context.input_path)
        app_output_path = Path(app_context.output_path)
        model_path = Path(app_context.model_path)

        # Create the custom operator(s) as well as SDK built-in operator(s).
        study_loader_op = DICOMDataLoaderOperator(
            self, CountCondition(self, 1), input_folder=app_input_path, name="study_loader_op"
        )
        series_selector_op = DICOMSeriesSelectorOperator(self, rules=Sample_Rules_Text, name="series_selector_op")
        series_to_vol_op = DICOMSeriesToVolumeOperator(self, name="series_to_vol_op")

        # Create the inference operator that supports MONAI Bundle and automates the inference.
        # The IOMapping labels match the input and prediction keys in the pre and post processing.
        # The model_name is optional when the app has only one model.
        # The bundle_path argument optionally can be set to an accessible bundle file path in the dev
        # environment, so when the app is packaged into a MAP, the operator can complete the bundle parsing
        # during init.

        config_names = BundleConfigNames(config_names=["inference"])  # Same as the default

        bundle_spleen_seg_op = MonaiBundleInferenceOperator(
            self,
            input_mapping=[IOMapping("image", Image, IOType.IN_MEMORY)],
            output_mapping=[IOMapping("pred", Image, IOType.IN_MEMORY)],
            app_context=app_context,
            bundle_config_names=config_names,
            bundle_path=model_path,
            name="bundle_spleen_seg_op",
        )

        # Create DICOM Seg writer providing the required segment description for each segment with
        # the actual algorithm and the pertinent organ/tissue. The segment_label, algorithm_name,
        # and algorithm_version are of DICOM VR LO type, limited to 64 chars.
        # https://dicom.nema.org/medical/dicom/current/output/chtml/part05/sect_6.2.html
        segment_descriptions = [
            SegmentDescription(
                segment_label="Spleen",
                segmented_property_category=codes.SCT.Organ,
                segmented_property_type=codes.SCT.Spleen,
                algorithm_name="volumetric (3D) segmentation of the spleen from CT image",
                algorithm_family=codes.DCM.ArtificialIntelligence,
                algorithm_version="0.3.2",
            )
        ]

        custom_tags = {"SeriesDescription": "AI generated Seg, not for clinical use."}

        dicom_seg_writer = DICOMSegmentationWriterOperator(
            self,
            segment_descriptions=segment_descriptions,
            custom_tags=custom_tags,
            output_folder=app_output_path,
            name="dicom_seg_writer",
        )

        # Create the processing pipeline, by specifying the source and destination operators, and
        # ensuring the output from the former matches the input of the latter, in both name and type.
        self.add_flow(study_loader_op, series_selector_op, {("dicom_study_list", "dicom_study_list")})
        self.add_flow(
            series_selector_op, series_to_vol_op, {("study_selected_series_list", "study_selected_series_list")}
        )
        self.add_flow(series_to_vol_op, bundle_spleen_seg_op, {("image", "image")})
        # Note below the dicom_seg_writer requires two inputs, each coming from a source operator.
        self.add_flow(
            series_selector_op, dicom_seg_writer, {("study_selected_series_list", "study_selected_series_list")}
        )
        self.add_flow(bundle_spleen_seg_op, dicom_seg_writer, {("pred", "seg_image")})
        # Create the surface mesh STL conversion operator and add it to the app execution flow, if needed, by
        # uncommenting the following couple lines.
        stl_conversion_op = STLConversionOperator(
            self, output_file=app_output_path.joinpath("stl/spleen.stl"), name="stl_conversion_op"
        )
        self.add_flow(bundle_spleen_seg_op, stl_conversion_op, {("pred", "image")})

        logging.info(f"End {self.compose.__name__}")


# This is a sample series selection rule in JSON, simply selecting CT series.
# If the study has more than 1 CT series, then all of them will be selected.
# Please see more detail in DICOMSeriesSelectorOperator.
Sample_Rules_Text = """
{
    "selections": [
        {
            "name": "CT Series",
            "conditions": {
                "StudyDescription": "(.*?)",
                "Modality": "(?i)CT",
                "SeriesDescription": "(.*?)"
            }
        }
    ]
}
"""

if __name__ == "__main__":
    AISpleenSegApp().run()
Writing my_app/app.py
if __name__ == "__main__":
    AISpleenSegApp().run()

The above lines are needed to execute the application code by using python interpreter.

__main__.py

__main__.py is needed for MONAI Application Packager to detect the main application code (app.py) when the application is executed with the application folder path (e.g., python simple_imaging_app).

%%writefile my_app/__main__.py
from app import AISpleenSegApp

if __name__ == "__main__":
    AISpleenSegApp().run()
Writing my_app/__main__.py
!ls my_app
app.py	__main__.py

This time, let’s execute the app in the command line.

Note

Since the environment variables have been set and contain the correct paths, it is not necessary to provide the command line options on running the application. The following command demonstrates the use of the options.

!rm -rf $HOLOSCAN_OUTPUT_PATH
!python my_app -i dcm -o output -m models
[2023-08-30 01:20:29,553] [INFO] (root) - Parsed args: Namespace(argv=['my_app', '-i', 'dcm', '-o', 'output', '-m', 'models'], input=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/dcm'), log_level=None, model=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models'), output=PosixPath('/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output'), workdir=None)
[2023-08-30 01:20:29,554] [INFO] (root) - AppContext object: AppContext(input_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/dcm, output_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output, model_path=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models, workdir=)
[2023-08-30 01:20:29,556] [INFO] (root) - End compose
[info] [gxf_executor.cpp:210] Creating context
[info] [gxf_executor.cpp:1595] Loading extensions from configs...
[info] [gxf_executor.cpp:1741] Activating Graph...
[info] [gxf_executor.cpp:1771] Running Graph...
[info] [gxf_executor.cpp:1773] Waiting for completion...
[info] [gxf_executor.cpp:1774] Graph execution waiting. Fragment: 
[info] [greedy_scheduler.cpp:190] Scheduling 8 entities
[2023-08-30 01:20:29,624] [INFO] (monai.deploy.operators.dicom_data_loader_operator.DICOMDataLoaderOperator) - No or invalid input path from the optional input port: None
[2023-08-30 01:20:30,122] [INFO] (root) - Finding series for Selection named: CT Series
[2023-08-30 01:20:30,122] [INFO] (root) - Searching study, : 1.3.6.1.4.1.14519.5.2.1.7085.2626.822645453932810382886582736291
  # of series: 1
[2023-08-30 01:20:30,122] [INFO] (root) - Working on series, instance UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239
[2023-08-30 01:20:30,122] [INFO] (root) - On attribute: 'StudyDescription' to match value: '(.*?)'
[2023-08-30 01:20:30,122] [INFO] (root) -     Series attribute StudyDescription value: CT ABDOMEN W IV CONTRAST
[2023-08-30 01:20:30,122] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:30,122] [INFO] (root) - On attribute: 'Modality' to match value: '(?i)CT'
[2023-08-30 01:20:30,122] [INFO] (root) -     Series attribute Modality value: CT
[2023-08-30 01:20:30,122] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:30,122] [INFO] (root) - On attribute: 'SeriesDescription' to match value: '(.*?)'
[2023-08-30 01:20:30,122] [INFO] (root) -     Series attribute SeriesDescription value: ABD/PANC 3.0 B31f
[2023-08-30 01:20:30,122] [INFO] (root) - Series attribute string value did not match. Try regEx.
[2023-08-30 01:20:30,122] [INFO] (root) - Selected Series, UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239
[2023-08-30 01:20:30,324] [INFO] (root) - Parsing from bundle_path: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models/model/model.ts
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary LoadImaged.__init__:image_only: Current default value of argument `image_only=False` has been deprecated since version 1.1. It will be changed to `image_only=True` in version 1.3.
  warn_deprecated(argname, msg, warning_category)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary SaveImaged.__init__:resample: Current default value of argument `resample=True` has been deprecated since version 1.1. It will be changed to `resample=False` in version 1.3.
  warn_deprecated(argname, msg, warning_category)
[2023-08-30 01:20:35,800] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConversionOperator) - Output will be saved in file /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/output/stl/spleen.stl.
[2023-08-30 01:20:37,096] [INFO] (monai.deploy.operators.stl_conversion_operator.SpatialImage) - 3D image
[2023-08-30 01:20:37,096] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConverter) - Image ndarray shape:(204, 512, 512)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/highdicom/valuerep.py:54: UserWarning: The string "C3N-00198" is unlikely to represent the intended person name since it contains only a single component. Construct a person name according to the format in described in http://dicom.nema.org/dicom/2013/output/chtml/part05/sect_6.2.html#sect_6.2.1.2, or, in pydicom 2.2.0 or later, use the pydicom.valuerep.PersonName.from_named_components() method to construct the person name correctly. If a single-component name is really intended, add a trailing caret character to disambiguate the name.
  warnings.warn(
[2023-08-30 01:20:47,289] [INFO] (highdicom.seg.sop) - add plane #0 for segment #1
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR UL.
  warnings.warn(msg)
/home/mqin/src/monai-deploy-app-sdk/.venv/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR US.
  warnings.warn(msg)
[2023-08-30 01:20:47,291] [INFO] (highdicom.seg.sop) - add plane #1 for segment #1
[2023-08-30 01:20:47,291] [INFO] (highdicom.seg.sop) - add plane #2 for segment #1
[2023-08-30 01:20:47,292] [INFO] (highdicom.seg.sop) - add plane #3 for segment #1
[2023-08-30 01:20:47,293] [INFO] (highdicom.seg.sop) - add plane #4 for segment #1
[2023-08-30 01:20:47,293] [INFO] (highdicom.seg.sop) - add plane #5 for segment #1
[2023-08-30 01:20:47,294] [INFO] (highdicom.seg.sop) - add plane #6 for segment #1
[2023-08-30 01:20:47,294] [INFO] (highdicom.seg.sop) - add plane #7 for segment #1
[2023-08-30 01:20:47,295] [INFO] (highdicom.seg.sop) - add plane #8 for segment #1
[2023-08-30 01:20:47,296] [INFO] (highdicom.seg.sop) - add plane #9 for segment #1
[2023-08-30 01:20:47,296] [INFO] (highdicom.seg.sop) - add plane #10 for segment #1
[2023-08-30 01:20:47,297] [INFO] (highdicom.seg.sop) - add plane #11 for segment #1
[2023-08-30 01:20:47,297] [INFO] (highdicom.seg.sop) - add plane #12 for segment #1
[2023-08-30 01:20:47,298] [INFO] (highdicom.seg.sop) - add plane #13 for segment #1
[2023-08-30 01:20:47,298] [INFO] (highdicom.seg.sop) - add plane #14 for segment #1
[2023-08-30 01:20:47,299] [INFO] (highdicom.seg.sop) - add plane #15 for segment #1
[2023-08-30 01:20:47,300] [INFO] (highdicom.seg.sop) - add plane #16 for segment #1
[2023-08-30 01:20:47,300] [INFO] (highdicom.seg.sop) - add plane #17 for segment #1
[2023-08-30 01:20:47,301] [INFO] (highdicom.seg.sop) - add plane #18 for segment #1
[2023-08-30 01:20:47,301] [INFO] (highdicom.seg.sop) - add plane #19 for segment #1
[2023-08-30 01:20:47,302] [INFO] (highdicom.seg.sop) - add plane #20 for segment #1
[2023-08-30 01:20:47,302] [INFO] (highdicom.seg.sop) - add plane #21 for segment #1
[2023-08-30 01:20:47,304] [INFO] (highdicom.seg.sop) - add plane #22 for segment #1
[2023-08-30 01:20:47,304] [INFO] (highdicom.seg.sop) - add plane #23 for segment #1
[2023-08-30 01:20:47,305] [INFO] (highdicom.seg.sop) - add plane #24 for segment #1
[2023-08-30 01:20:47,305] [INFO] (highdicom.seg.sop) - add plane #25 for segment #1
[2023-08-30 01:20:47,306] [INFO] (highdicom.seg.sop) - add plane #26 for segment #1
[2023-08-30 01:20:47,306] [INFO] (highdicom.seg.sop) - add plane #27 for segment #1
[2023-08-30 01:20:47,307] [INFO] (highdicom.seg.sop) - add plane #28 for segment #1
[2023-08-30 01:20:47,308] [INFO] (highdicom.seg.sop) - add plane #29 for segment #1
[2023-08-30 01:20:47,308] [INFO] (highdicom.seg.sop) - add plane #30 for segment #1
[2023-08-30 01:20:47,309] [INFO] (highdicom.seg.sop) - add plane #31 for segment #1
[2023-08-30 01:20:47,309] [INFO] (highdicom.seg.sop) - add plane #32 for segment #1
[2023-08-30 01:20:47,310] [INFO] (highdicom.seg.sop) - add plane #33 for segment #1
[2023-08-30 01:20:47,310] [INFO] (highdicom.seg.sop) - add plane #34 for segment #1
[2023-08-30 01:20:47,311] [INFO] (highdicom.seg.sop) - add plane #35 for segment #1
[2023-08-30 01:20:47,312] [INFO] (highdicom.seg.sop) - add plane #36 for segment #1
[2023-08-30 01:20:47,312] [INFO] (highdicom.seg.sop) - add plane #37 for segment #1
[2023-08-30 01:20:47,313] [INFO] (highdicom.seg.sop) - add plane #38 for segment #1
[2023-08-30 01:20:47,313] [INFO] (highdicom.seg.sop) - add plane #39 for segment #1
[2023-08-30 01:20:47,314] [INFO] (highdicom.seg.sop) - add plane #40 for segment #1
[2023-08-30 01:20:47,314] [INFO] (highdicom.seg.sop) - add plane #41 for segment #1
[2023-08-30 01:20:47,315] [INFO] (highdicom.seg.sop) - add plane #42 for segment #1
[2023-08-30 01:20:47,316] [INFO] (highdicom.seg.sop) - add plane #43 for segment #1
[2023-08-30 01:20:47,316] [INFO] (highdicom.seg.sop) - add plane #44 for segment #1
[2023-08-30 01:20:47,317] [INFO] (highdicom.seg.sop) - add plane #45 for segment #1
[2023-08-30 01:20:47,317] [INFO] (highdicom.seg.sop) - add plane #46 for segment #1
[2023-08-30 01:20:47,318] [INFO] (highdicom.seg.sop) - add plane #47 for segment #1
[2023-08-30 01:20:47,318] [INFO] (highdicom.seg.sop) - add plane #48 for segment #1
[2023-08-30 01:20:47,319] [INFO] (highdicom.seg.sop) - add plane #49 for segment #1
[2023-08-30 01:20:47,320] [INFO] (highdicom.seg.sop) - add plane #50 for segment #1
[2023-08-30 01:20:47,320] [INFO] (highdicom.seg.sop) - add plane #51 for segment #1
[2023-08-30 01:20:47,321] [INFO] (highdicom.seg.sop) - add plane #52 for segment #1
[2023-08-30 01:20:47,321] [INFO] (highdicom.seg.sop) - add plane #53 for segment #1
[2023-08-30 01:20:47,322] [INFO] (highdicom.seg.sop) - add plane #54 for segment #1
[2023-08-30 01:20:47,322] [INFO] (highdicom.seg.sop) - add plane #55 for segment #1
[2023-08-30 01:20:47,323] [INFO] (highdicom.seg.sop) - add plane #56 for segment #1
[2023-08-30 01:20:47,324] [INFO] (highdicom.seg.sop) - add plane #57 for segment #1
[2023-08-30 01:20:47,324] [INFO] (highdicom.seg.sop) - add plane #58 for segment #1
[2023-08-30 01:20:47,325] [INFO] (highdicom.seg.sop) - add plane #59 for segment #1
[2023-08-30 01:20:47,325] [INFO] (highdicom.seg.sop) - add plane #60 for segment #1
[2023-08-30 01:20:47,326] [INFO] (highdicom.seg.sop) - add plane #61 for segment #1
[2023-08-30 01:20:47,327] [INFO] (highdicom.seg.sop) - add plane #62 for segment #1
[2023-08-30 01:20:47,327] [INFO] (highdicom.seg.sop) - add plane #63 for segment #1
[2023-08-30 01:20:47,328] [INFO] (highdicom.seg.sop) - add plane #64 for segment #1
[2023-08-30 01:20:47,328] [INFO] (highdicom.seg.sop) - add plane #65 for segment #1
[2023-08-30 01:20:47,329] [INFO] (highdicom.seg.sop) - add plane #66 for segment #1
[2023-08-30 01:20:47,329] [INFO] (highdicom.seg.sop) - add plane #67 for segment #1
[2023-08-30 01:20:47,330] [INFO] (highdicom.seg.sop) - add plane #68 for segment #1
[2023-08-30 01:20:47,331] [INFO] (highdicom.seg.sop) - add plane #69 for segment #1
[2023-08-30 01:20:47,331] [INFO] (highdicom.seg.sop) - add plane #70 for segment #1
[2023-08-30 01:20:47,332] [INFO] (highdicom.seg.sop) - add plane #71 for segment #1
[2023-08-30 01:20:47,332] [INFO] (highdicom.seg.sop) - add plane #72 for segment #1
[2023-08-30 01:20:47,333] [INFO] (highdicom.seg.sop) - add plane #73 for segment #1
[2023-08-30 01:20:47,333] [INFO] (highdicom.seg.sop) - add plane #74 for segment #1
[2023-08-30 01:20:47,334] [INFO] (highdicom.seg.sop) - add plane #75 for segment #1
[2023-08-30 01:20:47,335] [INFO] (highdicom.seg.sop) - add plane #76 for segment #1
[2023-08-30 01:20:47,335] [INFO] (highdicom.seg.sop) - add plane #77 for segment #1
[2023-08-30 01:20:47,336] [INFO] (highdicom.seg.sop) - add plane #78 for segment #1
[2023-08-30 01:20:47,336] [INFO] (highdicom.seg.sop) - add plane #79 for segment #1
[2023-08-30 01:20:47,337] [INFO] (highdicom.seg.sop) - add plane #80 for segment #1
[2023-08-30 01:20:47,337] [INFO] (highdicom.seg.sop) - add plane #81 for segment #1
[2023-08-30 01:20:47,338] [INFO] (highdicom.seg.sop) - add plane #82 for segment #1
[2023-08-30 01:20:47,339] [INFO] (highdicom.seg.sop) - add plane #83 for segment #1
[2023-08-30 01:20:47,339] [INFO] (highdicom.seg.sop) - add plane #84 for segment #1
[2023-08-30 01:20:47,340] [INFO] (highdicom.seg.sop) - add plane #85 for segment #1
[2023-08-30 01:20:47,340] [INFO] (highdicom.seg.sop) - add plane #86 for segment #1
[2023-08-30 01:20:47,387] [INFO] (highdicom.base) - copy Image-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:47,387] [INFO] (highdicom.base) - copy attributes of module "Specimen"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy Patient-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy attributes of module "Patient"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Subject"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy Study-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy attributes of module "General Study"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy attributes of module "Patient Study"
[2023-08-30 01:20:47,388] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Study"
[info] [greedy_scheduler.cpp:369] Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.
[info] [greedy_scheduler.cpp:398] Scheduler finished.
[info] [gxf_executor.cpp:1783] Graph execution deactivating. Fragment: 
[info] [gxf_executor.cpp:1784] Deactivating Graph...
[info] [gxf_executor.cpp:1787] Graph execution finished. Fragment: 
[2023-08-30 01:20:47,473] [INFO] (app.AISpleenSegApp) - End run
!ls output
1.2.826.0.1.3680043.10.511.3.10023070564574692570777379407935822.dcm  stl

Packaging app

Let’s package the app with MONAI Application Packager.

In this version of the App SDK, we need to write out the configuration yaml file as well as the package requirements file, in the application folder.

%%writefile my_app/app.yaml
%YAML 1.2
---
application:
  title: MONAI Deploy App Package - MONAI Bundle AI App
  version: 1.0
  inputFormats: ["file"]
  outputFormats: ["file"]

resources:
  cpu: 1
  gpu: 1
  memory: 1Gi
  gpuMemory: 6Gi
Writing my_app/app.yaml
%%writefile my_app/requirements.txt
highdicom>=0.18.2
monai>=1.0
nibabel>=3.2.1
numpy>=1.21.6
pydicom>=2.3.0
setuptools>=59.5.0 # for pkg_resources
SimpleITK>=2.0.0
torch>=1.12.0
Writing my_app/requirements.txt

Now we can use the CLI package command to build the MONAI Application Package (MAP) container image based on a supported base image.

Note

Building a MONAI Application Package (Docker image) can take time. Use -l DEBUG option to see the progress.

tag_prefix = "my_app"

!monai-deploy package my_app -m {models_folder} -c my_app/app.yaml -t {tag_prefix}:1.0 --platform x64-workstation -l DEBUG
[2023-08-30 01:20:50,861] [INFO] (packager.parameters) - Application: /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/my_app
[2023-08-30 01:20:50,861] [INFO] (packager.parameters) - Detected application type: Python Module
[2023-08-30 01:20:50,861] [INFO] (packager) - Scanning for models in {models_path}...
[2023-08-30 01:20:50,861] [DEBUG] (packager) - Model model=/home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/models/model added.
[2023-08-30 01:20:50,861] [INFO] (packager) - Reading application configuration from /home/mqin/src/monai-deploy-app-sdk/notebooks/tutorials/my_app/app.yaml...
[2023-08-30 01:20:50,863] [INFO] (packager) - Generating app.json...
[2023-08-30 01:20:50,863] [INFO] (packager) - Generating pkg.json...
[2023-08-30 01:20:50,864] [DEBUG] (common) - 
=============== Begin app.json ===============
{
    "apiVersion": "1.0.0",
    "command": "[\"python3\", \"/opt/holoscan/app\"]",
    "environment": {
        "HOLOSCAN_APPLICATION": "/opt/holoscan/app",
        "HOLOSCAN_INPUT_PATH": "input/",
        "HOLOSCAN_OUTPUT_PATH": "output/",
        "HOLOSCAN_WORKDIR": "/var/holoscan",
        "HOLOSCAN_MODEL_PATH": "/opt/holoscan/models",
        "HOLOSCAN_CONFIG_PATH": "/var/holoscan/app.yaml",
        "HOLOSCAN_APP_MANIFEST_PATH": "/etc/holoscan/app.json",
        "HOLOSCAN_PKG_MANIFEST_PATH": "/etc/holoscan/pkg.json",
        "HOLOSCAN_DOCS_PATH": "/opt/holoscan/docs",
        "HOLOSCAN_LOGS_PATH": "/var/holoscan/logs"
    },
    "input": {
        "path": "input/",
        "formats": null
    },
    "liveness": null,
    "output": {
        "path": "output/",
        "formats": null
    },
    "readiness": null,
    "sdk": "monai-deploy",
    "sdkVersion": "0.6.0",
    "timeout": 0,
    "version": 1.0,
    "workingDirectory": "/var/holoscan"
}
================ End app.json ================
                 
[2023-08-30 01:20:50,864] [DEBUG] (common) - 
=============== Begin pkg.json ===============
{
    "apiVersion": "1.0.0",
    "applicationRoot": "/opt/holoscan/app",
    "modelRoot": "/opt/holoscan/models",
    "models": {
        "model": "/opt/holoscan/models"
    },
    "resources": {
        "cpu": 1,
        "gpu": 1,
        "memory": "1Gi",
        "gpuMemory": "6Gi"
    },
    "version": 1.0
}
================ End pkg.json ================
                 
[2023-08-30 01:20:50,888] [DEBUG] (packager.builder) - 
========== Begin Dockerfile ==========


FROM nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu

ENV DEBIAN_FRONTEND=noninteractive
ENV TERM=xterm-256color

ARG UNAME
ARG UID
ARG GID

RUN mkdir -p /etc/holoscan/ \
        && mkdir -p /opt/holoscan/ \
        && mkdir -p /var/holoscan \
        && mkdir -p /opt/holoscan/app \
        && mkdir -p /var/holoscan/input \
        && mkdir -p /var/holoscan/output

LABEL base="nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu"
LABEL tag="my_app:1.0"
LABEL org.opencontainers.image.title="MONAI Deploy App Package - MONAI Bundle AI App"
LABEL org.opencontainers.image.version="1.0"
LABEL org.nvidia.holoscan="0.6.0"

ENV HOLOSCAN_ENABLE_HEALTH_CHECK=true
ENV HOLOSCAN_INPUT_PATH=/var/holoscan/input
ENV HOLOSCAN_OUTPUT_PATH=/var/holoscan/output
ENV HOLOSCAN_WORKDIR=/var/holoscan
ENV HOLOSCAN_APPLICATION=/opt/holoscan/app
ENV HOLOSCAN_TIMEOUT=0
ENV HOLOSCAN_MODEL_PATH=/opt/holoscan/models
ENV HOLOSCAN_DOCS_PATH=/opt/holoscan/docs
ENV HOLOSCAN_CONFIG_PATH=/var/holoscan/app.yaml
ENV HOLOSCAN_APP_MANIFEST_PATH=/etc/holoscan/app.json
ENV HOLOSCAN_PKG_MANIFEST_PATH=/etc/holoscan/pkg.json
ENV HOLOSCAN_LOGS_PATH=/var/holoscan/logs
ENV PATH=/root/.local/bin:/opt/nvidia/holoscan:$PATH
ENV LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/libtorch/1.13.1/lib/:/opt/nvidia/holoscan/lib

RUN apt-get update \
    && apt-get install -y curl jq \
    && rm -rf /var/lib/apt/lists/*

ENV PYTHONPATH="/opt/holoscan/app:$PYTHONPATH"



RUN groupadd -g $GID $UNAME
RUN useradd -rm -d /home/$UNAME -s /bin/bash -g $GID -G sudo -u $UID $UNAME
RUN chown -R holoscan /var/holoscan 
RUN chown -R holoscan /var/holoscan/input 
RUN chown -R holoscan /var/holoscan/output 

# Set the working directory
WORKDIR /var/holoscan

# Copy HAP/MAP tool script
COPY ./tools /var/holoscan/tools
RUN chmod +x /var/holoscan/tools


# Copy gRPC health probe

USER $UNAME

ENV PATH=/root/.local/bin:/home/holoscan/.local/bin:/opt/nvidia/holoscan:$PATH

COPY ./pip/requirements.txt /tmp/requirements.txt

RUN pip install --upgrade pip
RUN pip install --no-cache-dir --user -r /tmp/requirements.txt

# Install Holoscan from PyPI org
RUN pip install holoscan==0.6.0


# Copy user-specified MONAI Deploy SDK file
COPY ./monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl /tmp/monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl
RUN pip install /tmp/monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl




COPY ./models  /opt/holoscan/models

COPY ./map/app.json /etc/holoscan/app.json
COPY ./app.config /var/holoscan/app.yaml
COPY ./map/pkg.json /etc/holoscan/pkg.json

COPY ./app /opt/holoscan/app

ENTRYPOINT ["/var/holoscan/tools"]
=========== End Dockerfile ===========

[2023-08-30 01:20:50,889] [INFO] (packager.builder) - 
===============================================================================
Building image for:                 x64-workstation
    Architecture:                   linux/amd64
    Base Image:                     nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu
    Build Image:                    N/A  
    Cache:                          Enabled
    Configuration:                  dgpu
    Holoiscan SDK Package:          pypi.org
    MONAI Deploy App SDK Package:   /home/mqin/src/monai-deploy-app-sdk/dist/monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl
    gRPC Health Probe:              N/A
    SDK Version:                    0.6.0
    SDK:                            monai-deploy
    Tag:                            my_app-x64-workstation-dgpu-linux-amd64:1.0
    
[2023-08-30 01:20:51,572] [INFO] (common) - Using existing Docker BuildKit builder `holoscan_app_builder`
[2023-08-30 01:20:51,573] [DEBUG] (packager.builder) - Building Holoscan Application Package: tag=my_app-x64-workstation-dgpu-linux-amd64:1.0
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 2.67kB done
#1 DONE 0.1s

#2 [internal] load .dockerignore
#2 transferring context: 1.79kB done
#2 DONE 0.1s

#3 [internal] load metadata for nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu
#3 DONE 0.4s

#4 [internal] load build context
#4 DONE 0.0s

#5 importing cache manifest from local:2428133242780292460
#5 DONE 0.0s

#6 importing cache manifest from nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu
#6 DONE 0.9s

#7 [ 1/22] FROM nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu@sha256:9653f80f241fd542f25afbcbcf7a0d02ed7e5941c79763e69def5b1e6d9fb7bc
#7 resolve nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu@sha256:9653f80f241fd542f25afbcbcf7a0d02ed7e5941c79763e69def5b1e6d9fb7bc
#7 resolve nvcr.io/nvidia/clara-holoscan/holoscan:v0.6.0-dgpu@sha256:9653f80f241fd542f25afbcbcf7a0d02ed7e5941c79763e69def5b1e6d9fb7bc 0.1s done
#7 DONE 0.1s

#4 [internal] load build context
#4 transferring context: 19.57MB 0.2s done
#4 DONE 0.3s

#8 [ 6/22] RUN chown -R holoscan /var/holoscan
#8 CACHED

#9 [16/22] COPY ./monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl /tmp/monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl
#9 CACHED

#10 [17/22] RUN pip install /tmp/monai_deploy_app_sdk-0.5.1+22.g029f8bc.dirty-py3-none-any.whl
#10 CACHED

#11 [19/22] COPY ./map/app.json /etc/holoscan/app.json
#11 CACHED

#12 [ 5/22] RUN useradd -rm -d /home/holoscan -s /bin/bash -g 1000 -G sudo -u 1000 holoscan
#12 CACHED

#13 [14/22] RUN pip install --no-cache-dir --user -r /tmp/requirements.txt
#13 CACHED

#14 [ 9/22] WORKDIR /var/holoscan
#14 CACHED

#15 [ 8/22] RUN chown -R holoscan /var/holoscan/output
#15 CACHED

#16 [13/22] RUN pip install --upgrade pip
#16 CACHED

#17 [15/22] RUN pip install holoscan==0.6.0
#17 CACHED

#18 [10/22] COPY ./tools /var/holoscan/tools
#18 CACHED

#19 [18/22] COPY ./models  /opt/holoscan/models
#19 CACHED

#20 [20/22] COPY ./app.config /var/holoscan/app.yaml
#20 CACHED

#21 [12/22] COPY ./pip/requirements.txt /tmp/requirements.txt
#21 CACHED

#22 [ 7/22] RUN chown -R holoscan /var/holoscan/input
#22 CACHED

#23 [ 4/22] RUN groupadd -g 1000 holoscan
#23 CACHED

#24 [ 3/22] RUN apt-get update     && apt-get install -y curl jq     && rm -rf /var/lib/apt/lists/*
#24 CACHED

#25 [ 2/22] RUN mkdir -p /etc/holoscan/         && mkdir -p /opt/holoscan/         && mkdir -p /var/holoscan         && mkdir -p /opt/holoscan/app         && mkdir -p /var/holoscan/input         && mkdir -p /var/holoscan/output
#25 CACHED

#26 [11/22] RUN chmod +x /var/holoscan/tools
#26 CACHED

#27 [21/22] COPY ./map/pkg.json /etc/holoscan/pkg.json
#27 CACHED

#28 [22/22] COPY ./app /opt/holoscan/app
#28 DONE 0.3s

#29 exporting to docker image format
#29 exporting layers
#29 exporting layers 0.2s done
#29 exporting manifest sha256:84725c6be7300f1d3487cf953efea1b7123df1b79dc893f79dd41e9b714cc971 0.0s done
#29 exporting config sha256:716356b4f3c03984961a626a47638b2538cf18516f267b54c7d0f502aa0ab077
#29 exporting config sha256:716356b4f3c03984961a626a47638b2538cf18516f267b54c7d0f502aa0ab077 0.0s done
#29 sending tarball
#29 ...

#30 importing to docker
#30 DONE 0.8s

#29 exporting to docker image format
#29 sending tarball 54.7s done
#29 DONE 54.9s

#31 exporting content cache
#31 preparing build cache for export
#31 writing layer sha256:0709800848b4584780b40e7e81200689870e890c38b54e96b65cd0a3b1942f2d done
#31 writing layer sha256:0ce020987cfa5cd1654085af3bb40779634eb3d792c4a4d6059036463ae0040d done
#31 writing layer sha256:0f4bc5775dfef844ad94316d6cba08f7430019a5986278e18978fdf8fd6370d0 done
#31 writing layer sha256:0f65089b284381bf795d15b1a186e2a8739ea957106fa526edef0d738e7cda70 done
#31 writing layer sha256:12a47450a9f9cc5d4edab65d0f600dbbe8b23a1663b0b3bb2c481d40e074b580 done
#31 writing layer sha256:1de965777e2e37c7fabe00bdbf3d0203ca83ed30a71a5479c3113fe4fc48c4bb done
#31 writing layer sha256:1e6d878a29f0eee28390766120813fdf36893f516bcc029e698cd941eeb79616 done
#31 writing layer sha256:24b5aa2448e920814dd67d7d3c0169b2cdacb13c4048d74ded3b4317843b13ff done
#31 writing layer sha256:2789e1f0e19719b047679b4b490cab1edb9e151cd286aed22df08022c249f040 done
#31 writing layer sha256:2d42104dbf0a7cc962b791f6ab4f45a803f8a36d296f996aca180cfb2f3e30d0 done
#31 writing layer sha256:2fa1ce4fa3fec6f9723380dc0536b7c361d874add0baaddc4bbf2accac82d2ff done
#31 writing layer sha256:38794be1b5dc99645feabf89b22cd34fb5bdffb5164ad920e7df94f353efe9c0 done
#31 writing layer sha256:38f963dc57c1e7b68a738fe39ed9f9345df7188111a047e2163a46648d7f1d88 done
#31 writing layer sha256:3e7e4c9bc2b136814c20c04feb4eea2b2ecf972e20182d88759931130cfb4181 done
#31 writing layer sha256:3fd77037ad585442cd82d64e337f49a38ddba50432b2a1e563a48401d25c79e6 done
#31 writing layer sha256:41814ed91034b30ac9c44dfc604a4bade6138005ccf682372c02e0bead66dbc0
#31 writing layer sha256:41814ed91034b30ac9c44dfc604a4bade6138005ccf682372c02e0bead66dbc0 done
#31 writing layer sha256:45893188359aca643d5918c9932da995364dc62013dfa40c075298b1baabece3 done
#31 writing layer sha256:49bc651b19d9e46715c15c41b7c0daa007e8e25f7d9518f04f0f06592799875a done
#31 writing layer sha256:4c12db5118d8a7d909e4926d69a2192d2b3cd8b110d49c7504a4f701258c1ccc done
#31 writing layer sha256:4cc43a803109d6e9d1fd35495cef9b1257035f5341a2db54f7a1940815b6cc65 done
#31 writing layer sha256:4d32b49e2995210e8937f0898327f196d3fcc52486f0be920e8b2d65f150a7ab done
#31 writing layer sha256:4d6fe980bad9cd7b2c85a478c8033cae3d098a81f7934322fb64658b0c8f9854 done
#31 writing layer sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1 done
#31 writing layer sha256:50b2500ad4a5ad2f73d71f4dedecabff852c74ea78a97dab0fc86b2ed44ddc77 done
#31 writing layer sha256:5150182f1ff123399b300ca469e00f6c4d82e1b9b72652fb8ee7eab370245236 done
#31 writing layer sha256:5450ec6233e924dcdedf28ae862b64cced3ba8d460257e793e0a31c605e8bbc8 0.0s done
#31 writing layer sha256:595c38fa102c61c3dda19bdab70dcd26a0e50465b986d022a84fa69023a05d0f done
#31 writing layer sha256:59d451175f6950740e26d38c322da0ef67cb59da63181eb32996f752ba8a2f17 done
#31 writing layer sha256:5ad1f2004580e415b998124ea394e9d4072a35d70968118c779f307204d6bd17 done
#31 writing layer sha256:5e2c1cbc09286c26c04d5b4257b11940ecdb161330319d54feadc7ef9a8dc8f6 done
#31 writing layer sha256:62598eafddf023e7f22643485f4321cbd51ff7eee743b970db12454fd3c8c675 done
#31 writing layer sha256:63d7e616a46987136f4cc9eba95db6f6327b4854cfe3c7e20fed6db0c966e380 done
#31 writing layer sha256:6939d591a6b09b14a437e5cd2d6082a52b6d76bec4f72d960440f097721da34f done
#31 writing layer sha256:698318e5a60e5e0d48c45bf992f205a9532da567fdfe94bd59be2e192975dd6f done
#31 writing layer sha256:6ddc1d0f91833b36aac1c6f0c8cea005c87d94bab132d46cc06d9b060a81cca3 done
#31 writing layer sha256:74ac1f5a47c0926bff1e997bb99985a09926f43bd0895cb27ceb5fa9e95f8720 done
#31 writing layer sha256:7577973918dd30e764733a352a93f418000bc3181163ca451b2307492c1a6ba9 done
#31 writing layer sha256:886c886d8a09d8befb92df75dd461d4f97b77d7cff4144c4223b0d2f6f2c17f2 done
#31 writing layer sha256:8a7451db9b4b817b3b33904abddb7041810a4ffe8ed4a034307d45d9ae9b3f2a done
#31 writing layer sha256:916f4054c6e7f10de4fd7c08ffc75fa23ebecca4eceb8183cb1023b33b1696c9 done
#31 writing layer sha256:9463aa3f56275af97693df69478a2dc1d171f4e763ca6f7b6f370a35e605c154 done
#31 writing layer sha256:955fd173ed884230c2eded4542d10a97384b408537be6bbb7c4ae09ccd6fb2d0 done
#31 writing layer sha256:9c42a4ee99755f441251e6043b2cbba16e49818a88775e7501ec17e379ce3cfd done
#31 writing layer sha256:9c63be0a86e3dc4168db3814bf464e40996afda0031649d9faa8ff7568c3154f done
#31 writing layer sha256:9e04bda98b05554953459b5edef7b2b14d32f1a00b979a23d04b6eb5c191e66b done
#31 writing layer sha256:a4a0c690bc7da07e592514dccaa26098a387e8457f69095e922b6d73f7852502 done
#31 writing layer sha256:a4aafbc094d78a85bef41036173eb816a53bcd3e2564594a32f542facdf2aba6 done
#31 writing layer sha256:ae36a4d38b76948e39a5957025c984a674d2de18ce162a8caaa536e6f06fccea done
#31 writing layer sha256:b2fa40114a4a0725c81b327df89c0c3ed5c05ca9aa7f1157394d5096cf5460ce done
#31 writing layer sha256:b48a5fafcaba74eb5d7e7665601509e2889285b50a04b5b639a23f8adc818157 done
#31 writing layer sha256:c657dd855c8726b050f2b5bd6f4999883fff6803fe9f22add96f6d3ff89cd477 done
#31 writing layer sha256:c86976a083599e36a6441f36f553627194d05ea82bb82a78682e718fe62fccf6 done
#31 writing layer sha256:cb506fbdedc817e3d074f609e2edbf9655aacd7784610a1bbac52f2d7be25438 done
#31 writing layer sha256:d2a6fe65a1f84edb65b63460a75d1cac1aa48b72789006881b0bcfd54cd01ffd done
#31 writing layer sha256:d2cafa18c788d3e44592cf8dcabf80e138db8389aa89e765550691199861d4fe done
#31 writing layer sha256:d6a198fd2a224cb803248e86953a164439f1a64889df0861dc5cc7eef4c66664 done
#31 writing layer sha256:d8d16d6af76dc7c6b539422a25fdad5efb8ada5a8188069fcd9d113e3b783304 done
#31 writing layer sha256:ddc2ade4f6fe866696cb638c8a102cb644fa842c2ca578392802b3e0e5e3bcb7 done
#31 writing layer sha256:e2cfd7f6244d6f35befa6bda1caa65f1786cecf3f00ef99d7c9a90715ce6a03c done
#31 writing layer sha256:e42e7ccc889dd8eabf5148a4e91eb843e32688cf109fa7c074d87862f8da5da0 done
#31 writing layer sha256:e94a4481e9334ff402bf90628594f64a426672debbdfb55f1290802e52013907
#31 preparing build cache for export 0.7s done
#31 writing layer sha256:e94a4481e9334ff402bf90628594f64a426672debbdfb55f1290802e52013907 done
#31 writing layer sha256:eaf45e9f32d1f5a9983945a1a9f8dedbb475bc0f578337610e00b4dedec87c20 done
#31 writing layer sha256:eb411bef39c013c9853651e68f00965dbd826d829c4e478884a2886976e9c989 done
#31 writing layer sha256:edfe4a95eb6bd3142aeda941ab871ffcc8c19cf50c33561c210ba8ead2424759 done
#31 writing layer sha256:ef4466d6f927d29d404df9c5af3ef5733c86fa14e008762c90110b963978b1e7 done
#31 writing layer sha256:f346e3ecdf0bee048fa1e3baf1d3128ff0283b903f03e97524944949bd8882e5 done
#31 writing layer sha256:f3f9a00a1ce9aadda250aacb3e66a932676badc5d8519c41517fdf7ea14c13ed done
#31 writing layer sha256:f7a50dafd51c2bcaad0ede31fbf29c38fe66776ade008a7fbdb07dba39de7f97 done
#31 writing layer sha256:fd849d9bd8889edd43ae38e9f21a912430c8526b2c18f3057a3b2cd74eb27b31 done
#31 writing config sha256:b0a64afdeb53276373de9d6facb2d12c84bc72fa642ca0ff57e9fd720b1e7168 0.0s done
#31 writing manifest sha256:ec78676329581462682bcf9e88a75e1c58d7ddb5232df774218a8c110a6ed892 0.0s done
#31 DONE 0.7s
[2023-08-30 01:21:50,557] [INFO] (packager) - Build Summary:

Platform: x64-workstation/dgpu
    Status:     Succeeded
    Docker Tag: my_app-x64-workstation-dgpu-linux-amd64:1.0
    Tarball:    None

We can see that the MAP Docker image is created

!docker image ls | grep {tag_prefix}
my_app-x64-workstation-dgpu-linux-amd64                   1.0                        716356b4f3c0   58 seconds ago      15.4GB

We can choose to display and inspect the MAP manifests by running the container with the show command. Furthermore, we can also extract the manifests and other contents in the MAP by using the extract command while mapping specific folder to the host’s (we know that our MAP is compliant and supports these commands).

Note

The host folder for storing the extracted content must first be created by the user, and if it has been created by Docker on running the container, the folder needs to be deleted and re-created.

!echo "Display manifests and extract MAP contents to the host folder, ./export"
!docker run --rm {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 show
!rm -rf `pwd`/export && mkdir -p `pwd`/export
!docker run --rm -v `pwd`/export/:/var/run/holoscan/export/ {tag_prefix}-x64-workstation-dgpu-linux-amd64:1.0 extract
!ls `pwd`/export
Display manifests and extract MAP contents to the host folder, ./export

============================== app.json ==============================
{
  "apiVersion": "1.0.0",
  "command": "[\"python3\", \"/opt/holoscan/app\"]",
  "environment": {
    "HOLOSCAN_APPLICATION": "/opt/holoscan/app",
    "HOLOSCAN_INPUT_PATH": "input/",
    "HOLOSCAN_OUTPUT_PATH": "output/",
    "HOLOSCAN_WORKDIR": "/var/holoscan",
    "HOLOSCAN_MODEL_PATH": "/opt/holoscan/models",
    "HOLOSCAN_CONFIG_PATH": "/var/holoscan/app.yaml",
    "HOLOSCAN_APP_MANIFEST_PATH": "/etc/holoscan/app.json",
    "HOLOSCAN_PKG_MANIFEST_PATH": "/etc/holoscan/pkg.json",
    "HOLOSCAN_DOCS_PATH": "/opt/holoscan/docs",
    "HOLOSCAN_LOGS_PATH": "/var/holoscan/logs"
  },
  "input": {
    "path": "input/",
    "formats": null
  },
  "liveness": null,
  "output": {
    "path": "output/",
    "formats": null
  },
  "readiness": null,
  "sdk": "monai-deploy",
  "sdkVersion": "0.6.0",
  "timeout": 0,
  "version": 1,
  "workingDirectory": "/var/holoscan"
}

============================== pkg.json ==============================
{
  "apiVersion": "1.0.0",
  "applicationRoot": "/opt/holoscan/app",
  "modelRoot": "/opt/holoscan/models",
  "models": {
    "model": "/opt/holoscan/models"
  },
  "resources": {
    "cpu": 1,
    "gpu": 1,
    "memory": "1Gi",
    "gpuMemory": "6Gi"
  },
  "version": 1
}

2023-08-30 08:21:57 [INFO] Copying application from /opt/holoscan/app to /var/run/holoscan/export/app

2023-08-30 08:21:57 [INFO] Copying application manifest file from /etc/holoscan/app.json to /var/run/holoscan/export/config/app.json
2023-08-30 08:21:57 [INFO] Copying pkg manifest file from /etc/holoscan/pkg.json to /var/run/holoscan/export/config/pkg.json
2023-08-30 08:21:57 [INFO] Copying application configuration from /var/holoscan/app.yaml to /var/run/holoscan/export/config/app.yaml

2023-08-30 08:21:57 [INFO] Copying models from /opt/holoscan/models to /var/run/holoscan/export/models

2023-08-30 08:21:57 [INFO] Copying documentation from /opt/holoscan/docs/ to /var/run/holoscan/export/docs
2023-08-30 08:21:57 [INFO] '/opt/holoscan/docs/' cannot be found.

app  config  models

Executing packaged app locally

The packaged app can be run locally through MONAI Application Runner.

# Clear the output folder and run the MAP. The input is expected to be a folder.
!rm -rf $HOLOSCAN_OUTPUT_PATH
!monai-deploy run -i $HOLOSCAN_INPUT_PATH -o $HOLOSCAN_OUTPUT_PATH my_app-x64-workstation-dgpu-linux-amd64:1.0
[2023-08-30 01:22:01,097] [INFO] (runner) - Checking dependencies...
[2023-08-30 01:22:01,097] [INFO] (runner) - --> Verifying if "docker" is installed...

[2023-08-30 01:22:01,098] [INFO] (runner) - --> Verifying if "docker-buildx" is installed...

[2023-08-30 01:22:01,098] [INFO] (runner) - --> Verifying if "my_app-x64-workstation-dgpu-linux-amd64:1.0" is available...

[2023-08-30 01:22:01,170] [INFO] (runner) - Reading HAP/MAP manifest...
Preparing to copy...?25lCopying from container - 0B?25hSuccessfully copied 2.56kB to /tmp/tmpp_510e7z/app.json
Preparing to copy...?25lCopying from container - 0B?25hSuccessfully copied 2.05kB to /tmp/tmpp_510e7z/pkg.json
[2023-08-30 01:22:01,356] [INFO] (runner) - --> Verifying if "nvidia-ctk" is installed...

[2023-08-30 01:22:01,563] [INFO] (common) - Launching container (03a25b708327) using image 'my_app-x64-workstation-dgpu-linux-amd64:1.0'...
    container name:      elated_heyrovsky
    host name:           mingq-dt
    network:             host
    user:                1000:1000
    ulimits:             memlock=-1:-1, stack=67108864:67108864
    cap_add:             CAP_SYS_PTRACE
    ipc mode:            host
    shared memory size:  67108864
    devices:             
2023-08-30 08:22:02 [INFO] Launching application python3 /opt/holoscan/app ...

[2023-08-30 08:22:07,078] [INFO] (root) - Parsed args: Namespace(argv=['/opt/holoscan/app'], input=None, log_level=None, model=None, output=None, workdir=None)

[2023-08-30 08:22:07,081] [INFO] (root) - AppContext object: AppContext(input_path=/var/holoscan/input, output_path=/var/holoscan/output, model_path=/opt/holoscan/models, workdir=/var/holoscan)

[2023-08-30 08:22:07,082] [INFO] (root) - End compose

[info] [app_driver.cpp:1025] Launching the driver/health checking service

[info] [gxf_executor.cpp:210] Creating context

[info] [server.cpp:73] Health checking server listening on 0.0.0.0:8777

[info] [gxf_executor.cpp:1595] Loading extensions from configs...

[info] [gxf_executor.cpp:1741] Activating Graph...

[info] [gxf_executor.cpp:1771] Running Graph...

[info] [gxf_executor.cpp:1773] Waiting for completion...

[info] [gxf_executor.cpp:1774] Graph execution waiting. Fragment: 

[info] [greedy_scheduler.cpp:190] Scheduling 8 entities

[2023-08-30 08:22:07,211] [INFO] (monai.deploy.operators.dicom_data_loader_operator.DICOMDataLoaderOperator) - No or invalid input path from the optional input port: None

[2023-08-30 08:22:07,580] [INFO] (root) - Finding series for Selection named: CT Series

[2023-08-30 08:22:07,580] [INFO] (root) - Searching study, : 1.3.6.1.4.1.14519.5.2.1.7085.2626.822645453932810382886582736291

  # of series: 1

[2023-08-30 08:22:07,580] [INFO] (root) - Working on series, instance UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239

[2023-08-30 08:22:07,580] [INFO] (root) - On attribute: 'StudyDescription' to match value: '(.*?)'

[2023-08-30 08:22:07,580] [INFO] (root) -     Series attribute StudyDescription value: CT ABDOMEN W IV CONTRAST

[2023-08-30 08:22:07,581] [INFO] (root) - Series attribute string value did not match. Try regEx.

[2023-08-30 08:22:07,581] [INFO] (root) - On attribute: 'Modality' to match value: '(?i)CT'

[2023-08-30 08:22:07,581] [INFO] (root) -     Series attribute Modality value: CT

[2023-08-30 08:22:07,581] [INFO] (root) - Series attribute string value did not match. Try regEx.

[2023-08-30 08:22:07,581] [INFO] (root) - On attribute: 'SeriesDescription' to match value: '(.*?)'

[2023-08-30 08:22:07,581] [INFO] (root) -     Series attribute SeriesDescription value: ABD/PANC 3.0 B31f

[2023-08-30 08:22:07,581] [INFO] (root) - Series attribute string value did not match. Try regEx.

[2023-08-30 08:22:07,581] [INFO] (root) - Selected Series, UID: 1.3.6.1.4.1.14519.5.2.1.7085.2626.119403521930927333027265674239

[2023-08-30 08:22:08,032] [INFO] (root) - Parsing from bundle_path: /opt/holoscan/models/model/model.ts

/home/holoscan/.local/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary LoadImaged.__init__:image_only: Current default value of argument `image_only=False` has been deprecated since version 1.1. It will be changed to `image_only=True` in version 1.3.

  warn_deprecated(argname, msg, warning_category)

/home/holoscan/.local/lib/python3.8/site-packages/monai/utils/deprecate_utils.py:321: FutureWarning: monai.transforms.io.dictionary SaveImaged.__init__:resample: Current default value of argument `resample=True` has been deprecated since version 1.1. It will be changed to `resample=False` in version 1.3.

  warn_deprecated(argname, msg, warning_category)

[2023-08-30 08:22:18,434] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConversionOperator) - Output will be saved in file /var/holoscan/output/stl/spleen.stl.

[2023-08-30 08:22:19,925] [INFO] (monai.deploy.operators.stl_conversion_operator.SpatialImage) - 3D image

[2023-08-30 08:22:19,925] [INFO] (monai.deploy.operators.stl_conversion_operator.STLConverter) - Image ndarray shape:(204, 512, 512)

Exception occurred for operator: 'stl_conversion_op'

Traceback (most recent call last):

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 118, in compute

    stl_bytes = self._convert(input_image, _output_file)

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 135, in _convert

    return self._converter.convert(

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 182, in convert

    nda = STLConverter.get_largest_cc(nda)

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/operators/stl_conversion_operator.py", line 255, in get_largest_cc

    labels = label(nda)

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/utils/importutil.py", line 274, in __call__

    raise self._exception

  File "/home/holoscan/.local/lib/python3.8/site-packages/monai/deploy/utils/importutil.py", line 226, in optional_import

    pkg = __import__(module)  # top level module

monai.deploy.utils.importutil.OptionalImportError: from skimage.measure import label (No module named 'skimage').



For details about installing the optional dependencies, please visit:

    https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies

/home/holoscan/.local/lib/python3.8/site-packages/highdicom/valuerep.py:54: UserWarning: The string "C3N-00198" is unlikely to represent the intended person name since it contains only a single component. Construct a person name according to the format in described in http://dicom.nema.org/dicom/2013/output/chtml/part05/sect_6.2.html#sect_6.2.1.2, or, in pydicom 2.2.0 or later, use the pydicom.valuerep.PersonName.from_named_components() method to construct the person name correctly. If a single-component name is really intended, add a trailing caret character to disambiguate the name.

  warnings.warn(

[2023-08-30 08:22:22,121] [INFO] (highdicom.seg.sop) - add plane #0 for segment #1

/home/holoscan/.local/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR UL.

  warnings.warn(msg)

/home/holoscan/.local/lib/python3.8/site-packages/pydicom/valuerep.py:443: UserWarning: A value of type 'int64' cannot be assigned to a tag with VR US.

  warnings.warn(msg)

[2023-08-30 08:22:22,124] [INFO] (highdicom.seg.sop) - add plane #1 for segment #1

[2023-08-30 08:22:22,125] [INFO] (highdicom.seg.sop) - add plane #2 for segment #1

[2023-08-30 08:22:22,126] [INFO] (highdicom.seg.sop) - add plane #3 for segment #1

[2023-08-30 08:22:22,127] [INFO] (highdicom.seg.sop) - add plane #4 for segment #1

[2023-08-30 08:22:22,127] [INFO] (highdicom.seg.sop) - add plane #5 for segment #1

[2023-08-30 08:22:22,128] [INFO] (highdicom.seg.sop) - add plane #6 for segment #1

[2023-08-30 08:22:22,129] [INFO] (highdicom.seg.sop) - add plane #7 for segment #1

[2023-08-30 08:22:22,129] [INFO] (highdicom.seg.sop) - add plane #8 for segment #1

[2023-08-30 08:22:22,130] [INFO] (highdicom.seg.sop) - add plane #9 for segment #1

[2023-08-30 08:22:22,131] [INFO] (highdicom.seg.sop) - add plane #10 for segment #1

[2023-08-30 08:22:22,131] [INFO] (highdicom.seg.sop) - add plane #11 for segment #1

[2023-08-30 08:22:22,132] [INFO] (highdicom.seg.sop) - add plane #12 for segment #1

[2023-08-30 08:22:22,133] [INFO] (highdicom.seg.sop) - add plane #13 for segment #1

[2023-08-30 08:22:22,133] [INFO] (highdicom.seg.sop) - add plane #14 for segment #1

[2023-08-30 08:22:22,134] [INFO] (highdicom.seg.sop) - add plane #15 for segment #1

[2023-08-30 08:22:22,134] [INFO] (highdicom.seg.sop) - add plane #16 for segment #1

[2023-08-30 08:22:22,135] [INFO] (highdicom.seg.sop) - add plane #17 for segment #1

[2023-08-30 08:22:22,136] [INFO] (highdicom.seg.sop) - add plane #18 for segment #1

[2023-08-30 08:22:22,137] [INFO] (highdicom.seg.sop) - add plane #19 for segment #1

[2023-08-30 08:22:22,137] [INFO] (highdicom.seg.sop) - add plane #20 for segment #1

[2023-08-30 08:22:22,138] [INFO] (highdicom.seg.sop) - add plane #21 for segment #1

[2023-08-30 08:22:22,139] [INFO] (highdicom.seg.sop) - add plane #22 for segment #1

[2023-08-30 08:22:22,140] [INFO] (highdicom.seg.sop) - add plane #23 for segment #1

[2023-08-30 08:22:22,141] [INFO] (highdicom.seg.sop) - add plane #24 for segment #1

[2023-08-30 08:22:22,141] [INFO] (highdicom.seg.sop) - add plane #25 for segment #1

[2023-08-30 08:22:22,142] [INFO] (highdicom.seg.sop) - add plane #26 for segment #1

[2023-08-30 08:22:22,143] [INFO] (highdicom.seg.sop) - add plane #27 for segment #1

[2023-08-30 08:22:22,144] [INFO] (highdicom.seg.sop) - add plane #28 for segment #1

[2023-08-30 08:22:22,145] [INFO] (highdicom.seg.sop) - add plane #29 for segment #1

[2023-08-30 08:22:22,146] [INFO] (highdicom.seg.sop) - add plane #30 for segment #1

[2023-08-30 08:22:22,147] [INFO] (highdicom.seg.sop) - add plane #31 for segment #1

[2023-08-30 08:22:22,148] [INFO] (highdicom.seg.sop) - add plane #32 for segment #1

[2023-08-30 08:22:22,148] [INFO] (highdicom.seg.sop) - add plane #33 for segment #1

[2023-08-30 08:22:22,149] [INFO] (highdicom.seg.sop) - add plane #34 for segment #1

[2023-08-30 08:22:22,150] [INFO] (highdicom.seg.sop) - add plane #35 for segment #1

[2023-08-30 08:22:22,151] [INFO] (highdicom.seg.sop) - add plane #36 for segment #1

[2023-08-30 08:22:22,152] [INFO] (highdicom.seg.sop) - add plane #37 for segment #1

[2023-08-30 08:22:22,152] [INFO] (highdicom.seg.sop) - add plane #38 for segment #1

[2023-08-30 08:22:22,153] [INFO] (highdicom.seg.sop) - add plane #39 for segment #1

[2023-08-30 08:22:22,153] [INFO] (highdicom.seg.sop) - add plane #40 for segment #1

[2023-08-30 08:22:22,154] [INFO] (highdicom.seg.sop) - add plane #41 for segment #1

[2023-08-30 08:22:22,155] [INFO] (highdicom.seg.sop) - add plane #42 for segment #1

[2023-08-30 08:22:22,155] [INFO] (highdicom.seg.sop) - add plane #43 for segment #1

[2023-08-30 08:22:22,156] [INFO] (highdicom.seg.sop) - add plane #44 for segment #1

[2023-08-30 08:22:22,157] [INFO] (highdicom.seg.sop) - add plane #45 for segment #1

[2023-08-30 08:22:22,158] [INFO] (highdicom.seg.sop) - add plane #46 for segment #1

[2023-08-30 08:22:22,158] [INFO] (highdicom.seg.sop) - add plane #47 for segment #1

[2023-08-30 08:22:22,159] [INFO] (highdicom.seg.sop) - add plane #48 for segment #1

[2023-08-30 08:22:22,160] [INFO] (highdicom.seg.sop) - add plane #49 for segment #1

[2023-08-30 08:22:22,160] [INFO] (highdicom.seg.sop) - add plane #50 for segment #1

[2023-08-30 08:22:22,161] [INFO] (highdicom.seg.sop) - add plane #51 for segment #1

[2023-08-30 08:22:22,162] [INFO] (highdicom.seg.sop) - add plane #52 for segment #1

[2023-08-30 08:22:22,162] [INFO] (highdicom.seg.sop) - add plane #53 for segment #1

[2023-08-30 08:22:22,163] [INFO] (highdicom.seg.sop) - add plane #54 for segment #1

[2023-08-30 08:22:22,163] [INFO] (highdicom.seg.sop) - add plane #55 for segment #1

[2023-08-30 08:22:22,164] [INFO] (highdicom.seg.sop) - add plane #56 for segment #1

[2023-08-30 08:22:22,165] [INFO] (highdicom.seg.sop) - add plane #57 for segment #1

[2023-08-30 08:22:22,165] [INFO] (highdicom.seg.sop) - add plane #58 for segment #1

[2023-08-30 08:22:22,166] [INFO] (highdicom.seg.sop) - add plane #59 for segment #1

[2023-08-30 08:22:22,167] [INFO] (highdicom.seg.sop) - add plane #60 for segment #1

[2023-08-30 08:22:22,167] [INFO] (highdicom.seg.sop) - add plane #61 for segment #1

[2023-08-30 08:22:22,168] [INFO] (highdicom.seg.sop) - add plane #62 for segment #1

[2023-08-30 08:22:22,168] [INFO] (highdicom.seg.sop) - add plane #63 for segment #1

[2023-08-30 08:22:22,169] [INFO] (highdicom.seg.sop) - add plane #64 for segment #1

[2023-08-30 08:22:22,170] [INFO] (highdicom.seg.sop) - add plane #65 for segment #1

[2023-08-30 08:22:22,170] [INFO] (highdicom.seg.sop) - add plane #66 for segment #1

[2023-08-30 08:22:22,171] [INFO] (highdicom.seg.sop) - add plane #67 for segment #1

[2023-08-30 08:22:22,172] [INFO] (highdicom.seg.sop) - add plane #68 for segment #1

[2023-08-30 08:22:22,172] [INFO] (highdicom.seg.sop) - add plane #69 for segment #1

[2023-08-30 08:22:22,173] [INFO] (highdicom.seg.sop) - add plane #70 for segment #1

[2023-08-30 08:22:22,173] [INFO] (highdicom.seg.sop) - add plane #71 for segment #1

[2023-08-30 08:22:22,174] [INFO] (highdicom.seg.sop) - add plane #72 for segment #1

[2023-08-30 08:22:22,175] [INFO] (highdicom.seg.sop) - add plane #73 for segment #1

[2023-08-30 08:22:22,175] [INFO] (highdicom.seg.sop) - add plane #74 for segment #1

[2023-08-30 08:22:22,176] [INFO] (highdicom.seg.sop) - add plane #75 for segment #1

[2023-08-30 08:22:22,177] [INFO] (highdicom.seg.sop) - add plane #76 for segment #1

[2023-08-30 08:22:22,178] [INFO] (highdicom.seg.sop) - add plane #77 for segment #1

[2023-08-30 08:22:22,178] [INFO] (highdicom.seg.sop) - add plane #78 for segment #1

[2023-08-30 08:22:22,179] [INFO] (highdicom.seg.sop) - add plane #79 for segment #1

[2023-08-30 08:22:22,180] [INFO] (highdicom.seg.sop) - add plane #80 for segment #1

[2023-08-30 08:22:22,180] [INFO] (highdicom.seg.sop) - add plane #81 for segment #1

[2023-08-30 08:22:22,181] [INFO] (highdicom.seg.sop) - add plane #82 for segment #1

[2023-08-30 08:22:22,181] [INFO] (highdicom.seg.sop) - add plane #83 for segment #1

[2023-08-30 08:22:22,182] [INFO] (highdicom.seg.sop) - add plane #84 for segment #1

[2023-08-30 08:22:22,183] [INFO] (highdicom.seg.sop) - add plane #85 for segment #1

[2023-08-30 08:22:22,183] [INFO] (highdicom.seg.sop) - add plane #86 for segment #1

[2023-08-30 08:22:22,223] [INFO] (highdicom.base) - copy Image-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy attributes of module "Specimen"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy Patient-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy attributes of module "Patient"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Subject"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy Study-related attributes from dataset "1.3.6.1.4.1.14519.5.2.1.7085.2626.936983343951485811186213470191"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy attributes of module "General Study"

[2023-08-30 08:22:22,224] [INFO] (highdicom.base) - copy attributes of module "Patient Study"

[2023-08-30 08:22:22,225] [INFO] (highdicom.base) - copy attributes of module "Clinical Trial Study"

[info] [greedy_scheduler.cpp:369] Scheduler stopped: Some entities are waiting for execution, but there are no periodic or async entities to get out of the deadlock.

[info] [greedy_scheduler.cpp:398] Scheduler finished.

[info] [gxf_executor.cpp:1783] Graph execution deactivating. Fragment: 

[info] [gxf_executor.cpp:1784] Deactivating Graph...

[info] [gxf_executor.cpp:1787] Graph execution finished. Fragment: 

[2023-08-30 08:22:22,346] [INFO] (app.AISpleenSegApp) - End run

[2023-08-30 01:22:23,709] [INFO] (common) - Container 'elated_heyrovsky'(03a25b708327) exited.
!ls $HOLOSCAN_OUTPUT_PATH
1.2.826.0.1.3680043.10.511.3.39359760221330773075218270807121109.dcm  stl
The Kernel crashed while executing code in the the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click <a href='https://aka.ms/vscodeJupyterKernelCrash'>here</a> for more info. View Jupyter <a href='command:jupyter.viewOutput'>log</a> for further details.