monailabel.interfaces.tasks.infer module¶
- class monailabel.interfaces.tasks.infer.InferTask(path, network, type, labels, dimension, description, model_state_dict='model', input_key='image', output_label_key='pred', output_json_key='result', config=None)[source]¶
Bases:
object
Basic Inference Task Helper
- Parameters
path – Model File Path. Supports multiple paths to support versions (Last item will be picked as latest)
network – Model Network (e.g. monai.networks.xyz). None in case if you use TorchScript (torch.jit).
type (
InferType
) – Type of Infer (segmentation, deepgrow etc..)dimension – Input dimension
description – Description
model_state_dict – Key for loading the model state from checkpoint
input_key – Input key for running inference
output_label_key – Output key for storing result/label of inference
output_json_key – Output key for storing result/label of inference
config – K,V pairs to be part of user config
- __init__(path, network, type, labels, dimension, description, model_state_dict='model', input_key='image', output_label_key='pred', output_json_key='result', config=None)[source]¶
- Parameters
path – Model File Path. Supports multiple paths to support versions (Last item will be picked as latest)
network – Model Network (e.g. monai.networks.xyz). None in case if you use TorchScript (torch.jit).
type (
InferType
) – Type of Infer (segmentation, deepgrow etc..)dimension – Input dimension
description – Description
model_state_dict – Key for loading the model state from checkpoint
input_key – Input key for running inference
output_label_key – Output key for storing result/label of inference
output_json_key – Output key for storing result/label of inference
config – K,V pairs to be part of user config
- abstract inferer()[source]¶
Provide Inferer Class
For Example:
return monai.inferers.SlidingWindowInferer(roi_size=[160, 160, 160])
- inverse_transforms()[source]¶
Provide List of inverse-transforms. They are normally subset of pre-transforms. This task is performed on output_label (using the references from input_key)
- Return one of the following.
None: Return None to disable running any inverse transforms (default behavior).
Empty: Return [] to run all applicable pre-transforms which has inverse method
list: Return list of specific pre-transforms names/classes to run inverse method
For Example:
return [ monai.transforms.Spacingd, ]
- abstract post_transforms()[source]¶
Provide List of post-transforms
For Example:
return [ monai.transforms.AddChanneld(keys='pred'), monai.transforms.Activationsd(keys='pred', softmax=True), monai.transforms.AsDiscreted(keys='pred', argmax=True), monai.transforms.SqueezeDimd(keys='pred', dim=0), monai.transforms.ToNumpyd(keys='pred'), monailabel.interface.utils.Restored(keys='pred', ref_image='image'), monailabel.interface.utils.ExtremePointsd(keys='pred', result='result', points='points'), monailabel.interface.utils.BoundingBoxd(keys='pred', result='result', bbox='bbox'), ]
- abstract pre_transforms()[source]¶
Provide List of pre-transforms
For Example:
return [ monai.transforms.LoadImaged(keys='image'), monai.transforms.AddChanneld(keys='image'), monai.transforms.Spacingd(keys='image', pixdim=[1.0, 1.0, 1.0]), monai.transforms.ScaleIntensityRanged(keys='image', a_min=-57, a_max=164, b_min=0.0, b_max=1.0, clip=True), ]
- run_inferer(data, convert_to_batch=True, device='cuda')[source]¶
Run Inferer over pre-processed Data. Derive this logic to customize the normal behavior. In some cases, you want to implement your own for running chained inferers over pre-processed data
- Parameters
data – pre-processed data
convert_to_batch – convert input to batched input
device – device type run load the model and run inferer
- Returns
updated data with output_key stored that will be used for post-processing
- writer(data, extension=None, dtype=None)[source]¶
You can provide your own writer. However this writer saves the prediction/label mask to file and fetches result json
- Parameters
data – typically it is post processed data
extension – output label extension
dtype – output label dtype
- Returns
tuple of output_file and result_json
- class monailabel.interfaces.tasks.infer.InferType[source]¶
Bases:
object
Type of Inference Model
- SEGMENTATION - Segmentation Model
- CLASSIFICATION - Classification Model
- DEEPGROW - Deepgrow Interactive Model
- DEEPEDIT - DeepEdit Interactive Model
- SCRIBBLES - Scribbles Model
- OTHERS - Other Model Type
- CLASSIFICATION = 'classification'¶
- DEEPEDIT = 'deepedit'¶
- DEEPGROW = 'deepgrow'¶
- KNOWN_TYPES = ['segmentation', 'classification', 'deepgrow', 'deepedit', 'scribbles', 'others']¶
- OTHERS = 'others'¶
- SCRIBBLES = 'scribbles'¶
- SEGMENTATION = 'segmentation'¶