Image Classification(图像分类)#
fastdeploy.vision.classification.PaddleClasPreprocessor#
- class fastdeploy.vision.classification.PaddleClasPreprocessor(config_file)[source]#
Create a preprocessor for PaddleClasModel from configuration file
- Parameters
config_file – (str)Path of configuration file, e.g resnet50/inference_cls.yaml
- initial_resize_on_cpu(v)[source]#
When the initial operator is Resize, and input image size is large, maybe it’s better to run resize on CPU, because the HostToDevice memcpy is time consuming. Set this True to run the initial resize on CPU. :param: v: True or False
- run(input_ims)#
Process input image
- Param
input_ims: (list of numpy.ndarray) The input images
- Returns
list of FDTensor
- use_cuda(enable_cv_cuda=False, gpu_id=-1)#
Use CUDA processors
- Param
enable_cv_cuda: Ture: use CV-CUDA, False: use CUDA only
- Param
gpu_id: GPU device id
fastdeploy.vision.classification.PaddleClasPostprocessor#
fastdeploy.vision.classification.PaddleClasModel#
- class fastdeploy.vision.classification.PaddleClasModel(model_file, params_file, config_file, runtime_option=None, model_format=<ModelFormat.PADDLE: 1>)[source]#
Load a image classification model exported by PaddleClas.
- Parameters
model_file – (str)Path of model file, e.g resnet50/inference.pdmodel
params_file – (str)Path of parameters file, e.g resnet50/inference.pdiparams, if the model_fomat is ModelFormat.ONNX, this param will be ignored, can be set as empty string
config_file – (str) Path of configuration file for deploy, e.g resnet50/inference_cls.yaml
runtime_option – (fastdeploy.RuntimeOption)RuntimeOption for inference this model, if it’s None, will use the default backend on CPU
model_format – (fastdeploy.ModelForamt)Model format of the loaded model
- batch_predict(images)[source]#
Classify a batch of input image
- Parameters
im – (list of numpy.ndarray) The input image list, each element is a 3-D array with layout HWC, BGR format
:return list of ClassifyResult
- get_profile_time()#
Get profile time of Runtime after the profile process is done.
- property postprocessor#
Get PaddleClasPostprocessor object of the loaded model
:return PaddleClasPostprocessor
- predict(im, topk=1)[source]#
Classify an input image
- Parameters
im – (numpy.ndarray) The input image data, a 3-D array with layout HWC, BGR format
topk – (int) Filter the topk classify result, default 1
- Returns
ClassifyResult
- property preprocessor#
Get PaddleClasPreprocessor object of the loaded model
:return PaddleClasPreprocessor
fastdeploy.vision.classification.YOLOv5Cls#
- class fastdeploy.vision.classification.YOLOv5Cls(model_file, params_file='', runtime_option=None, model_format=<ModelFormat.ONNX: 2>)[source]#
Load a YOLOv5Cls model exported by YOLOv5Cls.
- Parameters
model_file – (str)Path of model file, e.g ./YOLOv5Cls.onnx
params_file – (str)Path of parameters file, e.g yolox/model.pdiparams, if the model_fomat is ModelFormat.ONNX, this param will be ignored, can be set as empty string
runtime_option – (fastdeploy.RuntimeOption)RuntimeOption for inference this model, if it’s None, will use the default backend on CPU
model_format – (fastdeploy.ModelForamt)Model format of the loaded model
- batch_predict(images)[source]#
Classify a batch of input image
- Parameters
im – (list of numpy.ndarray) The input image list, each element is a 3-D array with layout HWC, BGR format
:return list of ClassifyResult
- get_profile_time()#
Get profile time of Runtime after the profile process is done.
- property postprocessor#
Get YOLOv5ClsPostprocessor object of the loaded model
:return YOLOv5ClsPostprocessor
- predict(input_image)[source]#
Classify an input image
- Parameters
input_image – (numpy.ndarray)The input image data, 3-D array with layout HWC, BGR format
- Returns
ClassifyResult
- property preprocessor#
Get YOLOv5ClsPreprocessor object of the loaded model
:return YOLOv5ClsPreprocessor
fastdeploy.vision.classification.ResNet#
- class fastdeploy.vision.classification.ResNet(model_file, params_file='', runtime_option=None, model_format=<ModelFormat.ONNX: 2>)[source]#
Load a image classification model exported by torchvision.ResNet.
- Parameters
model_file – (str)Path of model file, e.g resnet/resnet50.onnx
params_file – (str)Path of parameters file, if the model_fomat is ModelFormat.ONNX, this param will be ignored, can be set as empty string
runtime_option – (fastdeploy.RuntimeOption)RuntimeOption for inference this model, if it’s None, will use the default backend on CPU
model_format – (fastdeploy.ModelForamt)Model format of the loaded model, default is ONNX
- get_profile_time()#
Get profile time of Runtime after the profile process is done.
- property mean_vals#
Returns the mean value of normlization, default mean_vals = [0.485f, 0.456f, 0.406f];
- predict(input_image, topk=1)[source]#
Classify an input image
- Parameters
input_image – (numpy.ndarray)The input image data, 3-D array with layout HWC, BGR format
topk – (int)The topk result by the classify confidence score, default 1
- Returns
ClassifyResult
- property size#
Returns the preprocess image size, default size = [224, 224];
- property std_vals#
Returns the std value of normlization, default std_vals = [0.229f, 0.224f, 0.225f];