#23668 #23786 Streamlined preprocessing in OpenVINO Inference Engine (ie) API 1.0 backend.#22750 Added API blobFromImageParam to build network inputs with pre-processings. #23349 Vulkan backend refactor for better performance and robustness.Added full FP16 computation branch on ARMv8 platform, 1.5x faster than FP32 #22275(FP16 Winograd is still pending).Further increased DNN speed on ARM and X86 by improving convolution, covering 1D and 3D cases, supporting convolution+element-wise op fusion.Fixes in nary element wise layer about broadcast:.#23491 Fixes for Segment Anything Model by Meta.#23613 Reduce Refactor for robustness and potential follow-up improvements.#23401 support ONNX Sub, PRelu, ConvTranspose.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |