
Introduction — MACE documentation - Read the Docs
MACE (Mobile AI Compute Engine) is a deep learning inference framework optimized for mobile heterogeneous computing platforms. MACE provides tools and documents to help users to deploy deep learning models to mobile phones, tablets, personal computers and IoT devices.
GitHub - XiaoMi/mace: MACE is a deep learning inference …
Mobile AI Compute Engine (or MACE for short) is a deep learning inference framework optimized for mobile heterogeneous computing on Android, iOS, Linux and Windows devices. The design focuses on the following targets: Runtime is optimized with NEON, OpenCL and Hexagon, and Winograd algorithm is introduced to speed up convolution operations.
XiaoMi/kaldi-onnx: Kaldi model converter to ONNX - GitHub
With the converted ONNX model, you can use MACE to speedup the inference on Android, iOS, Linux or Windows devices with highly optimized NEON kernels (more heterogeneous devices will be supported in the future). This tool supports converting both Nnet2 and Nnet3 models.
mace/tools/python/transform/onnx_converter.py at master - GitHub
MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms. - XiaoMi/mace
MACE (Mobile AI Compute Engine) is a deep learning inference framework optimized for mobile heterogeneous computing platforms. MACE provides tools and documents to help users to deploy deep learning models to mobile
Basic usage for Bazel users — MACE documentation - Read the …
Prepare your ONNX model.onnx file. Use ONNX Optimizer Tool to optimize your model for inference.
MACE: Deep learning optimized for mobile and edge devices
Dec 26, 2023 · The MACE model is defined as a customized model format, similar to Caffe2. The model can be converted from exported models by TensorFlow, Caffe, or ONNX. The MACE Model Zoo is an open source project that hosts different models that find their way in everyday AI tasks, such as ResNet, MobileNet, FastStyleTransfer, and Inception. The repository ...
ONNX | Home
ONNX is an open format built to represent machine learning models. ONNX defines a common set of operators - the building blocks of machine learning and deep learning models - and a common file format to enable AI developers to use models with a variety of frameworks, tools, runtimes, and compilers.
Supported Tools - ONNX
Deploy your ONNX model using runtimes designed to accelerate inferencing. Fine tune your model for size, accuracy, resource utilization, and performance. Better understand your model by visualizing its computational graph.
小米AI平台MACE的构建和部署 - CSDN博客
Feb 18, 2019 · Mobile AI Compute Engine (MACE) 是小米公司针对移动端设备开发的一款高效的神经网络计算框架,旨在优化在手机和其他异构计算平台上的机器学习模型运行。MACE 的设计目标是提供低延迟、高性能的计算能力,使其能在...
- Some results have been removed