如何在树莓派等边缘设备中加载或推断 onnx 模型? [英] How to load or infer onnx models in edge devices like raspberry pi?

查看:33
本文介绍了如何在树莓派等边缘设备中加载或推断 onnx 模型?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我只想在 raspberry pi 中加载 onnx 模型.如何在边缘设备中加载 onnx 模型?

I just want to load onnx models in raspberry pi. How to load onnx models in edge devices?

推荐答案

您可以在 Raspberry Pi 中使用 ONNX Runtime 进行 ONNX 模型推理.它支持 Arm32v7l 架构.截至 2020 年 1 月 14 日,未提供预构建二进制文件.所以你需要从源代码构建它.说明如下.https://github.com/microsoft/onnxruntime/blob/master/dockerfiles/README.md#arm-32v7

You can use ONNX Runtime for ONNX model inference in Raspberry Pi. It support Arm32v7l architecture. Pre-build binary is not provided as of 2020/1/14. So you need to build it from source code. Instruction is described below. https://github.com/microsoft/onnxruntime/blob/master/dockerfiles/README.md#arm-32v7

  1. 按照此处

创建一个空的本地目录

mkdir onnx-build
cd onnx-build

  1. 将 Dockerfile 保存到新目录

Dockerfile.arm32v7

FROM balenalib/raspberrypi3-python:latest-stretch-build

ARG ONNXRUNTIME_REPO=https://github.com/Microsoft/onnxruntime
ARG ONNXRUNTIME_SERVER_BRANCH=master

#Enforces cross-compilation through Quemu
RUN [ "cross-build-start" ]

RUN install_packages 
    sudo 
    build-essential 
    curl 
    libcurl4-openssl-dev 
    libssl-dev 
    wget 
    python3 
    python3-pip 
    python3-dev 
    git 
    tar 
    libatlas-base-dev

RUN pip3 install --upgrade pip
RUN pip3 install --upgrade setuptools
RUN pip3 install --upgrade wheel
RUN pip3 install numpy

# Build the latest cmake
WORKDIR /code
RUN wget https://github.com/Kitware/CMake/releases/download/v3.14.3/cmake-3.14.3.tar.gz
RUN tar zxf cmake-3.14.3.tar.gz

WORKDIR /code/cmake-3.14.3
RUN ./configure --system-curl
RUN make
RUN sudo make install

# Set up build args
ARG BUILDTYPE=MinSizeRel
ARG BUILDARGS="--config ${BUILDTYPE} --arm"

# Prepare onnxruntime Repo
WORKDIR /code
RUN git clone --single-branch --branch ${ONNXRUNTIME_SERVER_BRANCH} --recursive ${ONNXRUNTIME_REPO} onnxruntime

# Start the basic build
WORKDIR /code/onnxruntime
RUN ./build.sh ${BUILDARGS} --update --build

# Build Shared Library
RUN ./build.sh ${BUILDARGS} --build_shared_lib

# Build Python Bindings and Wheel
RUN ./build.sh ${BUILDARGS} --enable_pybind --build_wheel

# Build Output
RUN ls -l /code/onnxruntime/build/Linux/${BUILDTYPE}/*.so
RUN ls -l /code/onnxruntime/build/Linux/${BUILDTYPE}/dist/*.whl

RUN [ "cross-build-end" ]

  1. 运行 docker build

这将首先构建所有依赖项,然后构建 ONNX 运行时及其 Python 绑定.这将需要几个小时.

This will build all the dependencies first, then build ONNX Runtime and its Python bindings. This will take several hours.

docker build -t onnxruntime-arm32v7 -f Dockerfile.arm32v7 .

  1. 注意 .whl 文件的完整路径

  1. Note the full path of the .whl file

  • 在构建结束时报告,在#Build Output 行之后.
  • 它应该遵循 onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl 格式,但版本号可能已更改.稍后您将使用此路径提取轮文件.

检查构建是否成功

  • 完成后,您应该会在 docker 镜像列表中看到一个标记为 onnxruntime-arm32v7 的镜像:

docker images

  1. 从 docker 镜像中提取 Python 轮文件

(使用步骤 5 中记录的文件更新 .whl 文件的路径/版本)

(Update the path/version of the .whl file with the one noted in step 5)

docker create -ti --name onnxruntime_temp onnxruntime-arm32v7 bash
docker cp onnxruntime_temp:/code/onnxruntime/build/Linux/MinSizeRel/dist/onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl .
docker rm -fv onnxruntime_temp

这会将车轮文件 onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl 的副本保存到主机上的工作目录中.

This will save a copy of the wheel file, onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl, to your working directory on your host machine.

  1. 将wheel文件(onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl)复制到你的树莓派或其他ARM设备上

  1. Copy the wheel file (onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl) to your Raspberry Pi or other ARM device

在设备上,安装 ONNX Runtime 轮文件

On device, install the ONNX Runtime wheel file

sudo apt-get update
sudo apt-get install -y python3 python3-pip
pip3 install numpy

# Install ONNX Runtime
# Important: Update path/version to match the name and location of your .whl file
pip3 install onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl

  1. 按照此处
  2. 的说明测试安装
  1. Test installation by following the instructions here

这篇关于如何在树莓派等边缘设备中加载或推断 onnx 模型?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆