如何在树莓派等边缘设备中加载或推断onnx模型? [英] How to load or infer onnx models in edge devices like raspberry pi?

查看:1305
本文介绍了如何在树莓派等边缘设备中加载或推断onnx模型?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我只想在树莓派中加载onnx模型.如何在边缘设备中加载onnx模型?

I just want to load onnx models in raspberry pi. How to load onnx models in edge devices?

推荐答案

您可以在Raspberry Pi中使用ONNX Runtime进行ONNX模型推断.它支持Arm32v7l体系结构.从2020/1/14开始不提供预构建二进制文件.因此,您需要从源代码构建它.说明如下. https://github.com/microsoft/onnxruntime/blob/master/dockerfiles/README.md#arm-32v7

You can use ONNX Runtime for ONNX model inference in Raspberry Pi. It support Arm32v7l architecture. Pre-build binary is not provided as of 2020/1/14. So you need to build it from source code. Instruction is described below. https://github.com/microsoft/onnxruntime/blob/master/dockerfiles/README.md#arm-32v7

  1. 按照说明此处

创建一个空的本地目录

mkdir onnx-build
cd onnx-build

  1. 将Dockerfile保存到您的新目录

Dockerfile.arm32v7

FROM balenalib/raspberrypi3-python:latest-stretch-build

ARG ONNXRUNTIME_REPO=https://github.com/Microsoft/onnxruntime
ARG ONNXRUNTIME_SERVER_BRANCH=master

#Enforces cross-compilation through Quemu
RUN [ "cross-build-start" ]

RUN install_packages \
    sudo \
    build-essential \
    curl \
    libcurl4-openssl-dev \
    libssl-dev \
    wget \
    python3 \
    python3-pip \
    python3-dev \
    git \
    tar \
    libatlas-base-dev

RUN pip3 install --upgrade pip
RUN pip3 install --upgrade setuptools
RUN pip3 install --upgrade wheel
RUN pip3 install numpy

# Build the latest cmake
WORKDIR /code
RUN wget https://github.com/Kitware/CMake/releases/download/v3.14.3/cmake-3.14.3.tar.gz
RUN tar zxf cmake-3.14.3.tar.gz

WORKDIR /code/cmake-3.14.3
RUN ./configure --system-curl
RUN make
RUN sudo make install

# Set up build args
ARG BUILDTYPE=MinSizeRel
ARG BUILDARGS="--config ${BUILDTYPE} --arm"

# Prepare onnxruntime Repo
WORKDIR /code
RUN git clone --single-branch --branch ${ONNXRUNTIME_SERVER_BRANCH} --recursive ${ONNXRUNTIME_REPO} onnxruntime

# Start the basic build
WORKDIR /code/onnxruntime
RUN ./build.sh ${BUILDARGS} --update --build

# Build Shared Library
RUN ./build.sh ${BUILDARGS} --build_shared_lib

# Build Python Bindings and Wheel
RUN ./build.sh ${BUILDARGS} --enable_pybind --build_wheel

# Build Output
RUN ls -l /code/onnxruntime/build/Linux/${BUILDTYPE}/*.so
RUN ls -l /code/onnxruntime/build/Linux/${BUILDTYPE}/dist/*.whl

RUN [ "cross-build-end" ]

  1. 运行docker build

这将首先构建所有依赖项,然后构建ONNX Runtime及其Python绑定.这将需要几个小时.

This will build all the dependencies first, then build ONNX Runtime and its Python bindings. This will take several hours.

docker build -t onnxruntime-arm32v7 -f Dockerfile.arm32v7 .

  1. 请注意.whl文件的完整路径

  1. Note the full path of the .whl file

  • 在构建结束时在#Build Output行之后报告.
  • 它应遵循onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl的格式,但是版本号可能已更改.您稍后将使用此路径提取转轮文件.

检查构建是否成功

  • 完成后,您应该在您的docker映像列表中看到标记为onnxruntime-arm32v7的映像:

docker images

  1. 从docker映像中提取Python wheel文件

(用第5步中提到的文件更新.whl文件的路径/版本)

(Update the path/version of the .whl file with the one noted in step 5)

docker create -ti --name onnxruntime_temp onnxruntime-arm32v7 bash
docker cp onnxruntime_temp:/code/onnxruntime/build/Linux/MinSizeRel/dist/onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl .
docker rm -fv onnxruntime_temp

这会将wheel文件的副本onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl保存到主机上的工作目录中.

This will save a copy of the wheel file, onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl, to your working directory on your host machine.

  1. 将wheel文件(onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl)复制到Raspberry Pi或其他ARM设备上

  1. Copy the wheel file (onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl) to your Raspberry Pi or other ARM device

在设备上,安装ONNX Runtime滚轮文件

On device, install the ONNX Runtime wheel file

sudo apt-get update
sudo apt-get install -y python3 python3-pip
pip3 install numpy

# Install ONNX Runtime
# Important: Update path/version to match the name and location of your .whl file
pip3 install onnxruntime-0.3.0-cp35-cp35m-linux_armv7l.whl

  1. 按照说明此处
  2. 进行测试
  1. Test installation by following the instructions here

这篇关于如何在树莓派等边缘设备中加载或推断onnx模型?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆