get-onnxruntime-prebuilt
Automatically generated README for this automation recipe: get-onnxruntime-prebuilt
Category: AI/ML frameworks
License: Apache 2.0
- CM meta description for this script: _cm.json
- Output cached? True
Reuse this script in your project
Install MLCommons CM automation meta-framework
Pull CM repository with this automation recipe (CM script)
cm pull repo mlcommons@cm4mlops
Print CM help from the command line
cmr "install onnxruntime get prebuilt lib lang-c lang-cpp" --help
Run this script
Run this script via CLI
cm run script --tags=install,onnxruntime,get,prebuilt,lib,lang-c,lang-cpp[,variations]
Run this script via CLI (alternative)
cmr "install onnxruntime get prebuilt lib lang-c lang-cpp [variations]"
Run this script from Python
import cmind
r = cmind.access({'action':'run'
'automation':'script',
'tags':'install,onnxruntime,get,prebuilt,lib,lang-c,lang-cpp'
'out':'con',
...
(other input keys for this script)
...
})
if r['return']>0:
print (r['error'])
Run this script via Docker (beta)
cm docker script "install onnxruntime get prebuilt lib lang-c lang-cpp[variations]"
Variations
-
Group "device"
Click here to expand this section.
_cpu
(default)- ENV variables:
- CM_ONNXRUNTIME_DEVICE: ``
- ENV variables:
_cuda
- ENV variables:
- CM_ONNXRUNTIME_DEVICE:
gpu
- CM_ONNXRUNTIME_DEVICE:
- ENV variables:
Default variations
_cpu
Versions
Default version: 1.16.3
Native script being run
Script output
cmr "install onnxruntime get prebuilt lib lang-c lang-cpp [variations]" -j