logo
CM Script Automation Documentation
MLPerf-benchmark-support
Initializing search
    GitHub
    • HOME
    • Getting Started
    • CM Scripts
    GitHub
    • HOME
    • Getting Started
    • CM Scripts
      • Python-automation
      • MLPerf-benchmark-support
      • Modular-AI-ML-application-pipeline
      • Modular-application-pipeline
      • Modular-MLPerf-inference-benchmark-pipeline
      • Modular-MLPerf-benchmarks
      • Reproduce-MLPerf-benchmarks
      • Modular-MLPerf-training-benchmark-pipeline
      • DevOps-automation
      • Docker-automation
      • AI-ML-optimization
      • AI-ML-models
      • CM-automation
      • TinyML-automation
      • Cloud-automation
      • Platform-information
      • Detection-or-installation-of-tools-and-artifacts
      • Compiler-automation
      • CM-Interface
      • Legacy-CK-support
      • AI-ML-datasets
      • CUDA-automation
      • AI-ML-frameworks
      • Reproducibility-and-artifact-evaluation
      • GUI
      • Collective-benchmarking
      • Tests
      • Dashboard-automation
      • Remote-automation
      • CM-interface-prototyping

    MLPerf-benchmark-support

    • add-custom-nvidia-system
    • benchmark-any-mlperf-inference-implementation
    • build-mlperf-inference-server-nvidia
    • generate-mlperf-inference-submission
    • generate-mlperf-inference-user-conf
    • generate-mlperf-tiny-report
    • generate-mlperf-tiny-submission
    • generate-nvidia-engine
    • get-mlperf-inference-intel-scratch-space
    • get-mlperf-inference-loadgen
    • get-mlperf-inference-nvidia-common-code
    • get-mlperf-inference-nvidia-scratch-space
    • get-mlperf-inference-results
    • get-mlperf-inference-results-dir
    • get-mlperf-inference-src
    • get-mlperf-inference-submission-dir
    • get-mlperf-inference-sut-configs
    • get-mlperf-inference-sut-description
    • get-mlperf-logging
    • get-mlperf-power-dev
    • get-mlperf-tiny-eembc-energy-runner-src
    • get-mlperf-tiny-src
    • get-mlperf-training-nvidia-code
    • get-mlperf-training-src
    • get-nvidia-mitten
    • get-spec-ptd
    • import-mlperf-inference-to-experiment
    • import-mlperf-tiny-to-experiment
    • import-mlperf-training-to-experiment
    • install-mlperf-logging-from-src
    • prepare-training-data-bert
    • prepare-training-data-resnet
    • preprocess-mlperf-inference-submission
    • process-mlperf-accuracy
    • push-mlperf-inference-results-to-github
    • run-mlperf-inference-mobilenet-models
    • run-mlperf-inference-submission-checker
    • run-mlperf-power-client
    • run-mlperf-power-server
    • run-mlperf-training-submission-checker
    • truncate-mlperf-inference-accuracy-log
    Made with Material for MkDocs