Skip to content

[Question] How to serve TF2 SOK model in Triton Inference and convert it to ONNX? #105

[Question] How to serve TF2 SOK model in Triton Inference and convert it to ONNX?

[Question] How to serve TF2 SOK model in Triton Inference and convert it to ONNX? #105

Workflow file for this run

name: triage_issues
on:
issues:
types: [opened, reopened]
jobs:
triage_issue:
uses: nvidia-merlin/.github/.github/workflows/triage.yaml@main
secrets:
TRIAGE_APP_ID: ${{ secrets.TRIAGE_APP_ID }}
TRIAGE_APP_PEM: ${{ secrets.TRIAGE_APP_PEM }}