We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问Paddle专用推理库(paddle inference)是否已适配基于x86架构的国产海光服务器?我在官方文档中找到的相关资料较为有限且有些模糊,想确认一下具体情况。如果已适配有无具体的部署方案或Docker
The text was updated successfully, but these errors were encountered:
https://www.paddlepaddle.org.cn/documentation/docs/zh/develop/guides/hardware_support/dcu/example_cn.html inference 库 源码编译 https://www.paddlepaddle.org.cn/inference/master/guides/hardware_support/dcu_hygon_cn.html 你好可以试着按照文档进行尝试
Sorry, something went wrong.
No branches or pull requests
请提出你的问题 Please ask your question
请问Paddle专用推理库(paddle inference)是否已适配基于x86架构的国产海光服务器?我在官方文档中找到的相关资料较为有限且有些模糊,想确认一下具体情况。如果已适配有无具体的部署方案或Docker
The text was updated successfully, but these errors were encountered: