本地服务器用gpu创建ollama容器报错 #468
Labels
No Label
bug
duplicate
enhancement
help wanted
invalid
question
wontfix
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: HswOAuth/llm_course#468
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
老师你好,我用一台本地带gpu的服务器(系统为ubuntu22.04),性能如下:




用gpu创建ollama容器报错如下:
CUDN和cudnn都已经安装(是跟着官网分别下载下来安装的),验证命令截图如下
请问这是什么原因呢?是我这种方式安装CUDN和cudnn的方式不对吗?
补充:我创建了一个conda环境,作了如下测试,不知能否给老师排错提供帮助

参考:https://blog.csdn.net/qq_38628046/article/details/136312844,可能需要安装NVIDIA Container Toolkit
还真是这个原因,谢谢老师!现在需要解决ollama容器中不能拉取模型的问题。因为这是一台安装ubuntu22.04的服务器,怎么给它配置科学上网呢?
这个需要你自己八仙过海各显神通了