Hi, currently have a spare GeForce 1060 lying around collecting dust.
Planning to use it with Ollama for self-hosting my own AI model or maybe even for AI training.
Problem is, none of my home lab devices have a compatible connection to the GPU’s GPIO. My current setup includes:
- Beelink MINI S12 Intel Alder Lake N100
- Raspberry Pi 5
- Le Potato AML-S905X-CC
- Pi Picos
Would like to hear about recommendations or experiences with external GPU docks that I can use to connect my GPU to my home lab setup, thanks.
You won’t get much use out of a 1060 for AI. Maybe some toy models to learn but even then you might run into limitations with the older CUDA version