General, Programming, Technical

Tensorflow, Ubuntu, and NVidia Drivers. Getting it All Working.

What. A. Mess.

I recently wanted to repurpose an old gaming machine by strapping a NVidia GeForce RTX 3050 to its backside, giving it some racing stripes (obviously to make it go faster), and making it my machine learning model-training workhorse: A noble and achievable goal…. oh such was my naiveté.

Now, yes, an RTX 3050 isn’t exactly the AI/ML card of choice, but it’s not *nothing* either. After the process below was done, it took my basic MNIST image classification training model from 20-30 minutes of training to about 10-30 seconds — a pretty decent bump in speed for fun models. For anything else, if you are lucky to have access to a University, there are beast machines to be used, but you have to request permission, get time slots allotted, and then you can build your next ChatGPT competitor.

The problem that I immediately ran into was that my OS of choice, Ubuntu, was a little behind when it came to drivers. This caused me to spend a week or two trying to match NVidia drivers with Ubuntu OS versions and kernel versions, then match those up with Cuda and cudnn versions: Not much fun, and so I decided to save any future enthusiasts the trouble.

Continue reading
Standard