If you’ve ever wondered what happens when you drop a GPU into a NAS and get it running AI, like a private ChatGPT, then StorageReview has the surprising answer.
For the base of his project, the site’s Jordan Ranous used QNAP’s TS-h1290FX, a 12 NVMe NAS powered by an AMD EPYC 7302P CPU and boasting 256GB DRAM, 25GbE connectivity, and plenty of PCI slots. He chose that NAS as it has support for an internal GPU and the ability to host up to 737TB of raw storage.
By adding an Nvidia RTX A4000 GPU to the TS-h1290FX and configuring it for AI using Virtualization Station (a hypervisor for the NAS), Ranous was able to run AI workflows seamlessly.
Nvidia’s Chat with RTX
Nvidia’s ChatRTX software package sorted the AI interaction side by providing a customized experience through a GPT-based LLM with a local, unique dataset. This allowed for rapid, context-aware responses while maintaining privacy and security.
StorageReview detailed the process which involved verifying hardware compatibility, installing the GPU, updating QNAP firmware and software, and installing the OS on the VM, before configuring GPU passthrough, installing GPU drivers in the VM and verifying the passthrough functionality.
The ease of setting up the GPU for AI on the QNAP NAS proves it could work as a cost-effective and efficient solution for businesses looking to leverage the power of AI. As Ranous says, “We’ve shown that adding a decent GPU to a QNAP NAS is relatively easy and inexpensive. We put an A4000 to work, and with a street price of about $1050, that’s not bad when you consider Virtualization Station is free and NVIDIA ChatRTX is available at no charge.”
More from TechRadar Pro
Source: Getting ChatGPT to run on a NAS is actually worth exploring — tech enthusiast puts an