Motivation
I am a PhD student in Computational Linguistics. As such I have both the need to experiment with deep learning frameworks and little money to build a powerful deep learning machine.
Unlike my last build, which was centered around an open source deep learning stack provided by AMD, this build is designed to be as cheap as possible.
There are some things that I included in this build that I just had lying around and some things that I had to buy to assemble it. I will break down the cost for both types of materials.
Chassis
Building a deep learning machine in an old gaming computer case would probably be ideal. However, those kinds of systems have held their value better than I initially expected. In addition, ram was a bit of a concern for systems old enough to be within my price range: gaming systems from 2011 or so almost always seemed to be built with 8 GB of ram on the high end. Of course, this can be expanded, but the highest memory capacity I could achieve for a $80 motherboard seemed to be about 32 GB. I wanted to have the flexibility to work with large datasets in the future.
In addition, I am very familiar with old server hardware (due to the number of non-deep learning oriented servers I maintain). I started seeking out used workstations like the HP z620, z820, Lenovo Thinkstation s30, or the Dell Precision T3600.
- Registered ECC memory for these old systems is very easy to come by and incredibly cheap (I just upgraded the system described in this post with 32 GB of ram for $35).
- The LGA 2011 processors (the e5-2600 and e5-2600 v2 lineup) have AVX extensions which enable faster floating point calculations. Tensorflow's prebuilt binaries started assuming AVX instructions sometime this past year. A system any older will not perform nearly as well.
- The power supplies are generally overspecc'd to handle GPUs for CAD work
- They don't typically have any of the vendor lock in that existed in many servers from these brands in the same year.
I ended up setting on the HP z620 in particular because:
- The motherboard provided a large number of SATA connections for storing more hard drives
- The PSU was quite overspecc'd with each of the 6 pin connectors capable of being split into an 8 pin connector
- A second processor can be added via a riser
- More memory could be added compared to the Dell.
- z820's are too expensive if the second processor is not a necessity.
- It was quiet
- It was cheap
GPUs
For the GPUs in this machine, I went with the GTX 1070. This was for two primary reasons. The first being that this is the cheapest processor available from NVIDIA with 8 GB of VRAM. VRAM is where the models and data live on the GPU. Having more VRAM directly relates to the size of the models you can run as well as the speed with which you can train (as training speed is highly dependent upon batch size and larger batches require more memory). The second reason is that the GTX 1070 typically only required a single 8 pin power connector. Provided that the z620 can supply two 8 pin connectors, something like the 1080ti is only practical if I have no desire to expand to more than one GPU.
The first GPU I selected was a cheap Gigabyte model that I bought off of an online forum.
The second GPU, which I bought after using the machine for a while was a blower-design HP OEM card that I bought on Ebay. This was slightly more expensive due to having to pay tax.
Hard drives
Data takes up a lot of space and I wanted to have no shortage of spinning metal to put my data on. The z620 came with one 500 GB drive. I had a number of 1 terabyte 2.5" hard drives around from a failed attempt at using them in my Proliant DL380 machine (however, the temp sensors on these drives did not agree with this machine).
The z620 has two 5.25" bays. Only the top one has a CD drive. I bought a caddy from IcyDock that enabled me to put four 2.5" drives in one 5.25" bay.
Cost breakdown
|----|-----| |Chassis | 200 | | GTX 1070 | 200 | | GTX 1070 | 215 | | IcyDock Chassis | 20 |
Comments
No comments yet. Be the first to react!