3 February 2016

7 Gamers in 1 Tower - Behind the Scenes

Ever wonder what kind of gaming rig you could build with $30,000? Nope, neither did we, but Linus Sebastian of LinusTechTips sure did. Near the end of 2015, we worked with Linus on a number of videos that incorporated the use of Unraid. These projects were a blast and we had a ton of fun working on them.

Little did we know, those were just a precursor to a bigger, crazier, more insane plan that Linus had in mind. There is a limit to the usefulness of cash when it comes to gaming performance, but instead of trying to go for the highest performance in a single gaming PC, Linus wanted to do what IT professionals call "scaling horizontally." That's a fancy way of saying, "add more players to the system." In short, this was an extension of what was done in the 2 Gamers; 1 CPU video, but for this project, we were truly traveling into uncharted territory. We do not expect nor suggest anyone to try to recreate this setup for themselves just yet (there are some glitches that need working out), but it certainly demonstrates Unraid's versatility and scalability as an operating system. Whether you want to build a light-weight system for mass media storage or a high-end rig with enough GPUs to support a LAN party in-a-box, we've got you covered. So check out the video for yourselves and then keep on reading for even more info about this project.

We had no idea if this was going to work

4 Finally Filming

7 GPUs, 256GB RAM, 2 CPUs, 1 motherboard, 1 power supply, 7 sets of mice and keyboards, all to create and run 7 local gaming VMs, all on Unraid, and all at the same time. Needless to say, this was an intense, yet fun and exhilarating project. There was so much to it that couldn't be covered in the time allotted for the video, so we wanted to provide some "behind the scenes" info on this setup. This truly was a test of might for both hardware and software. We knew we were getting best-of-the-best in terms of equipment and with a master PC builder like Linus at the helm, we weren't the slightest bit worried about system assembly. However, would the components hold up? Would the OS? We honestly didn't know because it had never been done before. All we did know was that at the end of it all, we would definitely have a story to tell...one way or another.

That's a lot of components

The first challenge was hardware selection. While we have to give credit to Linus for the custom build, cooling, and case selection, we actually didn't have much choice for the rest of the parts. For motherboard, we had a ton of devices to attach (16 USB, 8 SSDs, 2 CPUs, and 7 GPUs) so the Asus board we used was honestly one of the only choices anyway. For CPUs, Linus sized it so we could have as much processing power per user as an decent single gaming rig (4 CPU cores per player). And when it came to GPUs, it didn't take long for us to realize we didn't have any choice at all. The cards had to be:

  • high performance grade
  • single PCIe slot height
  • water cooled
  • supported off a single PSU
Nano Furyx Gtx980 E1451964273579

While technically there were other options that could have met the first two requirements, power consumption was ultimately what led to selecting the R9 Nano. Because these GPUs only required 1 PCIe power connector, we knew we'd be well within our power requirements of what a single power supply could reliably deliver. And even with thermal throttling on these GPUs, water-cooling let them all maintain full clockspeed throughout testing.

Pcie Locks Are Accessible E1451966519568

And one other requirement we didn't even think about in advance (Linus even mentions this during the video): the cards should be short enough in length to have access to the PCIe locks on the motherboard to safely remove the cards if necessary. This is even further complicated by the fact that we had to use a single cooling block for all 7 GPUs. This would mean that to remove one would mean removing all, and even though you could probably get to the PCIe locks on the top and maybe the bottom GPUs, the 5 in the middle would be neigh impossible. So while there are other GPUs that we could have considered using like the EVGA GTX 980 Kingpin ACX-2, and while we could have gone with a different/larger case and dual PSUs to support the power draw, the length of these cards would ultimately mean that once they were installed, you wouldn't be taking them out again. And while snapping those things off did cross our minds, it was a short-lived idea. In the end, we really wanted to leverage the lower power consumption and profile benefits of the R9 Nano to keep the rig running off a single PSU.

Preliminary testing

1 Things Coming Together1

For the 2 Gamers; 1 CPU project, we knew in advance that was going to work just fine because it had already been done (both by myself and many others in our community). And while we had read some reports of folks getting 3 and 4 GPUs working, without having tested it ourselves, we couldn't really be sure of the stability or performance. Thankfully all the hardware for the project actually arrived weeks ahead of filming (except for the 7-way water cooler block that was being custom made for us), so we asked Linus to build the rig with four of the GPUs (with their stock air coolers attached) so we could perform some remote testing and stage the VM setups.

2015 12 18 1 E1451868752415

That's when we ran into challenge #1: a POST code error; not enough PCIe resources. This message came up even when we only had 4 GPUs attached to the system. While the system would still boot up, we weren't seeing GPU pass through work correctly (in fact, only one GPU was even working at that point). Thankfully, this was a rather trivial thing to fix. By enabling Above 4G Decoding in the BIOS, this message disappeared and the system would boot normally.

With that out of the way, challenge #2: how do you remotely configure and test VMs with GPU pass through? TeamViewer alone wasn't enough for me to truly set up and test this. Even RDP wouldn't be sufficient for troubleshooting because the key here is making sure that the video for each VM is output to a physically attached monitor. As such, our solution was rather simple: connect a webcam to the Windows PC I was remotely controlling and point it at a bunch of monitors attached to the rig. This would let us see if the monitors were actually lighting up or not, but looking at the camera app in Windows through a remote session wasn't a good way to gauge if the tests we were running were going smooth or not. For that, we had to actually record video using that camera, then upload it somewhere we could get at. This worked surprisingly well, especially thanks to LTTs crazy fast Internet (uploading at over 300mbps).

Weird Issues

While we were able to succeed in getting GPU pass through to work, we witnessed a pretty major issue with GPU resets. When we would shut down a VM, if we tried to start it again without rebooting the host, the entire host would crash. A workaround to this was to “soft-eject” the GPU from Windows itself before shutting down the VM. This would result in a solid reset so the VM could started again safely. That said, it’s not really an ideal solution. A fix for this behavior has not yet been identified.

Another odd behavior was that it felt sometimes like a slot machine when we were booting up to determine whether or not all the GPUs would actually show up / work. Occasionally we would witness a GPU going into a D3 power state, and never recovering. Sometimes we’d only see 6 of the 7 GPUs even visible in the PCI devices list. It is possible that one of the GPUs could have been faulty, or perhaps the motherboard itself is faulty. Or perhaps no one has ever testing putting 7 GPUs in this one system before, and it’s just a quirky hardware behavior. Too difficult to diagnose at this point, especially when you’re trying to diagnose remotely.

It is for these reasons we wouldn’t recommend anyone to try to recreate this exact build on their own.

But is it cheaper than building 7 PCs?

Lots of things go into this question, but know that making it “cheaper” wasn’t the goal of the project. That said, let’s bust out the old calculator and do the math on this thing to determine for ourselves. While Linus did post a screenshot of an excel spreadsheet with a cost summary, it looked more like MSRP figures than street prices. As such, we wanted to take a stab at configuring this behemoth on PC Part Picker so that we could then create comparative builds with an even playing field on pricing. Because some of the parts used aren’t on PC Part Picker (custom cooling, etc.), we had to use their “add custom part” feature and look up those components by hand. And because PC Part Picker wouldn’t let us add more than 4 GPUs, I had to add another custom part to represent the cost of the 3 additional R9 Nano GPUs. You can view the entire build list if you want, but the grand total came out to $27,119.59. Now a few things about this build:

  • we could have easily used less than half the total RAM and still had room to spare
  • the monitors represented over 1/3 the total cost
  • we did not need 8 SSDs for this thanks btrfs copy on write
  • we did not need water cooling for the CPUs (only the GPUs)

Now all that said, let’s pretend that each gamer needed to match the capabilities of what one of our VMs had in our setup, tit for tat. That means not only equivalent computing power, but equivalent cooling power as well, and cooling costs are really where 7 Gamer shines against building 7 independent rigs. Because each of the seven individual rigs needs to water cool the CPU and GPU independently from one another, the costs add up quick. For each individual system, we would need two blocks, one water jacket, and one pump. This would easily set you back $500 per machine for anything near the quality of components that Linus was using for the 7 Gamers machine where we only needed one 7-way block + jackets for our GPUs, two CPU blocks total, and two water pumps. So we came up with another build list to represent each of the 7 gaming rigs you’d have to create to match that equivalent. This totaled out to $4,153.50 per machine, which would set the grand total for 7 of them to cost $29,074.50, exceeding the cost of the 7 gamers system by just under $2k.

While we could debate for ages over the build list and how else money could be saved on the components themselves, there is very little point to doing so. For any trade offs we could make on building 7 separate rigs, equivalent sacrifices could be made for the 7 gamers system to stay in line with cost capital cost. But what about power consumption? Well, the biggest downside is that there will always be a higher minimum power draw on the 7 gamers system than what it would take to power just 1 individual player. That said, if the goal is to support 7 players concurrently, then that is what we measure power draw against. 7 individual gaming rigs would consume easily as much as 3x power while under load as compared to the 7 gamers rig.

So what’s the point?

The 7 gamers project is a proof of concept demonstrating what you get when you combine capable hardware with powerful software. There are a number of exciting things on the horizon with Unraid and virtualization technology that will continue to redefine how we think about personal computing and network-attached storage. And we’re looking forward to even more videos featuring Unraid on LinusTechTips in the future!

Discuss this in the Unraid Community Forum

Registration required to post.