Craig Ulmer

Tiled-Display Visualization Walls

2012-09-17 viz bestof

A few years ago I had the opportunity to help revive an old interactive visualization room that featured a 29-foot wide, tiled-display wall built out of 27 DLP projectors. While the projectors in this room were still functional, the 40-node cluster that drove the wall had failed and our users wanted a simpler solution. Thanks to some low-cost, active video splitters, we were able to rig up a system that drove the wall with just two PCs. This hack extended the lifetime of the room, and is a good example of how things that are difficult to do at one point in time can become trivial in just a few years.


Tiled-Display Walls

Tiled-display walls were popular in the viz community in the late 1990s and early 2000s because they provided a straight-forward way to scale up visualization resolution. The idea was that you could build a tightly-packed, 2D array of displays and then drive them with a cluster to give the appearance of one, massive display. A number of scientific simulation centers built viz walls (or even caves) with the idea that researchers would come to these rooms to look through their simulation results in high resolution. Rather than zoom in and out at your desktop, you'd just walk over to the region you were interested in and step closer to see the details.

There were a few ways people usually built the display hardware. While some developers used arrays of LCD monitors, most people built systems with projectors, as they were easier to scale up in size and could be made to have zero gaps between the tiles (after a great deal of manual adjustment). Some installations blended multiple forward projectors together onto a curved screen, while others used rear-projection approaches that leveraged specially-designed glass to spread the image all the way to the tile edges.

Building a computing system to drive the display array took a substantial amount of hardware and software. Some walls used a high-end video switcher to upsample and dice up a single desktop's video to the array. The preferred solution though was to build a cluster computer that used distributed graphics cards to drive the displays. Special purpose software packages such as Chromium could intercept an OpenGL library and then dispatch the rendering operations to the proper nodes in the cluster. ParaView also provides a parallel backend that makes it easy for users to distribute their viz and analysis work to different nodes in the cluster.

Local Wall

The group I hired into at the lab did visualization work for scientific computing. People that came before me designed and built a few moderate-sized viz walls over time before receiving approval to build a high-end viz room for a new state-of-the art building that was being constructed. The plans for the viz room were extremely impressive: they decided to built a 29-foot wide, faceted (but curved) screen out of an array of 3x9, 50" projectors. The rear-projected system could be driven by one of two 40-node clusters, and included an optional commercial video scaling/mutliplexing system that allowed a user to also drive the wall with their laptop. The wall was impressive, though it was easy to get some vertigo when mesh datasets got spinning on it.

Unfortunately, the Achilles heel for the viz room was maintenance. Running the distributed software was more complicated than most people could handle, and therefore it was often necessary to have a viz worker on call whenever there was a meeting or the wall needed to be brought up. The projectors and clusters were also susceptible to hardware failures. With 27 projectors and 40 compute nodes it was almost guaranteed that something wouldn't come up right any time the system was restarted. The catastrophic failure though was the commercial video multiplexing system. It failed shortly after the support and maintenance contract ran out and was too expensive to fix or replace.

Rethinking the Wall

After the room sat unused for a while, a few of us made an ad hoc effort to get the display wall back in working condition one summer. A survey of the hardware showed the projectors were functional and could be recabled to bypass the video switch without much effort. The cluster that drove the wall though had been repurposed, and the common feeling was that a cluster was too complicated to manage for what people wanted to do. We started looking for alternate ways that we could drive the projectors, preferably through one or two PCs. Initially we tried an AMD EyeFinity card, as it was designed to drive up to six monitors from one card. During testing though, we learned that if you wanted to drive more than two monitors, they needed to be true DisplayPort monitors (DP provides better synchronization than DVI). While we could connect up our DVI projectors to the card through DisplayPort-to-DVI cables, the projectors lacked the synchronization control the EyeFinity needed.


Fortunately, we discovered a clever piece of technology that provided the breakthrough we needed. A few companies like Accell make a multi-monitor adapter box that splits a DisplayPort connection into three DVI links. The box aggregates the monitors together and presents them as a single, extremely-wide monitor to the host. The aggregators were picky and didn't work with the EyeFinity card, but they seemed to run fine with Nvidia Fermi cards. Each Fermi card had two DisplayPorts and enough memory to drive a total of six displays.

Updating our System

We found a PC that could house three Fermi cards, which allowed us to drive a total of 18 projectors from one PC using standard Nvidia device drivers. We decided to split the wall into two sections (3x6 and 3x3) and drive each section with a separate computer. The beauty of this approach was that users could simply login to one of the two computers and start running their viz applications without any special training.


Retirement

The updated display wall extended the lifetime of the room by a few years, but in the end the space got reallocated to another group for other purposes. It took a considerable amount of effort tearing down the projector array. In addition to moving hundreds of pounds of projectors, scaffolding, and clusters, someone had to properly dispose of the projector bulbs in a safe way. I believe ours were xenon arc lamps, which this safety site recommends: "From approximately three (3) feet in height, drop the cardboard box, with the lamp and protective case inside, onto a hard floor to break the lamp." It was sad to destroy bulbs that still had life in them, but that's better than someone getting hurt by a dropped leftover bulb.

Reviving the viz wall was a great engineering experience for me, and I particularly how our team came up with some creative ways to bring the wall back online. In retrospect though I have mixed feelings about video walls in general. On one hand they were an interesting technology that produced a number of impressive demos for projects that had visual data and creative developers. On the other hand viz walls always seemed to come up short in terms of usability, which means they never achieved the "interactive room" concept designers wanted. In any case, viz hardware/software is commonplace now, and it can be said that nothing beats being able to do viz from the comfort of your own laptop. I'm looking forward to seeing if the upcoming, low-cost VR goggles empower users even more.