This page has moved. Click here to go to the new page.

The History of Linux in VFX and Animation

Author   Aaron Estrada
Date Tue 28 March 2017
Tags      Linux, tech, explainer

People often seem surprised when I tell them that effectively all high-end VFX and animation shops use Linux. Surely something like Windows or OSX would be more convenient, right? Maybe so… or maybe not. As Ethan mentioned in an earlier post, Linux has some distinct advantages when deployed at scale and in a pipeline vs traditional desktop operating systems like Windows and OSX. Also, ignoring the history behind how the high-end shops ended up where they are today would dismiss a critical part of the overall story. Animation and VFX shops don’t only use Linux for the advantages it offers today, there is a certain amount of legacy behind the choice. Let’s look at the history.

Almost all of the larger C.G. shops started back in the Unix days, some of them before Windows or Mac OS even existed. The earliest C.G. studios were even using Lisp based machines and supercomputers with custom OSes. Most of those studios no longer exist. Regardless, the legacy of the older studios and the development of IT in general impact how we do things today.

TRON was released in 1982, over 35 years ago! At the time, 3D C.G.I. required a supercomputer to render.

Image of Cray-1 Supercomputert

Pictured above is a Cray-1 Supercomputer. One of these was used to render the C.G.I. in TRON. In 1977 this machine cost $5M - $8M depending on the options ordered. It had 32MBytes of RAM, ran at 80 MHz and provided 160 MFLOPS of compute power. To put that in perspective, assuming you have a relatively new smartphone, the ARM CPU in your phone almost certainly provides over 1 GigaFlop. (That’s 1000 MFLOPS!) of computing power. It most likely has 1 or more GIGABYTES of RAM. Think about that. Your phone is more powerful than the SUPERCOMPUTER (at the time) used to render TRON. More amazingly, it’s likely more powerful than the computers used to render even Jurassic Park! Imagine how powerful modern PC hardware is by comparison.

Of course, back when the hardware required to render 3D C.G.I. filled a room and cost over $5 million, the desktop computers of the same era were not even close to being up to the task. There was no 3D hardware acceleration and the CPUs were not 32bit. Most did not have any provision for calculating floating point math in hardware, which is a critical feature for 3D rendering at any reasonable speed. RAM was still incredibly expensive and the home computers of the era were only 8 or 16 bit. Even if the RAM were affordable, they couldn’t address very much of it. They were essentially toys compared to the computers used to do computer graphics at the time. Windows didn’t even exist and neither did Mac OS.

Anyone who has heard of Moore’s Law knows there is an exponential pace of improvement in the state of the art in computer science. Computer graphics requires large amounts of compute, and for years, the cost of compute seemed high. After the era of the mainframes and supercomputers for the production of C.G. came the era of the minicomputer and “workstation”. These machines were still expensive but at least they fit in the same room as the artist. Relative to the mainframes and supercomputers they replaced for doing C.G. work, they provided a very good performance per dollar. Most of these machines ran Unix, and this was the era of early C.G. blockbuster films like Jurassic Park, Terminator 2, and Toy Story. There was a lot of growth in the C.G. industry in that period. Not only was animation starting to shift over to computer graphics, visual effects were driving the demand for photo-real C.G.

In parallel to the development of modern C.G.I. production, the .com and WWW era of the Internet was maturing. Unix was building a strong foothold in the datacenter due to Sun, HP, DEC and SGI servers, which all ran proprietary Unix variants. The demand for servers to feed the growing demand for data delivered via the Internet drove the sales of these computers. It was a less glamourous application of the technology than C.G., but the higher volumes of machines being sold helped drive the economy of scale so that the machines were more affordable for C.G. work.

TUX, the Linux Mascot

By around 1995 Linux, BSD Unix and PC hardware were becoming mature enough relative to the established Minicomputer and “Workstation” class machines that some adventurous Internet server admins began putting parts of their workloads on PC hardware. Graphic accelerator cards capable of running OpenGL became available for the PC. Folks in the VFX and animation business noticed this. Home computers were no longer toy-like. They supported 32 bit processors with integrated floating point units, could accept large amounts of RAM and be networked via ethernet to connect with high performance network attached storage. They were getting good enough to compete with the workstation class machines and clusters of them worked well for making inexpensive render farms.

The demand for home computers commoditized the PC and helped drive the PC revolution. The economy of scale possible when selling commodity PC hardware allowed for better R&D budgets than what the workstation and minicomputer vendors could match. PC hardware was catching up with workstation class hardware and actually started to surpass it in many ways. By around 1997, when Linux had begun to really mature, commodity PC hardware had begun to outpace the minicomputer and “workstation” class stuff from the likes of Sun, SGI, HP and IBM. The new PC hardware was sufficiently faster (per dollar spent) than the “workstation” and “minicomputer” class stuff from the legacy vendors that the economics could no longer be ignored. Running Linux on the new PC hardware was the obvious choice because all the custom software (after re-compilation) and automation scripts they had created over the years for Unix would just keep working as usual. Porting the custom code over to a NON-Unix-Like OS would have been a huge chore but tweaking it for Linux (which is mostly Posix compliant) was a much less difficult task. Plus, all the artists at the shops knew and trusted Unix-like systems. At the time, Windows was for playing video games and doing spreadsheets, not making C.G. for feature films. A little bit of trivia… I worked at one shop that even still used CSH (rather than the more modern Bash shell or a more robust scripting language like Python) due to still having so much automation that was still dependent on CSH. Legacy support was definitely part of the equation when making the switch to Linux from proprietary Unix variants.

So, that’s a quick history of how we ended up where we are. But does that mean we only use Linux today because we are stuck supporting some old legacy code from the stone ages? Far from it! I have to agree with Ethan that even today, Unix-like OSes are more suitable for large scale deployments and automation. There is a reason nearly the entire Internet and your smartphone run Linux or some flavor of Unix. What works at scale for the Internet of course also works at scale for C.G. production. The standard system shell and file system semantics alone completely smoke anything on Windows. Scripting languages like Python are first class citizens and there is a wealth of Free Open Source Software available to help address just about any conceivable need. Basically, with Linux you have some great building blocks for creating large systems. Plus, the larger the render farm, the more you save when running Linux vs Windows (While OSX is based on Unix, it isn’t an option since Apple doesn’t license the OS alone. It’s not possible to run OSX in the cloud.) Netbooting Linux is more straight forward than Windows also, which makes managing huge fleets of machines easier.

Linux supports most of the high end DCC applications a C.G. artist might want to use but a few notable examples, like Photoshop, are missing. At present, this is really the only negative of using Linux vs using something else. Of course, the same could be said for any OS. There is always some killer app missing from any given platform. (Final Cut is missing from Windows, 3DS Max is missing on OSX, etc.)

If you are an aspiring VFX artist I recommend you learn enough Linux to at least get around in it. Consider learning Python and possibly some more advanced shell scripting. Assuming your other skills are solid, a working knowledge of Linux will help you stand out as an artist & TD.