Controlling Creatures with Linux
Motion-mixing algorithms are served up via software tools provided by a process called the Tool Server. The Tool Server establishes the relationships between live performance data and the algorithms the Motion Engine will perform on it, using direct access to objects and datasets in shared memory. Designed by Michael Babcock, it is, as the name implies, an asynchronous server that speaks with graphical user interface clients that connect to it.
The GUI is a “not-so-thin” client that connects to the Tool Server via a socket. Michael Babcock wrote the actual application in GTKmm, based on an original high-level interface design by consultant Robert McNally. The client/server architecture allows multiple instances of the user interface to be run either by technicians on other machines or by additional puppeteers.
There are times during production when a technician assists puppeteers with administrative tasks, so a networked GUI means the technician can do those tasks without kicking anyone off the system. A custom protocol is spoken between the GUI and Tool Server. The ability for a technician to administer our Control System application remotely is of great value to us, as production locations may be anywhere around the world, and support staff back at The Creature Shop can have complete access for the purposes of debugging or assisting.
Since Linux itself provides mechanisms for remote access to the rest of the operating system, a technician back at The Creature Shop can give us unparalleled support for systems on the road.
As you can see in the schematic diagram, the back end of the control system, beyond the Motion Engine, can vary. We can perform animatronics with embedded processors in a remote onboard computer (ROC) scenario, and we can perform 3-D Computer Graphic puppets with a Viewer back end.
In one 60Hz frame, the motion engine will obtain an analog to digital converter (ADC) snapshot of our physical input devices, execute motion-mixing algorithms based on the character setup and transmit the data to ROC clients.
Our animatronic ROCs are embedded PCs running DR DOS. Steve Rosenbluth wrote the ROC code in “somewhat object-oriented C” for speed, as it is a fairly lean-and-mean piece of code. The ROC protocol allows for multiple devices on the communications link, which is currently RS-232. This aging interface is not as high-bandwidth as modern networking hardware, but it is the easiest to use over fiber, copper and radio. RS-232 is a hard real-time interface with predictable triggering and latency, which we need. We use plastic fiber optics to transmit RS-232 when a creature is tethered and spread spectrum for wireless creatures.
The computer graphics Viewer is the software module that renders and displays a computer graphic (CG) puppet on a computer screen. The Viewer represented in the schematic diagram can be one of an array of CG modeling packages or game engines that can render quickly enough to display an OpenGL scene live. Computer graphics puppeteering at the Henson Company was pioneered by Digital Supervisor Hal Bertram in the London Creature Shop in the early 1990s. More recently, some of the PC-based CG applications for which Michael Babcock has written control system plugins include Discrete's 3-D Studio Max, Side Effects' Houdini, Kaydara's Filmbox and Alias|Wavefront's Maya.
A control system Actuator, in the CG realm, is scalar channel data that can move a 3-D mesh deformation or a blend shape. A UDP network connection from the Control Computer's Motion Engine streams live motion data into the Viewer. Viewers behave as ROC clients from the perspective of the Motion Engine, in that they speak the same ROC protocol.
With dual AMD Athlon machines, we get frame rates above 100FPS, enough to put multiple puppets in one scene. Character Technical Director Jeff Christie helped complete the picture by perfecting a 3-D character setup that gets fast, lifelike performance from our CG models.
The Viewer supports connections from multiple motion engines, which means, for a scene with multiple characters, that each can be performed by its own control system and puppeteer, sort of like a networked game.
|Graph Any Data with Cacti!||Apr 27, 2017|
|Be Kind, Buffer!||Apr 26, 2017|
|Preparing Data for Machine Learning||Apr 25, 2017|
|openHAB||Apr 24, 2017|
|Omesh Tickoo and Ravi Iyer's Making Sense of Sensors (Apress)||Apr 21, 2017|
|Low Power Wireless: 6LoWPAN, IEEE802.15.4 and the Raspberry Pi||Apr 20, 2017|
- Graph Any Data with Cacti!
- Teradici's Cloud Access Platform: "Plug & Play" Cloud for the Enterprise
- The Weather Outside Is Frightful (Or Is It?)
- Simple Server Hardening
- Understanding Firewalld in Multi-Zone Configurations
- Gordon H. Williams' Making Things Smart (Maker Media, Inc.)
- Server Technology's HDOT Alt-Phase Switched POPS PDU
- IGEL Universal Desktop Converter
- From vs. to + for Microsoft and Linux