But they will also be able to rehearse and review potential maintenance problems that the engine might have once it is installed in the aircraft. According to Tom Farmer, vice president of the company’s engine program, virtual reality combined with motion capture saves time and money on the front end. “Because we can use our own CAD models with this technology, we can correct any issues more quickly and for less cost than [we could] with physical models and mock-ups,” he says. [an error occurred while processing this directive] This is not the first time that Pratt & Whitney has used virtual means to develop equipment, but it is the first time the company has integrated motion-capture technology into the development process. Once motion-capture entered the equation, Pratt & Whitney could track a real engineer’s movements within a virtual environment, giving the designers a more accurate picture of how an engineer will interact with the engine once it is installed. Previously, the company relied on digital human stand-ins driven by a keyboard.
After Pratt & Whitney had done some trial work with Vicon Motion Systems (www.vicon.com), a Lake Forest, California-based developer of motion-capture systems, the company turned to VR Simulation (www.vrsim.net), a virtual-reality integrator and developer in New Britain, Connecticut, to bring the technology’s pieces together into one application. VR Simulation has just finished the initial proof of concept for the application and is now developing the part of the program that will let engineers assess how well the engine’s external components are working.
Standing in a 20-square-foot area rigged with eight Vicon sensor-equipped cameras, an engineer wearing data gloves and a head-mounted display performs each exercise while a review board of fellow engineers watches the simulation on a projection screen. Any problems with a task’s complexity or completion time are then documented and integrated back into the engine’s design.
Though Vicon’s full-body tracking system is wireless, the immersed engineers are still tethered by the cables attached to the VR gear, including the head-mounted display and two gloves. “We’re working toward a completely wireless environment,” says Gifford. “But first we need to develop a way to multiplex all those signals—including the data from the gloves and the video signal that is sent back to the head mount for the 1,024 x 768 display—so we can send them over one set of transmitters and receivers. Eventually, we’d also like to get all the gear to run off of a single power source. I think that would really clean things up and give the engineers even more freedom of movement.”
Because the Vicon system outputs marker position—or just points in space—the application needed to take that information, translate it and deliver it as a real representation of the engineer’s human form on the projected image for the review panel. “We ended up writing a proprietary driver so we could interface the entire system,” says Gifford. VR Simulation also used Sense8 World Toolkit, Discreet 3ds max, Adobe Photoshop and Okino’s PolyTrans scene-conversion software during development. In addition to the data gloves from 5DT and Immersion, VR Simulation used Kaiser Optics head-mounted displays and Dell workstations running Windows 2000.
Related sites: AV Video Digital Animators Digital CAD Digital Producer
[an error occurred while processing this directive]
Top Home Search User Forum Subscribe Media Kit Contact Webmaster