Virtual Tune-Up
Simulated design and upkeep for a fighter-jet engine

Page 1 of 2

A Pratt & Whitney engineer rigged with mo-cap sensors
The worlds of motion capture and military jet technology have collided. During a long-term $4.8 billion system-development contract with the U.S. government, military aircraft contractor Pratt & Whitney is using wireless mo-cap and head-mounted VR displays and data gloves to fine-tune the heart of Lockheed Martin’s F-35 Joint Strike Fighter jet for the military. The VR environment, to be used throughout the fighter jet’s development, has both a design and a training component. By interacting directly with digital CAD models in a simulated environment, Pratt & Whitney engineers are able to design an engine without the need for physical prototypes.

But they will also be able to rehearse and review potential maintenance problems that the engine might have once it is installed in the aircraft. According to Tom Farmer, vice president of the company’s engine program, virtual reality combined with motion capture saves time and money on the front end. “Because we can use our own CAD models with this technology, we can correct any issues more quickly and for less cost than [we could] with physical models and mock-ups,” he says. [an error occurred while processing this directive] This is not the first time that Pratt & Whitney has used virtual means to develop equipment, but it is the first time the company has integrated motion-capture technology into the development process. Once motion-capture entered the equation, Pratt & Whitney could track a real engineer’s movements within a virtual environment, giving the designers a more accurate picture of how an engineer will interact with the engine once it is installed. Previously, the company relied on digital human stand-ins driven by a keyboard.

After Pratt & Whitney had done some trial work with Vicon Motion Systems (www.vicon.com), a Lake Forest, California-based developer of motion-capture systems, the company turned to VR Simulation (www.vrsim.net), a virtual-reality integrator and developer in New Britain, Connecticut, to bring the technology’s pieces together into one application. VR Simulation has just finished the initial proof of concept for the application and is now developing the part of the program that will let engineers assess how well the engine’s external components are working.

Virtually Wireless
Standing in a 20-square-foot area rigged with eight Vicon sensor-equipped cameras, an engineer wearing data gloves and a head-mounted display performs each exercise while a review board of fellow engineers watches the simulation on a projection screen. Any problems with a task’s complexity or completion time are then documented and integrated back into the engine’s design.

Pratt & Whitney’s original CAD models of the engine, complete with text references, were translated to the 3D immersive environment.
“Basically, the app lets the engineers figure out if they can reach a particular part while the engine is still inside the aircraft,” says Tim Gifford, VR Simulation’s chief technology officer. “We use all different kinds of trackers in our VR applications, but as you can imagine, there’s complicated physical maneuvering involved here. It’s much more than simply opening the hood and taking a look.”

Though Vicon’s full-body tracking system is wireless, the immersed engineers are still tethered by the cables attached to the VR gear, including the head-mounted display and two gloves. “We’re working toward a completely wireless environment,” says Gifford. “But first we need to develop a way to multiplex all those signals—including the data from the gloves and the video signal that is sent back to the head mount for the 1,024 x 768 display—so we can send them over one set of transmitters and receivers. Eventually, we’d also like to get all the gear to run off of a single power source. I think that would really clean things up and give the engineers even more freedom of movement.”

Because the Vicon system outputs marker position—or just points in space—the application needed to take that information, translate it and deliver it as a real representation of the engineer’s human form on the projected image for the review panel. “We ended up writing a proprietary driver so we could interface the entire system,” says Gifford. VR Simulation also used Sense8 World Toolkit, Discreet 3ds max, Adobe Photoshop and Okino’s PolyTrans scene-conversion software during development. In addition to the data gloves from 5DT and Immersion, VR Simulation used Kaiser Optics head-mounted displays and Dell workstations running Windows 2000.



Source: AVVMMP

1 2 Next

Related sites: • AV VideoDigital AnimatorsDigital CADDigital Producer
Related forums:

[an error occurred while processing this directive]




 

Copyright © 2004 PBI Media, LLC. All rights reserved.


Top      Home      Search      User Forum     Subscribe      Media Kit      Contact      Webmaster