Semantic integration of Airbus VR tool and an advanced CAD environment, for aircraft architecture design in an immersive cooperative mode
Once the optimization of aircraft production infrastructure has reached its limits, aircraft assembly simplification becomes the next big step in the race to improved production ramp-up. This simplification is done by design teams belonging to multiple organizations: structures engineering, acoustics engineering, systems engineering, manufacturing, customer support and so on. All these organizations work according to rules that need synchronizing, which may often bring cycling. In order to avoid cycling, Virtual Reality (VR) rooms enable users from multiple organizations to work together in the aircraft in immersive mode and at scale 1, as if they would be in a real aircraft at the final assembly line.
A first set of such capabilities exists in Airbus VR toolset, that is used for aircraft architecture design studies. Its being more and more frequently used brings requirements for additional capabilities, and brings a basic question: is it really worth to extend aircraft architecture design capabilities within the VR tool? Isn’t it more suitable to integrate the VR tool with a CAD environment? What follows aims at giving an answer to this question.
The main objective of this project is to demonstrate that a major improvement of current design-in-VR capabilities can be designed, that is to bridge the gaps that are being outlined by current system users. Similarly, the goal of the project is to demonstrate that, despite the high level of performance that is requested to a VR system, a quick enough integration of a professional CAD system can be envisaged, that will deliver the design capabilities whilst maintaining a VR quality of service that sustains both current systems immersive impressions and qualities of service.
In this frame, a VR system being composed with the combination of a stereoscopic display and a motion tracking infrastructure, can capture and digitalize human movements. The idea this project proposes is thus to get the users to tell the system their design intents using the most tangible interface: their own body (instead of using a keyboard, or a mouse).
For instance, basic movements can be joining the hands, then separating them until planned distance between the hands. This kind of movement enables leading the system either to exact conclusions (the distance between two hands) or to multiple opportunities (am I looking for a cube? for a sphere?). In the latter case, the user is then expected to be able to select the object that is the closer one to what he needs, install it at the right place in the aircraft, and then use tool parametric aspects. And this brings a specific aspect of the need for performances: the integration with the Cad tool during resizing and positioning, the managing of parametric relationships. Finally, the global system is expected to be showcased on two design scenarios: incremental design (architecture modification study for an existing aircraft) and bottom-up design (architecture design for a pathfinder study, thus for an aircraft that still doesn’t exist).