Exploiting Audio-Visual Cross-Modal Interaction to Reduce Computational Requirements in Interactive Environments
The quality of real-time computer graphics has progressed enormously in the last decade due to the rapid development in graphics hardware and its utilisation of new algorithms and techniques. The computer games industry, with its substantial software and hardware requirements, has been at the forefront in pushing these developments. Despite all the advances, there is still a demand for even more computational resources. For example, sound effects are an integral part of most computer games. This paper presents a method for reducing the amount of effort required to compute the computer graphics aspects of a game by exploiting movement related sound effects. We conducted a detailed psychophysical experiment investigating how camera movement speed and the sounds affect the perceive smoothness of an animation. The results show that walking (slow) animations were perceived as smoother than running (fast) animations. We also found that the addition of sound effects, such as footsteps, to a walking/running animation affects the animation smoothness perception. This entails that for certain conditions the number of frames that need to be rendered each second can be reduced saving valuable computation time. Our approach will enable the computed frame rate to be decreased, and thus the computational requirements to be lowered, without any perceivable visual loss of quality.