Summary

Completed

In this module, you learned how to add eye tracking to objects to trigger actions when the user looks at the objects. You also learned how to create speech commands and how to control them globally. Including voice commands and eye tracking can benefit some applications and some users, reducing the movement required for interactions.

Next steps

You can continue to add eye tracking and voice commands to your applications. Consider trying the following interactions:

  • Use eye gaze to select an object and move with your hands.
  • Change the color of an object by saying "Change color".
  • Add local speech commands that require the user to look at the object that controls the speech command by selecting Voice Requires Focus.

Further reading