EyeGuide SDK

You can use the EyeGuide API to build apps that work with the EyeGuide hardware. For example, fly a drone or play a game with just head and eye movement! Augmented Reality (AR) applications are also easy thanks to full access to both cameras' video streams.

The EyeGuide hardware broadcasts all its data in realtime by way of a simple network protocol. This protocol is trivial to implement, and we provide a sample implementation under a friendly license to help you get started.


With just a few lines of code in the language of your choice, you can access:

Point-of-gaze coordinates (60 times per second)

If calibration has been performed, this is where the eye tracker has determined the wearer is looking in the scene camera image.

 

Eye camera video data (60 frames per second)

This is the video from the eye camera, compressed using the H.264 codec.

 

Pupil center coordinates (60 times per second)

This is where the eye tracker detected the wearer's pupil center in the eye camera image.

Scene camera video data (30 frames per second)

This is the video from the scene camera, compressed using the H.264 codec.

Gyro angles (3 angles, 100 sets per second)
These are the usual three accelerometer/gyroscope angles: pitch, roll, and yaw of the wearer's head.


In addition to receiving data in realtime, you can also make video recordings and download them from the unit later.

Sending audio for playback through the unit's headphones is as simple as piping raw audio data back over the same network connection using audio_data_state messages.

You can also perform many maintenance and configuration functions on the EyeGuide hardware, such as setting the wireless network name (SSID) and channel, enabling/disabling the cameras and video streams, and even running commands in a root shell (careful with this one!).