eyeGestures

Package for eye tracking algorithm allowing for development of gaze controlled computer interface


Keywords
eye, eyetracking, gaze, gazetracking, tracking
License
Other
Install
pip install eyeGestures==2.3.1

Documentation

EYEGESTURES

EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.

Our Mission!

PyPI - Downloads

💜 Sponsors:

For enterprise avoiding GPL3 licensing there is commercial license!

We offer custom integration and managed services. For businesses requiring invoices message us contact@eyegestures.com.

Sponsor us and we can add your link, banner or other promo materials!

Subscribe on Polar

🔨 Projects build with EyeGestures:

💻 Install

python3 -m pip install eyeGestures

⚙️ Try

python3 examples/simple_example.py
python3 examples/simple_example_v2.py

🔧 Build your own:

Using EyeGesture Engine V2 - Machine Learning Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v2

# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)  
calibrate = True
screen_width = 500
screen_height= 500

# Process each frame
while true:
  ret, frame = cap.read()
  event, cevent = gestures.step(frame,
    calibrate,
    screen_width,
    screen_height,
    context="my_context")
  
  cursor_x, cursor_y = event.point[0], event.point[1]
  # calibration_radius: radius for data collection during calibration

Customize:

You can customize your calibration points/map to fit your solutions. Simple copy snippet below, and place your calibration poitns on x,y planes from 0.0 to 1.0. It will be then automatically scaled to your display.

gestures = EyeGestures_v2()
gestures.uploadCalibrationMap([[0,0],[0,1],[1,0],[1,1]])

Using EyeGesture Engine V1 - Model-Based Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v1

# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()

cap = VideoCapture(0)  
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500

# Obtain estimations from camera frames
event, cevent = gestures.estimate(
    frame,
    "main",
    calibrate,  # set calibration - switch to False to stop calibration
    screen_width,
    screen_height,
    0, 0, 0.8, 10
)
cursor_x, cursor_y = event.point[0], event.point[1]
# cevent - is calibration event

Feel free to copy and paste the relevant code snippets for your project.

🔥 Web Demos:

rules of using

If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.

📇 Find us:

Follow us on Polar (if you want to help, you can support us there!):

Subscribe on Polar

Troubleshooting:

  1. some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.

📢 Announcements:

Posts on Polar

💻 Contributors

💵 Support the project

Subscription Tiers on Polar

Star History Chart