eyeGestures

Package for eye tracking algorithm allowing for development of gaze controlled computer interface


Keywords
eye, eyetracking, gaze, gazetracking, tracking
License
Other
Install
pip install eyeGestures==2.3.0

Documentation

EYEGESTURES

EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.

Our Mission!

PyPI - Downloads

💜 Sponsors:

For enterprise avoiding GPL3 licensing there is commercial license!

We offer custom integration and managed services. For businesses requiring invoices message us contact@eyegestures.com.

Sponsor us and we can add your link, banner or other promo materials!

Subscribe on Polar

💻 Install

python3 -m pip install eyeGestures

⚙️ Run

python3 examples/simple_example.py
python3 examples/simple_example_v2.py

🪟 Run Windows App [Not updated to latest engine, may crash]

python3 apps/win_app.py

Or download it from releases

🔧 How to use [WiP - adding Enginge V2]:

Using EyeGesture Engine V2 - Machine Learning Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v2

# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)  
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500

# Process each frame
event, cevent = gestures.step(frame, True, screen_width, screen_height)

cursor_x, cursor_y = event.point[0], event.point[1]
# calibration_radius: radius for data collection during calibration

Using EyeGesture Engine V1 - Model-Based Approach:

from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v1

# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()

cap = VideoCapture(0)  
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500

# Obtain estimations from camera frames
event, cevent = gestures.estimate(
    frame,
    "main",
    calibrate,  # set calibration - switch to False to stop calibration
    screen_width,
    screen_height,
    0, 0, 0.8, 10
)
cursor_x, cursor_y = event.point[0], event.point[1]
# cevent - is calibration event

Feel free to copy and paste the relevant code snippets for your project.

🔥 Web Demos:

rules of using

If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.

Promo Materials:

https://github.com/NativeSensors/EyeGestures/assets/40773550/4ca842b9-ba32-4ffd-b2e4-179ff67ee47f

https://github.com/NativeSensors/EyeGestures/assets/40773550/6a7c74b5-b069-4eec-bc96-3a6bb4159b37

📇 Find us:

Follow us on polar (it costs nothing but you help project!):

Subscribe on Polar

📢 Announcements:

Posts on Polar

💻 Contributors

💵 Support the project

Subscription Tiers on Polar

Star History Chart