FlightMatrix Bridge is a Python-based API designed for controlling and fetching information, frames, and other data from Flight Matrix. This library enables efficient and real-time communication between various processes in a system, primarily designed for interfacing flight simulators, UAV systems, or other robotics platforms. It utilizes the multiprocessing.shared_memory
module to share data such as frames, sensor data, and movement commands across multiple processes.
Download the software from Flight Matrix.
The project is structured around two primary files:
-
bridge.py
: Implements the core functionalities of shared memory management, handling frames, sensor data, movement commands, and logging. -
utilities.py
: Provides utility functions to handle timestamps and convert them into human-readable formats. It also includes aDroneController
class for controlling drones.
- Introduction
- Features
- Controls
- Installation
- Usage
- Detailed Functionality
- Examples
- Documentation
- Network Streaming
- Logging Configuration
- Credits
The FlightMatrixBridge system is designed to bridge multiple processes that need to access shared memory for real-time communication. The typical use cases include flight simulators, robotics platforms, autonomous vehicles, and any application where sharing large datasets like frames or sensor readings between processes is essential.
This package provides:
- An interface to retrieve frames and sensor data from shared memory.
- The ability to send movement commands to be processed by another service.
- Real-time noise application to sensor data.
- Utilities to handle timestamps.
The FlightMatrix software offers a range of features to facilitate real-time communication and data sharing between processes. Key features include:
- Dual Camera Support: Flight Matrix is equipped with two cameras—left and right—that operate simultaneously. Each camera is capable of outputting high-quality RGB images, depth passes (z-depth), and segmentation maps, providing a comprehensive view of your simulated environment.
- Independent Camera Control: Each camera can be controlled independently, allowing you to position them relative to the drone with precision. Adjust the x, y, z coordinates, as well as yaw, pitch, and roll to achieve the desired perspective.
- Variable Speed Control: Control the speed of each axis and the rotation speed of the cameras, ensuring you can fine-tune the responsiveness to suit your simulation needs.
- Customizable Output: Turn on and off various output maps as required. Control the resolution of the output frames and adjust the Field of View (FOV) to enhance your visual experience.
- Graphics Presets: Choose from various graphics presets tailored for different simulation scenarios. Optimize the software’s performance based on your hardware capabilities and desired visual fidelity.
- Diverse Simulation Environments: Flight Matrix features a range of realistic maps, including architectural, natural, and ultra-realistic environments for authentic simulations. Navigate through intricate landscapes and urban settings as if you were flying in the real world.
- Human-like AI Characters: Enhance your simulations with beautifully animated AI characters that simulate real crowds and human interactions. Observe how they behave and interact within the environment, adding depth to your scenarios.
The FlightMatrixBridge API provides a simple and efficient way to interact with the Flight Matrix simulation software, enabling you to access frames, sensor data, and movement commands in real-time. The API is designed to be easy to use and flexible, allowing you to integrate it into your projects seamlessly.
- Frame Management: Retrieve left/right frames, z-depth maps, and segmentation frames in real-time.
- Sensor Data Access: Retrieve real-time sensor data such as location, orientation, velocity, acceleration, magnetometer readings, and more.
- Movement Command Handling: Send movement commands (position and orientation) for external systems to process.
- Noise Simulation: Add configurable levels of noise to sensor data for testing robustness.
- Flexible Resolution Handling: Easily set and adjust resolution for frames.
- Timestamp Management: Convert timestamps into human-readable formats and handle system-wide timing data.
Action | Keyboard/Mouse |
---|---|
Move Forward | W |
Move Backward | S |
Move Left | A |
Move Right | D |
Ascend | Space Bar |
Descend | Left Shift |
Rotate (Yaw, Pitch, Roll) | Arrow Keys |
Move Left/Right | Q / E |
Pause | Escape / Pause / P |
Exit | Escape / Pause / P |
Spawn Human AI Character | H |
Return to Starting Location | R |
Download the software from Flight Matrix.
To install the FlightMatrixBridge (API), simply use pip:
pip install flightmatrixbridge
Make sure your system has Python 3.8+ and supports the multiprocessing.shared_memory
module.
To initialize and start using the FlightMatrixBridge, create an instance of the FlightMatrixBridge
class and specify the resolution of the frames you want to handle:
from flightmatrix.bridge import FlightMatrixBridge
bridge = FlightMatrixBridge(resolution=(1226, 370), noise_level=0.01, apply_noise=False) # Set frame resolution (width, height), noise level, and noise application
The system uses the following units for sensor data:
- Length: centimeters (cm)
- Angular values: degrees (°)
- Angular velocity/ gyroscope readings: degrees per second (°/s)
- Acceleration/ accelerometer readings: centimeters per second squared (cm/s²)
- Magnetometer readings: unit vector
- LiDAR data: centimeters (cm)
- Collision detection: centimeters (cm)
- Timestamp: milliseconds (ms)
The axis system differs slightly between the software interface and the API. Below is a detailed explanation for both.
When adjusting camera positions or configuring movement multipliers within the software, the following axis system is used:
Direction | Axis |
---|---|
Forward | Y |
Backward | -Y |
Left | -X |
Right | X |
Up | Z |
Down | -Z |
Rotation values are in degrees and are labeled roll, pitch, and yaw:
Rotation | Axis |
---|---|
Roll | X |
Pitch | Y |
Yaw | Z |
Axis Orientation:
Z (Up)
|
|
|
O------ X (Right)
/
/
-Y (Backward)
The API uses a different axis system for movement commands and sensor data:
Direction | Axis |
---|---|
Forward | X |
Backward | -X |
Left | -Y |
Right | X |
Up | Z |
Down | -Z |
Rotation values are in degrees and are labeled roll, pitch, and yaw:
Rotation | Axis |
---|---|
Roll | X |
Pitch | Y |
Yaw | Z |
Axis Orientation:
Z (Up)
|
|
|
O------ Y (Right)
/
/
-X (Backward)
You can configure logging based on your needs. The logging system provides flexibility to output logs either to the console or a file, and supports different log levels (DEBUG
, INFO
, WARNING
, ERROR
, SUCCESS
).
# Set log level to 'DEBUG'
bridge.set_log_level('DEBUG')
# Enable logging to file
bridge.set_write_to_file(True)
The core functionalities include retrieving frames, fetching sensor data, and sending movement commands.
You can retrieve frames from both the left and right cameras. You also have access to depth and segmentation data.
# Retrieve right camera frame
right_frame = bridge.get_right_frame()
# Retrieve left camera frame
left_frame = bridge.get_left_frame()
# Retrieve z-depth for the right camera
right_zdepth = bridge.get_right_zdepth()
# Retrieve segmentation frame for the left camera
left_seg = bridge.get_left_seg()
The bridge allows real-time access to sensor data from the shared memory block. This data includes location, orientation, velocity, acceleration, and more.
sensor_data = bridge.get_sensor_data()
print(sensor_data)
To send movement commands (position and orientation) to a system, use the send_movement_command
method.
# Send movement command (x, y, z, roll, pitch, yaw)
bridge.send_movement_command(1.0, 2.0, 3.0, 0.1, 0.2, 0.3)
You can adjust the frame resolution dynamically and control noise levels applied to sensor data.
# Set a new resolution for frames
bridge.set_resolution(1280, 720)
# Set noise level for sensor data
bridge.set_noise_level(0.05)
# Enable or disable noise application
bridge.set_apply_noise(True)
The utilities.py
file provides functions to convert timestamps from milliseconds into human-readable formats and to datetime
objects.
from flightmatrix.utilities import timestamp2string, timestamp2datetime
# Convert timestamp to string
timestamp_string = timestamp2string(1633029600000)
print(timestamp_string) # Output: '2021-10-01 00:00:00:000'
# Convert timestamp to datetime object
timestamp_dt = timestamp2datetime(1633029600000)
print(timestamp_dt) # Output: datetime object in UTC
Upon initialization, the FlightMatrixBridge
class sets up shared memory blocks for frames, sensor data, and movement commands. It also configures the resolution and frame shapes.
The shared memory blocks are initialized using multiprocessing.shared_memory.SharedMemory
, providing fast, low-latency access to the data. Each memory block corresponds to specific data types like frames, sensor readings, or movement commands.
The memory block names and their associated data are defined in the memory_names
dictionary within the FlightMatrixBridge
class:
-
right_frame
: Stores the right camera frame. -
left_frame
: Stores the left camera frame. -
right_zdepth
: Z-depth map for the right camera. -
left_zdepth
: Z-depth map for the left camera. -
right_seg
: Segmentation data for the right camera. -
left_seg
: Segmentation data for the left camera. -
sensor_data
: Sensor data shared memory. -
movement_command
: Memory block for sending movement commands.
Frames can be retrieved from the shared memory using the _get_frame
method. The frames are stored as NumPy arrays and can be either 1-channel (grayscale) or 3-channel (RGB).
The get_sensor_data
method retrieves sensor readings from the shared memory. The sensor data includes:
- Location
(x, y, z)
in centimeters - Orientation
(roll, pitch, yaw)
in degrees - gyroscope
(x, y, z)
in degrees per second - accelerometer
(x, y, z)
in cm/s^2 - Magnetometer readings
(x, y, z)
in unit vector - LiDAR data
(LiDARForward, LiDARBackward, LiDARLeft, LiDARRight, LiDARBottom) or (Y, -Y, -X, X, -Z)
in centimeters - Collision detection status
(True/False, LocationX, LocationY, LocationZ)
in centimeters - Timestamp in milliseconds
Movement commands are written to shared memory using send_movement_command
. These commands include the position and orientation of the system and are stored as six floating-point values.
The logging system is highly configurable and provides essential feedback about the system's operations. You can adjust the verbosity of the logs and decide whether to write them to a file.
To simulate real-world noise in sensor data, noise can be added using Gaussian distribution. This feature is optional and can be enabled/disabled dynamically.
import cv2
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.utilities import timestamp2string
import ultraprint.common as p
# Initialize the FlightMatrixBridge
bridge = FlightMatrixBridge()
# Start a loop to continuously fetch and display frames
while True:
# Fetch the left and right frames
left_frame_data = bridge.get_left_frame()
right_frame_data = bridge.get_right_frame()
# Fetch the z-depth frames for both left and right
left_zdepth_data = bridge.get_left_zdepth()
right_zdepth_data = bridge.get_right_zdepth()
# Retrieve the actual frame arrays and timestamps
left_frame = left_frame_data['frame']
right_frame = right_frame_data['frame']
left_zdepth = left_zdepth_data['frame']
right_zdepth = right_zdepth_data['frame']
left_timestamp = left_frame_data['timestamp']
right_timestamp = right_frame_data['timestamp']
# Convert timestamps to human-readable format
left_timestamp = timestamp2string(left_timestamp)
right_timestamp = timestamp2string(right_timestamp)
# Display the frames in OpenCV windows
cv2.imshow("Left Frame", left_frame)
cv2.imshow("Right Frame", right_frame)
cv2.imshow("Left Z-Depth", left_zdepth)
cv2.imshow("Right Z-Depth", right_zdepth)
# Print timestamps for each frame (optional)
p.purple(f"Left Frame Timestamp: {left_timestamp}")
p.purple(f"Right Frame Timestamp: {right_timestamp}")
# Print timestamps for z-depth frames (optional)
p.lgray(f"Left Z-Depth Timestamp: {left_timestamp}")
p.lgray(f"Right Z-Depth Timestamp: {right_timestamp}")
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
# Release OpenCV windows
cv2.destroyAllWindows()
from flightmatrix.bridge import FlightMatrixBridge
# Initialize the bridge
bridge = FlightMatrixBridge()
# Send a movement command (x, y, z, roll, pitch, yaw)
bridge.send_movement_command(0.5, 1.0, 0.8, 0.0, 0.1, 0.2)
In order to reset/stop the movement, you can send a command with all zeros:
bridge.send_movement_command(0.0, 0.0, 0.0, 0.0, 0.0, 0.0)
from flightmatrix.bridge import FlightMatrixBridge
# Initialize the bridge
bridge = FlightMatrixBridge(resolution=(1226, 370), noise_level=0.01, apply_noise=False) # Set frame resolution (width, height), noise level, and noise application
# Fetch sensor data
sensor_data = bridge.get_sensor_data()
# Check for errors
if sensor_data.get('error'):
print("Error fetching sensor data:", sensor_data['error'])
else:
# Extract sensor readings
location = sensor_data['location']
orientation = sensor_data['orientation']
gyroscope = sensor_data['gyroscope']
accelerometer = sensor_data['accelerometer']
magnetometer = sensor_data['magnetometer']
lidar = sensor_data['lidar']
collision = sensor_data['collision']
timestamp = sensor_data['timestamp']
# Display sensor data in a readable format
print("Sensor Data:")
print("-----------------------")
print(f"Timestamp: {timestamp} ms")
print(f"Location (cm): X={location[0]:.2f}, Y={location[1]:.2f}, Z={location[2]:.2f}")
print(f"Orientation (degrees): Roll={orientation[0]:.2f}, Pitch={orientation[1]:.2f}, Yaw={orientation[2]:.2f}")
print(f"Gyroscope (deg/s): X={gyroscope[0]:.2f}, Y={gyroscope[1]:.2f}, Z={gyroscope[2]:.2f}")
print(f"Accelerometer (cm/s²): X={accelerometer[0]:.2f}, Y={accelerometer[1]:.2f}, Z={accelerometer[2]:.2f}")
print(f"Magnetometer (unit vector): X={magnetometer[0]:.2f}, Y={magnetometer[1]:.2f}, Z={magnetometer[2]:.2f}")
print(f"LiDAR Data (cm): Forward={lidar[0]:.2f}, Backward={lidar[1]:.2f}, Left={lidar[2]:.2f}, Right={lidar[3]:.2f}, Bottom={lidar[4]:.2f}")
print(f"Collision Detection: Status={collision[0]}, Location (cm): X={collision[1]:.2f}, Y={collision[2]:.2f}, Z={collision[3]:.2f}")
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.utilities import DroneController
# Example Usage
bridge = FlightMatrixBridge()
drone = DroneController(bridge)
# Move forward by 1.0 (positive y-axis)
drone.move_forward(1.0)
# Ascend by 0.5 (positive z-axis)
drone.ascend(0.5)
# Rotate in yaw by 0.3
drone.rotate_yaw(0.3)
# Stop only rotation (keep movement intact)
drone.stop_rotation()
# Stop all movement and rotation
drone.stop()
# Hover in place and rotate at 0.5 speed for 5 seconds
drone.hover_and_rotate(0.5, 5)
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.utilities import DataRecorder
import time
# Example usage (Record data each second for 120 seconds)
if __name__ == "__main__":
bridge = FlightMatrixBridge()
recorder = DataRecorder(bridge, base_dir="Sample_Recordings",
record_left_frame=True,
record_right_frame=True,
record_left_zdepth=True,
record_right_zdepth=True,
record_left_seg=True,
record_right_seg=True,
record_sensor_data=True,
record_sensor_data_interval=1)
recorder.start_recording()
time.sleep(120) # Record for 120 seconds
recorder.stop_recording()
This class interfaces with the Flight Matrix system using shared memory for inter-process communication. It manages frames, timestamps, and movement commands, enabling seamless data sharing between processes.
-
width (int)
: The width of the frame, initialized by the resolution provided. -
height (int)
: The height of the frame, initialized by the resolution provided. -
frame_shape (tuple)
: Tuple representing the shape of the frame as(height, width)
. -
frame_shape_3ch (tuple)
: Tuple representing the shape of the frame with 3 channels as(height, width, 3)
. -
noise_level (float)
: Specifies the level of noise to be applied. Defaults to0.01
. -
apply_noise (bool)
: Boolean flag that determines whether noise should be applied. Defaults toFalse
. -
memory_names (dict)
: Dictionary mapping keys to shared memory block names. Used for storing frame, depth, segmentation, and movement command data. -
log (Logger)
: A logger instance used for logging events and debugging messages. -
shm (dict)
: Dictionary storing the shared memory objects for frame data. -
shm_timestamps (dict)
: Dictionary storing the shared memory objects for timestamps. -
num_floats (int)
: Number of float values stored in shared memory for movement commands. Defaults to6
. Do not edit this value.
Description:
Initializes the FlightMatrixBridge
class by setting up shared memory, logging, and configuring noise settings.
Args:
-
resolution (tuple, optional)
: A tuple specifying the frame's width and height. Defaults to(1226, 370)
. -
noise_level (float, optional)
: Specifies the level of noise to be applied to sensor data. Defaults to0.01
. -
apply_noise (bool, optional)
: Boolean flag that determines whether noise should be applied to sensor data. Defaults toFalse
.
Example:
bridge = FlightMatrixBridge(resolution=(800, 600), noise_level=0.05, apply_noise=True)
Description:
Sets the logging level for the logger instance to control the verbosity of log output.
Args:
-
log_level (str)
: Desired log level ('DEBUG'
,'INFO'
,'WARNING'
,'ERROR'
). Default is'INFO'
.
Returns:
None.
Example:
bridge.set_log_level('DEBUG')
Description:
Sets whether the logging should be written to a file or not.
Args:
-
write_to_file (bool)
: IfTrue
, log messages will be written to a file; otherwise, they won't.
Returns:
None.
Example:
bridge.set_write_to_file(True)
Description:
Initializes shared memory blocks for frames and timestamps based on the keys stored in memory_names
. If the shared memory block for a specific key is not available, a warning will be logged.
Raises:
-
FileNotFoundError
: If the shared memory block for a key does not exist.
Returns:
None.
Example:
bridge._initialize_shared_memory()
Description:
Sets up shared memory for movement commands (x, y, z, roll, pitch, yaw
) and an availability flag. If the shared memory block exists, it will attach to it; otherwise, it will create a new block.
Raises:
-
FileExistsError
: If the shared memory block already exists when trying to create it.
Returns:
None.
Example:
bridge._initialize_movement_command_memory()
Description:
Retrieves a frame from shared memory. Handles both 3-channel and single-channel frame retrieval.
Args:
-
key (str)
: Key identifying the shared memory segment. -
channels (int, optional)
: Number of channels in the frame, default is3
.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved frame orNone
if an error occurred. -
'timestamp' (any or None)
: The timestamp associated with the frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Raises:
-
Warning
: If shared memory is not available or if there is a resolution mismatch.
Example:
frame_data = bridge._get_frame('right_frame', channels=3)
Description:
Retrieves the timestamp associated with the frame stored in shared memory.
Args:
-
key (str)
: Key identifying the shared memory segment for the timestamp.
Returns:
-
int or None
: The timestamp as an integer, orNone
if not available.
Example:
timestamp = bridge._get_timestamp('right_frame')
Description:
Adds Gaussian noise to the given data based on the configured noise level.
Args:
-
data (np.ndarray)
: The data (typically a frame) to which noise will be added.
Returns:
-
np.ndarray
: The noisy data.
Example:
noisy_frame = bridge.add_noise(frame_data)
Description:
Retrieves sensor data from shared memory and returns it as a dictionary.
If the sensor data is not available in shared memory, a warning is logged,
and a dictionary with all sensor fields set to None and an error message is returned.
The sensor data includes:
- location: 3 floats representing the location coordinates.
- orientation: 3 floats representing the orientation.
- gyroscope: 3 floats representing the gyroscope readings.
- accelerometer: 3 floats representing the accelerometer readings.
- magnetometer: 3 floats representing the magnetometer readings.
- lidar: 5 floats representing the lidar readings.
- collision: 4 floats representing the collision data.
- timestamp: The timestamp of the sensor data.
If noise application is enabled, noise is added to the gyroscope, accelerometer,
magnetometer, and lidar data.
Returns:
-
dict
: A dictionary containing the sensor data or an error message if the data is not available.
Example:
sensor_data = bridge.get_sensor_data()
Description:
Sends movement command values (x, y, z, roll, pitch, yaw
) to the shared memory block.
Args:
-
x (float)
: Movement in the X-axis. -
y (float)
: Movement in the Y-axis. -
z (float)
: Movement in the Z-axis. -
roll (float)
: Roll rotation. -
pitch (float)
: Pitch rotation. -
yaw (float)
: Yaw rotation.
Returns:
None.
Example:
bridge.send_movement_command(1.0, 0.5, -1.0, 0.2, 0.1, -0.3)
Description:
Writes the movement commands to shared memory.
Args:
-
commands (list of float)
: List of movement command values ([x, y, z, roll, pitch, yaw]
).
Returns:
None.
Example:
bridge._write_movement_command([1.0, 0.5, -1.0, 0.2, 0.1, -0.3])
Description:
Sets the resolution of the frames by updating the width
and height
attributes and recalculating the frame shapes.
Args:
-
width (int)
: Width of the frames. -
height (int)
: Height of the frames.
Returns:
None.
Example:
bridge.set_resolution(800, 600)
Description:
Sets the noise level for the frames.
Args:
-
noise_level (float)
: The level of noise to apply.
Returns:
None.
Example:
bridge.set_noise_level(0.05)
Description:
Sets whether noise should be applied to frames.
Args:
-
apply_noise (bool)
: Whether to apply noise (True
orFalse
).
Returns:
None.
Example:
bridge.set_apply_noise(True)
Description:
Retrieves the right frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved right frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the right frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
right_frame_data = bridge.get_right_frame()
Description:
Retrieves the left frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved left frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the left frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
left_frame_data = bridge.get_left_frame()
Description:
Retrieves the right depth frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved right depth frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the right depth frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
right_zdepth_data = bridge.get_right_zdepth()
Description:
Retrieves the left depth frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved left depth frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the left depth frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
left_zdepth_data = bridge.get_left_zdepth()
Description:
Retrieves the right segmentation frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved right segmentation frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the right segmentation frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
right_segmentation_data = bridge.get_right_seg()
Description:
Retrieves the left segmentation frame from shared memory.
Returns:
-
dict
: A dictionary with:-
'frame' (np.ndarray or None)
: The retrieved left segmentation frame orNone
if an error occurred. -
'timestamp' (int or None)
: The timestamp associated with the left segmentation frame orNone
if an error occurred. -
'error' (str or None)
: Error message, if any.
-
Example:
left_segmentation_data = bridge.get_left_seg()
Description:
Converts a timestamp in milliseconds to a human-readable string format.
Args:
-
timestamp (int)
: The timestamp in milliseconds.
Returns:
-
str
: Formatted timestamp as a string in the format 'YYYY-MM-DD HH:MM:SS:fff'.
Example:
formatted_time = timestamp2string(1609459200000)
# Output: '2021-01-01 00:00:00:000'
Description:
Converts a timestamp in milliseconds to a datetime
object in UTC.
Args:
-
timestamp (int)
: The timestamp in milliseconds.
Returns:
-
datetime
: The correspondingdatetime
object in UTC.
Example:
datetime_obj = timestamp2datetime(1609459200000)
# Output: datetime(2021, 1, 1, 0, 0, 0, tzinfo=timezone.utc)
Description:
Converts Cartesian coordinates to GPS coordinates (latitude, longitude, altitude).
Args:
-
x (float)
: X coordinate in centimeters. -
y (float)
: Y coordinate in centimeters. -
z (float)
: Z coordinate in centimeters. -
origin_lat (float, optional)
: Latitude of the origin point in degrees. Defaults to 22.583047. -
origin_lon (float, optional)
: Longitude of the origin point in degrees. Defaults to 88.45859783333334. -
origin_alt (float, optional)
: Altitude of the origin point in meters. Defaults to 0. -
add_noise (bool, optional)
: Whether to add noise to the GPS coordinates. Defaults to False. -
lat_long_noise_amt (float, optional)
: Amount of noise to add to latitude and longitude. Defaults to 0.0001. -
alt_noise_amt (float, optional)
: Amount of noise to add to altitude. Defaults to 0.1. -
earth_radius (float, optional)
: Radius of the Earth in meters. Defaults to 6378137 (meters).
Returns:
-
tuple
: A tuple containing the latitude, longitude, and altitude in meters.
Example:
latitude, longitude, altitude = cartesian_to_gps(1000, 2000, 300)
This class provides an interface to control the drone's movements by sending commands to the flight matrix system. It allows the drone to move along the x, y, and z axes and rotate around the roll, pitch, and yaw axes.
-
bridge (FlightMatrixBridge)
: The bridge object used to communicate with the drone. -
current_x (float)
: Current x-coordinate position, initialized to0.0
. -
current_y (float)
: Current y-coordinate position, initialized to0.0
. -
current_z (float)
: Current z-coordinate position, initialized to0.0
. -
current_roll (float)
: Current roll angle, initialized to0.0
. -
current_pitch (float)
: Current pitch angle, initialized to0.0
. -
current_yaw (float)
: Current yaw angle, initialized to0.0
.
Description:
Initializes the DroneController
class by linking it to a FlightMatrixBridge
object and setting the initial drone movement parameters to zero.
Args:
-
bridge_object (FlightMatrixBridge)
: An instance ofFlightMatrixBridge
for communication with the flight matrix system.
Example:
bridge = FlightMatrixBridge()
drone_controller = DroneController(bridge)
Description:
Sends the current positional and rotational state (x, y, z, roll, pitch, yaw) as movement commands to the drone.
Returns:
None
Example:
drone_controller._send_command()
Description:
Moves the drone to a specified x-coordinate.
Args:
-
value (float)
: The x-coordinate to move to.
Returns:
None
Example:
drone_controller.move_x(10.5) # Move drone to x = 10.5
Description:
Moves the drone to a specified y-coordinate (left or right).
Args:
-
value (float)
: The y-coordinate to move to.
Returns:
None
Example:
drone_controller.move_y(-5.2) # Move drone to y = -5.2
Description:
Moves the drone to a specified z-coordinate (up or down).
Args:
-
value (float)
: The z-coordinate to move to.
Returns:
None
Example:
drone_controller.move_z(15.0) # Move drone up to z = 15.0
Description:
Rotates the drone to a specified roll angle.
Args:
-
value (float)
: The roll angle to rotate to.
Returns:
None
Example:
drone_controller.rotate_roll(30.0) # Rotate drone to a roll angle of 30 degrees
Description:
Rotates the drone to a specified pitch angle.
Args:
-
value (float)
: The pitch angle to rotate to.
Returns:
None
Example:
drone_controller.rotate_pitch(-15.0) # Rotate drone to a pitch angle of -15 degrees
Description:
Rotates the drone to a specified yaw angle.
Args:
-
value (float)
: The yaw angle to rotate to, in degrees.
Returns:
None
Example:
drone_controller.rotate_yaw(90.0) # Rotate drone to a yaw angle of 90 degrees
Description:
Ascends the drone by a specified value, increasing the current altitude.
Args:
-
value (float)
: The amount to increase the altitude.
Returns:
None
Example:
drone_controller.ascend(5.0) # Ascend drone by 5 units
Description:
Descends the drone by a specified value, decreasing the current altitude.
Args:
-
value (float)
: The amount to decrease the altitude.
Returns:
None
Example:
drone_controller.descend(3.0) # Descend drone by 3 units
Description:
Moves the drone forward by a specified value (positive y-axis).
Args:
-
value (float)
: The amount to move forward.
Returns:
None
Example:
drone_controller.move_forward(10.0) # Move drone forward by 10 units
Description:
Moves the drone backward by a specified value (negative y-axis).
Args:
-
value (float)
: The amount to move backward.
Returns:
None
Example:
drone_controller.move_backward(8.0) # Move drone backward by 8 units
Description:
Stops all drone movements on the x, y, and z axes.
Returns:
None
Example:
drone_controller.stop_movement() # Stop all drone movements
The DataRecorder
class is designed to record various types of data from a drone or robotic system using the FlightMatrix framework. It can capture visual frames, z-depth images, segmentation frames, and sensor data, all of which are stored in a structured manner for later analysis.
-
bridge (FlightMatrixBridge)
: The bridge object used to interface with the drone or robot's systems. -
base_dir (str)
: The base directory where all recorded data will be stored. -
record_left_frame (bool)
: Flag indicating whether to record the left visual frame (default:False
). -
record_right_frame (bool)
: Flag indicating whether to record the right visual frame (default:False
). -
record_left_zdepth (bool)
: Flag indicating whether to record the left z-depth frame (default:False
). -
record_right_zdepth (bool)
: Flag indicating whether to record the right z-depth frame (default:False
). -
record_left_seg (bool)
: Flag indicating whether to record the left segmentation frame (default:False
). -
record_right_seg (bool)
: Flag indicating whether to record the right segmentation frame (default:False
). -
record_sensor_data (bool)
: Flag indicating whether to record sensor data (default:False
). -
sensor_data_interval (float)
: The interval at which sensor data is recorded (default:0.1
seconds). -
threads (list)
: List to hold the thread objects for recording data. -
stop_event (Event)
: Event used to signal the threads to stop.
__init__(self, bridge: FlightMatrixBridge, base_dir: str, record_left_frame: bool = False, record_right_frame: bool = False, record_left_zdepth: bool = False, record_right_zdepth: bool = False, record_left_seg: bool = False, record_right_seg: bool = False, record_sensor_data: bool = False, record_sensor_data_interval: float = 0.1)
Description:
Initializes the DataRecorder
class with specified options for recording. It sets up directories for storing the recorded data based on user selections.
Args:
-
bridge (FlightMatrixBridge)
: An instance ofFlightMatrixBridge
used to interact with the drone/robot. -
base_dir (str)
: The directory to store recorded files. -
record_left_frame (bool)
: IfTrue
, records the left visual frame. -
record_right_frame (bool)
: IfTrue
, records the right visual frame. -
record_left_zdepth (bool)
: IfTrue
, records the left z-depth frame. -
record_right_zdepth (bool)
: IfTrue
, records the right z-depth frame. -
record_left_seg (bool)
: IfTrue
, records the left segmentation frame. -
record_right_seg (bool)
: IfTrue
, records the right segmentation frame. -
record_sensor_data (bool)
: IfTrue
, records sensor data. -
record_sensor_data_interval (float)
: Time interval for recording sensor data in seconds.
Description:
Continuously captures and saves visual frames, z-depth frames, and segmentation frames until the recording is stopped. Each frame is saved with a timestamped filename.
Description:
Records sensor data at specified intervals, saving the readings to a CSV file. It checks for errors in the sensor data and handles them appropriately.
Description:
Starts the recording process by launching separate threads for recording frames and sensor data, based on the user’s selections.
Description:
Stops the recording process by signaling the threads to finish and waits for them to join back.
if __name__ == "__main__":
bridge = FlightMatrixBridge()
recorder = DataRecorder(bridge, base_dir="Sample_Recordings",
record_left_frame=True,
record_right_frame=True,
record_left_zdepth=True,
record_right_zdepth=True,
record_left_seg=True,
record_right_seg=True,
record_sensor_data=True,
record_sensor_data_interval=1)
recorder.start_recording()
time.sleep(10) # Record for 10 seconds
recorder.stop_recording()
Description: Checks if the recording process is currently active.
Returns:
-
bool
:True
if recording is active,False
otherwise.
The DataStreamer
class provides a convenient way to stream data from the Flight Matrix system using callbacks. It allows you to subscribe to specific data streams (e.g., left frame, sensor data) and specify callback functions that are called whenever new data is available. Each data stream runs in its own thread, enabling parallel data fetching without one stream waiting for the other. You can also specify the interval at which data is fetched or set it to zero for the fastest possible streaming.
- Subscribe to Data Streams: Subscribe to specific data types you are interested in.
- Parallel Data Fetching: Each data stream runs in its own thread for efficient parallel processing.
- Custom Callbacks: Provide your own callback functions to process the data as it arrives.
- Adjustable Fetch Interval: Control the rate at which data is fetched by specifying the interval.
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.utilities import DataStreamer
# Initialize the FlightMatrixBridge
bridge = FlightMatrixBridge()
# Initialize the DataStreamer
streamer = DataStreamer(bridge)
import cv2
# Define a callback function for left frame data
def left_frame_callback(frame_data):
frame = frame_data['frame']
timestamp = frame_data['timestamp']
cv2.imshow("Left Frame", frame)
print("Left Frame Timestamp:", timestamp)
# Subscribe to the left frame data stream
streamer.subscribe("left_frame", left_frame_callback)
# Define a callback function for sensor data
def sensor_data_callback(sensor_data):
print("Sensor Data:", sensor_data)
# Subscribe to the sensor data stream
streamer.subscribe("sensor_data", sensor_data_callback)
- left_frame: Left visual frame data.
- right_frame: Right visual frame data.
- left_zdepth: Left z-depth frame data.
- right_zdepth: Right z-depth frame data.
- left_seg: Left segmentation frame data.
- right_seg: Right segmentation frame data.
- sensor_data: Sensor data from the drone or robot.
Use the subscribe
method to subscribe to the desired data streams and provide a callback function to process the data as it arrives. The callback function will be called with the data as an argument whenever new data is available. for example, the left_frame_callback
function will be called with the left frame data whenever a new frame is available.
import cv2
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.utilities import DataStreamer
def left_frame_callback(left_frame_data):
left_frame = left_frame_data['frame']
left_timestamp = left_frame_data['timestamp']
cv2.imshow("Left Frame", left_frame)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Left Frame Timestamp:", left_timestamp)
def right_frame_callback(right_frame_data):
right_frame = right_frame_data['frame']
right_timestamp = right_frame_data['timestamp']
cv2.imshow("Right Frame", right_frame)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Right Frame Timestamp:", right_timestamp)
def left_zdepth_callback(left_zdepth_data):
left_zdepth = left_zdepth_data['frame']
left_timestamp = left_zdepth_data['timestamp']
cv2.imshow("Left Z-Depth", left_zdepth)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Left Z-Depth Timestamp:", left_timestamp)
def right_zdepth_callback(right_zdepth_data):
right_zdepth = right_zdepth_data['frame']
right_timestamp = right_zdepth_data['timestamp']
cv2.imshow("Right Z-Depth", right_zdepth)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Right Z-Depth Timestamp:", right_timestamp)
def left_seg_callback(left_seg_data):
left_seg = left_seg_data['frame']
left_timestamp = left_seg_data['timestamp']
cv2.imshow("Left Segmentation", left_seg)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Left Segmentation Timestamp:", left_timestamp)
def right_seg_callback(right_seg_data):
right_seg = right_seg_data['frame']
right_timestamp = right_seg_data['timestamp']
cv2.imshow("Right Segmentation", right_seg)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
cv2.destroyAllWindows()
print("Right Segmentation Timestamp:", right_timestamp)
def sensor_data_callback(sensor_data):
print("Sensor Data:", sensor_data)
# Initialize the FlightMatrixBridge
bridge = FlightMatrixBridge()
# Initialize the DataStreamer
streamer = DataStreamer(bridge)
# Subscribe to the left frame data stream
streamer.subscribe("left_frame", left_frame_callback, interval=0)
# Subscribe to the right frame data stream
streamer.subscribe("right_frame", right_frame_callback, interval=0.1)
# Subscribe to the left z-depth data stream
streamer.subscribe("left_zdepth", left_zdepth_callback, interval=0.1)
# Subscribe to the right z-depth data stream
streamer.subscribe("right_zdepth", right_zdepth_callback, interval=0.1)
# Subscribe to the left segmentation data stream
streamer.subscribe("left_seg", left_seg_callback, interval=0.1)
# Subscribe to the right segmentation data stream
streamer.subscribe("right_seg", right_seg_callback, interval=0.1)
# Subscribe to the sensor data stream
streamer.subscribe("sensor_data", sensor_data_callback, interval=0.1)
FlightMatrix Bridge provides a robust network streaming feature that allows seamless communication between the server and multiple clients. This enables real-time data transmission, including sensor data, visual frames, and movement commands, facilitating efficient control and monitoring of flight simulations.
- Server (
FlightMatrixServer
) - Client (
FlightMatrixClient
)
The server is responsible for broadcasting subscribed data streams to connected clients. It leverages the FlightMatrixServer
class to manage client connections, handle subscriptions, and continuously send data at specified intervals.
-
start()
: Initializes and starts the server to listen for incoming client connections. -
accept_clients()
: Handles the acceptance of new client connections. -
handle_client(client_socket)
: Manages communication with a connected client. -
send_data_to_client(client_socket)
: Sends subscribed data streams to the client. -
collect_data(subscription_list)
: Gathers the required data based on client subscriptions. -
send_data_in_chunks(client_socket, data_bytes, chunk_size=4096)
: Ensures efficient data transmission by sending data in manageable chunks. -
receive_data(sock)
: Receives incoming data from clients. -
disconnect_client(client_socket)
: Safely disconnects a client from the server. -
stop()
: Stops the server and disconnects all clients.
Clients connect to the server to receive real-time data streams. The FlightMatrixClient
class facilitates this connection, allowing clients to specify their data subscriptions and handle incoming data through callback functions.
-
__init__(host, port, subscription_list, data_callback)
: Initializes the client with server details, desired data streams, and a callback for handling received data. -
connect()
: Establishes a connection to the server. -
receive_data_loop()
: Continuously listens for incoming data from the server. -
process_received_data(data)
: Processes and handles the received data. -
send_movement_command(x, y, z, roll, pitch, yaw)
: Sends movement commands to the server. -
send_data(data)
: Sends arbitrary data to the server. -
receive_data()
: Receives data from the server. -
disconnect()
: Disconnects the client from the server.
- Customizable Intervals: Set the frequency of data retrieval or opt for the fastest possible streaming.
- Data Subscriptions: Subscribe to specific data streams based on your requirements.
- Data Handling: Process received data using custom callback functions.
- Movement Commands: Send movement commands to control the drone or robot remotely.
- Error Handling: Manage exceptions and ensure reliable communication between the server and clients.
- Efficient Data Transmission: Optimize data transfer by sending data in manageable chunks.
- Client Disconnection: Safely disconnect clients from the server to maintain network integrity.
- Real-Time Communication: Enable real-time data streaming for monitoring and control applications.
- Multiple Client Support: Connect multiple clients to the server for simultaneous data transmission.
# server_example.py
from flightmatrix.bridge import FlightMatrixBridge
from flightmatrix.server import FlightMatrixServer
import time
# Initialize the bridge with desired resolution
bridge = FlightMatrixBridge()
# Start the server
server = FlightMatrixServer(
host='0.0.0.0',
port=9999,
bridge=bridge,
broadcast_interval=0.00 # Adjusted interval for higher FPS
)
server.start()
try:
while True:
time.sleep(1)
except KeyboardInterrupt:
server.stop()
from flightmatrix.server import FlightMatrixClient
import cv2
import time
def data_callback(data):
# Handle received data
if 'sensor_data' in data:
sensor_data = data['sensor_data']
print(f"Sensor Data: {sensor_data}")
if 'left_frame' in data:
left_frame_info = data['left_frame']
frame = left_frame_info['frame'] # Frame is already decoded
cv2.imshow('Left Frame', frame)
cv2.waitKey(1)
if 'right_frame' in data:
right_frame_info = data['right_frame']
frame = right_frame_info['frame'] # Frame is already decoded
cv2.imshow('Right Frame', frame)
cv2.waitKey(1)
# Handle other data types similarly
# Specify the data types you want to subscribe to
subscription_list = ['sensor_data', 'left_frame', 'right_frame']
# Initialize the client
client = FlightMatrixClient(
host='localhost',
port=9999,
subscription_list=subscription_list,
data_callback=data_callback
)
client.connect()
# Example: Send a movement command once, if needed
time.sleep(5)
client.send_movement_command(x=0, y=0, z=0, roll=0, pitch=0, yaw=5)
time.sleep(5)
client.send_movement_command(x=0, y=0, z=0, roll=0, pitch=0, yaw=0)
# Keep the client running to receive data
try:
while True:
time.sleep(1)
except KeyboardInterrupt:
client.disconnect()
cv2.destroyAllWindows()
Both FlightMatrixServer
and FlightMatrixClient
classes provide methods to configure logging settings.
You can adjust the verbosity of the logs by setting the log level. Available levels are 'DEBUG'
, 'INFO'
, 'WARNING'
, 'ERROR'
, and 'SUCCESS'
.
from flightmatrix.server import FlightMatrixServer
# Initialize the server
server = FlightMatrixServer(host='0.0.0.0', port=9999, bridge=bridge)
# Set log level to 'DEBUG'
server.set_log_level('DEBUG')
from flightmatrix.server import FlightMatrixClient
# Initialize the client
client = FlightMatrixClient(host='localhost', port=9999, subscription_list=['sensor_data'], data_callback=data_callback)
# Set log level to 'DEBUG'
client.set_log_level('DEBUG')
You can enable logging to a file for both FlightMatrixServer
and FlightMatrixClient
.
from flightmatrix.server import FlightMatrixServer
# Initialize the server
server = FlightMatrixServer(host='0.0.0.0', port=9999, bridge=bridge)
# Enable logging to file
server.set_write_to_file(True)
from flightmatrix.server import FlightMatrixClient
# Initialize the client
client = FlightMatrixClient(host='localhost', port=9999, subscription_list=['sensor_data'], data_callback=data_callback)
# Enable logging to file
client.set_write_to_file(True)
This project was developed and maintained by Ranit Bhowmick, a Robotics and Automation engineer with a passion for building innovative solutions in AI, game development, and full-stack projects. Specializing in advanced Python programming, machine learning, and robotics, I’m always open to collaboration and eager to explore new challenges.
I'd like to express my gratitude to the unreal engine community for their support and feedback. I'm always open to suggestions and contributions to improve this project further.