Programming
Video Processing
  • Limelight was designed to make robotic perception as easy and reliable as possible without sacrificing raw performance.
  • Limelight is easy enough for complete beginners, and powerful enough for professionals.
  • Configure zero-code computer vision pipelines for color blobs, AprilTags, neural networks, and more using the built-in web interface.
  • Write custom Python SnapScript pipelines with tensorflow, opencv, and more using the built-in web interface or Visual Studio Code.
  • The Limelight hardware integrates a high-bandwidth, ultra-low-latency MIPI-CSI imaging sensor, arm64 computer, power conditioning, and LimelightOS.
  • Limelight OS supports REST/HTTP, Websocket, Modbus, and NetworkTables protocols and JSON, Protobuf, and raw output formats.

Accessing API

  • Limelight OS provides a NetworkTables API for accessing Limelight data and controlling Limelight settings.
  • NetworkTables is a protocol for distributed data storage and retrieval.
  • NetworkTables is used by the FIRST Robotics Competition (FRC) and other robotics competitions.
  • To get the Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").getDouble(0);
  • To set the Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").setDouble(0);
  • To get the 3D Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").getDoubleArray(new double[6]);

Web Interface

  • Configure Limelight settings and view data.
  • Manage pipelines, SnapScript pipelines, neural networks, color blobs, and AprilTags.
  • View Limelight data in real-time.

SnapScript

  • SnapScript is a Python-like language that is designed to be easy to learn and use.
  • SnapScript is designed to be easy to learn and use.

Compatible with Google Coral

  • Limelight is compatible with the Google Coral USB Accelerator.
  • The Google Coral USB Accelerator is a USB device that provides hardware acceleration for neural networks.

Robot Localization

  • Limelight provides robot localization using and AprilTags.
  • Robot localization is the process of determining the position and orientation of a robot in a known environment.

LimelightOS

  • LimelightOS is a custom operating system that is designed to run on the Limelight hardware.
  • LimelightOS is based on the Linux kernel.
  • LimelightOS is designed to be fast, reliable, and easy to use.

Basic Targeting Data API

KeyTypeDescription
tvint1 if valid target exists. 0 if no valid targets exist
txdoubleHorizontal Offset From Crosshair To Target (degrees)
tydoubleVertical Offset From Crosshair To Target (degrees)
txncdoubleHorizontal Offset From Principal Pixel To Target
tyncdoubleVertical Offset From Principal Pixel To Target
tadoubleTarget Area (0% of image to 100% of image)
tldoubleThe pipeline's latency contribution (ms). Add to "cl" to get total latency.
cldoubleCapture pipeline latency (ms). Time between the end of the exposure of the middle row of the sensor to the beginning of the tracking pipeline.
tshortdoubleSidelength of shortest side of the fitted bounding box (pixels)
tlongdoubleSidelength of longest side of the fitted bounding box (pixels)
thordoubleHorizontal sidelength of the rough bounding box (0 - 320 pixels)
tvertdoubleVertical sidelength of the rough bounding box (0 - 320 pixels)
getpipeintTrue active pipeline index of the camera (0 .. 9)
jsonstringFull JSON dump of targeting results
tclassstringClass name of primary neural detector result or neural classifier result
tcdoubleArrayGet the average HSV color underneath the crosshair region (3x3 pixel region) as a NumberArray
hbdoubleheartbeat value. Increases once per frame, resets at 2 billion
hwdoubleArrayHW metrics [fps, cpu temp, ram usage, temp]

Advanced Targeting Data API

KeyTypeDescription
botposedoubleArrayRobot transform in field-space. Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpibluedoubleArrayRobot transform in field-space (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_wpireddoubleArrayRobot transform in field-space (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orbdoubleArrayRobot transform in field-space (Megatag2). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpibluedoubleArrayRobot transform in field-space (Megatag2) (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
botpose_orb_wpireddoubleArrayRobot transform in field-space (Megatag2) (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image)
camerapose_targetspacedoubleArray3D transform of the camera in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_cameraspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Camera (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
targetpose_robotspacedoubleArray3D transform of the primary in-view AprilTag in the coordinate system of the Robot (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
botpose_targetspacedoubleArray3D transform of the robot in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees)
camerapose_robotspacedoubleArray3D transform of the camera in the coordinate system of the robot (array (6))
tidintID of the primary in-view AprilTag
priorityidint (setter)SET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization

Other Utils API

KeyTypeDescription
ledModeintSets limelight's LED state: [0] use the LED Mode set in the current pipeline,[1] force off, [2] force blink, [3] force on
pipelineintSets limelight's current pipeline: 0 .. 9 Select pipeline 0..9
streamintSets limelight's streaming mode: [0] Standard - Side-by-side streams if a webcam is attached to Limelight, [1] PiP Main - The secondary camera stream is placed in the lower-right corner of the primary camera stream, [2] PiP Secondary - The primary camera stream is placed in the lower-right corner of the secondary camera stream
cropdoubleArraySets the crop rectangle. The pipeline must utilize the default crop rectangle in the web interface. The array must have exactly 4 entries: [X0, X1, Y0, Y1]
camerapose_robotspace_setdoubleArraySet the camera's pose in the coordinate system of the robot
priorityidint (setter)SET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization
robot_orientation_setdoubleArraySET Robot Orientation and angular velocities in degrees and degrees per second[yaw,yawrate,pitch,pitchrate,roll,rollrate]
fiducial_id_filters_setintArrayOverride valid fiducial ids for localization (array)

Libraries


Limelight Library