All Classes and Interfaces
Class
Description
Represents a robot pose sample used for pose estimation.
Represents the angle to a simple target, not used for pose estimation.
IO implementation for real PhotonVision hardware.
Adapts the AprilTag vision system to the standardized PoseSource interface.
Utility class for registering named commands with PathPlanner's autonomous functionality.
Constants for LED control and configuration.
Pre-configured animation patterns.
Pre-defined color configurations.
Interface for controlling LED lighting functionality on the robot.
Data structure for logging LED state information.
Record class representing an RGB color value for LED control.
Hardware implementation of the BlingIO interface for controlling physical LED strips.
Simulation implementation of the BlingIO interface.
Subsystem for controlling robot LED lighting effects.
Automatically generated file containing build version information.
Robot-wide constants class that defines runtime modes and device configurations.
CAN bus device ID assignments.
Defines the possible runtime modes for the robot code.
A command that automatically rotates the robot to face detected game objects while allowing
manual translation control.
Represents a notification object to be sent to the Elastic dashboard.
Represents the possible levels of notifications for the Elastic dashboard.
Command that measures the velocity feedforward constants for the drive motors.
Represents a game element (measurements are in meters)
IO implementation for Pigeon 2.
The LocalizationFusion subsystem manages robot pose estimation by fusing data from multiple
sources.
Constants used by the LocalizationFusion subsystem for robot pose estimation and tracking.
Constants related to initialization requirements and validation counts.
Constants related to timing and update intervals.
Constants related to pose validation and thresholds.
Manages the state machine for the robot localization system.
Represents the possible states of the localization system.
The Main class serves as the entry point for the robot program.
Physics sim implementation of module IO.
Command that displays an animation while waiting for alliance selection.
Interface for a single camera's object detection IO operations.
Subsystem that handles object detection using PhotonVision cameras.
Constants used by the Oculus Quest navigation subsystem.
Interface for handling input/output operations with the Oculus Quest hardware.
Data structure for Oculus inputs that can be automatically logged.
Implementation of OculusIO for real hardware communication via NetworkTables.
Simulation implementation of OculusIO that provides static test values.
Adapts the Meta Quest SLAM system to the standardized PoseSource interface.
Manages communication and pose estimation with a Meta Quest VR headset.
The Operator Interface (OI) class handles all driver control inputs and button mappings.
A command that automatically navigates the robot to the best detected game object.
Manages PathPlanner integration for autonomous path following and path finding.
Provides an interface for asynchronously reading high-frequency measurements to a set of queues.
Command that sends a ping to the Oculus system and waits for a response.
Standardized interface for pose estimation sources in the robot localization system.
Functional interface for consuming pose updates from the fusion system.
Command that resets the Oculus system's pose estimation to a specified target pose.
Main robot class that handles robot lifecycle and mode transitions.
Command that sets the LED color based on the current alliance color.
Interface for logging state transitions in the localization system.
The SwerveDriveSubsystem class manages the robot's swerve drive system, handling odometry, module
control, and autonomous path following capabilities.
Command that measures the robot's wheel radius by spinning in a circle.
Command that zeros the heading (rotation) of the Oculus system.