RPM sensor on the VESC motor controller. Reports speed in meters-per-second over /vesc_odom
(also accessible through a serial port if not using ROS).
Rate: 50hz
Unit: m/s
Latency: marginal
Accuracy: good, if calibrated
Data source: https://github.com/f1tenth/vesc/
The motor is a brushless motor, and the RPM is calculated with ERPM. The brushless motor is driven by three (or more!) electromagnet poles in different phases. By monitoring feedback as the motor magnets pass by each electromagnet, the controller can estimate the motion of the shaft. See the feedback effect in this animation here: https://www.youtube.com/watch?v=43JMIuwVrY4
Velocity can be derived from onboard localization, or from an external localization system (ie, from a VR tracker).
We could also attach an encoder to an axle.
Hokuyo UST-10LX (or equivalent)
1D lidar
FOV: 270 degrees - blindspot is behind the car
1080 points, evenly distributed across 270 degrees: 0.25 degrees (0.004363 rad)
Max distance: claimed 30 meters, in reality, less
Dark, rough, or reflective objects may not be seen until they’re very close
Data source: ‘/scan’ provided by urg_node
We do not have a direct way to sense steering position - the servo does not send feedback. Right now, we have a estimate of the maximum steering velocity, and use that to estimate the position.
Depending on the car, the servo may be connected to an external I2C port, or to the VESC’s servo port
Rate: reported at 50hz (through VESC) - actual measurement rate is unknown
Unit: position from -1 to 1, can be converted to degrees/radians
Latency: ???
Accuracy: bad - an estimate in the best case
Data source: Virtual (estimated from previous commands)
Like speed, this could be derived from other sensor data and localization. Servos with a feedback wire exist (we have one mounted on a car, but we don’t use it right now), or we could attach an encoder.
ie X, Y, angle
We have multiple options available:
We have access to the speed and (estimated) steering position, so we can integrate/accumulate (using the bicycle model) these to get a rough estimate of position. This is mostly useful for measuring short-term relative position changes, and is useful as a starting point or seed for more advanced methods (see Onboard Localization below). Right now, this estimate is also published over /vesc_odom
ie Sensor Fusion
We can use LIDAR measurements to enhance the Odometry estimate. With SLAM, we can use a-priori knowledge of the environment to determine an absolute position.
TODO: accuracy, rate, etc
tl;dr Accuracy & rate varies
We can use cameras and other sensors in the environment to track the car. As long as line-of-sight is maintained, this is usually very high quality, but is limited in the area/volume it can cover.