Skip to content

Rule-Based Lane Following

Overview

Develop a simple hand-engineered policy that keeps the vehicle centered between lane markers. This tutorial demonstrates a deterministic control loop before you transition to learning-based methods.

Prerequisites

  • Completed Obstacle Avoidance and AMCL Localization
  • Camera calibrated and publishing /camera/image_rect
  • Binary lane mask or segmentation model providing /perception/lane_mask
  • Vehicle accepts steering commands on /drive (ackermann_msgs/AckermannDriveStamped)

Learning Objectives

  • Students can write a program that maps sensor inputs directly to actions (a policy).
  • Extract positional error relative to the lane centerline
  • Translate error into steering corrections using proportional control
  • Validate on a closed track with safety overrides

1. Generate a Lane Mask

  1. Use OpenCV to threshold the rectified camera stream:
    mask = cv2.inRange(hsv, lower_white, upper_white)
    
  2. Optionally run birdseye warping for horizon flattening.
  3. Publish the mask as a sensor_msgs/Image.

2. Estimate Lane Offset

Implement lane_error_node.py:

import cv2
import numpy as np
from cv_bridge import CvBridge
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Image
from std_msgs.msg import Float32

class LaneError(Node):
    def __init__(self):
        super().__init__('lane_error')
        self.bridge = CvBridge()
        self.declare_parameter('roi_row', 400)
        self.declare_parameter('pixels_per_meter', 200.0)
        self.pub = self.create_publisher(Float32, '/lane_error', 10)
        self.sub = self.create_subscription(Image, '/perception/lane_mask', self.cb, 10)

    def cb(self, msg: Image):
        roi_row = self.get_parameter('roi_row').value
        ppm = self.get_parameter('pixels_per_meter').value
        mask = self.bridge.imgmsg_to_cv2(msg, desired_encoding='mono8')
        row = mask[roi_row, :]
        indices = np.where(row > 0)[0]
        if len(indices) < 2:
            return
        center = (indices[0] + indices[-1]) / 2.0
        error_px = center - mask.shape[1] / 2.0
        error_m = error_px / ppm
        self.pub.publish(Float32(data=float(error_m)))

3. Apply Rule-Based Steering

  1. Create a control node lane_follow_rule.py:
    from ackermann_msgs.msg import AckermannDriveStamped
    
    class LaneFollower(Node):
        def __init__(self):
            super().__init__('lane_follower')
            self.declare_parameter('kp', 1.2)
            self.declare_parameter('speed', 1.0)
            self.kp = self.get_parameter('kp').value
            self.speed = self.get_parameter('speed').value
            self.sub = self.create_subscription(Float32, '/lane_error', self.cb, 10)
            self.pub = self.create_publisher(AckermannDriveStamped, '/drive', 10)
    
        def cb(self, msg: Float32):
            steering = -self.kp * msg.data
            cmd = AckermannDriveStamped()
            cmd.drive.speed = self.speed
            cmd.drive.steering_angle = steering
            self.pub.publish(cmd)
    
  2. Add saturation (np.clip) if steering angles exceed hardware limits.
  3. Launch both nodes with ros2 launch fri_control lane_following.launch.py.

4. Track Testing Procedure

  • Start at low speed (0.5 m/s) and gradually increase.
  • Observer holds an emergency stop.
  • Record /lane_error, /drive, and camera feeds for review.

5. Evaluation Metrics

  • Mean absolute lane error (target < 0.1 m)
  • Maximum steering angle vs. hardware constraints
  • Percentage of time the lane was successfully detected

6. Troubleshooting

  • No lane detection: adjust threshold or ROI height.
  • Oscillations: reduce kp or add derivative damping.
  • Oversteer in curves: scale kp based on speed (kp = kp0 / speed).

Wrap-Up

Document tuned gains, update the launch file, and store datasets for future learning-based comparisons.

Next Tutorial: Continue with Data Collection for Imitation Learning.