Autonomous Sand-Leveling Rover
Inspiration
Construction sites and earthmoving projects rely heavily on skilled operators to manually level terrain. We were inspired by the idea of creating a small-scale autonomous system that mimics real grading machinery but with intelligence powered by computer vision and real-time control. The challenge of transforming an uneven sand surface into a flat plane using sensing, estimation, and control felt like the perfect intersection of robotics, perception, and embedded systems.
We wanted to build a robot that doesn’t just move, but understands terrain and actively reshapes it.
What It Does
Our rover autonomously levels sand dunes inside a sandbox by:
- Using computer vision to detect color variations in sand.
- Interpreting those variations as height differences.
- Estimating surface slope and unevenness.
- Adjusting a front-mounted leveling blade in real time.
- Driving in controlled passes to flatten the terrain.
The robot continuously measures heading with an IMU, maintains straight driving using a PD controller, and makes controlled turns when reaching boundaries.
How We Built It
1. Hardware System
- Drive Base: Differential-drive rover chassis
- Motor Control: L298N H-bridge with PWM speed control
- IMU: MPU6050 for yaw stabilization
- Ultrasonic Sensor: Boundary detection and safe turning
- Raspberry Pi: Onboard compute for CV and control
- Custom 3D-Printed Leveling Tool: Adjustable front blade for grading
2. Computer Vision Pipeline
We used a monocular camera mounted above the sand surface. The pipeline:
- Capture frame.
- Convert to HSV color space.
- Segment sand color gradients.
- Extract intensity distribution across the surface.
- Estimate slope direction and uneven regions.
If brightness correlates with sand thickness, we approximate surface variation as:
[ \nabla h(x, y) \propto \nabla I(x, y) ]
Where:
- ( h(x, y) ) = estimated surface height
- ( I(x, y) ) = pixel intensity
We compute directional gradients to determine where sand is elevated.
Control System
Heading Stabilization
Yaw is estimated by integrating gyro-Z:
[ \theta(t) = \int \omega_z(t)\, dt ]
A PD controller maintains straight driving:
[ u = -\left(K_p e + K_d \dot{e}\right) ]
Where:
- ( e ) = yaw error
- ( \dot{e} ) = yaw rate
- ( u ) = wheel speed correction
Wheel speeds are adjusted as:
[ v_L = v - u, \quad v_R = v + u ]
Leveling Strategy
We divided the sandbox into virtual lanes and performed systematic passes. Based on detected gradient direction:
- If slope increases to the right, shift blade bias.
- If slope increases forward, reduce speed and increase blade depth.
- If region is flat, maintain steady pass.
Challenges We Faced
1. Monocular Depth Ambiguity
Color is not true depth. Lighting conditions strongly affected intensity readings. We mitigated this with:
- HSV normalization
- Region averaging
- Filtering out shadows
2. IMU Drift
Yaw integration drifts over time. We:
- Calibrated gyro bias at startup
- Added adaptive bias correction when stationary
- Re-zeroed heading after major turns
3. Sand Is Nonlinear
Sand does not behave like rigid terrain. Blade force causes unpredictable redistribution. We had to:
- Reduce speed
- Make multiple shallow passes instead of one deep cut
4. Mechanical Stability
The leveling tool initially dug too aggressively. We redesigned it with:
- A flatter blade geometry
- Slight trailing angle
- Reduced friction surface
What We Learned
- Closed-loop control is critical. Open-loop driving fails quickly.
- Perception and control must be tightly integrated.
- Small sensor noise can significantly affect terrain estimation.
- Mechanical design matters as much as software.
- Iterative testing beats theoretical perfection.
Most importantly, we learned how to combine computer vision, embedded systems, and control theory into a cohesive autonomous system.
Future Improvements
- Add stereo or structured-light depth for better height estimation.
- Implement surface plane fitting using least squares.
- Use PID blade height control with feedback.
- Add mapping memory to avoid re-leveling flat regions.
Final Thoughts
This project demonstrates how relatively simple sensors — a camera, IMU, and ultrasonic sensor — can work together to create intelligent terrain-shaping behavior. By blending perception with real-time control, we built a rover that not only navigates sand dunes but reshapes them autonomously.
Built With
- 3dprinting
- computer-vision-(hsv-filtering
- gradient-detection)
- hc-sr04-ultrasonic-sensor
- l298n-motor-driver
- mpu6050-imu
- opencv
- pd-control
- pwm-motor-control
- python
- raspberry-pi
- rpi.gpio
- sensor-fusion
Log in or sign up for Devpost to join the conversation.