ArduPilot SITL Update

This commit is contained in:
2026-01-04 00:24:46 +00:00
parent 6c72bbf24c
commit 6804180e21
20 changed files with 2138 additions and 2970 deletions

View File

@@ -1,28 +1,23 @@
# DroneController Guide (GPS-Denied)
# DroneController Guide
Implement your landing algorithm in `drone_controller.py`.
Implement your GPS-denied landing algorithm in `drone_controller.py`.
## Quick Start
1. Edit `drone_controller.py`
2. Find `calculate_landing_maneuver()`
3. Implement your algorithm
4. Test with any mode:
- `python standalone_simulation.py --pattern stationary` (standalone)
- `python run_bridge.py --pattern stationary` (PyBullet + ROS 2)
- `python run_gazebo.py --pattern stationary` (Gazebo + ROS 2)
4. Test: `python standalone_simulation.py`
## GPS-Denied Challenge
## Sensors Available
No GPS available. You must use:
| Sensor | Data |
|--------|------|
| **IMU** | Orientation, angular velocity |
| **Altimeter** | Altitude, vertical velocity |
| **Velocity** | Estimated from optical flow |
| **Camera** | 320x240 downward image (base64 JPEG) |
| **Landing Pad** | Relative position (may be null!) |
| Sensor | Description |
|--------|-------------|
| IMU | Orientation, angular velocity |
| Altimeter | Altitude, vertical velocity |
| Velocity | Estimated velocity (x, y, z) |
| Camera | 320x240 downward image |
| Landing Pad | Relative position (may be null!) |
## Function to Implement
@@ -34,55 +29,20 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
## Sensor Data
### IMU
```python
imu = telemetry['imu']
roll = imu['orientation']['roll']
pitch = imu['orientation']['pitch']
yaw = imu['orientation']['yaw']
angular_vel = imu['angular_velocity'] # {x, y, z}
```
# Altitude
altitude = telemetry['altimeter']['altitude']
vertical_vel = telemetry['altimeter']['vertical_velocity']
### Altimeter
```python
altimeter = telemetry['altimeter']
altitude = altimeter['altitude']
vertical_vel = altimeter['vertical_velocity']
```
# Velocity
vel_x = telemetry['velocity']['x']
vel_y = telemetry['velocity']['y']
### Velocity
```python
velocity = telemetry['velocity'] # {x, y, z} in m/s
```
### Camera
The drone has a downward-facing camera providing 320x240 JPEG images.
```python
import base64
from PIL import Image
import io
camera = telemetry['camera']
image_b64 = camera.get('image')
if image_b64:
image_bytes = base64.b64decode(image_b64)
image = Image.open(io.BytesIO(image_bytes))
# Process image for custom vision algorithms
```
### Landing Pad (Vision)
**Important: May be None if pad not visible!**
```python
landing_pad = telemetry['landing_pad']
if landing_pad is not None:
relative_x = landing_pad['relative_x'] # body frame
relative_y = landing_pad['relative_y'] # body frame
distance = landing_pad['distance'] # vertical
confidence = landing_pad['confidence'] # 0-1
# Landing Pad (may be None!)
landing_pad = telemetry.get('landing_pad')
if landing_pad:
relative_x = landing_pad['relative_x']
relative_y = landing_pad['relative_y']
```
## Control Output
@@ -112,7 +72,7 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
thrust = 0.5 * (0 - altitude) - 0.3 * vertical_vel
# Horizontal control
if landing_pad is not None:
if landing_pad:
pitch = 0.3 * landing_pad['relative_x'] - 0.2 * vel_x
roll = 0.3 * landing_pad['relative_y'] - 0.2 * vel_y
else:
@@ -124,84 +84,43 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
## Using the Camera
You can implement custom vision processing on the camera image:
```python
import cv2
import numpy as np
import base64
def process_camera(telemetry):
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if not image_b64:
return None
# Decode JPEG
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if image_b64:
image_bytes = base64.b64decode(image_b64)
nparr = np.frombuffer(image_bytes, np.uint8)
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
# Example: detect green landing pad
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
green_mask = cv2.inRange(hsv, (35, 50, 50), (85, 255, 255))
# Find contours
contours, _ = cv2.findContours(green_mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
if contours:
largest = max(contours, key=cv2.contourArea)
M = cv2.moments(largest)
if M['m00'] > 0:
cx = int(M['m10'] / M['m00'])
cy = int(M['m01'] / M['m00'])
# cx, cy is center of detected pad in image coordinates
return (cx, cy)
return None
# Process image...
```
## Strategies
### When Pad Not Visible
- Maintain altitude and stabilize
- Search by ascending or spiraling
- Dead reckoning from last known position
### State Machine
1. Search → find pad
2. Approach → move above pad
3. Align → center over pad
4. Descend → controlled descent
5. Land → touch down
## Testing
```bash
# Easy - stationary rover
# Easy - stationary
python standalone_simulation.py --pattern stationary
# Medium - slow circular movement
python standalone_simulation.py --pattern circular --speed 0.2
# Medium - circular
python standalone_simulation.py --pattern circular --speed 0.3
# Hard - faster random movement
python standalone_simulation.py --pattern random --speed 0.3
# With ROS 2 (Gazebo)
ros2 launch gazebo/launch/drone_landing.launch.py # Terminal 1
python run_gazebo.py --pattern circular # Terminal 2
# Hard - random
python standalone_simulation.py --pattern random --speed 0.5
```
## Configuration
Edit `config.py` to tune controller gains:
Edit `config.py`:
```python
CONTROLLER = {
"Kp_z": 0.5, # Altitude proportional gain
"Kd_z": 0.3, # Altitude derivative gain
"Kp_xy": 0.3, # Horizontal proportional gain
"Kd_xy": 0.2, # Horizontal derivative gain
"Kp_z": 0.5, # Altitude proportional
"Kd_z": 0.3, # Altitude derivative
"Kp_xy": 0.3, # Horizontal proportional
"Kd_xy": 0.2, # Horizontal derivative
}
```