ArduPilot SITL Update

This commit is contained in:
2026-01-04 00:24:46 +00:00
parent 6c72bbf24c
commit 6804180e21
20 changed files with 2138 additions and 2970 deletions

View File

@@ -1,176 +1,94 @@
# Architecture Overview
GPS-denied drone landing simulation with multiple operation modes.
## Operation Modes
### 1. Standalone Mode (Any Platform)
### 1. Standalone (Any Platform)
Single-process simulation - no ROS 2 or networking required:
Single process, no ROS 2 required:
```bash
python standalone_simulation.py --pattern circular
```
```
┌────────────────────────────────────────┐
│ standalone_simulation.py │
│ ┌──────────────────────────────────┐ │
│ │ PyBullet Physics + Camera │ │
│ │ Built-in Landing Controller │ │
│ │ Rover Movement Patterns │ │
│ │ Configuration from config.py │ │
│ │ Built-in Controller │ │
│ │ Rover Movement │ │
│ └──────────────────────────────────┘ │
└────────────────────────────────────────┘
```
### 2. PyBullet + ROS 2 Mode (Two Terminals)
### 2. Gazebo + ROS 2 (2 Terminals)
**Terminal 1:**
```bash
ros2 launch gazebo/launch/drone_landing.launch.py
```
**Terminal 2:**
```bash
python run_gazebo.py --pattern circular
```
```
Terminal 1 Terminal 2
┌──────────────────┐ ┌──────────────────────────
simulation_host │◄─UDP───►│ run_bridge.py
(PyBullet) ┌────────────────────┐
│ Port 5555 │ │ │ ROS2SimulatorBridge│ │
│ │ │ │ DroneController │ │
│ │ │ │ RoverController │ │
└──────────────────┘ │ └────────────────────┘ │
└──────────────────────────┘
┌──────────────────┐ ┌───────────────────┐
Gazebo + Bridge │◄──────►│ run_gazebo.py │
(Physics) ROS+ Controllers
└───────────────────┘ └───────────────────┘
```
Data flow:
- RoverController publishes position → Bridge sends to Simulator
- Simulator moves rover visually AND sends back telemetry
- DroneController receives telemetry, publishes commands
- Bridge forwards commands to Simulator
### 3. ArduPilot SITL (2 Terminals)
### 3. Gazebo + ROS 2 Mode (Two Terminals, Linux/WSL2)
```
Terminal 1 Terminal 2
┌───────────────────────────┐ ┌──────────────────────────┐
│ ros2 launch ... .launch.py│ │ run_gazebo.py │
│ ┌─────────────────────┐ │ │ ┌────────────────────┐ │
│ │ Gazebo (ign gazebo) │ │ │ │ GazeboBridge │ │
│ │ - Drone (vel ctrl) │ │◄────►│ │ DroneController │ │
│ │ - Rover (vel ctrl) │ │ ROS │ │ RoverController │ │
│ ├─────────────────────┤ │ │ └────────────────────┘ │
│ │ ros_gz_bridge │ │ └──────────────────────────┘
│ └─────────────────────┘ │
└───────────────────────────┘
**Terminal 1:**
```bash
ros2 launch ardupilot_gz_bringup iris_runway.launch.py
```
Data flow:
- RoverController publishes to `/rover/cmd_vel` → Gazebo moves rover
- Gazebo publishes odometry → GazeboBridge converts to telemetry
- DroneController receives telemetry, publishes to `/cmd_vel`
- GazeboBridge forwards to `/drone/cmd_vel` → Gazebo moves drone
### 4. ArduPilot SITL + Gazebo Mode (Three Terminals, Linux/WSL2)
```
Terminal 1 Terminal 2 Terminal 3
┌──────────────┐ ┌─────────────────┐ ┌────────────────────────┐
│ Gazebo + │ │ ArduPilot SITL │ │ run_ardupilot.py │
│ ArduPilot │◄──►│ sim_vehicle.py │ │ ┌──────────────────┐ │
│ Plugin │JSON│ + MAVProxy │◄───►│ │ MAVLinkBridge │ │
│ │ │ │ UDP │ │ DroneController │ │
│ ardupilot_ │ │ Flight Control │ │ │ RoverController │ │
│ drone.sdf │ │ + GCS │ │ └──────────────────┘ │
└──────────────┘ └─────────────────┘ └────────────────────────┘
**Terminal 2:**
```bash
mavproxy.py --console --map --master=:14550
```
Data flow:
- ArduPilot SITL sends motor commands → Gazebo plugin controls drone
- Gazebo plugin sends sensor data → ArduPilot SITL for state estimation
- MAVProxy outputs telemetry → MAVLinkBridge converts to ROS telemetry
- DroneController receives telemetry, publishes velocity commands
- MAVLinkBridge sends MAVLink commands → ArduPilot SITL executes
```
┌─────────────────────────────────────────────┐
│ Single Launch Command │
│ (Starts SITL + Gazebo + RViz) │
├─────────────────────────────────────────────┤
│ ArduPilot SITL ◄──► Gazebo ◄──► ROS 2 │
│ ▲ │
│ │ /ap/* topics │
│ ▼ │
│ MAVProxy (GCS) │
└─────────────────────────────────────────────┘
```
Key differences from simple Gazebo mode:
- Full ArduPilot flight controller (EKF, stabilization, failsafes)
- Real MAVLink protocol for commands and telemetry
- Support for all ArduPilot flight modes (GUIDED, LAND, etc.)
- Arming checks and safety features
- Compatible with ground control stations (QGroundControl, Mission Planner)
## Key Components
## Components
| File | Description |
|------|-------------|
| `config.py` | Central configuration (positions, physics, gains) |
| `standalone_simulation.py` | All-in-one simulation |
| `simulation_host.py` | PyBullet physics server (UDP) |
| `run_bridge.py` | PyBullet bridge + controllers |
| `run_gazebo.py` | Gazebo bridge + controllers |
| `run_ardupilot.py` | **ArduPilot SITL** + MAVLink bridge |
| `mavlink_bridge.py` | MAVLink ↔ ROS 2 bridge |
| `drone_controller.py` | **Your landing algorithm** |
| `rover_controller.py` | Moving landing pad |
| `ros_bridge.py` | ROS-UDP bridge (used by run_bridge.py) |
| `gazebo_bridge.py` | Gazebo-ROS bridge (used by run_gazebo.py) |
| `gazebo/launch/drone_landing.launch.py` | ROS 2 launch file for Gazebo |
| `gazebo/launch/ardupilot_drone.launch.py` | ROS 2 launch file for ArduPilot |
| `gazebo/worlds/drone_landing.sdf` | Gazebo world with simple velocity control |
| `gazebo/worlds/ardupilot_drone.sdf` | Gazebo world with ArduPilot plugin |
| Component | Description |
|-----------|-------------|
| `drone_controller.py` | Your landing algorithm |
| `gazebo_bridge.py` | Gazebo ↔ ROS bridge |
| `mavlink_bridge.py` | MAVLink commands |
| `camera_viewer.py` | Camera display |
## ROS 2 Topics
| Topic | Type | Description |
|-------|------|-------------|
| `/cmd_vel` | `Twist` | Drone commands from DroneController |
| `/drone/cmd_vel` | `Twist` | Drone commands to Gazebo |
| `/drone/telemetry` | `String` | GPS-denied sensor data (JSON) |
| `/rover/cmd_vel` | `Twist` | Rover velocity to simulator |
| `/rover/telemetry` | `String` | Rover position (JSON) |
| Topic | Direction | Description |
|-------|-----------|-------------|
| `/drone/telemetry` | ← | Sensor data (JSON) |
| `/cmd_vel` | → | Velocity commands |
| `/drone/camera` | ← | Camera images |
| `/rover/telemetry` | ← | Landing pad position |
## Network Configuration
## ArduPilot Topics
All components default to `0.0.0.0` for network accessibility.
### Remote Setup (PyBullet mode)
**Machine 1 (with display):**
```bash
python simulation_host.py # Listens on 0.0.0.0:5555
```
**Machine 2 (headless controller):**
```bash
python run_bridge.py --host 192.168.1.100
```
### UDP Ports
| Port | Direction | Content |
|------|-----------|---------|
| 5555 | Bridge → Simulator | Commands (JSON) |
| 5556 | Simulator → Bridge | Telemetry (JSON) |
## GPS-Denied Sensors
All modes provide the same sensor data:
| Sensor | Data |
|--------|------|
| **IMU** | Orientation (roll, pitch, yaw), angular velocity |
| **Altimeter** | Altitude above ground, vertical velocity |
| **Velocity** | Estimated velocity (x, y, z) |
| **Camera** | 320x240 downward JPEG (base64) |
| **Landing Pad** | Relative position (x, y, distance) when visible |
## Configuration (config.py)
| Section | Parameters |
|---------|------------|
| `DRONE` | mass, size, color, start_position, thrust/torque scales |
| `ROVER` | size, color, start_position, default_pattern, default_speed |
| `CAMERA` | width, height, fov, jpeg_quality |
| `PHYSICS` | gravity, timestep, telemetry_rate |
| `CONTROLLER` | Kp_z, Kd_z, Kp_xy, Kd_xy, rate |
| `LANDING` | success_distance, success_velocity, height_threshold |
| `NETWORK` | host, command_port, telemetry_port |
## Platform Support
| Mode | Ubuntu | Arch | macOS | Windows | WSL2 |
|------|--------|------|-------|---------|------|
| Standalone | ✅ | ✅ | ✅ | ✅ | ✅ |
| PyBullet+ROS | ✅ | ⚠️ | ❌ | ❌ | ✅ |
| Gazebo+ROS | ✅ | ⚠️ | ❌ | ❌ | ✅ |
| Topic | Type |
|-------|------|
| `/ap/pose/filtered` | Position |
| `/ap/twist/filtered` | Velocity |
| `/ap/imu/filtered` | IMU |
| `/ap/battery` | Battery |

View File

@@ -1,285 +1,101 @@
# ArduPilot SITL + Gazebo Integration
# ArduPilot SITL Integration
This guide explains how to run the drone simulation with ArduPilot Software-In-The-Loop (SITL) and MAVProxy, providing a realistic flight controller stack.
## Overview
The ArduPilot integration replaces the simple velocity control with a full ArduPilot flight stack:
```
┌──────────────────┐ ┌─────────────────┐ ┌──────────────────┐
│ ArduPilot SITL │◄───►│ Gazebo + Plugin │◄───►│ MAVLink Bridge │
│ (Flight Control)│ JSON│ (Physics Sim) │ ROS │ + Controllers │
└──────────────────┘ └─────────────────┘ └──────────────────┘
▲ │
│ UDP │
│ ▼
┌──────────────────┐ ┌──────────────────┐
│ MAVProxy │◄────────────────────────────►│ DroneController │
│ (GCS) │ MAVLink Commands │ (Your Algorithm) │
└──────────────────┘ └──────────────────┘
```
## Components
| Component | Description |
|-----------|-------------|
| **ArduPilot SITL** | Full autopilot firmware running in simulation |
| **ardupilot_gazebo** | Plugin connecting Gazebo physics to ArduPilot |
| **MAVProxy** | Ground Control Station for monitoring/commands |
| **MAVLink Bridge** | ROS 2 node bridging MAVLink ↔ ROS topics |
| **Drone Controller** | Your landing algorithm |
## Prerequisites
### 1. ArduPilot SITL
Install ArduPilot development environment:
```bash
# Ubuntu/Debian
git clone https://github.com/ArduPilot/ardupilot.git ~/ardupilot
cd ~/ardupilot
git submodule update --init --recursive
Tools/environment_install/install-prereqs-ubuntu.sh -y
. ~/.profile
# Set environment
echo 'export PATH=$PATH:$HOME/ardupilot/Tools/autotest' >> ~/.bashrc
echo 'export ARDUPILOT_HOME=$HOME/ardupilot' >> ~/.bashrc
source ~/.bashrc
```
### 2. ArduPilot Gazebo Plugin
Install the ardupilot_gazebo plugin:
```bash
# For Gazebo Garden/Harmonic
git clone https://github.com/ArduPilot/ardupilot_gazebo.git ~/ardupilot_gazebo
cd ~/ardupilot_gazebo
mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j4
# Add to Gazebo plugin path
echo 'export GZ_SIM_SYSTEM_PLUGIN_PATH=$HOME/ardupilot_gazebo/build:$GZ_SIM_SYSTEM_PLUGIN_PATH' >> ~/.bashrc
echo 'export GZ_SIM_RESOURCE_PATH=$HOME/ardupilot_gazebo/models:$HOME/ardupilot_gazebo/worlds:$GZ_SIM_RESOURCE_PATH' >> ~/.bashrc
source ~/.bashrc
```
### 3. pymavlink
```bash
pip install pymavlink
```
Run the simulation with a realistic ArduPilot flight controller.
## Quick Start
### Option 1: Integrated Launch (Recommended)
This starts everything together:
**Terminal 1 - Simulation:**
```bash
# Terminal 1: Start Gazebo
ros2 launch gazebo/launch/ardupilot_drone.launch.py
# Terminal 2: Start SITL
cd ~/ardupilot
sim_vehicle.py -v ArduCopter -f gazebo-iris --model JSON --console --map
# Terminal 3: Run bridge + controllers
python run_ardupilot.py --no-sitl --pattern circular
source ~/ardu_ws/install/setup.bash
ros2 launch ardupilot_gz_bringup iris_runway.launch.py
```
### Option 2: Manual Setup
**Terminal 2 - Control:**
```bash
# Terminal 1: Start Gazebo world
gz sim -r gazebo/worlds/ardupilot_drone.sdf
# Terminal 2: Start ArduPilot SITL
cd ~/ardupilot
sim_vehicle.py -v ArduCopter -f gazebo-iris --model JSON --console
# Terminal 3: Run MAVLink bridge + controllers
python run_ardupilot.py --no-sitl
mavproxy.py --console --map --master=:14550
```
### Option 3: Full Automatic
## Installation
```bash
# Starts everything (requires SITL installed)
python run_ardupilot.py --pattern circular --console --map
./setup/install_ardupilot.sh
source ~/.bashrc
```
## Flight Operations
This installs:
- ArduPilot SITL with DDS
- Gazebo with ardupilot_gz
- MAVProxy
### Using MAVProxy Commands
Once connected, use MAVProxy to control the drone:
## MAVProxy Commands
```bash
# Set GUIDED mode for algorithm control
# Set mode
mode guided
# Arm motors
# Arm
arm throttle
# Take off to 5 meters
# Takeoff
takeoff 5
# Land
mode land
# Disarm
disarm
```
### Using the MAVLink Bridge API
## ROS 2 Topics
From Python, you can control the drone directly:
ArduPilot publishes native ROS 2 topics:
```python
from mavlink_bridge import MAVLinkBridge
```bash
# List topics
ros2 topic list
# Create bridge
bridge = MAVLinkBridge(sitl_port=14550)
# View position
ros2 topic echo /ap/geopose/filtered
# Arm and takeoff
bridge.set_mode('GUIDED')
bridge.arm()
bridge.takeoff(altitude=5.0)
# Land
bridge.land()
# View battery
ros2 topic echo /ap/battery
```
## Files
| Topic | Type |
|-------|------|
| `/ap/pose/filtered` | PoseStamped |
| `/ap/twist/filtered` | TwistStamped |
| `/ap/imu/filtered` | Imu |
| `/ap/battery` | BatteryState |
| File | Description |
|------|-------------|
| `mavlink_bridge.py` | ROS 2 ↔ MAVLink bridge |
| `run_ardupilot.py` | Integrated launcher |
| `gazebo/worlds/ardupilot_drone.sdf` | Gazebo world with ArduPilot plugin |
| `gazebo/launch/ardupilot_drone.launch.py` | ROS 2 launch file |
## Available Worlds
## Configuration
```bash
# Iris on runway
ros2 launch ardupilot_gz_bringup iris_runway.launch.py
Edit `config.py` to adjust ArduPilot settings:
# Iris in maze
ros2 launch ardupilot_gz_bringup iris_maze.launch.py
```python
ARDUPILOT = {
"vehicle": "ArduCopter", # ArduCopter, ArduPlane, APMrover2
"frame": "gazebo-iris", # Gazebo frame
"sitl_host": "127.0.0.1",
"sitl_port": 5760,
"mavproxy_port": 14550,
}
MAVLINK = {
"system_id": 1,
"component_id": 191,
"heartbeat_timeout": 5.0,
}
# Rover
ros2 launch ardupilot_gz_bringup wildthumper_playpen.launch.py
```
## Telemetry Format
## Using the Launcher
The MAVLink bridge publishes telemetry in the same format as other modes:
```json
{
"imu": {
"orientation": {"roll": 0.0, "pitch": 0.0, "yaw": 0.0},
"angular_velocity": {"x": 0.0, "y": 0.0, "z": 0.0}
},
"altimeter": {
"altitude": 5.0,
"vertical_velocity": 0.0
},
"velocity": {"x": 0.0, "y": 0.0, "z": 0.0},
"position": {"x": 0.0, "y": 0.0, "z": 5.0},
"landing_pad": {
"relative_x": 0.5,
"relative_y": 0.2,
"distance": 4.8,
"confidence": 0.95
},
"battery": {"voltage": 12.6, "remaining": 100},
"armed": true,
"flight_mode": "GUIDED",
"connected": true
}
```bash
python run_ardupilot.py --world runway
python run_ardupilot.py --world maze
python run_ardupilot.py --vehicle rover
```
## Troubleshooting
### SITL Not Starting
**No ROS 2 topics:**
```bash
# Check if SITL is installed
which sim_vehicle.py
# Set ArduPilot path
export ARDUPILOT_HOME=~/ardupilot
export PATH=$PATH:$ARDUPILOT_HOME/Tools/autotest
# Check DDS is enabled
param set DDS_ENABLE 1
```
### Gazebo Plugin Not Found
**Can't arm:**
```bash
# Check plugin path
echo $GZ_SIM_SYSTEM_PLUGIN_PATH
# Verify plugin exists
ls ~/ardupilot_gazebo/build/libArduPilotPlugin.so
```
### No MAVLink Connection
```bash
# Check if SITL is listening
netstat -tuln | grep 14550
# Test with mavlink console
python -c "from pymavlink import mavutil; c = mavutil.mavlink_connection('udpin:127.0.0.1:14550'); print(c.wait_heartbeat())"
```
### Drone Won't Arm
Common issues:
1. **Pre-arm checks failing** - Check MAVProxy console for errors
2. **GPS required** - In simulation, you may need to wait for GPS lock
3. **EKF not ready** - Wait for EKF to initialize
Disable pre-arm checks for testing (not recommended for real flights):
```
# In MAVProxy
# Disable pre-arm checks (simulation only)
param set ARMING_CHECK 0
```
## Flight Modes
| Mode | Description |
|------|-------------|
| **GUIDED** | Accept velocity/position commands from controller |
| **LOITER** | Hold position (GPS required) |
| **ALT_HOLD** | Maintain altitude, manual horizontal |
| **LAND** | Automatic landing |
| **STABILIZE** | Attitude stabilization only |
For autonomous landing, use **GUIDED** mode.
## Architecture Comparison
| Feature | Simple Gazebo | ArduPilot + Gazebo |
|---------|--------------|-------------------|
| Flight Controller | Velocity control | Full ArduPilot |
| Stabilization | Manual PD | Inbuilt EKF + PID |
| Flight Modes | None | All ArduPilot modes |
| Arming | Not required | Safety checks |
| Failsafes | None | Battery, GPS, etc. |
| MAVLink | No | Full protocol |
| GCS Support | No | QGC, Mission Planner |
| Realism | Low | High |

View File

@@ -1,28 +1,23 @@
# DroneController Guide (GPS-Denied)
# DroneController Guide
Implement your landing algorithm in `drone_controller.py`.
Implement your GPS-denied landing algorithm in `drone_controller.py`.
## Quick Start
1. Edit `drone_controller.py`
2. Find `calculate_landing_maneuver()`
3. Implement your algorithm
4. Test with any mode:
- `python standalone_simulation.py --pattern stationary` (standalone)
- `python run_bridge.py --pattern stationary` (PyBullet + ROS 2)
- `python run_gazebo.py --pattern stationary` (Gazebo + ROS 2)
4. Test: `python standalone_simulation.py`
## GPS-Denied Challenge
## Sensors Available
No GPS available. You must use:
| Sensor | Data |
|--------|------|
| **IMU** | Orientation, angular velocity |
| **Altimeter** | Altitude, vertical velocity |
| **Velocity** | Estimated from optical flow |
| **Camera** | 320x240 downward image (base64 JPEG) |
| **Landing Pad** | Relative position (may be null!) |
| Sensor | Description |
|--------|-------------|
| IMU | Orientation, angular velocity |
| Altimeter | Altitude, vertical velocity |
| Velocity | Estimated velocity (x, y, z) |
| Camera | 320x240 downward image |
| Landing Pad | Relative position (may be null!) |
## Function to Implement
@@ -34,55 +29,20 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
## Sensor Data
### IMU
```python
imu = telemetry['imu']
roll = imu['orientation']['roll']
pitch = imu['orientation']['pitch']
yaw = imu['orientation']['yaw']
angular_vel = imu['angular_velocity'] # {x, y, z}
```
# Altitude
altitude = telemetry['altimeter']['altitude']
vertical_vel = telemetry['altimeter']['vertical_velocity']
### Altimeter
```python
altimeter = telemetry['altimeter']
altitude = altimeter['altitude']
vertical_vel = altimeter['vertical_velocity']
```
# Velocity
vel_x = telemetry['velocity']['x']
vel_y = telemetry['velocity']['y']
### Velocity
```python
velocity = telemetry['velocity'] # {x, y, z} in m/s
```
### Camera
The drone has a downward-facing camera providing 320x240 JPEG images.
```python
import base64
from PIL import Image
import io
camera = telemetry['camera']
image_b64 = camera.get('image')
if image_b64:
image_bytes = base64.b64decode(image_b64)
image = Image.open(io.BytesIO(image_bytes))
# Process image for custom vision algorithms
```
### Landing Pad (Vision)
**Important: May be None if pad not visible!**
```python
landing_pad = telemetry['landing_pad']
if landing_pad is not None:
relative_x = landing_pad['relative_x'] # body frame
relative_y = landing_pad['relative_y'] # body frame
distance = landing_pad['distance'] # vertical
confidence = landing_pad['confidence'] # 0-1
# Landing Pad (may be None!)
landing_pad = telemetry.get('landing_pad')
if landing_pad:
relative_x = landing_pad['relative_x']
relative_y = landing_pad['relative_y']
```
## Control Output
@@ -112,7 +72,7 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
thrust = 0.5 * (0 - altitude) - 0.3 * vertical_vel
# Horizontal control
if landing_pad is not None:
if landing_pad:
pitch = 0.3 * landing_pad['relative_x'] - 0.2 * vel_x
roll = 0.3 * landing_pad['relative_y'] - 0.2 * vel_y
else:
@@ -124,84 +84,43 @@ def calculate_landing_maneuver(self, telemetry, rover_telemetry):
## Using the Camera
You can implement custom vision processing on the camera image:
```python
import cv2
import numpy as np
import base64
def process_camera(telemetry):
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if not image_b64:
return None
# Decode JPEG
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if image_b64:
image_bytes = base64.b64decode(image_b64)
nparr = np.frombuffer(image_bytes, np.uint8)
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
# Example: detect green landing pad
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
green_mask = cv2.inRange(hsv, (35, 50, 50), (85, 255, 255))
# Find contours
contours, _ = cv2.findContours(green_mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
if contours:
largest = max(contours, key=cv2.contourArea)
M = cv2.moments(largest)
if M['m00'] > 0:
cx = int(M['m10'] / M['m00'])
cy = int(M['m01'] / M['m00'])
# cx, cy is center of detected pad in image coordinates
return (cx, cy)
return None
# Process image...
```
## Strategies
### When Pad Not Visible
- Maintain altitude and stabilize
- Search by ascending or spiraling
- Dead reckoning from last known position
### State Machine
1. Search → find pad
2. Approach → move above pad
3. Align → center over pad
4. Descend → controlled descent
5. Land → touch down
## Testing
```bash
# Easy - stationary rover
# Easy - stationary
python standalone_simulation.py --pattern stationary
# Medium - slow circular movement
python standalone_simulation.py --pattern circular --speed 0.2
# Medium - circular
python standalone_simulation.py --pattern circular --speed 0.3
# Hard - faster random movement
python standalone_simulation.py --pattern random --speed 0.3
# With ROS 2 (Gazebo)
ros2 launch gazebo/launch/drone_landing.launch.py # Terminal 1
python run_gazebo.py --pattern circular # Terminal 2
# Hard - random
python standalone_simulation.py --pattern random --speed 0.5
```
## Configuration
Edit `config.py` to tune controller gains:
Edit `config.py`:
```python
CONTROLLER = {
"Kp_z": 0.5, # Altitude proportional gain
"Kd_z": 0.3, # Altitude derivative gain
"Kp_xy": 0.3, # Horizontal proportional gain
"Kd_xy": 0.2, # Horizontal derivative gain
"Kp_z": 0.5, # Altitude proportional
"Kd_z": 0.3, # Altitude derivative
"Kp_xy": 0.3, # Horizontal proportional
"Kd_xy": 0.2, # Horizontal derivative
}
```

View File

@@ -1,159 +1,72 @@
# Gazebo Simulation Guide
Running the GPS-denied drone simulation with Gazebo Ignition Fortress on Linux/WSL2.
## Quick Start (2 Terminals)
## Quick Start (Two Terminals)
**Terminal 1 - Launch Gazebo + Bridge:**
**Terminal 1 - Gazebo:**
```bash
source activate.sh
ros2 launch gazebo/launch/drone_landing.launch.py
```
**Terminal 2 - Run Controllers:**
**Terminal 2 - Controllers:**
```bash
source activate.sh
python run_gazebo.py --pattern circular --speed 0.3
python run_gazebo.py --pattern circular
```
Both the drone AND rover will move!
## Command Options
```bash
python run_gazebo.py --help
python run_gazebo.py --pattern circular --speed 0.5
Options:
--pattern stationary, linear, circular, square, random
--pattern, -p stationary, linear, circular, square, random
--speed, -s Rover speed in m/s (default: 0.5)
--amplitude, -a Movement amplitude (default: 2.0)
--amplitude, -a Movement radius (default: 2.0)
--no-rover Disable rover controller
```
## How It Works
1. **Gazebo** runs the physics simulation with:
- Drone with `VelocityControl` plugin (responds to `/drone/cmd_vel`)
- Rover with `VelocityControl` plugin (responds to `/rover/cmd_vel`)
2. **ros_gz_bridge** connects Gazebo topics to ROS 2
3. **run_gazebo.py** starts:
- `GazeboBridge` - converts ROS topics to telemetry format
- `DroneController` - your landing algorithm
- `RoverController` - moves the landing pad
## GPS-Denied Sensors
The `GazeboBridge` provides the same sensor interface as PyBullet:
| Sensor | Source |
|--------|--------|
| IMU | Gazebo odometry orientation |
| Altimeter | Gazebo Z position |
| Velocity | Gazebo twist |
| Camera | Gazebo camera sensor (if enabled) |
| Landing Pad | Computed from relative position |
## Topics
### ROS 2 Topics (your code uses these)
| Topic | Type | Direction |
|-------|------|-----------|
| `/cmd_vel` | `Twist` | Input (from DroneController) |
| `/drone/telemetry` | `String` | Output (to DroneController) |
| `/rover/telemetry` | `String` | Output (rover position) |
### Gazebo Topics (bridged automatically)
| Topic | Type | Description |
|-------|------|-------------|
| `/drone/cmd_vel` | `Twist` | Drone velocity commands |
| `/rover/cmd_vel` | `Twist` | Rover velocity commands |
| `/model/drone/odometry` | `Odometry` | Drone state |
| `/drone/imu` | `IMU` | IMU sensor data |
| `/clock` | `Clock` | Simulation time |
## Headless Mode (WSL2 / No GPU)
Run Gazebo without GUI:
## View Camera
```bash
# Server mode only
python camera_viewer.py --topic /drone/camera
```
## Sensors
| Sensor | Description |
|--------|-------------|
| IMU | Orientation, angular velocity |
| Altimeter | Altitude, vertical velocity |
| Velocity | Estimated velocity (x, y, z) |
| Camera | Downward-facing image |
| Landing Pad | Relative position |
## ROS 2 Topics
| Topic | Direction |
|-------|-----------|
| `/cmd_vel` | Your commands → Drone |
| `/drone/telemetry` | Sensors → You |
| `/drone/camera` | Camera → You |
| `/rover/telemetry` | Rover position |
## Headless Mode (WSL2)
```bash
# Server only (no GUI)
ign gazebo -s gazebo/worlds/drone_landing.sdf
```
Then run the bridge manually:
```bash
ros2 run ros_gz_bridge parameter_bridge \
/drone/cmd_vel@geometry_msgs/msg/Twist]ignition.msgs.Twist \
/rover/cmd_vel@geometry_msgs/msg/Twist]ignition.msgs.Twist \
/model/drone/odometry@nav_msgs/msg/Odometry[ignition.msgs.Odometry
```
## World File Details
The world file `gazebo/worlds/drone_landing.sdf` includes:
- **Drone** at (0, 0, 2) with:
- `VelocityControl` plugin for movement
- `OdometryPublisher` plugin for telemetry
- IMU sensor
- **Landing Pad (Rover)** at (0, 0, 0.15) with:
- `VelocityControl` plugin for movement
- Visual H marker
## Troubleshooting
### Drone falls immediately
**Drone falls:**
- Check `run_gazebo.py` is running
- Check topic: `ros2 topic echo /drone/cmd_vel`
The drone should hover with the controller running. If it falls:
1. Check that `run_gazebo.py` is running
2. Verify the bridge shows "Passing message from ROS"
3. Check `/drone/cmd_vel` topic: `ros2 topic echo /drone/cmd_vel`
**Rover doesn't move:**
- Check topic: `ros2 topic echo /rover/cmd_vel`
### Rover doesn't move
1. Check that `/rover/cmd_vel` is bridged
2. Verify RoverController is publishing: `ros2 topic echo /rover/cmd_vel`
### Model not found
Set the model path:
**Model not found:**
```bash
export GZ_SIM_RESOURCE_PATH=$PWD/gazebo/models:$GZ_SIM_RESOURCE_PATH
export IGN_GAZEBO_RESOURCE_PATH=$PWD/gazebo/models:$IGN_GAZEBO_RESOURCE_PATH
```
### "Cannot connect to display" (WSL2)
Use headless mode:
```bash
ign gazebo -s gazebo/worlds/drone_landing.sdf
```
Or ensure WSLg is working:
```bash
export DISPLAY=:0
```
### Plugin not found
For Ignition Fortress, plugins use `libignition-gazebo-*-system.so` naming.
Check available plugins:
```bash
ls /usr/lib/x86_64-linux-gnu/ign-gazebo-6/plugins/
```
## Launch File Options
```bash
ros2 launch gazebo/launch/drone_landing.launch.py use_sim_time:=true
```
| Argument | Default | Description |
|----------|---------|-------------|
| `use_sim_time` | `true` | Use Gazebo clock |

View File

@@ -1,503 +1,125 @@
# Installation Guide
Setup instructions for all supported platforms.
## Quick Install
```bash
# Ubuntu/Debian
./setup/install_ubuntu.sh
source activate.sh
# Test
python standalone_simulation.py
```
## Install Scripts
| Platform | Command |
|----------|---------|
| Ubuntu/Debian | `./setup/install_ubuntu.sh` |
| Ubuntu + ArduPilot | `./setup/install_ubuntu.sh --with-ardupilot` |
| ArduPilot SITL | `./setup/install_ardupilot.sh` |
| Arch Linux | `./setup/install_arch.sh` |
| macOS | `./setup/install_macos.sh` |
| Windows | `.\setup\install_windows.ps1` |
After installation:
```bash
source activate.sh # Linux/macOS
. .\activate.ps1 # Windows PowerShell
## Platform Support
python standalone_simulation.py
```
| Mode | Ubuntu | macOS | Windows |
|------|--------|-------|---------|
| Standalone | ✅ | ✅ | ✅ |
| Gazebo + ROS 2 | ✅ | ❌ | WSL2 |
| ArduPilot SITL | ✅ | ❌ | WSL2 |
---
## Platform Compatibility
| Feature | Ubuntu | Arch | macOS | Windows | WSL2 |
|---------|--------|------|-------|---------|------|
| **Standalone Simulation** | ✅ | ✅ | ✅ | ✅ | ✅ |
| **ROS 2** | ✅ | ⚠️ AUR | ❌ | ❌ | ✅ |
| **Gazebo** | ✅ | ⚠️ AUR | ❌ | ❌ | ✅ |
| **ArduPilot SITL** | ✅ | ⚠️ Manual | ❌ | ❌ | ✅ |
| **Full Mode** | ✅ | ⚠️ | ❌ | ❌ | ✅ |
| **GUI Support** | ✅ | ✅ | ✅ | ✅ | ✅ WSLg |
**Legend:**
- ✅ Fully supported
- ⚠️ Available but requires extra setup
- ❌ Not supported
**Recommendation for Windows users:** Use WSL2 for the full experience (ROS 2 + Gazebo).
---
## Ubuntu / Debian
**Tested on:** Ubuntu 22.04 (Jammy), Ubuntu 24.04 (Noble)
## Ubuntu/Debian
```bash
# Run installer
./setup/install_ubuntu.sh
# Activate environment
source activate.sh
# Run simulation
python standalone_simulation.py
```
**Installs:**
- ROS 2 (Humble or Jazzy based on Ubuntu version)
- Gazebo (ros-gz)
- Python packages: pybullet, numpy, pillow, pyinstaller, pymavlink
- ROS 2 Humble/Jazzy
- Gazebo
- Python packages (pybullet, numpy, opencv, pymavlink)
---
## ArduPilot SITL
For realistic flight controller simulation:
**With ArduPilot SITL (full flight controller):**
```bash
# Run installer with ArduPilot
./setup/install_ubuntu.sh --with-ardupilot
./setup/install_ardupilot.sh
source ~/.bashrc
```
# This will also install:
# - ArduPilot SITL (~15-20 min build)
# - ArduPilot Gazebo plugin
# - MAVProxy
**Installs:**
- ArduPilot SITL
- ardupilot_gz (Gazebo integration)
- MAVProxy
**Run:**
```bash
# Terminal 1
source ~/ardu_ws/install/setup.bash
ros2 launch ardupilot_gz_bringup iris_runway.launch.py
# Terminal 2
mavproxy.py --console --map --master=:14550
```
---
## Arch Linux
## Windows (WSL2)
**Tested on:** Arch Linux (rolling release)
```bash
# Run installer
./setup/install_arch.sh
# Activate environment
source activate.sh
# Run simulation
python standalone_simulation.py
1. Install WSL2:
```powershell
wsl --install -d Ubuntu-22.04
```
**Installs:**
- Python packages: pybullet, numpy, pillow, pyinstaller
- yay (AUR helper)
**Optional ROS 2 (from AUR):**
2. Open Ubuntu and run:
```bash
yay -S ros-humble-desktop
yay -S ros-humble-ros-gz
./setup/install_ubuntu.sh
source activate.sh
python standalone_simulation.py
```
---
## macOS
**Tested on:** macOS 12+ (Monterey, Ventura, Sonoma)
```bash
# Run installer
./setup/install_macos.sh
# Activate environment
source activate.sh
# Run simulation
python standalone_simulation.py
```
**Installs:**
- Homebrew (if not present)
- Python 3.11
- Python packages: pybullet, numpy, pillow, pyinstaller
**Note:** ROS 2 and Gazebo are not supported on macOS. Use standalone mode.
**Note:** ROS 2 and Gazebo not supported on macOS. Use standalone mode.
---
## Windows
**Tested on:** Windows 10, Windows 11
```powershell
# Open PowerShell as Administrator
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
# Run installer
.\setup\install_windows.ps1
# Activate environment
. .\activate.ps1
# Run simulation
python standalone_simulation.py
```
**Installs:**
- Chocolatey (package manager)
- Python 3.11
- Python packages: pybullet, numpy, pillow, pyinstaller
**Note:** ROS 2 and Gazebo are not supported natively on Windows. Use standalone mode or WSL2 (below).
---
## Windows with WSL2 (Full Linux Experience)
WSL2 lets you run full Linux on Windows with GUI support. This enables ROS 2 and Gazebo!
**Requirements:** Windows 10 (build 19041+) or Windows 11
### Step 1: Install WSL2
Open PowerShell as Administrator:
```powershell
# Install WSL2 with Ubuntu
wsl --install -d Ubuntu-22.04
# set the user name and password
# Then update the system
sudo apt update
sudo apt upgrade
# Restart computer if prompted
```
### Step 2: Enable GUI Support (WSLg)
Windows 11 and recent Windows 10 updates include WSLg (GUI support) automatically.
Verify by opening Ubuntu and running:
```bash
# Test GUI (should open a window)
sudo apt update
sudo apt install x11-apps -y
xclock
```
If xclock appears, GUI is working!
### Step 3: Install Simulation in WSL
Open Ubuntu from Start menu:
```bash
# Clone or copy your project
cd ~
git clone <your-repo-url> simulation
# OR copy from Windows:
# cp -r /mnt/c/Users/YourName/simulation ~/simulation
cd simulation
# Run Ubuntu installer
./setup/install_ubuntu.sh
# Activate
source activate.sh
# Run with GUI
python standalone_simulation.py
```
### WSL2 Tips
**Access Windows files:**
```bash
# Windows C: drive is at /mnt/c/
cd /mnt/c/Users/YourName/Documents
```
**Run from Windows Terminal:**
```powershell
wsl -d Ubuntu-22.04 -e bash -c "cd ~/simulation && source activate.sh && python standalone_simulation.py"
```
**GPU Acceleration (NVIDIA):**
If you have an NVIDIA GPU:
```bash
# Check if GPU is available
nvidia-smi
# PyBullet will use hardware rendering automatically
```
**Install Gazebo (optional):**
If you want to use Gazebo simulation:
```bash
# Install ros-gz bridge
sudo apt install ros-humble-ros-gz
# Install Gazebo Fortress (provides 'ign' command)
sudo apt install gz-fortress
# Verify - one of these should work:
gz sim --version # Newer Gazebo
ign gazebo --version # Fortress (ROS 2 Humble)
```
**Note:** ROS 2 Humble uses Gazebo Fortress, which uses `ign gazebo` command instead of `gz sim`. The launch file auto-detects which command is available.
**Gazebo GPU Issues in WSL2:**
If Gazebo crashes with GPU/OpenGL errors, try:
```bash
# Option 1: Run in server mode (no GUI)
ign gazebo -s gazebo/worlds/drone_landing.sdf
# Option 2: Fix permissions and restart WSL
sudo usermod -aG render $USER
chmod 700 /run/user/1000
# Then in PowerShell: wsl --shutdown
# Option 3: Force software rendering
export LIBGL_ALWAYS_SOFTWARE=1
ign gazebo gazebo/worlds/drone_landing.sdf
# Option 4: Just use PyBullet (more reliable on WSL2)
python standalone_simulation.py
```
**Troubleshooting WSL GUI:**
If GUI doesn't work:
```bash
# Update WSL
wsl --update
# Set WSL2 as default
wsl --set-default-version 2
# Reinstall Ubuntu
wsl --unregister Ubuntu-22.04
wsl --install -d Ubuntu-22.04
```
**Using VcXsrv (older Windows 10):**
If WSLg isn't available:
```powershell
# Install VcXsrv
choco install vcxsrv -y
```
Then in WSL:
```bash
# Add to ~/.bashrc
export DISPLAY=$(grep -m 1 nameserver /etc/resolv.conf | awk '{print $2}'):0
export LIBGL_ALWAYS_INDIRECT=1
# Start VcXsrv with "Disable access control" checked
# Then run simulation
python standalone_simulation.py
```
---
## Manual Installation
If the install scripts don't work, install manually:
### 1. Python 3.10+
```bash
# Ubuntu/Debian
sudo apt install python3 python3-pip python3-venv
# Arch
sudo pacman -S python python-pip python-virtualenv
# macOS
brew install python@3.11
# Windows
# Download from https://python.org
```
### 2. Create Virtual Environment
## Manual Install
```bash
# Create virtual environment
python3 -m venv venv
source venv/bin/activate # Linux/macOS
# OR
.\venv\Scripts\Activate.ps1 # Windows
```
source venv/bin/activate
### 3. Install Python Packages
```bash
# Install packages
pip install -r requirements.txt
```
Or manually:
```bash
pip install pybullet numpy pillow pyinstaller
```
### 4. Run Simulation
```bash
# Run
python standalone_simulation.py
```
---
## Troubleshooting
### PyBullet fails to install
Install build tools:
```bash
# Ubuntu/Debian
sudo apt install build-essential
# Arch
sudo pacman -S base-devel
# macOS
xcode-select --install
# Windows
# Install Visual Studio Build Tools
```
### "Cannot connect to X server"
PyBullet GUI requires a display:
```bash
# Use virtual display
sudo apt install xvfb
xvfb-run python standalone_simulation.py
# OR use X11 forwarding
ssh -X user@host
```
### Pillow fails to install
```bash
# Ubuntu/Debian
sudo apt install libjpeg-dev zlib1g-dev
# Arch
sudo pacman -S libjpeg-turbo zlib
# macOS
brew install libjpeg zlib
```
### Permission denied on Windows
Run PowerShell as Administrator:
```powershell
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
```
---
## Verification
After installation, verify packages:
```bash
python -c "import pybullet; print('PyBullet OK')"
python -c "import numpy; print('NumPy OK')"
python -c "from PIL import Image; print('Pillow OK')"
python -c "import cv2; print('OpenCV OK')"
python -c "from pymavlink import mavutil; print('pymavlink OK')"
```
All should print "OK".
---
## ArduPilot SITL Manual Setup
If you want to install ArduPilot SITL manually (without the install script):
### 1. Install ArduPilot
```bash
# Clone ArduPilot
git clone --recurse-submodules https://github.com/ArduPilot/ardupilot.git ~/ardupilot
cd ~/ardupilot
# Install prerequisites (Ubuntu)
Tools/environment_install/install-prereqs-ubuntu.sh -y
# Reload profile
. ~/.profile
# Build ArduCopter SITL
./waf configure --board sitl
./waf copter
```
### 2. Install ArduPilot Gazebo Plugin
```bash
# Clone plugin
git clone https://github.com/ArduPilot/ardupilot_gazebo.git ~/ardupilot_gazebo
cd ~/ardupilot_gazebo
# Build
mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
```
### 3. Set Environment Variables
Add to `~/.bashrc`:
```bash
# ArduPilot
export ARDUPILOT_HOME=$HOME/ardupilot
export PATH=$PATH:$ARDUPILOT_HOME/Tools/autotest
# ArduPilot Gazebo Plugin
export GZ_SIM_SYSTEM_PLUGIN_PATH=$HOME/ardupilot_gazebo/build:$GZ_SIM_SYSTEM_PLUGIN_PATH
export GZ_SIM_RESOURCE_PATH=$HOME/ardupilot_gazebo/models:$HOME/ardupilot_gazebo/worlds:$GZ_SIM_RESOURCE_PATH
```
### 4. Test SITL
```bash
# Test ArduCopter SITL
cd ~/ardupilot
sim_vehicle.py -v ArduCopter --console --map
```
### 5. Run with Gazebo
```bash
# Terminal 1: Launch Gazebo
ros2 launch gazebo/launch/ardupilot_drone.launch.py
# Terminal 2: Start SITL
cd ~/ardupilot
sim_vehicle.py -v ArduCopter -f gazebo-iris --model JSON --console
# Terminal 3: Run controllers
cd ~/simulation
source activate.sh
python run_ardupilot.py --no-sitl --pattern circular
```
For more details, see [ArduPilot Guide](ardupilot.md).

View File

@@ -1,8 +1,8 @@
# Communication Protocol (GPS-Denied)
# Communication Protocol
Message formats for GPS-denied drone operation with camera.
Message formats for drone operation.
## Drone Commands
## Commands
```json
{
@@ -13,25 +13,20 @@ Message formats for GPS-denied drone operation with camera.
}
```
| Field | Range | Description |
|-------|-------|-------------|
| `thrust` | ±1.0 | Vertical thrust (positive = up) |
| `pitch` | ±0.5 | Forward/backward tilt |
| `roll` | ±0.5 | Left/right tilt |
| `yaw` | ±0.5 | Rotation |
| Field | Range | Effect |
|-------|-------|--------|
| thrust | ±1.0 | Up/down |
| pitch | ±0.5 | Forward/back |
| roll | ±0.5 | Left/right |
| yaw | ±0.5 | Rotation |
---
## Drone Telemetry
Published on `/drone/telemetry`. **No GPS position available.**
## Telemetry
```json
{
"imu": {
"orientation": {"roll": 0.0, "pitch": 0.0, "yaw": 0.0},
"angular_velocity": {"x": 0.0, "y": 0.0, "z": 0.0},
"linear_acceleration": {"x": 0.0, "y": 0.0, "z": 9.81}
"angular_velocity": {"x": 0.0, "y": 0.0, "z": 0.0}
},
"altimeter": {
"altitude": 5.0,
@@ -47,138 +42,30 @@ Published on `/drone/telemetry`. **No GPS position available.**
"camera": {
"width": 320,
"height": 240,
"fov": 60.0,
"image": "<base64 encoded JPEG>"
},
"landed": false,
"timestamp": 1.234
"image": "<base64 JPEG>"
}
}
```
---
## Sensors
## Sensor Details
| Sensor | Fields |
|--------|--------|
| IMU | orientation (roll, pitch, yaw), angular_velocity |
| Altimeter | altitude, vertical_velocity |
| Velocity | x, y, z (m/s) |
| Landing Pad | relative_x, relative_y, distance, confidence |
| Camera | Base64 JPEG image |
### IMU
Always available.
| Field | Unit | Description |
|-------|------|-------------|
| `orientation.roll/pitch/yaw` | radians | Euler angles |
| `angular_velocity.x/y/z` | rad/s | Rotation rates |
| `linear_acceleration.x/y/z` | m/s² | Acceleration |
### Altimeter
Always available.
| Field | Unit | Description |
|-------|------|-------------|
| `altitude` | meters | Height above ground |
| `vertical_velocity` | m/s | Vertical speed |
### Velocity
Estimated from optical flow.
| Field | Unit | Description |
|-------|------|-------------|
| `x` | m/s | Forward velocity |
| `y` | m/s | Lateral velocity |
| `z` | m/s | Vertical velocity |
### Landing Pad Detection
**May be null if pad not visible!**
| Field | Unit | Description |
|-------|------|-------------|
| `relative_x` | meters | Forward/back offset (body frame) |
| `relative_y` | meters | Left/right offset (body frame) |
| `distance` | meters | Vertical distance to pad |
| `confidence` | 0-1 | Detection confidence |
### Camera
Always available.
| Field | Description |
|-------|-------------|
| `width` | Image width in pixels |
| `height` | Image height in pixels |
| `fov` | Horizontal field of view in degrees |
| `image` | Base64 encoded JPEG (or null) |
---
## Using the Camera Image
The camera provides a base64-encoded JPEG image of what the drone sees looking down.
### Decoding the Image (Python)
```python
import base64
from PIL import Image
import io
def decode_camera_image(telemetry):
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if image_b64 is None:
return None
# Decode base64 to bytes
image_bytes = base64.b64decode(image_b64)
# Load as PIL Image
image = Image.open(io.BytesIO(image_bytes))
return image
```
### Using with OpenCV
## Decoding Camera
```python
import base64
import cv2
import numpy as np
def decode_camera_image_cv2(telemetry):
camera = telemetry.get('camera', {})
image_b64 = camera.get('image')
if image_b64 is None:
return None
# Decode base64 to bytes
image_bytes = base64.b64decode(image_b64)
# Convert to numpy array
nparr = np.frombuffer(image_bytes, np.uint8)
# Decode JPEG
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
return image
```
### Image Properties
- **Resolution**: 320 x 240 pixels
- **Format**: JPEG (quality 70)
- **FOV**: 60 degrees
- **Direction**: Downward-facing
- **Update Rate**: ~5 Hz (every 5th telemetry frame)
---
## Rover Telemetry
For internal use by RoverController.
```json
{
"position": {"x": 1.5, "y": 0.8, "z": 0.15},
"velocity": {"x": 0.3, "y": 0.4, "z": 0.0},
"pattern": "circular",
"timestamp": 1.234
}
image_b64 = telemetry['camera']['image']
image_bytes = base64.b64decode(image_b64)
nparr = np.frombuffer(image_bytes, np.uint8)
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
```

View File

@@ -1,163 +1,68 @@
# PyBullet Simulation Guide
Running the GPS-denied drone simulation with PyBullet physics engine.
## Standalone Mode (1 Terminal)
## Standalone Mode (Single Terminal - Any Platform)
No ROS 2 required! Works on Windows, macOS, and Linux:
No ROS 2 required. Works on Windows, macOS, Linux:
```bash
source activate.sh # Linux/macOS
. .\activate.ps1 # Windows
python standalone_simulation.py --pattern circular --speed 0.3
source activate.sh
python standalone_simulation.py --pattern circular
```
### Options
```bash
python standalone_simulation.py --help
Options:
--pattern, -p stationary, linear, circular, square
--speed, -s Rover speed in m/s (default: 0.5)
--amplitude, -a Movement amplitude in meters (default: 2.0)
```
---
## ROS 2 Mode (Two Terminals)
For distributed or remote simulation with ROS 2:
## ROS 2 Mode (2 Terminals)
**Terminal 1 - Simulator:**
```bash
source activate.sh
python simulation_host.py
```
**Terminal 2 - Controllers:**
```bash
source activate.sh
python run_bridge.py --pattern circular --speed 0.3
python run_bridge.py --pattern circular
```
### How It Works
## Options
1. `simulation_host.py` runs PyBullet physics and listens on UDP port 5555
2. `run_bridge.py` starts:
- `ROS2SimulatorBridge` - connects ROS topics to UDP
- `DroneController` - your landing algorithm
- `RoverController` - moves the landing pad
The rover position is sent to the simulator, so both drone AND rover move!
### Remote Setup
Run simulator on one machine, controllers on another:
**Machine 1 (with display):**
```bash
python simulation_host.py # Listens on 0.0.0.0:5555
--pattern, -p stationary, linear, circular, square, random
--speed, -s Rover speed in m/s (default: 0.5)
--amplitude, -a Movement radius (default: 2.0)
```
**Machine 2 (headless):**
```bash
python run_bridge.py --host 192.168.1.100 --pattern circular
```
## Remote Setup
---
**Machine 1:** `python simulation_host.py`
**Machine 2:** `python run_bridge.py --host <IP>`
## Configuration
All parameters are configurable in `config.py`:
```python
DRONE = {
"mass": 1.0,
"start_position": (0.0, 0.0, 5.0),
"thrust_scale": 15.0,
...
}
ROVER = {
"start_position": (0.0, 0.0, 0.15),
"default_pattern": "circular",
"default_speed": 0.5,
...
}
CONTROLLER = {
"Kp_z": 0.5,
"Kd_z": 0.3,
...
}
```
---
## Simulation Parameters
| Parameter | Value |
|-----------|-------|
| Physics Rate | 240 Hz |
| Telemetry Rate | 24 Hz |
| Drone Mass | 1.0 kg (configurable) |
| Rover Mass | Static (kinematic) |
| UDP Port | 5555 (commands), 5556 (telemetry) |
## GPS-Denied Sensors
## Sensors
| Sensor | Description |
|--------|-------------|
| **IMU** | Orientation (roll, pitch, yaw), angular velocity |
| **Altimeter** | Altitude above ground, vertical velocity |
| **Velocity** | Estimated horizontal velocity (x, y, z) |
| **Camera** | 320x240 downward-facing JPEG image |
| **Landing Pad** | Vision-based relative position when in camera FOV |
| IMU | Orientation, angular velocity |
| Altimeter | Altitude, vertical velocity |
| Velocity | Estimated velocity (x, y, z) |
| Camera | 320x240 downward JPEG |
| Landing Pad | Relative position |
## Troubleshooting
## Configuration
### "Cannot connect to X server"
Edit `config.py`:
PyBullet GUI requires a display:
```bash
# Use virtual display
xvfb-run python standalone_simulation.py
# Or use X11 forwarding
ssh -X user@host
```
### Drone flies erratically
Reduce control gains in `config.py`:
```python
CONTROLLER = {
"Kp_z": 0.3,
"Kd_z": 0.2,
"Kp_xy": 0.2,
"Kd_xy": 0.1,
"Kp_z": 0.5,
"Kd_z": 0.3,
"Kp_xy": 0.3,
"Kd_xy": 0.2,
}
```
### Camera image not appearing
## Troubleshooting
Install Pillow:
**"Cannot connect to X server":**
```bash
pip install pillow numpy
xvfb-run python standalone_simulation.py
```
### Rover not moving (ROS 2 mode)
Ensure `run_bridge.py` is used (not `ros_bridge.py` directly).
The rover controller must be running to send position updates.
### WSL2 GUI issues
Set display scaling:
```bash
export GDK_DPI_SCALE=1.0
export QT_SCALE_FACTOR=1.0
python standalone_simulation.py
```
**Drone flies erratically:**
Reduce gains in `config.py`

View File

@@ -4,97 +4,58 @@ The RoverController creates a moving landing pad target.
## Usage
The rover controller is automatically included when running `controllers.py`:
```bash
# Stationary rover (default)
python controllers.py
# Stationary (default)
python standalone_simulation.py --pattern stationary
# Moving rover
python controllers.py --pattern circular --speed 0.3
# Moving
python standalone_simulation.py --pattern circular --speed 0.3
```
### Options
## Options
| Option | Short | Default | Description |
|--------|-------|---------|-------------|
| `--pattern` | `-p` | stationary | Movement pattern |
| `--speed` | `-s` | 0.5 | Speed in m/s |
| `--amplitude` | `-a` | 2.0 | Amplitude in meters |
| Option | Default | Description |
|--------|---------|-------------|
| `--pattern, -p` | stationary | Movement pattern |
| `--speed, -s` | 0.5 | Speed in m/s |
| `--amplitude, -a` | 2.0 | Radius in meters |
## Movement Patterns
## Patterns
### Stationary
```bash
python controllers.py --pattern stationary
```
Rover stays at origin. Best for initial testing.
### Linear
```bash
python controllers.py --pattern linear --speed 0.3 --amplitude 2.0
```
Oscillates along X-axis.
### Circular
```bash
python controllers.py --pattern circular --speed 0.5 --amplitude 2.0
```
Follows circular path of radius `amplitude`.
### Random
```bash
python controllers.py --pattern random --speed 0.3 --amplitude 2.0
```
Moves to random positions. Changes target every 3 seconds.
### Square
```bash
python controllers.py --pattern square --speed 0.5 --amplitude 2.0
```
Square pattern with corners at `(±amplitude, ±amplitude)`.
| Pattern | Description |
|---------|-------------|
| stationary | Stays at origin |
| linear | Oscillates along X-axis |
| circular | Circular path |
| square | Square with sharp turns |
| random | Random positions |
## Difficulty Levels
| Level | Pattern | Speed | Description |
|-------|---------|-------|-------------|
| Beginner | stationary | 0.0 | Static target |
| Easy | linear | 0.2 | Predictable 1D |
| Medium | circular | 0.3 | Smooth 2D |
| Hard | random | 0.3 | Unpredictable |
| Expert | square | 0.5 | Sharp turns |
| Level | Pattern | Speed |
|-------|---------|-------|
| Beginner | stationary | 0.0 |
| Easy | linear | 0.2 |
| Medium | circular | 0.3 |
| Hard | random | 0.3 |
| Expert | square | 0.5 |
## Progressive Testing
Start easy and increase difficulty:
```bash
# Step 1: Static target
python controllers.py --pattern stationary
# 1. Static target
python standalone_simulation.py --pattern stationary
# Step 2: Slow linear motion
python controllers.py --pattern linear --speed 0.2
# 2. Slow circular
python standalone_simulation.py --pattern circular --speed 0.2
# Step 3: Slow circular motion
python controllers.py --pattern circular --speed 0.2
# 3. Faster circular
python standalone_simulation.py --pattern circular --speed 0.4
# Step 4: Faster circular
python controllers.py --pattern circular --speed 0.4
# Step 5: Random
python controllers.py --pattern random --speed 0.3
# 4. Random
python standalone_simulation.py --pattern random --speed 0.3
```
## Published Topics
## Note
| Topic | Type | Description |
|-------|------|-------------|
| `/rover/cmd_vel` | `Twist` | Velocity commands |
| `/rover/position` | `Point` | Current position |
| `/rover/telemetry` | `String` | Full state (JSON) |
## GPS-Denied Note
In GPS-denied mode, the drone cannot directly access rover position. Instead, it must detect the landing pad visually via `landing_pad` sensor data.
The `/rover/telemetry` topic is used internally by the RoverController but the DroneController should primarily rely on vision-based `landing_pad` detection in the drone telemetry.
The drone cannot access rover position directly (GPS-denied). It must detect the landing pad visually via the camera.