Initial Attempt
This commit is contained in:
119
docs/architecture.md
Normal file
119
docs/architecture.md
Normal file
@@ -0,0 +1,119 @@
|
||||
# Architecture Overview
|
||||
|
||||
GPS-denied drone landing simulation with camera vision.
|
||||
|
||||
## System Diagram
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ Simulation System │
|
||||
├─────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌──────────────────┐ ┌──────────────────────────┐ │
|
||||
│ │ simulation_host │◄── UDP:5555 ──────►│ ros_bridge.py │ │
|
||||
│ │ (PyBullet) │ │ (UDP ↔ ROS Bridge) │ │
|
||||
│ └──────────────────┘ └────────────┬─────────────┘ │
|
||||
│ OR │ │
|
||||
│ ┌──────────────────┐ ┌────────────┴─────────────┐ │
|
||||
│ │ Gazebo │◄── ROS Topics ────►│ gazebo_bridge.py │ │
|
||||
│ │ (Ignition) │ │ (Gazebo ↔ ROS Bridge) │ │
|
||||
│ └──────────────────┘ └────────────┬─────────────┘ │
|
||||
│ │ │
|
||||
│ ┌────────────▼─────────────┐ │
|
||||
│ │ controllers.py │ │
|
||||
│ │ ┌─────────────────────┐ │ │
|
||||
│ │ │ DroneController │ │ │
|
||||
│ │ │ RoverController │ │ │
|
||||
│ │ └─────────────────────┘ │ │
|
||||
│ └──────────────────────────┘ │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Components
|
||||
|
||||
### Simulators
|
||||
|
||||
| Component | Description |
|
||||
|-----------|-------------|
|
||||
| **PyBullet** (`simulation_host.py`) | Standalone physics, UDP networking, camera rendering |
|
||||
| **Gazebo** | Full robotics simulator, native ROS 2 integration, camera sensor |
|
||||
|
||||
### Bridges
|
||||
|
||||
| Component | Description |
|
||||
|-----------|-------------|
|
||||
| **ros_bridge.py** | Connects PyBullet ↔ ROS 2 via UDP |
|
||||
| **gazebo_bridge.py** | Connects Gazebo ↔ ROS 2, provides same interface |
|
||||
|
||||
### Controllers
|
||||
|
||||
| Component | Description |
|
||||
|-----------|-------------|
|
||||
| **controllers.py** | Runs drone + rover controllers together |
|
||||
| **drone_controller.py** | GPS-denied landing logic |
|
||||
| **rover_controller.py** | Moving landing pad patterns |
|
||||
|
||||
## ROS Topics
|
||||
|
||||
| Topic | Type | Publisher | Subscriber |
|
||||
|-------|------|-----------|------------|
|
||||
| `/cmd_vel` | `Twist` | DroneController | Bridge |
|
||||
| `/drone/telemetry` | `String` | Bridge | DroneController |
|
||||
| `/rover/telemetry` | `String` | RoverController | DroneController |
|
||||
| `/rover/cmd_vel` | `Twist` | RoverController | (internal) |
|
||||
| `/rover/position` | `Point` | RoverController | (debug) |
|
||||
|
||||
## GPS-Denied Sensor Flow
|
||||
|
||||
```
|
||||
Simulator Bridge DroneController
|
||||
│ │ │
|
||||
│ Render Camera │ │
|
||||
│ Compute Physics │ │
|
||||
│──────────────────────►│ │
|
||||
│ │ │
|
||||
│ │ GPS-Denied Sensors: │
|
||||
│ │ - IMU │
|
||||
│ │ - Altimeter │
|
||||
│ │ - Velocity │
|
||||
│ │ - Camera Image (JPEG) │
|
||||
│ │ - Landing Pad Detection │
|
||||
│ │─────────────────────────►│
|
||||
│ │ │
|
||||
│ │ /cmd_vel │
|
||||
│◄──────────────────────│◄─────────────────────────│
|
||||
```
|
||||
|
||||
## Camera System
|
||||
|
||||
Both simulators provide a downward-facing camera:
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Resolution | 320 x 240 |
|
||||
| FOV | 60 degrees |
|
||||
| Format | Base64 JPEG |
|
||||
| Update Rate | ~5 Hz |
|
||||
| Direction | Downward |
|
||||
|
||||
## Data Flow
|
||||
|
||||
### PyBullet Mode
|
||||
```
|
||||
DroneController → /cmd_vel → ros_bridge → UDP:5555 → simulation_host
|
||||
simulation_host → UDP:5556 → ros_bridge → /drone/telemetry → DroneController
|
||||
```
|
||||
|
||||
### Gazebo Mode
|
||||
```
|
||||
DroneController → /cmd_vel → gazebo_bridge → /drone/cmd_vel → Gazebo
|
||||
Gazebo → /model/drone/odometry → gazebo_bridge → /drone/telemetry → DroneController
|
||||
Gazebo → /drone/camera → gazebo_bridge → (encoded in telemetry)
|
||||
```
|
||||
|
||||
## UDP Protocol
|
||||
|
||||
| Port | Direction | Content |
|
||||
|------|-----------|---------|
|
||||
| 5555 | Bridge → Simulator | Command JSON |
|
||||
| 5556 | Simulator → Bridge | Telemetry JSON (includes camera image) |
|
||||
187
docs/drone_guide.md
Normal file
187
docs/drone_guide.md
Normal file
@@ -0,0 +1,187 @@
|
||||
# DroneController Guide (GPS-Denied)
|
||||
|
||||
Implement your landing algorithm in `drone_controller.py`.
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. Edit `drone_controller.py`
|
||||
2. Find `calculate_landing_maneuver()`
|
||||
3. Implement your algorithm
|
||||
4. Test: `python controllers.py --pattern stationary`
|
||||
|
||||
## GPS-Denied Challenge
|
||||
|
||||
No GPS available. You must use:
|
||||
|
||||
| Sensor | Data |
|
||||
|--------|------|
|
||||
| **IMU** | Orientation, angular velocity |
|
||||
| **Altimeter** | Altitude, vertical velocity |
|
||||
| **Velocity** | Estimated from optical flow |
|
||||
| **Camera** | 320x240 downward image (base64 JPEG) |
|
||||
| **Landing Pad** | Relative position (may be null!) |
|
||||
|
||||
## Function to Implement
|
||||
|
||||
```python
|
||||
def calculate_landing_maneuver(self, telemetry, rover_telemetry):
|
||||
# Your code here
|
||||
return (thrust, pitch, roll, yaw)
|
||||
```
|
||||
|
||||
## Sensor Data
|
||||
|
||||
### IMU
|
||||
```python
|
||||
imu = telemetry['imu']
|
||||
roll = imu['orientation']['roll']
|
||||
pitch = imu['orientation']['pitch']
|
||||
yaw = imu['orientation']['yaw']
|
||||
angular_vel = imu['angular_velocity'] # {x, y, z}
|
||||
```
|
||||
|
||||
### Altimeter
|
||||
```python
|
||||
altimeter = telemetry['altimeter']
|
||||
altitude = altimeter['altitude']
|
||||
vertical_vel = altimeter['vertical_velocity']
|
||||
```
|
||||
|
||||
### Velocity
|
||||
```python
|
||||
velocity = telemetry['velocity'] # {x, y, z} in m/s
|
||||
```
|
||||
|
||||
### Camera
|
||||
The drone has a downward-facing camera providing 320x240 JPEG images.
|
||||
|
||||
```python
|
||||
import base64
|
||||
from PIL import Image
|
||||
import io
|
||||
|
||||
camera = telemetry['camera']
|
||||
image_b64 = camera.get('image')
|
||||
|
||||
if image_b64:
|
||||
image_bytes = base64.b64decode(image_b64)
|
||||
image = Image.open(io.BytesIO(image_bytes))
|
||||
# Process image for custom vision algorithms
|
||||
```
|
||||
|
||||
### Landing Pad (Vision)
|
||||
**Important: May be None if pad not visible!**
|
||||
|
||||
```python
|
||||
landing_pad = telemetry['landing_pad']
|
||||
|
||||
if landing_pad is not None:
|
||||
relative_x = landing_pad['relative_x'] # body frame
|
||||
relative_y = landing_pad['relative_y'] # body frame
|
||||
distance = landing_pad['distance'] # vertical
|
||||
confidence = landing_pad['confidence'] # 0-1
|
||||
```
|
||||
|
||||
## Control Output
|
||||
|
||||
| Value | Range | Effect |
|
||||
|-------|-------|--------|
|
||||
| thrust | ±1.0 | Up/down |
|
||||
| pitch | ±0.5 | Forward/back |
|
||||
| roll | ±0.5 | Left/right |
|
||||
| yaw | ±0.5 | Rotation |
|
||||
|
||||
## Example Algorithm
|
||||
|
||||
```python
|
||||
def calculate_landing_maneuver(self, telemetry, rover_telemetry):
|
||||
altimeter = telemetry.get('altimeter', {})
|
||||
altitude = altimeter.get('altitude', 5.0)
|
||||
vertical_vel = altimeter.get('vertical_velocity', 0.0)
|
||||
|
||||
velocity = telemetry.get('velocity', {})
|
||||
vel_x = velocity.get('x', 0.0)
|
||||
vel_y = velocity.get('y', 0.0)
|
||||
|
||||
landing_pad = telemetry.get('landing_pad')
|
||||
|
||||
# Altitude control
|
||||
thrust = 0.5 * (0 - altitude) - 0.3 * vertical_vel
|
||||
|
||||
# Horizontal control
|
||||
if landing_pad is not None:
|
||||
pitch = 0.3 * landing_pad['relative_x'] - 0.2 * vel_x
|
||||
roll = 0.3 * landing_pad['relative_y'] - 0.2 * vel_y
|
||||
else:
|
||||
pitch = -0.2 * vel_x
|
||||
roll = -0.2 * vel_y
|
||||
|
||||
return (thrust, pitch, roll, 0.0)
|
||||
```
|
||||
|
||||
## Using the Camera
|
||||
|
||||
You can implement custom vision processing on the camera image:
|
||||
|
||||
```python
|
||||
import cv2
|
||||
import numpy as np
|
||||
import base64
|
||||
|
||||
def process_camera(telemetry):
|
||||
camera = telemetry.get('camera', {})
|
||||
image_b64 = camera.get('image')
|
||||
|
||||
if not image_b64:
|
||||
return None
|
||||
|
||||
# Decode JPEG
|
||||
image_bytes = base64.b64decode(image_b64)
|
||||
nparr = np.frombuffer(image_bytes, np.uint8)
|
||||
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
|
||||
|
||||
# Example: detect green landing pad
|
||||
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
|
||||
green_mask = cv2.inRange(hsv, (35, 50, 50), (85, 255, 255))
|
||||
|
||||
# Find contours
|
||||
contours, _ = cv2.findContours(green_mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
|
||||
|
||||
if contours:
|
||||
largest = max(contours, key=cv2.contourArea)
|
||||
M = cv2.moments(largest)
|
||||
if M['m00'] > 0:
|
||||
cx = int(M['m10'] / M['m00'])
|
||||
cy = int(M['m01'] / M['m00'])
|
||||
# cx, cy is center of detected pad in image coordinates
|
||||
return (cx, cy)
|
||||
|
||||
return None
|
||||
```
|
||||
|
||||
## Strategies
|
||||
|
||||
### When Pad Not Visible
|
||||
- Maintain altitude and stabilize
|
||||
- Search by ascending or spiraling
|
||||
- Dead reckoning from last known position
|
||||
|
||||
### State Machine
|
||||
1. Search → find pad
|
||||
2. Approach → move above pad
|
||||
3. Align → center over pad
|
||||
4. Descend → controlled descent
|
||||
5. Land → touch down
|
||||
|
||||
## Testing
|
||||
|
||||
```bash
|
||||
# Easy
|
||||
python controllers.py --pattern stationary
|
||||
|
||||
# Medium
|
||||
python controllers.py --pattern circular --speed 0.2
|
||||
|
||||
# Hard
|
||||
python controllers.py --pattern random --speed 0.3
|
||||
```
|
||||
158
docs/gazebo.md
Normal file
158
docs/gazebo.md
Normal file
@@ -0,0 +1,158 @@
|
||||
# Gazebo Simulation
|
||||
|
||||
Running the GPS-denied drone simulation with Gazebo.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Install Gazebo and ROS-Gazebo bridge:
|
||||
|
||||
```bash
|
||||
./setup/install_ubuntu.sh
|
||||
source activate.sh
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
**Terminal 1 - Start Gazebo:**
|
||||
```bash
|
||||
source activate.sh
|
||||
gz sim gazebo/worlds/drone_landing.sdf
|
||||
```
|
||||
|
||||
**Terminal 2 - Spawn drone and start bridge:**
|
||||
```bash
|
||||
source activate.sh
|
||||
|
||||
# Spawn drone
|
||||
gz service -s /world/drone_landing_world/create \
|
||||
--reqtype gz.msgs.EntityFactory \
|
||||
--reptype gz.msgs.Boolean \
|
||||
--req 'sdf_filename: "gazebo/models/drone/model.sdf", name: "drone"'
|
||||
|
||||
# Start bridge
|
||||
python gazebo_bridge.py
|
||||
```
|
||||
|
||||
**Terminal 3 - Run controllers:**
|
||||
```bash
|
||||
source activate.sh
|
||||
python controllers.py --pattern circular --speed 0.3
|
||||
```
|
||||
|
||||
## World Description
|
||||
|
||||
The `drone_landing.sdf` world contains:
|
||||
|
||||
| Object | Description |
|
||||
|--------|-------------|
|
||||
| Ground Plane | Infinite flat surface |
|
||||
| Sun | Directional light with shadows |
|
||||
| Landing Pad | Green box with "H" marker at origin |
|
||||
|
||||
## Drone Model
|
||||
|
||||
Quadrotor drone with:
|
||||
|
||||
- **Body**: 0.3m × 0.3m × 0.1m, 1.0 kg
|
||||
- **Rotors**: 4 spinning rotors
|
||||
- **IMU**: Orientation and angular velocity
|
||||
- **Camera**: 320x240 downward-facing sensor
|
||||
- **Odometry**: Position and velocity
|
||||
|
||||
### Gazebo Plugins
|
||||
|
||||
| Plugin | Function |
|
||||
|--------|----------|
|
||||
| MulticopterMotorModel | Motor dynamics |
|
||||
| MulticopterVelocityControl | Velocity commands |
|
||||
| OdometryPublisher | Pose and twist |
|
||||
|
||||
## Camera System
|
||||
|
||||
The drone has a downward-facing camera:
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Resolution | 320 x 240 |
|
||||
| FOV | 60 degrees |
|
||||
| Format | Base64 encoded JPEG |
|
||||
| Update Rate | 30 Hz (Gazebo) / ~5 Hz (in telemetry) |
|
||||
| Topic | `/drone/camera` |
|
||||
|
||||
## Gazebo Topics
|
||||
|
||||
| Topic | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `/drone/cmd_vel` | `gz.msgs.Twist` | Velocity commands |
|
||||
| `/model/drone/odometry` | `gz.msgs.Odometry` | Drone state |
|
||||
| `/drone/camera` | `gz.msgs.Image` | Camera images |
|
||||
| `/drone/imu` | `gz.msgs.IMU` | IMU data |
|
||||
|
||||
## GPS-Denied Sensors
|
||||
|
||||
The `gazebo_bridge.py` converts Gazebo data to GPS-denied sensor format:
|
||||
|
||||
| Sensor | Source |
|
||||
|--------|--------|
|
||||
| IMU | Odometry orientation + angular velocity |
|
||||
| Altimeter | Odometry Z position |
|
||||
| Velocity | Odometry twist |
|
||||
| Camera | Camera sensor (base64 JPEG) |
|
||||
| Landing Pad | Computed from relative position |
|
||||
|
||||
## Headless Mode
|
||||
|
||||
Run without GUI:
|
||||
|
||||
```bash
|
||||
gz sim -s gazebo/worlds/drone_landing.sdf
|
||||
```
|
||||
|
||||
## Using the Launch File
|
||||
|
||||
For ROS 2 packages:
|
||||
|
||||
```bash
|
||||
ros2 launch <package_name> drone_landing.launch.py
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Cannot connect to display"
|
||||
|
||||
```bash
|
||||
export DISPLAY=:0
|
||||
# or use headless mode
|
||||
gz sim -s gazebo/worlds/drone_landing.sdf
|
||||
```
|
||||
|
||||
### Drone falls immediately
|
||||
|
||||
The velocity controller may need to be enabled:
|
||||
|
||||
```bash
|
||||
gz topic -t /drone/enable -m gz.msgs.Boolean -p 'data: true'
|
||||
```
|
||||
|
||||
### Topics not visible in ROS
|
||||
|
||||
Ensure the bridge is running:
|
||||
|
||||
```bash
|
||||
python gazebo_bridge.py
|
||||
```
|
||||
|
||||
### Model not found
|
||||
|
||||
Set the model path:
|
||||
|
||||
```bash
|
||||
export GZ_SIM_RESOURCE_PATH=$PWD/gazebo/models:$GZ_SIM_RESOURCE_PATH
|
||||
```
|
||||
|
||||
### Camera image not in telemetry
|
||||
|
||||
Ensure PIL/Pillow is installed:
|
||||
```bash
|
||||
pip install pillow
|
||||
```
|
||||
267
docs/installation.md
Normal file
267
docs/installation.md
Normal file
@@ -0,0 +1,267 @@
|
||||
# Installation Guide
|
||||
|
||||
This guide covers installation on Ubuntu, macOS, and Windows.
|
||||
|
||||
## Quick Install
|
||||
|
||||
### Ubuntu / Debian
|
||||
|
||||
```bash
|
||||
cd simulation
|
||||
chmod +x setup/install_ubuntu.sh
|
||||
./setup/install_ubuntu.sh
|
||||
source activate.sh
|
||||
```
|
||||
|
||||
### macOS
|
||||
|
||||
```bash
|
||||
cd simulation
|
||||
chmod +x setup/install_macos.sh
|
||||
./setup/install_macos.sh
|
||||
source activate.sh
|
||||
```
|
||||
|
||||
### Windows (PowerShell)
|
||||
|
||||
```powershell
|
||||
cd simulation
|
||||
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
|
||||
.\setup\install_windows.ps1
|
||||
.\activate.bat
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## What Gets Installed
|
||||
|
||||
| Component | Description |
|
||||
|-----------|-------------|
|
||||
| **ROS 2** | Humble (Ubuntu 22.04) or Jazzy (Ubuntu 24.04) |
|
||||
| **Gazebo** | Modern Ignition-based simulator |
|
||||
| **Python venv** | Virtual environment with system site-packages |
|
||||
| **PyBullet** | Lightweight physics engine |
|
||||
| **PyInstaller** | Executable bundler |
|
||||
| **ros_gz_bridge** | ROS 2 ↔ Gazebo topic bridge |
|
||||
|
||||
---
|
||||
|
||||
## Manual Installation
|
||||
|
||||
If the scripts don't work, follow these steps manually.
|
||||
|
||||
### Step 1: Install ROS 2
|
||||
|
||||
#### Ubuntu 22.04 (Humble)
|
||||
|
||||
```bash
|
||||
sudo apt update && sudo apt install -y locales
|
||||
sudo locale-gen en_US en_US.UTF-8
|
||||
sudo update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8
|
||||
export LANG=en_US.UTF-8
|
||||
|
||||
sudo apt install -y software-properties-common curl
|
||||
sudo add-apt-repository -y universe
|
||||
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key \
|
||||
-o /usr/share/keyrings/ros-archive-keyring.gpg
|
||||
|
||||
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(. /etc/os-release && echo $UBUNTU_CODENAME) main" \
|
||||
| sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null
|
||||
|
||||
sudo apt update
|
||||
sudo apt install -y ros-humble-desktop
|
||||
```
|
||||
|
||||
#### Ubuntu 24.04 (Jazzy)
|
||||
|
||||
```bash
|
||||
# Same as above, but install ros-jazzy-desktop
|
||||
sudo apt install -y ros-jazzy-desktop
|
||||
```
|
||||
|
||||
### Step 2: Install Gazebo
|
||||
|
||||
```bash
|
||||
# Ubuntu 22.04
|
||||
sudo apt install -y ros-humble-ros-gz ros-humble-ros-gz-bridge
|
||||
|
||||
# Ubuntu 24.04
|
||||
sudo apt install -y ros-jazzy-ros-gz ros-jazzy-ros-gz-bridge
|
||||
```
|
||||
|
||||
### Step 3: Create Python Virtual Environment
|
||||
|
||||
```bash
|
||||
sudo apt install -y python3-venv python3-full
|
||||
|
||||
cd /path/to/simulation
|
||||
python3 -m venv venv --system-site-packages
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
### Step 4: Install Python Dependencies
|
||||
|
||||
```bash
|
||||
pip install --upgrade pip
|
||||
pip install pybullet pyinstaller
|
||||
```
|
||||
|
||||
### Step 5: Create Activation Script
|
||||
|
||||
Create `activate.sh`:
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
# Source ROS 2 (adjust distro as needed)
|
||||
source /opt/ros/humble/setup.bash
|
||||
echo "[OK] ROS 2 sourced"
|
||||
|
||||
# Activate venv
|
||||
source "$SCRIPT_DIR/venv/bin/activate"
|
||||
echo "[OK] Python venv activated"
|
||||
```
|
||||
|
||||
Make executable:
|
||||
|
||||
```bash
|
||||
chmod +x activate.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Verifying Installation
|
||||
|
||||
After installation, verify all components:
|
||||
|
||||
```bash
|
||||
source activate.sh
|
||||
|
||||
# Check ROS 2
|
||||
ros2 --version
|
||||
|
||||
# Check PyBullet
|
||||
python3 -c "import pybullet; print('PyBullet OK')"
|
||||
|
||||
# Check rclpy
|
||||
python3 -c "import rclpy; print('rclpy OK')"
|
||||
|
||||
# Check geometry_msgs
|
||||
python3 -c "from geometry_msgs.msg import Twist; print('geometry_msgs OK')"
|
||||
|
||||
# Check Gazebo
|
||||
gz sim --version
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "externally-managed-environment" Error
|
||||
|
||||
This happens on modern Ubuntu/Debian due to PEP 668. Solution: use the virtual environment.
|
||||
|
||||
```bash
|
||||
source activate.sh # Activates venv
|
||||
pip install pybullet # Now works
|
||||
```
|
||||
|
||||
### ROS 2 Packages Not Found
|
||||
|
||||
Ensure ROS 2 is sourced before activating venv:
|
||||
|
||||
```bash
|
||||
source /opt/ros/humble/setup.bash # or jazzy
|
||||
source venv/bin/activate
|
||||
```
|
||||
|
||||
The `activate.sh` script handles this automatically.
|
||||
|
||||
### Gazebo Not Starting
|
||||
|
||||
Check if Gazebo is properly installed:
|
||||
|
||||
```bash
|
||||
which gz
|
||||
gz sim --version
|
||||
```
|
||||
|
||||
If missing, install the ROS-Gazebo packages:
|
||||
|
||||
```bash
|
||||
sudo apt install ros-humble-ros-gz # or jazzy
|
||||
```
|
||||
|
||||
### PyBullet GUI Not Showing
|
||||
|
||||
PyBullet requires a display. Options:
|
||||
|
||||
1. Run on machine with monitor
|
||||
2. Use X11 forwarding: `ssh -X user@host`
|
||||
3. Use virtual display: `xvfb-run python simulation_host.py`
|
||||
|
||||
### Permission Denied on Scripts
|
||||
|
||||
Make scripts executable:
|
||||
|
||||
```bash
|
||||
chmod +x setup/*.sh
|
||||
chmod +x activate.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Platform-Specific Notes
|
||||
|
||||
### Ubuntu
|
||||
|
||||
- Full support for both PyBullet and Gazebo
|
||||
- ROS 2 installed via apt packages
|
||||
- Recommended platform
|
||||
|
||||
### macOS
|
||||
|
||||
- PyBullet works well
|
||||
- Gazebo support is limited
|
||||
- ROS 2 installed via Homebrew or binary
|
||||
|
||||
### Windows
|
||||
|
||||
- PyBullet works in GUI mode
|
||||
- Gazebo not officially supported
|
||||
- ROS 2 requires Windows-specific binaries
|
||||
- Consider WSL2 for full Linux experience
|
||||
|
||||
---
|
||||
|
||||
## Updating
|
||||
|
||||
To update the simulation framework:
|
||||
|
||||
```bash
|
||||
cd simulation
|
||||
git pull # If using git
|
||||
|
||||
# Reinstall Python dependencies
|
||||
source activate.sh
|
||||
pip install --upgrade pybullet pyinstaller
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Uninstalling
|
||||
|
||||
### Remove Virtual Environment
|
||||
|
||||
```bash
|
||||
rm -rf venv/
|
||||
rm activate.sh
|
||||
```
|
||||
|
||||
### Remove ROS 2 (Ubuntu)
|
||||
|
||||
```bash
|
||||
sudo apt remove ros-humble-* # or jazzy
|
||||
sudo rm /etc/apt/sources.list.d/ros2.list
|
||||
```
|
||||
184
docs/protocol.md
Normal file
184
docs/protocol.md
Normal file
@@ -0,0 +1,184 @@
|
||||
# Communication Protocol (GPS-Denied)
|
||||
|
||||
Message formats for GPS-denied drone operation with camera.
|
||||
|
||||
## Drone Commands
|
||||
|
||||
```json
|
||||
{
|
||||
"thrust": 0.5,
|
||||
"pitch": 0.1,
|
||||
"roll": -0.2,
|
||||
"yaw": 0.0
|
||||
}
|
||||
```
|
||||
|
||||
| Field | Range | Description |
|
||||
|-------|-------|-------------|
|
||||
| `thrust` | ±1.0 | Vertical thrust (positive = up) |
|
||||
| `pitch` | ±0.5 | Forward/backward tilt |
|
||||
| `roll` | ±0.5 | Left/right tilt |
|
||||
| `yaw` | ±0.5 | Rotation |
|
||||
|
||||
---
|
||||
|
||||
## Drone Telemetry
|
||||
|
||||
Published on `/drone/telemetry`. **No GPS position available.**
|
||||
|
||||
```json
|
||||
{
|
||||
"imu": {
|
||||
"orientation": {"roll": 0.0, "pitch": 0.0, "yaw": 0.0},
|
||||
"angular_velocity": {"x": 0.0, "y": 0.0, "z": 0.0},
|
||||
"linear_acceleration": {"x": 0.0, "y": 0.0, "z": 9.81}
|
||||
},
|
||||
"altimeter": {
|
||||
"altitude": 5.0,
|
||||
"vertical_velocity": -0.1
|
||||
},
|
||||
"velocity": {"x": 0.0, "y": 0.0, "z": -0.1},
|
||||
"landing_pad": {
|
||||
"relative_x": 0.5,
|
||||
"relative_y": -0.2,
|
||||
"distance": 4.5,
|
||||
"confidence": 0.85
|
||||
},
|
||||
"camera": {
|
||||
"width": 320,
|
||||
"height": 240,
|
||||
"fov": 60.0,
|
||||
"image": "<base64 encoded JPEG>"
|
||||
},
|
||||
"landed": false,
|
||||
"timestamp": 1.234
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Sensor Details
|
||||
|
||||
### IMU
|
||||
Always available.
|
||||
|
||||
| Field | Unit | Description |
|
||||
|-------|------|-------------|
|
||||
| `orientation.roll/pitch/yaw` | radians | Euler angles |
|
||||
| `angular_velocity.x/y/z` | rad/s | Rotation rates |
|
||||
| `linear_acceleration.x/y/z` | m/s² | Acceleration |
|
||||
|
||||
### Altimeter
|
||||
Always available.
|
||||
|
||||
| Field | Unit | Description |
|
||||
|-------|------|-------------|
|
||||
| `altitude` | meters | Height above ground |
|
||||
| `vertical_velocity` | m/s | Vertical speed |
|
||||
|
||||
### Velocity
|
||||
Estimated from optical flow.
|
||||
|
||||
| Field | Unit | Description |
|
||||
|-------|------|-------------|
|
||||
| `x` | m/s | Forward velocity |
|
||||
| `y` | m/s | Lateral velocity |
|
||||
| `z` | m/s | Vertical velocity |
|
||||
|
||||
### Landing Pad Detection
|
||||
**May be null if pad not visible!**
|
||||
|
||||
| Field | Unit | Description |
|
||||
|-------|------|-------------|
|
||||
| `relative_x` | meters | Forward/back offset (body frame) |
|
||||
| `relative_y` | meters | Left/right offset (body frame) |
|
||||
| `distance` | meters | Vertical distance to pad |
|
||||
| `confidence` | 0-1 | Detection confidence |
|
||||
|
||||
### Camera
|
||||
Always available.
|
||||
|
||||
| Field | Description |
|
||||
|-------|-------------|
|
||||
| `width` | Image width in pixels |
|
||||
| `height` | Image height in pixels |
|
||||
| `fov` | Horizontal field of view in degrees |
|
||||
| `image` | Base64 encoded JPEG (or null) |
|
||||
|
||||
---
|
||||
|
||||
## Using the Camera Image
|
||||
|
||||
The camera provides a base64-encoded JPEG image of what the drone sees looking down.
|
||||
|
||||
### Decoding the Image (Python)
|
||||
|
||||
```python
|
||||
import base64
|
||||
from PIL import Image
|
||||
import io
|
||||
|
||||
def decode_camera_image(telemetry):
|
||||
camera = telemetry.get('camera', {})
|
||||
image_b64 = camera.get('image')
|
||||
|
||||
if image_b64 is None:
|
||||
return None
|
||||
|
||||
# Decode base64 to bytes
|
||||
image_bytes = base64.b64decode(image_b64)
|
||||
|
||||
# Load as PIL Image
|
||||
image = Image.open(io.BytesIO(image_bytes))
|
||||
|
||||
return image
|
||||
```
|
||||
|
||||
### Using with OpenCV
|
||||
|
||||
```python
|
||||
import base64
|
||||
import cv2
|
||||
import numpy as np
|
||||
|
||||
def decode_camera_image_cv2(telemetry):
|
||||
camera = telemetry.get('camera', {})
|
||||
image_b64 = camera.get('image')
|
||||
|
||||
if image_b64 is None:
|
||||
return None
|
||||
|
||||
# Decode base64 to bytes
|
||||
image_bytes = base64.b64decode(image_b64)
|
||||
|
||||
# Convert to numpy array
|
||||
nparr = np.frombuffer(image_bytes, np.uint8)
|
||||
|
||||
# Decode JPEG
|
||||
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
|
||||
|
||||
return image
|
||||
```
|
||||
|
||||
### Image Properties
|
||||
|
||||
- **Resolution**: 320 x 240 pixels
|
||||
- **Format**: JPEG (quality 70)
|
||||
- **FOV**: 60 degrees
|
||||
- **Direction**: Downward-facing
|
||||
- **Update Rate**: ~5 Hz (every 5th telemetry frame)
|
||||
|
||||
---
|
||||
|
||||
## Rover Telemetry
|
||||
|
||||
For internal use by RoverController.
|
||||
|
||||
```json
|
||||
{
|
||||
"position": {"x": 1.5, "y": 0.8, "z": 0.15},
|
||||
"velocity": {"x": 0.3, "y": 0.4, "z": 0.0},
|
||||
"pattern": "circular",
|
||||
"timestamp": 1.234
|
||||
}
|
||||
```
|
||||
140
docs/pybullet.md
Normal file
140
docs/pybullet.md
Normal file
@@ -0,0 +1,140 @@
|
||||
# PyBullet Simulation
|
||||
|
||||
Running the GPS-denied drone simulation with PyBullet.
|
||||
|
||||
## Quick Start
|
||||
|
||||
**Terminal 1 - Simulator:**
|
||||
```bash
|
||||
source activate.sh
|
||||
python simulation_host.py
|
||||
```
|
||||
|
||||
**Terminal 2 - ROS Bridge:**
|
||||
```bash
|
||||
source activate.sh
|
||||
python ros_bridge.py
|
||||
```
|
||||
|
||||
**Terminal 3 - Controllers:**
|
||||
```bash
|
||||
source activate.sh
|
||||
python controllers.py --pattern circular --speed 0.3
|
||||
```
|
||||
|
||||
## Remote Setup
|
||||
|
||||
Run simulator on one machine, controllers on another.
|
||||
|
||||
**Machine 1 (with display):**
|
||||
```bash
|
||||
python simulation_host.py
|
||||
```
|
||||
|
||||
**Machine 2 (headless):**
|
||||
```bash
|
||||
source activate.sh
|
||||
python ros_bridge.py --host <MACHINE_1_IP>
|
||||
python controllers.py
|
||||
```
|
||||
|
||||
## Simulation Parameters
|
||||
|
||||
| Parameter | Value |
|
||||
|-----------|-------|
|
||||
| Physics Rate | 240 Hz |
|
||||
| Telemetry Rate | 24 Hz |
|
||||
| Drone Mass | 1.0 kg |
|
||||
| Gravity | -9.81 m/s² |
|
||||
|
||||
## GPS-Denied Sensors
|
||||
|
||||
The simulator provides:
|
||||
|
||||
| Sensor | Description |
|
||||
|--------|-------------|
|
||||
| IMU | Orientation (roll, pitch, yaw), angular velocity |
|
||||
| Altimeter | Barometric altitude, vertical velocity |
|
||||
| Velocity | Optical flow estimate (x, y, z) |
|
||||
| Camera | 320x240 downward JPEG image |
|
||||
| Landing Pad | Vision-based relative position (60° FOV, 10m range) |
|
||||
|
||||
## Camera System
|
||||
|
||||
PyBullet renders a camera image from the drone's perspective:
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Resolution | 320 x 240 |
|
||||
| FOV | 60 degrees |
|
||||
| Format | Base64 encoded JPEG |
|
||||
| Update Rate | ~5 Hz |
|
||||
| Direction | Downward-facing |
|
||||
|
||||
The image is included in telemetry as `camera.image`.
|
||||
|
||||
## World Setup
|
||||
|
||||
| Object | Position | Description |
|
||||
|--------|----------|-------------|
|
||||
| Ground | z = 0 | Infinite plane |
|
||||
| Rover | (0, 0, 0.15) | 1m × 1m landing pad |
|
||||
| Drone | (0, 0, 5) | Starting position |
|
||||
|
||||
## UDP Communication
|
||||
|
||||
| Port | Direction | Data |
|
||||
|------|-----------|------|
|
||||
| 5555 | Bridge → Sim | Commands (JSON) |
|
||||
| 5556 | Sim → Bridge | Telemetry (JSON with camera) |
|
||||
|
||||
## ROS Bridge Options
|
||||
|
||||
```bash
|
||||
python ros_bridge.py --help
|
||||
|
||||
Options:
|
||||
--host, -H Simulator IP (default: 127.0.0.1)
|
||||
--port, -p Simulator port (default: 5555)
|
||||
```
|
||||
|
||||
## Building Executable
|
||||
|
||||
Create standalone executable:
|
||||
|
||||
```bash
|
||||
source activate.sh
|
||||
python build_exe.py
|
||||
```
|
||||
|
||||
Output: `dist/simulation_host` (or `.exe` on Windows)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Cannot connect to X server"
|
||||
|
||||
PyBullet requires a display:
|
||||
- Run on machine with monitor
|
||||
- Use X11 forwarding: `ssh -X user@host`
|
||||
- Virtual display: `xvfb-run python simulation_host.py`
|
||||
|
||||
### Drone flies erratically
|
||||
|
||||
Reduce control gains:
|
||||
```python
|
||||
Kp = 0.3
|
||||
Kd = 0.2
|
||||
```
|
||||
|
||||
### No telemetry received
|
||||
|
||||
1. Check simulator is running
|
||||
2. Verify firewall allows UDP 5555-5556
|
||||
3. Check IP address in ros_bridge.py
|
||||
|
||||
### Camera image not appearing
|
||||
|
||||
Ensure PIL/Pillow is installed:
|
||||
```bash
|
||||
pip install pillow
|
||||
```
|
||||
100
docs/rover_controller.md
Normal file
100
docs/rover_controller.md
Normal file
@@ -0,0 +1,100 @@
|
||||
# Rover Controller
|
||||
|
||||
The RoverController creates a moving landing pad target.
|
||||
|
||||
## Usage
|
||||
|
||||
The rover controller is automatically included when running `controllers.py`:
|
||||
|
||||
```bash
|
||||
# Stationary rover (default)
|
||||
python controllers.py
|
||||
|
||||
# Moving rover
|
||||
python controllers.py --pattern circular --speed 0.3
|
||||
```
|
||||
|
||||
### Options
|
||||
|
||||
| Option | Short | Default | Description |
|
||||
|--------|-------|---------|-------------|
|
||||
| `--pattern` | `-p` | stationary | Movement pattern |
|
||||
| `--speed` | `-s` | 0.5 | Speed in m/s |
|
||||
| `--amplitude` | `-a` | 2.0 | Amplitude in meters |
|
||||
|
||||
## Movement Patterns
|
||||
|
||||
### Stationary
|
||||
```bash
|
||||
python controllers.py --pattern stationary
|
||||
```
|
||||
Rover stays at origin. Best for initial testing.
|
||||
|
||||
### Linear
|
||||
```bash
|
||||
python controllers.py --pattern linear --speed 0.3 --amplitude 2.0
|
||||
```
|
||||
Oscillates along X-axis.
|
||||
|
||||
### Circular
|
||||
```bash
|
||||
python controllers.py --pattern circular --speed 0.5 --amplitude 2.0
|
||||
```
|
||||
Follows circular path of radius `amplitude`.
|
||||
|
||||
### Random
|
||||
```bash
|
||||
python controllers.py --pattern random --speed 0.3 --amplitude 2.0
|
||||
```
|
||||
Moves to random positions. Changes target every 3 seconds.
|
||||
|
||||
### Square
|
||||
```bash
|
||||
python controllers.py --pattern square --speed 0.5 --amplitude 2.0
|
||||
```
|
||||
Square pattern with corners at `(±amplitude, ±amplitude)`.
|
||||
|
||||
## Difficulty Levels
|
||||
|
||||
| Level | Pattern | Speed | Description |
|
||||
|-------|---------|-------|-------------|
|
||||
| Beginner | stationary | 0.0 | Static target |
|
||||
| Easy | linear | 0.2 | Predictable 1D |
|
||||
| Medium | circular | 0.3 | Smooth 2D |
|
||||
| Hard | random | 0.3 | Unpredictable |
|
||||
| Expert | square | 0.5 | Sharp turns |
|
||||
|
||||
## Progressive Testing
|
||||
|
||||
Start easy and increase difficulty:
|
||||
|
||||
```bash
|
||||
# Step 1: Static target
|
||||
python controllers.py --pattern stationary
|
||||
|
||||
# Step 2: Slow linear motion
|
||||
python controllers.py --pattern linear --speed 0.2
|
||||
|
||||
# Step 3: Slow circular motion
|
||||
python controllers.py --pattern circular --speed 0.2
|
||||
|
||||
# Step 4: Faster circular
|
||||
python controllers.py --pattern circular --speed 0.4
|
||||
|
||||
# Step 5: Random
|
||||
python controllers.py --pattern random --speed 0.3
|
||||
```
|
||||
|
||||
## Published Topics
|
||||
|
||||
| Topic | Type | Description |
|
||||
|-------|------|-------------|
|
||||
| `/rover/cmd_vel` | `Twist` | Velocity commands |
|
||||
| `/rover/position` | `Point` | Current position |
|
||||
| `/rover/telemetry` | `String` | Full state (JSON) |
|
||||
|
||||
## GPS-Denied Note
|
||||
|
||||
In GPS-denied mode, the drone cannot directly access rover position. Instead, it must detect the landing pad visually via `landing_pad` sensor data.
|
||||
|
||||
The `/rover/telemetry` topic is used internally by the RoverController but the DroneController should primarily rely on vision-based `landing_pad` detection in the drone telemetry.
|
||||
Reference in New Issue
Block a user