Code reorganization and Drone Logic Update

This commit is contained in:
2026-01-05 02:38:46 +00:00
parent c5b208c91a
commit 27a70c4983
32 changed files with 1018 additions and 812 deletions

View File

@@ -1,98 +1,67 @@
# Architecture Overview
# Architecture
## Modes
### 1. Standalone (1 Terminal)
```bash
python standalone_simulation.py --pattern circular
```
## System Overview
```
┌────────────────────────────────────────┐
standalone_simulation.py
PyBullet + Controllers + Camera
└────────────────────────────────────────┘
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
Gazebo │◄───►│ ArduPilot SITL │◄───►│ Controller
(Physics) │JSON │ (Flight Ctrl) │MAV │ (Your Logic)
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │
▼ ▼
~/ardupilot_gazebo src/drone_controller.py
```
### 2. Gazebo + ROS 2 (2 Terminals)
## Terminal Layout
**Terminal 1:**
```
Terminal 1 Terminal 2
┌───────────────────┐ ┌───────────────────┐
│ Gazebo + Bridge │◄──────►│ run_gazebo.py │
└───────────────────┘ ROS └───────────────────┘
./scripts/run_ardupilot_sim.sh
Gazebo + ArduPilot Plugin
```
### 3. ArduPilot (2 Terminals)
**Terminal 2:**
```
Terminal 1 Terminal 2
┌───────────────────┐ ┌────────────────────────────┐
│ Gazebo + │◄──────►│ run_ardupilot_controller.sh│
│ ArduPilot Plugin │ JSON │ ┌──────────────────┐
└───────────────────┘ │ │ ArduPilot SITL │ │
│ └─────────┬────────┘
│ │ MAVLink │
│ ┌─────────▼────────┐ │
│ │ run_ardupilot.py │ │
│ └──────────────────┘ │
└────────────────────────────┘
./scripts/run_ardupilot_controller.sh
├── ArduPilot SITL (background)
└── run_ardupilot.py
└── src/drone_controller.py
```
## Data Flow
### Standalone
```
Controller → PyBullet → Telemetry → Controller
```
### Gazebo
```
Controller → /cmd_vel → Gazebo → /odometry → Controller
```
### ArduPilot
```
Gazebo ◄─── JSON ───► SITL ◄─── MAVLink ───► Controller
Gazebo ◄─── JSON/UDP ───► SITL ◄─── MAVLink ───► Controller
│ │ │
│ Physics │ Flight control │ Your logic
│ Sensors │ EKF │ 3-phase mission
│ Rendering │ Stabilization │ QR detection
▼ ▼ ▼
Display Attitude/Position Commands
```
## Key Files
| File | Purpose |
|------|---------|
| `drone_controller.py` | **Your landing algorithm (used in ALL modes)** |
| `run_ardupilot.py` | MAVLink interface for ArduPilot |
| `run_gazebo.py` | ROS 2 interface for Gazebo |
| `standalone_simulation.py` | PyBullet simulation engine |
| `config.py` | Shared configuration |
| `src/drone_controller.py` | 3-phase mission logic |
| `scripts/run_ardupilot.py` | MAVLink interface |
| `src/mavlink_bridge.py` | MAVLink utilities |
| `src/gazebo_bridge.py` | Gazebo ROS bridge |
| `config.py` | Configuration |
## GPS-Denied Sensors
## 3-Phase Mission
The controller receives this standardized telemetry structure in all modes:
```python
telemetry = {
"altimeter": {
"altitude": float, # Meters
"vertical_velocity": float # m/s (positive = up)
},
"velocity": { # Body or Local frame
"x": float,
"y": float,
"z": float
},
"imu": {
"orientation": {
"roll": float,
"pitch": float,
"yaw": float
}
},
"landing_pad": { # If visible (None otherwise)
"relative_x": float,
"relative_y": float,
"distance": float
}
}
```
┌────────┐ QR Found ┌─────────┐ Timeout ┌──────┐
│ SEARCH │───────────────►│ COMMAND │──────────────►│ LAND │
└────────┘ └─────────┘ └──────┘
│ │ │
▼ ▼
Detect QR Send to rover Track & descend
```

View File

@@ -1,38 +1,7 @@
# ArduPilot GPS-Denied Simulation
Realistic flight controller simulation with your drone logic.
# ArduPilot + Gazebo Simulation
## Quick Start (2 Terminals)
**Terminal 1 - Gazebo:**
```bash
./scripts/run_ardupilot_sim.sh runway
# Options: runway, warehouse, zephyr
```
**Terminal 2 - Controller + SITL:**
```bash
./scripts/run_ardupilot_controller.sh
```
## How It Works
The `run_ardupilot_controller.sh` script starts ArduPilot SITL in the background and connects your controller to it via MAVLink.
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Gazebo │◄───►│ ArduPilot SITL │◄───►│ run_ardupilot.py│
│ (Physics) │JSON │ (Hidden) │MAV │ (Your Logic) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
drone_controller.py
```
## Manual Mode (Debugging)
If you need to debug with MAVProxy console (3 Terminals):
**Terminal 1:**
```bash
./scripts/run_ardupilot_sim.sh runway
@@ -40,50 +9,64 @@ If you need to debug with MAVProxy console (3 Terminals):
**Terminal 2:**
```bash
sim_vehicle.py -v ArduCopter -f gazebo-iris --model JSON --console
./scripts/run_ardupilot_controller.sh
```
**Terminal 3:**
## Architecture
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Gazebo │◄───►│ ArduPilot SITL │◄───►│ run_ardupilot.py│
│ (Physics) │JSON │ (Flight Ctrl) │MAV │ (Your Logic) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
```
## World Options
```bash
# Debug commands in MAVProxy:
./scripts/run_ardupilot_sim.sh runway # Default
./scripts/run_ardupilot_sim.sh warehouse # Indoor
./scripts/run_ardupilot_sim.sh zephyr # Fixed-wing
```
## GPS-Denied Mode
The simulation runs in GPS-denied mode by default.
For manual debugging with MAVProxy:
```bash
sim_vehicle.py -v ArduCopter -f gazebo-iris --model JSON --console
# In MAVProxy:
param set ARMING_CHECK 0
mode guided
mode stabilize
arm throttle force
```
## Installation
## Controller Options
```bash
./setup/install_ardupilot.sh
source ~/.bashrc
./scripts/run_ardupilot_controller.sh # Auto takeoff
./scripts/run_ardupilot_controller.sh --no-takeoff # Manual
./scripts/run_ardupilot_controller.sh -a 10 # 10m altitude
```
## Configuration
## Files
Your `drone_controller.py` receives telemetry and returns control inputs.
The simulation translates your inputs:
- `thrust` → Vertical velocity
- `pitch/roll` → Horizontal velocity
- `yaw` Yaw rate
| File | Purpose |
|------|---------|
| `scripts/run_ardupilot_sim.sh` | Gazebo + GPU detection |
| `scripts/run_ardupilot_controller.sh` | SITL + Controller |
| `scripts/run_ardupilot.py` | MAVLink interface |
| `src/drone_controller.py` | Your algorithm |
## Troubleshooting
### "SITL failed to start"
Check if `sim_vehicle.py` is in your PATH:
```bash
export PATH=$PATH:~/ardupilot/Tools/autotest
```
### Drone drift
ArduPilot in GUIDED mode requires good position estimation. Without GPS, it relies on optical flow or visual odometry (not yet implemented in default setup). The drone might drift if relying only on IMU.
### "No JSON sensor message"
Ensure Gazebo (Terminal 1) is running before starting the controller.
Start Gazebo (Terminal 1) before the controller.
## Visualizing Camera
### Drone doesn't respond
Check mode is GUIDED in MAVProxy.
```bash
python camera_viewer.py --topic /drone/camera
```
(Requires bridging the topic if using ROS 2 bridge)
### Simulation laggy
Check GPU: `glxinfo | grep "OpenGL renderer"`

168
docs/blender_models.md Normal file
View File

@@ -0,0 +1,168 @@
# Blender Models in Gazebo
Import 3D models from Blender into the ARG simulation.
## Workflow
```
Blender (.blend) → Export COLLADA (.dae) → Gazebo Model → World
```
## Step 1: Create Model in Blender
1. Create your 3D model
2. Apply all transforms: `Ctrl+A` → All Transforms
3. Set origin to geometry center
## Step 2: Export from Blender
1. File → Export → COLLADA (.dae)
2. Settings:
- Selection Only (if needed)
- Include Armatures: OFF
- Include Animations: OFF
- Triangulate: ON
3. Save as `model.dae`
## Step 3: Create Gazebo Model
```
gazebo/models/my_model/
├── model.config
├── model.sdf
├── meshes/
│ └── model.dae
└── materials/
└── textures/
└── texture.png
```
### model.config
```xml
<?xml version="1.0"?>
<model>
<name>My Model</name>
<version>1.0</version>
<sdf version="1.9">model.sdf</sdf>
<description>Custom Blender model</description>
</model>
```
### model.sdf
```xml
<?xml version="1.0"?>
<sdf version="1.9">
<model name="my_model">
<static>true</static>
<link name="link">
<collision name="collision">
<geometry>
<mesh>
<uri>meshes/model.dae</uri>
<scale>1 1 1</scale>
</mesh>
</geometry>
</collision>
<visual name="visual">
<geometry>
<mesh>
<uri>meshes/model.dae</uri>
<scale>1 1 1</scale>
</mesh>
</geometry>
</visual>
</link>
</model>
</sdf>
```
## Step 4: Add to World
```xml
<include>
<uri>model://my_model</uri>
<name>my_model_instance</name>
<pose>5 3 0 0 0 0</pose>
</include>
```
## Step 5: Set Model Path
```bash
export GZ_SIM_RESOURCE_PATH=$PWD/gazebo/models:$GZ_SIM_RESOURCE_PATH
```
## Common Issues
### Model Not Found
```bash
export GZ_SIM_RESOURCE_PATH=/full/path/to/gazebo/models:$GZ_SIM_RESOURCE_PATH
```
### Scale Wrong
In Blender, check unit settings: Properties → Scene → Units
Adjust in SDF:
```xml
<scale>0.01 0.01 0.01</scale>
```
### Textures Not Showing
Put textures in `materials/textures/` and reference in DAE file.
Or add material in SDF:
```xml
<visual name="visual">
<geometry>
<mesh><uri>meshes/model.dae</uri></mesh>
</geometry>
<material>
<diffuse>0.8 0.2 0.2 1</diffuse>
</material>
</visual>
```
### Model Orientation Wrong
Blender uses Z-up, Gazebo uses Z-up. Should match.
If rotated, fix in Blender or use pose:
```xml
<pose>0 0 0 1.5708 0 0</pose>
```
## Simplified Collision
For complex meshes, use simple collision:
```xml
<collision name="collision">
<geometry>
<box><size>2 2 3</size></box>
</geometry>
</collision>
<visual name="visual">
<geometry>
<mesh><uri>meshes/complex_model.dae</uri></mesh>
</geometry>
</visual>
```
## Template
Copy the template:
```bash
cp -r gazebo/models/custom_object gazebo/models/my_model
```
Then:
1. Edit `model.config` with your name
2. Edit `model.sdf` with your model name
3. Put your `model.dae` in `meshes/`
## Test Model
```bash
gz sim -v4 gazebo/worlds/custom_landing.sdf
```

View File

@@ -1,73 +1,79 @@
# DroneController Guide
Implement your GPS-denied landing algorithm.
## 3-Phase Mission
## Quick Start
```
SEARCH ──► COMMAND ──► LAND ──► COMPLETE
```
1. Edit `drone_controller.py`
2. Find `calculate_landing_maneuver()`
3. Implement your algorithm
4. Test with any simulation mode
| Phase | Action |
|-------|--------|
| SEARCH | Find QR code on rover |
| COMMAND | Send commands to rover |
| LAND | Land on rover |
## Function to Implement
## Your Code
Edit `src/drone_controller.py`:
### Search Phase
```python
def calculate_search_maneuver(self, telemetry):
return (thrust, pitch, roll, yaw)
def detect_qr_code(self):
return {'data': 'qr_content', 'position': {...}} or None
```
### Command Phase
```python
def generate_rover_command(self, qr_data):
return {'type': 'move', 'x': 0, 'y': 0}
```
### Land Phase
```python
def calculate_landing_maneuver(self, telemetry, rover_telemetry):
# Your logic here
return (thrust, pitch, roll, yaw)
```
## Sensors (GPS-Denied)
## Telemetry
```python
# Altitude
altitude = telemetry['altimeter']['altitude']
vertical_vel = telemetry['altimeter']['vertical_velocity']
# Velocity
vel_x = telemetry['velocity']['x']
vel_y = telemetry['velocity']['y']
# Landing Pad (may be None!)
landing_pad = telemetry.get('landing_pad')
if landing_pad:
relative_x = landing_pad['relative_x']
relative_y = landing_pad['relative_y']
telemetry = {
"altimeter": {"altitude": 5.0, "vertical_velocity": -0.1},
"velocity": {"x": 0, "y": 0, "z": 0},
"imu": {"orientation": {"roll": 0, "pitch": 0, "yaw": 0}},
"landing_pad": {"relative_x": 0.5, "relative_y": -0.2, "distance": 5.0}
}
```
## Control Output
| Value | Range | Effect |
|-------|-------|--------|
| thrust | ±1.0 | Up/down (positive = up) |
| thrust | ±1.0 | Vertical velocity |
| pitch | ±0.5 | Forward/back |
| roll | ±0.5 | Left/right |
| yaw | ±0.5 | Rotation |
Note: In ArduPilot mode, these are scaled to velocities:
- Thrust → Z velocity
- Pitch/Roll → X/Y velocity
## Configuration
## Example Algorithm (PD Control)
Edit `config.py`:
```python
def calculate_landing_maneuver(self, telemetry, rover_telemetry):
alt = telemetry.get('altimeter', {})
altitude = alt.get('altitude', 5.0)
vert_vel = alt.get('vertical_velocity', 0.0)
# Altitude PD control
thrust = 0.5 * (target_alt - altitude) - 0.3 * vert_vel
# Horizontal control
pad = telemetry.get('landing_pad')
if pad:
pitch = 0.3 * pad['relative_x']
roll = 0.3 * pad['relative_y']
else:
# Hover
pitch = 0
roll = 0
return (thrust, pitch, roll, 0.0)
CONTROLLER = {
"Kp_z": 0.5,
"Kd_z": 0.3,
"Kp_xy": 0.3,
"Kd_xy": 0.2,
"rate": 50,
}
```
## Testing
```bash
./scripts/run_ardupilot_sim.sh runway
./scripts/run_ardupilot_controller.sh
```

View File

@@ -1,66 +0,0 @@
# Gazebo Simulation Guide
## Quick Start (2 Terminals)
**Terminal 1:**
```bash
ros2 launch gazebo/launch/drone_landing.launch.py
```
**Terminal 2:**
```bash
source activate.sh
python run_gazebo.py --pattern circular
```
## Camera Feed
```bash
python camera_viewer.py --topic /drone/camera
```
## Options
```bash
--pattern stationary, linear, circular, square, random
--speed Rover speed in m/s (default: 0.5)
--no-rover Disable rover movement
```
## Sensors
| Sensor | Source |
|--------|--------|
| IMU | Gazebo odometry |
| Altimeter | Z position |
| Camera | Camera sensor |
| Landing Pad | Relative position |
## ROS 2 Topics
| Topic | Direction |
|-------|-----------|
| `/cmd_vel` | Your commands → Drone |
| `/drone/telemetry` | Sensors → You |
| `/drone/camera` | Camera images |
## Headless Mode (WSL2)
```bash
ign gazebo -s gazebo/worlds/drone_landing.sdf
```
## Troubleshooting
**Drone falls:**
Check `run_gazebo.py` is running
**No camera:**
```bash
python camera_viewer.py --list # Find topics
```
**Model not found:**
```bash
export GZ_SIM_RESOURCE_PATH=$PWD/gazebo/models:$GZ_SIM_RESOURCE_PATH
```

167
docs/gazebo_worlds.md Normal file
View File

@@ -0,0 +1,167 @@
# Custom Gazebo Worlds
Create custom environments for the ARG simulation.
## Quick Start
1. Copy template world:
```bash
cp gazebo/worlds/custom_landing.sdf gazebo/worlds/my_world.sdf
```
2. Edit the world file
3. Run:
```bash
./scripts/run_ardupilot_sim.sh gazebo/worlds/my_world.sdf
```
## World Structure
```xml
<?xml version="1.0" ?>
<sdf version="1.9">
<world name="my_world">
<!-- Required plugins -->
<plugin filename="gz-sim-physics-system" name="gz::sim::systems::Physics"/>
<plugin filename="gz-sim-sensors-system" name="gz::sim::systems::Sensors"/>
<plugin filename="gz-sim-user-commands-system" name="gz::sim::systems::UserCommands"/>
<plugin filename="gz-sim-scene-broadcaster-system" name="gz::sim::systems::SceneBroadcaster"/>
<plugin filename="gz-sim-imu-system" name="gz::sim::systems::Imu"/>
<!-- Physics -->
<physics name="1ms" type="ode">
<max_step_size>0.001</max_step_size>
<real_time_factor>1.0</real_time_factor>
</physics>
<!-- Lighting -->
<light type="directional" name="sun">
<pose>0 0 10 0 0 0</pose>
<diffuse>0.8 0.8 0.8 1</diffuse>
</light>
<!-- Ground -->
<model name="ground">
<static>true</static>
<link name="link">
<collision name="collision">
<geometry><plane><normal>0 0 1</normal></plane></geometry>
</collision>
<visual name="visual">
<geometry><plane><normal>0 0 1</normal><size>100 100</size></plane></geometry>
</visual>
</link>
</model>
<!-- Your models here -->
<!-- ArduPilot drone (required) -->
<include>
<uri>model://iris_with_ardupilot</uri>
<name>iris</name>
<pose>0 0 0.195 0 0 0</pose>
</include>
</world>
</sdf>
```
## Adding Objects
### Basic Shapes
```xml
<model name="box">
<static>true</static>
<pose>5 0 0.5 0 0 0</pose>
<link name="link">
<collision name="collision">
<geometry><box><size>1 1 1</size></box></geometry>
</collision>
<visual name="visual">
<geometry><box><size>1 1 1</size></box></geometry>
<material>
<ambient>0.8 0.2 0.2 1</ambient>
</material>
</visual>
</link>
</model>
```
### Cylinder
```xml
<model name="cylinder">
<static>true</static>
<pose>0 5 1 0 0 0</pose>
<link name="link">
<collision name="collision">
<geometry><cylinder><radius>0.5</radius><length>2</length></cylinder></geometry>
</collision>
<visual name="visual">
<geometry><cylinder><radius>0.5</radius><length>2</length></cylinder></geometry>
</visual>
</link>
</model>
```
### Include Model
```xml
<include>
<uri>model://my_custom_model</uri>
<name>obstacle_1</name>
<pose>3 4 0 0 0 0.5</pose>
</include>
```
## Landing Pad with Rover
```xml
<model name="landing_pad">
<static>false</static>
<pose>0 0 0.05 0 0 0</pose>
<link name="base">
<inertial><mass>10</mass></inertial>
<collision name="collision">
<geometry><cylinder><radius>0.75</radius><length>0.1</length></cylinder></geometry>
</collision>
<visual name="visual">
<geometry><cylinder><radius>0.75</radius><length>0.1</length></cylinder></geometry>
<material><diffuse>0.2 0.8 0.2 1</diffuse></material>
</visual>
</link>
<plugin filename="gz-sim-velocity-control-system" name="gz::sim::systems::VelocityControl">
<topic>/landing_pad/cmd_vel</topic>
</plugin>
</model>
```
## Camera Sensor
Add to drone or create camera model:
```xml
<sensor name="camera" type="camera">
<pose>0 0 -0.1 0 1.5708 0</pose>
<always_on>true</always_on>
<update_rate>30</update_rate>
<camera>
<horizontal_fov>1.047</horizontal_fov>

</camera>
<topic>/drone/camera</topic>
</sensor>
```
## Tips
- Use `<static>true</static>` for non-moving objects
- Pose format: `x y z roll pitch yaw`
- Angles are in radians
- Colors are RGBA (0-1 range)

View File

@@ -1,110 +1,62 @@
# Installation Guide
# Installation
## Quick Install
```bash
./setup/install_ubuntu.sh
source activate.sh
python standalone_simulation.py
```
## Scripts
| Platform | Command |
|----------|---------|
| Ubuntu/Debian | `./setup/install_ubuntu.sh` |
| ArduPilot SITL | `./setup/install_ardupilot.sh` |
| macOS | `./setup/install_macos.sh` |
| Windows | `.\setup\install_windows.ps1` |
## Platform Support
| Mode | Ubuntu | macOS | Windows |
|------|--------|-------|---------|
| Standalone | ✅ | ✅ | ✅ |
| Gazebo | ✅ | ❌ | WSL2 |
| ArduPilot | ✅ | ❌ | WSL2 |
---
## Ubuntu/Debian
```bash
./setup/install_ubuntu.sh
source activate.sh
```
Installs: ROS 2, Gazebo, PyBullet, OpenCV, pymavlink
---
## ArduPilot SITL
```bash
./setup/install_ardupilot.sh
source ~/.bashrc
```
Installs: ArduPilot SITL, ardupilot_gazebo, MAVProxy
## What Gets Installed
**Run:**
```bash
./scripts/run_ardupilot_sim.sh camera
```
| Component | Location |
|-----------|----------|
| ArduPilot SITL | `~/ardupilot` |
| ardupilot_gazebo | `~/ardupilot_gazebo` |
| Gazebo Harmonic | System |
| ROS 2 | System |
| MAVProxy | `~/.local/bin` |
---
## GPU Support
The simulation auto-detects GPU:
| Priority | GPU Type | Notes |
|----------|----------|-------|
| 1 | NVIDIA | Best performance |
| 2 | Intel integrated | Good for laptops |
| 3 | AMD | Good performance |
| 4 | Software (llvmpipe) | Slow fallback |
Check your GPU:
```bash
glxinfo | grep "OpenGL renderer"
```
---
## Manual Install
## Dependencies
```bash
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python standalone_simulation.py
```
---
- pybullet
- numpy
- pillow
- opencv-python
- pymavlink
- pexpect
## Verify Installation
```bash
sim_vehicle.py --help
gz sim --help
```
## Troubleshooting
### Simulation is laggy
```bash
# Check GPU (should NOT show "llvmpipe")
glxinfo | grep "OpenGL renderer"
# Install GPU drivers
sudo apt install mesa-utils # Intel/AMD
sudo apt install nvidia-driver-535 # NVIDIA
```
### MAVProxy not found
```bash
pip3 install --user mavproxy
export PATH=$PATH:~/.local/bin
```
### sim_vehicle.py not found
```bash
export PATH=$PATH:~/ardupilot/Tools/autotest
```
### mavproxy.py not found
```bash
export PATH=$PATH:~/.local/bin
```
### pexpect error
```bash
pip install pexpect
```
### Gazebo slow
```bash
glxinfo | grep "OpenGL renderer"
```
Should show GPU, not "llvmpipe".

View File

@@ -1,71 +0,0 @@
# Communication Protocol
Message formats for drone operation.
## Commands
```json
{
"thrust": 0.5,
"pitch": 0.1,
"roll": -0.2,
"yaw": 0.0
}
```
| Field | Range | Effect |
|-------|-------|--------|
| thrust | ±1.0 | Up/down |
| pitch | ±0.5 | Forward/back |
| roll | ±0.5 | Left/right |
| yaw | ±0.5 | Rotation |
## Telemetry
```json
{
"imu": {
"orientation": {"roll": 0.0, "pitch": 0.0, "yaw": 0.0},
"angular_velocity": {"x": 0.0, "y": 0.0, "z": 0.0}
},
"altimeter": {
"altitude": 5.0,
"vertical_velocity": -0.1
},
"velocity": {"x": 0.0, "y": 0.0, "z": -0.1},
"landing_pad": {
"relative_x": 0.5,
"relative_y": -0.2,
"distance": 4.5,
"confidence": 0.85
},
"camera": {
"width": 320,
"height": 240,
"image": "<base64 JPEG>"
}
}
```
## Sensors
| Sensor | Fields |
|--------|--------|
| IMU | orientation (roll, pitch, yaw), angular_velocity |
| Altimeter | altitude, vertical_velocity |
| Velocity | x, y, z (m/s) |
| Landing Pad | relative_x, relative_y, distance, confidence |
| Camera | Base64 JPEG image |
## Decoding Camera
```python
import base64
import cv2
import numpy as np
image_b64 = telemetry['camera']['image']
image_bytes = base64.b64decode(image_b64)
nparr = np.frombuffer(image_bytes, np.uint8)
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
```

View File

@@ -1,68 +0,0 @@
# PyBullet Simulation Guide
## Standalone Mode (1 Terminal)
No ROS 2 required. Works on Windows, macOS, Linux:
```bash
source activate.sh
python standalone_simulation.py --pattern circular
```
## ROS 2 Mode (2 Terminals)
**Terminal 1 - Simulator:**
```bash
python simulation_host.py
```
**Terminal 2 - Controllers:**
```bash
python run_bridge.py --pattern circular
```
## Options
```bash
--pattern, -p stationary, linear, circular, square, random
--speed, -s Rover speed in m/s (default: 0.5)
--amplitude, -a Movement radius (default: 2.0)
```
## Remote Setup
**Machine 1:** `python simulation_host.py`
**Machine 2:** `python run_bridge.py --host <IP>`
## Sensors
| Sensor | Description |
|--------|-------------|
| IMU | Orientation, angular velocity |
| Altimeter | Altitude, vertical velocity |
| Velocity | Estimated velocity (x, y, z) |
| Camera | 320x240 downward JPEG |
| Landing Pad | Relative position |
## Configuration
Edit `config.py`:
```python
CONTROLLER = {
"Kp_z": 0.5,
"Kd_z": 0.3,
"Kp_xy": 0.3,
"Kd_xy": 0.2,
}
```
## Troubleshooting
**"Cannot connect to X server":**
```bash
xvfb-run python standalone_simulation.py
```
**Drone flies erratically:**
Reduce gains in `config.py`

View File

@@ -1,61 +0,0 @@
# Rover Controller
The RoverController creates a moving landing pad target.
## Usage
```bash
# Stationary (default)
python standalone_simulation.py --pattern stationary
# Moving
python standalone_simulation.py --pattern circular --speed 0.3
```
## Options
| Option | Default | Description |
|--------|---------|-------------|
| `--pattern, -p` | stationary | Movement pattern |
| `--speed, -s` | 0.5 | Speed in m/s |
| `--amplitude, -a` | 2.0 | Radius in meters |
## Patterns
| Pattern | Description |
|---------|-------------|
| stationary | Stays at origin |
| linear | Oscillates along X-axis |
| circular | Circular path |
| square | Square with sharp turns |
| random | Random positions |
## Difficulty Levels
| Level | Pattern | Speed |
|-------|---------|-------|
| Beginner | stationary | 0.0 |
| Easy | linear | 0.2 |
| Medium | circular | 0.3 |
| Hard | random | 0.3 |
| Expert | square | 0.5 |
## Progressive Testing
```bash
# 1. Static target
python standalone_simulation.py --pattern stationary
# 2. Slow circular
python standalone_simulation.py --pattern circular --speed 0.2
# 3. Faster circular
python standalone_simulation.py --pattern circular --speed 0.4
# 4. Random
python standalone_simulation.py --pattern random --speed 0.3
```
## Note
The drone cannot access rover position directly (GPS-denied). It must detect the landing pad visually via the camera.