Scripts Update

This commit is contained in:
2026-02-09 04:52:32 +00:00
parent 2d3b795d82
commit 79f748d35d
10 changed files with 861 additions and 1376 deletions

View File

@@ -1,198 +1,128 @@
# GPS-Denied Navigation
## Overview
How the system navigates without GPS.
This project implements **GPS-denied navigation** for UAV and UGV vehicles. Instead of relying on GPS for position estimation and waypoint navigation, the system uses vision-based sensors and local coordinate frames.
## Principle
## Why GPS-Denied?
All navigation uses relative positioning from visual sensors. GPS is only used for geofencing (safety boundaries).
GPS-denied navigation is critical for scenarios where GPS is:
| Function | GPS Used? |
|----------|-----------|
| Position estimation | No - visual odometry |
| Waypoint navigation | No - local coordinates |
| Velocity control | No - optical flow |
| Geofencing | Yes - safety only |
1. **Unavailable**: Indoor environments, underground, tunnels
2. **Unreliable**: Urban canyons with multipath errors
3. **Jammed/Spoofed**: Electronic warfare, contested environments
4. **Degraded**: Under bridges, heavy foliage, near tall structures
## Position Estimation
## Navigation Architecture
### Visual Odometry
1. Detect features in camera image (ORB, SIFT)
2. Match features between consecutive frames
3. Estimate camera motion from feature displacement
4. Accumulate motion into position estimate
### Optical Flow
1. Capture ground images from downward camera
2. Measure pixel displacement between frames
3. Convert to velocity using altitude
4. Integrate for position
### Sensor Fusion
Extended Kalman Filter combines:
- Visual odometry (position)
- Optical flow (velocity)
- IMU (acceleration, rotation)
- Barometer (altitude)
Output: Full 6-DOF pose estimate
## ArduPilot Configuration
Key parameters for GPS-denied operation:
```
┌─────────────────────────────────────────────────┐
│ VISION SENSORS │
│ • Forward Camera (640x480, 30Hz) │
│ • Downward Camera (320x240, 30Hz) │
│ • Optional: Depth Camera, LiDAR │
└───────────────────┬─────────────────────────────┘
┌─────────────────────────────────────────────────┐
│ VISUAL ODOMETRY MODULE │
│ • Feature detection (ORB/SIFT/SURF) │
│ • Feature matching between frames │
│ • Essential matrix estimation │
│ • Relative pose computation │
└───────────────────┬─────────────────────────────┘
┌─────────────────────────────────────────────────┐
│ OPTICAL FLOW MODULE │
│ • Lucas-Kanade optical flow │
│ • Velocity estimation from flow vectors │
│ • Height compensation using rangefinder │
└───────────────────┬─────────────────────────────┘
┌─────────────────────────────────────────────────┐
│ POSITION ESTIMATOR (EKF) │
│ Fuses: │
│ • Visual odometry (weight: 0.6) │
│ • Optical flow (weight: 0.3) │
│ • IMU integration (weight: 0.1) │
│ Output: Local position/velocity estimate │
└───────────────────┬─────────────────────────────┘
┌─────────────────────────────────────────────────┐
│ NAVIGATION CONTROLLER │
│ • Waypoints in LOCAL frame (x, y, z meters) │
│ • Path planning using relative coordinates │
│ • Obstacle avoidance using depth/camera │
└─────────────────────────────────────────────────┘
# EKF Source Configuration
EK3_SRC1_POSXY = 6 # External Nav for position
EK3_SRC1_VELXY = 6 # External Nav for velocity
EK3_SRC1_POSZ = 1 # Barometer for altitude
# Disable GPS for navigation
GPS_TYPE = 0 # No GPS (or keep for geofence)
# Enable external navigation
VISO_TYPE = 1 # Enable visual odometry input
# Arming checks
ARMING_CHECK = 0 # Disable pre-arm checks (for testing)
```
## Coordinate Frames
See `config/ardupilot_gps_denied.parm` for complete parameters.
### LOCAL_NED Frame
- **Origin**: Vehicle starting position
- **X**: North (forward)
- **Y**: East (right)
- **Z**: Down
## Sending Position to ArduPilot
### Body Frame
- **X**: Forward
- **Y**: Right
- **Z**: Down
Visual odometry sends position via MAVLink:
## GPS Usage: Geofencing Only
```python
# VISION_POSITION_ESTIMATE message
msg = mavutil.mavlink.MAVLink_vision_position_estimate_message(
usec=timestamp_us,
x=position_x, # meters, NED frame
y=position_y,
z=position_z,
roll=roll, # radians
pitch=pitch,
yaw=yaw
)
```
While navigation is GPS-denied, GPS is still used for **geofencing** (safety boundaries):
## Drift Mitigation
Visual odometry accumulates drift over time. Strategies:
1. **Loop Closure**: Recognize previously visited locations
2. **Landmark Matching**: Use known visual markers
3. **Multi-Sensor Fusion**: Weight sensors by confidence
4. **Periodic Reset**: Return to known position
## Geofencing
GPS is only used for safety boundaries:
```yaml
geofence:
enabled: true
use_gps: true # GPS ONLY for geofence check
polygon_points:
- {lat: 47.397742, lon: 8.545594}
- {lat: 47.398242, lon: 8.545594}
- {lat: 47.398242, lon: 8.546094}
- {lat: 47.397742, lon: 8.546094}
action_on_breach: "RTL" # Return to LOCAL origin
use_gps: true
fence_type: polygon
action: RTL
max_altitude: 50
```
## Example: Relative Waypoint Mission
If drone crosses boundary, triggers return-to-launch.
## Coordinate System
All waypoints use local NED coordinates:
- X: North (meters from origin)
- Y: East (meters from origin)
- Z: Down (negative for altitude)
Example mission:
```python
# All waypoints are in LOCAL frame (meters from origin)
waypoints = [
{"x": 0, "y": 0, "z": 5}, # Takeoff to 5m
{"x": 10, "y": 0, "z": 5}, # 10m forward
{"x": 10, "y": 10, "z": 5}, # 10m right
{"x": 0, "y": 0, "z": 5}, # Return to origin
{"x": 0, "y": 0, "z": -5}, # Takeoff to 5m
{"x": 10, "y": 0, "z": -5}, # 10m north
{"x": 10, "y": 10, "z": -5}, # 10m east
{"x": 0, "y": 0, "z": -5}, # Return
{"x": 0, "y": 0, "z": 0}, # Land
]
```
## Sensor Fusion Details
### Extended Kalman Filter State
```
State vector x = [px, py, pz, vx, vy, vz, ax, ay, az]
Where:
- px, py, pz = position (meters)
- vx, vy, vz = velocity (m/s)
- ax, ay, az = acceleration (m/s²)
```
### Measurement Sources
| Source | Measures | Update Rate | Noise |
|-----------------|----------------|-------------|-------|
| Visual Odometry | Position | 30 Hz | 0.05m |
| Optical Flow | Velocity | 60 Hz | 0.1m/s|
| IMU | Acceleration | 200 Hz | 0.2m/s²|
| Rangefinder | Altitude | 30 Hz | 0.02m |
## Drift Mitigation
GPS-denied navigation accumulates drift over time. Mitigation strategies:
1. **Visual Landmarks**: Detect and track known markers (ArUco)
2. **Loop Closure**: Recognize previously visited locations
3. **Ground Truth Reset**: Periodic reset in simulation
4. **Multi-Sensor Fusion**: Reduce individual sensor drift
## Configuration Parameters
### Visual Odometry
```yaml
visual_odometry:
method: "ORB" # or "SIFT", "SURF"
min_features: 100
max_features: 500
feature_quality: 0.01
```
### Optical Flow
```yaml
optical_flow:
method: "Lucas-Kanade"
window_size: 15
max_level: 3
min_altitude: 0.3 # meters
```
### Position Estimator
```yaml
position_estimator:
fusion_method: "EKF"
weights:
visual_odometry: 0.6
optical_flow: 0.3
imu: 0.1
```
## Testing GPS-Denied Navigation
### Indoor Warehouse World
```bash
bash scripts/run_simulation.sh worlds/indoor_warehouse.world
```
### Urban Canyon World
```bash
bash scripts/run_simulation.sh worlds/urban_canyon.world
```
## Troubleshooting
### High Position Drift
- Check camera exposure/focus
- Ensure sufficient visual features in environment
- Increase feature count in visual odometry
- Add visual markers to environment
### Vision Loss
- The failsafe handler triggers after 5 seconds of no visual odometry
- Action: HOLD (hover in place) by default
- Configure with `action_on_vision_loss` parameter
### Geofence Breach
- Vehicle returns to LOCAL origin (not GPS home)
- RTL uses the same local coordinate system
## Limitations
- Drift accumulates over distance/time
- Requires visual features (fails in featureless environments)
- Requires sufficient lighting
- Performance degrades with fast motion or blur