199 lines
6.7 KiB
Markdown
199 lines
6.7 KiB
Markdown
# GPS-Denied Navigation
|
|
|
|
## Overview
|
|
|
|
This project implements **GPS-denied navigation** for UAV and UGV vehicles. Instead of relying on GPS for position estimation and waypoint navigation, the system uses vision-based sensors and local coordinate frames.
|
|
|
|
## Why GPS-Denied?
|
|
|
|
GPS-denied navigation is critical for scenarios where GPS is:
|
|
|
|
1. **Unavailable**: Indoor environments, underground, tunnels
|
|
2. **Unreliable**: Urban canyons with multipath errors
|
|
3. **Jammed/Spoofed**: Electronic warfare, contested environments
|
|
4. **Degraded**: Under bridges, heavy foliage, near tall structures
|
|
|
|
## Navigation Architecture
|
|
|
|
```
|
|
┌─────────────────────────────────────────────────┐
|
|
│ VISION SENSORS │
|
|
│ • Forward Camera (640x480, 30Hz) │
|
|
│ • Downward Camera (320x240, 30Hz) │
|
|
│ • Optional: Depth Camera, LiDAR │
|
|
└───────────────────┬─────────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────────────────────────────┐
|
|
│ VISUAL ODOMETRY MODULE │
|
|
│ • Feature detection (ORB/SIFT/SURF) │
|
|
│ • Feature matching between frames │
|
|
│ • Essential matrix estimation │
|
|
│ • Relative pose computation │
|
|
└───────────────────┬─────────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────────────────────────────┐
|
|
│ OPTICAL FLOW MODULE │
|
|
│ • Lucas-Kanade optical flow │
|
|
│ • Velocity estimation from flow vectors │
|
|
│ • Height compensation using rangefinder │
|
|
└───────────────────┬─────────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────────────────────────────┐
|
|
│ POSITION ESTIMATOR (EKF) │
|
|
│ Fuses: │
|
|
│ • Visual odometry (weight: 0.6) │
|
|
│ • Optical flow (weight: 0.3) │
|
|
│ • IMU integration (weight: 0.1) │
|
|
│ Output: Local position/velocity estimate │
|
|
└───────────────────┬─────────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────────────────────────────┐
|
|
│ NAVIGATION CONTROLLER │
|
|
│ • Waypoints in LOCAL frame (x, y, z meters) │
|
|
│ • Path planning using relative coordinates │
|
|
│ • Obstacle avoidance using depth/camera │
|
|
└─────────────────────────────────────────────────┘
|
|
```
|
|
|
|
## Coordinate Frames
|
|
|
|
### LOCAL_NED Frame
|
|
- **Origin**: Vehicle starting position
|
|
- **X**: North (forward)
|
|
- **Y**: East (right)
|
|
- **Z**: Down
|
|
|
|
### Body Frame
|
|
- **X**: Forward
|
|
- **Y**: Right
|
|
- **Z**: Down
|
|
|
|
## GPS Usage: Geofencing Only
|
|
|
|
While navigation is GPS-denied, GPS is still used for **geofencing** (safety boundaries):
|
|
|
|
```yaml
|
|
geofence:
|
|
enabled: true
|
|
use_gps: true # GPS ONLY for geofence check
|
|
|
|
polygon_points:
|
|
- {lat: 47.397742, lon: 8.545594}
|
|
- {lat: 47.398242, lon: 8.545594}
|
|
- {lat: 47.398242, lon: 8.546094}
|
|
- {lat: 47.397742, lon: 8.546094}
|
|
|
|
action_on_breach: "RTL" # Return to LOCAL origin
|
|
```
|
|
|
|
## Example: Relative Waypoint Mission
|
|
|
|
```python
|
|
# All waypoints are in LOCAL frame (meters from origin)
|
|
waypoints = [
|
|
{"x": 0, "y": 0, "z": 5}, # Takeoff to 5m
|
|
{"x": 10, "y": 0, "z": 5}, # 10m forward
|
|
{"x": 10, "y": 10, "z": 5}, # 10m right
|
|
{"x": 0, "y": 0, "z": 5}, # Return to origin
|
|
{"x": 0, "y": 0, "z": 0}, # Land
|
|
]
|
|
```
|
|
|
|
## Sensor Fusion Details
|
|
|
|
### Extended Kalman Filter State
|
|
|
|
```
|
|
State vector x = [px, py, pz, vx, vy, vz, ax, ay, az]
|
|
|
|
Where:
|
|
- px, py, pz = position (meters)
|
|
- vx, vy, vz = velocity (m/s)
|
|
- ax, ay, az = acceleration (m/s²)
|
|
```
|
|
|
|
### Measurement Sources
|
|
|
|
| Source | Measures | Update Rate | Noise |
|
|
|-----------------|----------------|-------------|-------|
|
|
| Visual Odometry | Position | 30 Hz | 0.05m |
|
|
| Optical Flow | Velocity | 60 Hz | 0.1m/s|
|
|
| IMU | Acceleration | 200 Hz | 0.2m/s²|
|
|
| Rangefinder | Altitude | 30 Hz | 0.02m |
|
|
|
|
## Drift Mitigation
|
|
|
|
GPS-denied navigation accumulates drift over time. Mitigation strategies:
|
|
|
|
1. **Visual Landmarks**: Detect and track known markers (ArUco)
|
|
2. **Loop Closure**: Recognize previously visited locations
|
|
3. **Ground Truth Reset**: Periodic reset in simulation
|
|
4. **Multi-Sensor Fusion**: Reduce individual sensor drift
|
|
|
|
## Configuration Parameters
|
|
|
|
### Visual Odometry
|
|
|
|
```yaml
|
|
visual_odometry:
|
|
method: "ORB" # or "SIFT", "SURF"
|
|
min_features: 100
|
|
max_features: 500
|
|
feature_quality: 0.01
|
|
```
|
|
|
|
### Optical Flow
|
|
|
|
```yaml
|
|
optical_flow:
|
|
method: "Lucas-Kanade"
|
|
window_size: 15
|
|
max_level: 3
|
|
min_altitude: 0.3 # meters
|
|
```
|
|
|
|
### Position Estimator
|
|
|
|
```yaml
|
|
position_estimator:
|
|
fusion_method: "EKF"
|
|
weights:
|
|
visual_odometry: 0.6
|
|
optical_flow: 0.3
|
|
imu: 0.1
|
|
```
|
|
|
|
## Testing GPS-Denied Navigation
|
|
|
|
### Indoor Warehouse World
|
|
```bash
|
|
bash scripts/run_simulation.sh worlds/indoor_warehouse.world
|
|
```
|
|
|
|
### Urban Canyon World
|
|
```bash
|
|
bash scripts/run_simulation.sh worlds/urban_canyon.world
|
|
```
|
|
|
|
## Troubleshooting
|
|
|
|
### High Position Drift
|
|
- Check camera exposure/focus
|
|
- Ensure sufficient visual features in environment
|
|
- Increase feature count in visual odometry
|
|
- Add visual markers to environment
|
|
|
|
### Vision Loss
|
|
- The failsafe handler triggers after 5 seconds of no visual odometry
|
|
- Action: HOLD (hover in place) by default
|
|
- Configure with `action_on_vision_loss` parameter
|
|
|
|
### Geofence Breach
|
|
- Vehicle returns to LOCAL origin (not GPS home)
|
|
- RTL uses the same local coordinate system
|
|
|