README.md 2.56 KB
Newer Older
Mugisha David's avatar
Mugisha David committed
1 2 3 4 5 6
# Rover driver 

This is part of my work to homework 4. The main goal is to control a 6 wheel rover using a joystick, therefore applying basics of mobile robot kinematics.

## Speed and Steering distribution 

7 8 9 10 11 12 13 14 15 16 17 18 19
We compute each speed and steering wheel using a standard velocity tensor i.e `Vw = Vb + Ωb ∧ bw`. 
Since each wheel sensor is accessible thanks to `driver_cfg`, we can deduce the steering and speed by applying classic formulas.

Our first approach was reducing the number of wheels, by grouping them by pair, resulting in an artificial rover of three wheels. However, this only allows the robots to move forward. 
Here is a quick representation of the reduction, with `x` the origin of the robot frame and `o` a wheel.
```
   -----      -
   o   o      o
   | x |      x
   o   o  =>  o
   |   |      |
   o---o      o
```
Mugisha David's avatar
Mugisha David committed
20 21 22 23
The rover cannot turn 	effectively on itself with that configuration. As matter of fact, applying a full right command (or left) would result for the wheel to have 90 degrees position, which is valid for a three  wheeled vehicles whose wheels are aligned with the frame origin.

Illustrations

Mugisha David's avatar
Mugisha David committed
24 25 26 27
Wheels grouped by pair
![Grouped wheels](https://gitlab.centralesupelec.fr/mugisha_dav/rover_driver_base/raw/bc6974fcfe8def8c173de7cc2841bb1a5523276a/snapshots/rover_turn_conf_3_wheels.png)
Wheels not grouped
![Decoupled wheels](https://gitlab.centralesupelec.fr/mugisha_dav/rover_driver_base/raw/bc6974fcfe8def8c173de7cc2841bb1a5523276a/snapshots/rover_driver_6_wheel_config.png)
Mugisha David's avatar
Mugisha David committed
28 29


30 31

The second approach is more thorough and consider each wheel without any reduction step.
Mugisha David's avatar
Mugisha David committed
32 33 34

## 2D Odometry

Mugisha David's avatar
Mugisha David committed
35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52
We used a least square method to compute the position and steering of the robot in the global frame.
We had 12 equations each one resulting from relations between wheels data and position in the global frame. 

For each wheel we have the following : 
`Vw = V - Ω*W` 
`Sw = V*dt - Ω∧W*dt`
resulting in 
`Sw*cos(beta_w) = delta_x - delta_theta*Wy` 
and 
`Sw*sin(beta_w) = delta_y + delta_theta*Wx`
after projection
where 
* Sw, delta_x stands for the position of each wheel respectively in the robot  and global frame, 
* Ω rotation speed in the global frame and position of the wheel in the global frame
* `beta_w` bearing of the wheel
* `Wy` and `Wx` are actually the radius of each wheel


Mugisha David's avatar
Mugisha David committed
53 54 55
For the odometry, we adopted a Least Mean Square method to estimate x, y and theta.
We visualized odometry using _rviz_ to see how the vehicle is tracked based on motion sensors. 
There is some glitch in the overall behavior, might be linked to the periodicity of the estimated angles.
Mugisha David's avatar
Mugisha David committed
56 57 58 59

## Task Manager

TODO