Wall detection
をテンプレートにして作成
[
トップ
] [
新規
|
一覧
|
単語検索
|
最終更新
|
ヘルプ
|
ログイン
]
開始行:
[[Autonomous Vehicles and Mobile Land/Aerial Robots/Drones @ ASL]]
*Abstract [#d90504bd]
- Proceed to vacant space by referring to the value of the distance sensor facing three directions of 45 degrees left, 45 degrees front, right.
- Judgment of the direction of travel is judged by using a neural network.
- The learning test data of the neural network is generated based on the minimum turning radius of the vehicle body, the full width of the vehicle, and the like.
1) Learning model
2) Learning Test Data Generator
- Swivelable distance
- Swivel condition branch
Code file: &ref(ai.py,,);,&ref(train_model.py,,);,&ref(labelgenerator.py,,);
* Learning model [#jef5412e]
- Neural networks are created with multilayer perceptrons.
-- an input layer: 3, an output layer: 4, and an intermediate layer 11.
-- Input: Each sensor value [left, front, right]
-- Output: The type of each action [STOP, LEFT, FRONT, RIGHT]
CENTER:&ref(LM.png,,50%);
- Learning is done with supervised learning using test data generated by Learning Test Data Generator.
-Implementation by TensorFlow.
-Hidden Neuron number changed result:&ref(result_hidden.pdf,,);
* Learning Test Data Generator [#t8eadb80]
-Generate pairs of values of three distance sensors and control (labels) with that value.
--Input: Number of data to generate.
--Output: Each sensor value [left, front, right] and label data [STOP, LEFT, FRONT, RIGHT]
**Swivelable distance [#x77b2cd2]
-Based on the vehicle data, turn determination and stop determination are performed.
LEFT:&ref(VP.png,,35%);&ref(TD.png,,40%);
-#605 Motor Shield
-#902 Kerberos
-#105 Button
--Vehicle data
---Overall width:16.5cm
---Wheel base:19.0cm
---Nose:14.0cm
---Minimum turning radius:75cm
--Maximum turning radius:R
---R^2 = X^2 +Y^2 = 91.5^2 + 33^2, R = 97.27cm
--Turn determination distance: y_turn
---y_turn = R + margin_t = 100cm
--Inevitable distance: y_stop
--- r^2 = R^2 - (75 + 16.5/2)^2 = 97.27^2 - 83.25^2, r = 50.3
--- y_stop = r - 33 + margin_s = 50.3 - 33 + margin_s = 17.3 + margin_s = 27.3cm
**Swivel condition branch [#f7a0046f]
-Each of the three distance sensor values can be thought of as the following three patterns.
--Pattern 0: Since the distance to the obstacle is far, no control is required
-> sensor > y_turn
--Pattern 1: Since the distance to the obstacle is short, it is necessary to control
-> sensor < y_turn
--Pattern 2: Since the distance to the obstacle is too close, it can not move in that direction
-> sensor < y_stop
-Determine which of these patterns the left, front, right sensor values correspond to,
Next, create a branch to determine the direction of travel.
>>Control [sensor][pattern]:action
>>Control 1 [Front][2] :Stop because you can not go straight or turn
Control branch 1 [Left][0, 1]&[Front][1]&[Right] [0, 1] - There is an obstacle ahead. It can turn left and right. The distance between the left and right turns towards the far side
--Control 2: When the distance is farther on the right. turn right
--Control 3: When the distance is farther on the left. Turn to the left
>>Control 4 [Right][ 0 ]&[Front][ 0 ]&[Left][ 0 ]: Go straight ahead as it is unnecessary
>>Control 5 [Left][1, 2]&[right][ 0 ]: The distance to the obstacle is too close or too close, turn right
>>Control 6 [Left][ 0 ]&[Right][ 1, 2 ]:Right turn to the left because the distance to the obstacle is too close or too close
>>Control 7 [Left][ 2 ]&[Right][ 2 ]:Neither left nor right turnable. It is possible to go forward because it passes control 1.
Control branch 2 [Left][ 1 ]&[Right][ 1 ] : Both left and right are close to obstacles, so adjust the width so that it is in the middle
--Control 8: When the distance is farther on the right side. turn right
--Control 9: When the distance is farther on the left. Turn to the left
>>Control 10 [Left][ 2 ]&[Right][ 1 ]: Left turn to the right because the distance to the obstacle is too close
>>Control 11 [Left][ 1 ]&[Right][ 2 ]: Right turn to the left because the distance to the obstacle is too close
*DEMO [#be34b673]
**Identical procedure [#y2c806c3]
-Login Raspberry PI (RobotCar)
>>ssh pi@192.168.xxx.xxx (192.168.11.31 10/15)
-Login docker conteiner
>>sudo su
>>docker exec -it CONTAINER_ID /bin/bash
-Move RobotCar directory
>>cd /notebooks/github/RobotCarAI/
**Progression direction prediction [#k38618ae]
# Current directory is /notebooks/github/RobotCarAI/
>>cd level1_sensors
>>python run_ai.py
-Directory (/notebooks/github/RobotCarAI/level1_sensors)
--document : document related
--fabolib : Fabo board related
--generator : Label generation related to learning data
--lib : forecast related
--MLP : Learning and pb file creation related
--model : learned model storage area
-Files (/notebooks/github/RobotCarAI/level1_sensors)
--run_ai_eval.py: Prediction accuracy evaluation code within the learning range
--run_ai_eval_400.py: Prediction accuracy evaluation code outside the learning range
--run_ai.py: Code to retrieve sensor values and execute prediction. Fabo # 902, LidarLite v3 required.
--MLP/train_model.py: Learning execution code
---A log file for Tensorboard is output to MLP/log/
---A checkpoint file is output to MLP/model/
--MLP/freeze_graph.py: pb file creation code
---Create MLP/model/car_model.pb file
--MLP/run_ai_test.py: Code to execute prediction with random value as input value
**Driving Demo [#bc599a20]
# Current directory is /notebooks/github/RobotCarAI/
>>cd level1_demo/
>>python start_button.py
# To start running, press the blue button of the robot car
# To stop driving, press the red button on the robot car
-Directory
--document: Document related
--fabolib: Fabo board related
-- lib: SPI, AI library
-Files
--run_ai_ai.py: Automatic driving code
-- start_button.py: start button code
終了行:
[[Autonomous Vehicles and Mobile Land/Aerial Robots/Drones @ ASL]]
*Abstract [#d90504bd]
- Proceed to vacant space by referring to the value of the distance sensor facing three directions of 45 degrees left, 45 degrees front, right.
- Judgment of the direction of travel is judged by using a neural network.
- The learning test data of the neural network is generated based on the minimum turning radius of the vehicle body, the full width of the vehicle, and the like.
1) Learning model
2) Learning Test Data Generator
- Swivelable distance
- Swivel condition branch
Code file: &ref(ai.py,,);,&ref(train_model.py,,);,&ref(labelgenerator.py,,);
* Learning model [#jef5412e]
- Neural networks are created with multilayer perceptrons.
-- an input layer: 3, an output layer: 4, and an intermediate layer 11.
-- Input: Each sensor value [left, front, right]
-- Output: The type of each action [STOP, LEFT, FRONT, RIGHT]
CENTER:&ref(LM.png,,50%);
- Learning is done with supervised learning using test data generated by Learning Test Data Generator.
-Implementation by TensorFlow.
-Hidden Neuron number changed result:&ref(result_hidden.pdf,,);
* Learning Test Data Generator [#t8eadb80]
-Generate pairs of values of three distance sensors and control (labels) with that value.
--Input: Number of data to generate.
--Output: Each sensor value [left, front, right] and label data [STOP, LEFT, FRONT, RIGHT]
**Swivelable distance [#x77b2cd2]
-Based on the vehicle data, turn determination and stop determination are performed.
LEFT:&ref(VP.png,,35%);&ref(TD.png,,40%);
-#605 Motor Shield
-#902 Kerberos
-#105 Button
--Vehicle data
---Overall width:16.5cm
---Wheel base:19.0cm
---Nose:14.0cm
---Minimum turning radius:75cm
--Maximum turning radius:R
---R^2 = X^2 +Y^2 = 91.5^2 + 33^2, R = 97.27cm
--Turn determination distance: y_turn
---y_turn = R + margin_t = 100cm
--Inevitable distance: y_stop
--- r^2 = R^2 - (75 + 16.5/2)^2 = 97.27^2 - 83.25^2, r = 50.3
--- y_stop = r - 33 + margin_s = 50.3 - 33 + margin_s = 17.3 + margin_s = 27.3cm
**Swivel condition branch [#f7a0046f]
-Each of the three distance sensor values can be thought of as the following three patterns.
--Pattern 0: Since the distance to the obstacle is far, no control is required
-> sensor > y_turn
--Pattern 1: Since the distance to the obstacle is short, it is necessary to control
-> sensor < y_turn
--Pattern 2: Since the distance to the obstacle is too close, it can not move in that direction
-> sensor < y_stop
-Determine which of these patterns the left, front, right sensor values correspond to,
Next, create a branch to determine the direction of travel.
>>Control [sensor][pattern]:action
>>Control 1 [Front][2] :Stop because you can not go straight or turn
Control branch 1 [Left][0, 1]&[Front][1]&[Right] [0, 1] - There is an obstacle ahead. It can turn left and right. The distance between the left and right turns towards the far side
--Control 2: When the distance is farther on the right. turn right
--Control 3: When the distance is farther on the left. Turn to the left
>>Control 4 [Right][ 0 ]&[Front][ 0 ]&[Left][ 0 ]: Go straight ahead as it is unnecessary
>>Control 5 [Left][1, 2]&[right][ 0 ]: The distance to the obstacle is too close or too close, turn right
>>Control 6 [Left][ 0 ]&[Right][ 1, 2 ]:Right turn to the left because the distance to the obstacle is too close or too close
>>Control 7 [Left][ 2 ]&[Right][ 2 ]:Neither left nor right turnable. It is possible to go forward because it passes control 1.
Control branch 2 [Left][ 1 ]&[Right][ 1 ] : Both left and right are close to obstacles, so adjust the width so that it is in the middle
--Control 8: When the distance is farther on the right side. turn right
--Control 9: When the distance is farther on the left. Turn to the left
>>Control 10 [Left][ 2 ]&[Right][ 1 ]: Left turn to the right because the distance to the obstacle is too close
>>Control 11 [Left][ 1 ]&[Right][ 2 ]: Right turn to the left because the distance to the obstacle is too close
*DEMO [#be34b673]
**Identical procedure [#y2c806c3]
-Login Raspberry PI (RobotCar)
>>ssh pi@192.168.xxx.xxx (192.168.11.31 10/15)
-Login docker conteiner
>>sudo su
>>docker exec -it CONTAINER_ID /bin/bash
-Move RobotCar directory
>>cd /notebooks/github/RobotCarAI/
**Progression direction prediction [#k38618ae]
# Current directory is /notebooks/github/RobotCarAI/
>>cd level1_sensors
>>python run_ai.py
-Directory (/notebooks/github/RobotCarAI/level1_sensors)
--document : document related
--fabolib : Fabo board related
--generator : Label generation related to learning data
--lib : forecast related
--MLP : Learning and pb file creation related
--model : learned model storage area
-Files (/notebooks/github/RobotCarAI/level1_sensors)
--run_ai_eval.py: Prediction accuracy evaluation code within the learning range
--run_ai_eval_400.py: Prediction accuracy evaluation code outside the learning range
--run_ai.py: Code to retrieve sensor values and execute prediction. Fabo # 902, LidarLite v3 required.
--MLP/train_model.py: Learning execution code
---A log file for Tensorboard is output to MLP/log/
---A checkpoint file is output to MLP/model/
--MLP/freeze_graph.py: pb file creation code
---Create MLP/model/car_model.pb file
--MLP/run_ai_test.py: Code to execute prediction with random value as input value
**Driving Demo [#bc599a20]
# Current directory is /notebooks/github/RobotCarAI/
>>cd level1_demo/
>>python start_button.py
# To start running, press the blue button of the robot car
# To stop driving, press the red button on the robot car
-Directory
--document: Document related
--fabolib: Fabo board related
-- lib: SPI, AI library
-Files
--run_ai_ai.py: Automatic driving code
-- start_button.py: start button code
ページ名: