Elegoo car modification for autonomous driving. More information here: https://youtu.be/1-7RTr_nGgs
After winning the "ELEGOO SMART ROBOT CAR KIT 3.0 PLUS" in a contest in my universities selfdriving cars lecture, I decided to convert it into a simple autonomous car using a (conditional) imitation learning approach.
The car was modified in such a way that it is able to drive in a simple 'newspaper-street-scenario' given only a first person camera image and a condition set by the user.
Using 3d printed parts, a raspberry pi (3 b+), a raspberry camera with fish-eye lens and a powerbank were mounted on top of the car.
The ultrasonic sensor was moved down and the line tracking module as well as the arduino were removed.
The network was built using pytorch's pretrained resnet50 model followed by some custom layers. Those layers were separately trained for each condition. During inference the corresponding subbranch is chosen based on the user input.
Many runs with changing street designs, floors and lighting were recorded. The floor not only changed how the scene looked but also had an impact on the ability to navigate with the car since it depends on sliding over the ground for steering.
To improve stability, the training data was recorded during day as well as during the night with different kinds of artificial lights. During the day some of the recordings were made in direct sunlight as well as in the shadow.
After the initial training, more samples were generated by manually intervening and correcting every time the agent failed. The correct behaviour was then recorded and a new agent trained using the original as well as the new training samples. This process was repeated several times.
Data augmentation was performed using imgaug and oversampling was used to obtain a more balanced dataset.