The car
DonkeyCar S1 is a 1:10 scale toy car, equipped with camera, raspberry pi and remote control board. It comes assembled and functional, no hardware skills are needed to use it (as opposed to alternative platforms). It has a strong and responsive community in Discord.
The DonkeyCar:
- Has a mobile app that allows you to get driving with the car in less than 10 minutes. Driving means collecting human-driving data that AI can learn from!
- You can train a type of model within the app, without needing even to install anything on your computer. You just drive around and based on the recordings of your driving, you can train a model to imitate you.
Attach:demo_AI.mov
This is AI driving,learn based on 20 clean laps of human driving you model should behave like this, but also deal well with intersections and obstacles.
- The car can be controlled via a mobile app, web application or a game controller. You can choose when to record, if to delete the last 100 frames (in case of behaviour you don’t want the model to learn, e.g. you crashed the car) and so on, either via application or via buttons on the controller.
- For more advanced models, there is a codebase that allows you to transfer recordings from the car to your computer, train a model in your computer or in Google Colabs, transfer the code/models back to the car and launch them.
- The codebase supports multiple model types.
- The cars can be configured to connect to any wifi network, being in the same network you can connect to the car via ssh if you know the device name.
- The battery lasts for a few hours of driving, but is a fire hazard and needs to be kept in fireproof bags when not in use. Please adhere to the safety rules regarding the batteries.
Based on prior projects, a few more words about the capabilities and limitations of the hardware:
- 180degree turning diameter as measured by outside wheel 140-160 cm, depending on the car
- cannot maintain speed when hardware heats up (need to manually turn up throttle)
- all driving so far has been done with end-to-end systems. So far only steering has been controlled, but it is possible to control speed too, it just needs more data to learn.
- can compute simple CNN prediction in 40ms, enough frequency for driving
- Can be taught to drive between walls, or follow a line. Can avoid obstacles. Can adhere to commands "turn left, turn right" or other similar (drive slow), but this takes more data collection.
- can be taught to give way to another donkey coming from the right
- can drive indoors and outdoors
- quite sensitive to light conditions, as expected