Autonomous robot car (deep learning)

maelhmaelh Posts: 5
edited 2020-08-29 - 23:55:57 in Robotics
(Pictures/videos in linked blog posts.)

I'd like to share an autonomous RC car project I worked on over a longer time.
It started with an RC car, where I modified the remote so I could control it over an Arduino, and attached a Raspberry Pi and camera to it, streaming the video back to a laptop, which then steers the car depending on the current camera frame, using the remote control.
https://blog.mh-nexus.de/2019/11/building-an-autonomous-robot-car/

The hardware part was working fine, but due to the delays introduced by streaming and the inferencing (based on a convolutional neural net) being a bit slow, the car would have a lot of steering "noise".
So I switched to simulation, to have more control over the hardware and timing, which allowed for more experiments and was much more precise:
https://blog.mh-nexus.de/2020/03/self-driving-car-based-on-deep-learning/

Now I made a reliable self-driving car in the "real world", with a NVIDIA Jetson based car, which has significantly more processing power, so that all can be done locally on the car, which improves latency a lot.
Next week I plan to make a video, when I get the replacement parts I had in warranty.

But comments would still be nice until I can upload a video of the final result :)

Comments

  • Duane DegnDuane Degn Posts: 10,348
    edited 2020-08-30 - 00:11:00
    Very cool!

    side.jpg
    maelh wrote: »
    The hardware part was working fine,

    I rarely find hardware more difficult than software in my robotic projects.

    I'm looking forward to seeing a video of the car in action.

    Hopefully we'll see your car in the figure-8 thread sometime in the near future.

    Thanks for sharing your robotics project with us. It's always fun to see what other people are doing.

  • maelhmaelh Posts: 5
    edited 2020-08-30 - 14:58:42
    Thanks.

    Video of the simulated car driving autonomously:
    https://blog.mh-nexus.de/wp-content/uploads/2019/11/varied-track-x5-with-audio-and-fadeout-fast.webm
    varied-track-x5-with-audio-and-fadeout-fast.jpg

    Some pictures of the modified remote control:
    soldering-finished-1024x683.jpg

    Modified it to add a little socket, so you can plug jumper wires to it to attach an Arduino, yet the remote control remains fully functional.
    I used a fiberglass pen to expose more copper on the PCB so I could add additional wires and drilled the holes open a bit. Later I painted the new connections because they were so close to each other that they would short out. It was pretty fiddly to make it all fit inside with the many additional wires.
    But now it's essentially as if you had a trainer port like on some remote controls for planes or game pads for old platforms.
    carving-filled-1024x683.jpg
    casing-of-wire-1024x683.jpg
    CleanedUpRC-1024x683.jpg
  • maelhmaelh Posts: 5
    edited 2020-08-30 - 15:14:18
    Duane Degn wrote: »
    I rarely find hardware more difficult than software in my robotic projects.
    I am a computer scientist, so usually hardware is the more tricky part for me.
    Though deep learning can be tricky as well, mostly due to the necessary computing power and data collection. Iterations are slow, much slower than with normal software development, where you get fast feedback and have many debugging tools. Analyzing models is definitely still an issue and awkward in machine learning.
    Hopefully we'll see your car in the figure-8 thread sometime in the near future.
    The simulated car does it already.
    Edit: it's not ultra precise, but this is kind of on purpose, because I wanted to see if it could drive unknown track types with minimal training, i.e., the model having only seen perfect (=90°) left and right turns, and completely straight ahead parts. And it does generalize to other types of curvature fine, indeed. Actually providing few perfect and strategically/representative examples works better, than many examples that contain some erroneous steering or imperfect driving.
    https://blog.mh-nexus.de/wp-content/uploads/2019/11/endless-sign-drive-x5-with-music-and-fadeout.webm

    The "unclean" driving is the car approximating/switching between the known categories. I found it quite interesting to watch. You could get much more precise driving with regression instead of classification and training other types of curves/bends.

    But the figure-8 challenge would definitely be useful for testing the quality of encoders. That would be a tricky hardware issue again though, due to mechanical tolerances that have plagued a lot of cheap builds. (Including cheap robots arms I tried so far.)

    I don't know if I have a room big enough to do the same with the real car. We'll see, still waiting for some components to come back from warranty.
    Thanks for sharing your robotics project with us. It's always fun to see what other people are doing.
    I agree. It's nice to share and see what other people do. Would love if I could share it in person, and watch others building things. Since there isn't anything around here like that it's places like this that can also work a bit :)

  • Well done! I'd try this haha
    Really interesting to see what others are doing, I agree
Sign In or Register to comment.