Shop OBEX P1 Docs P2 Docs Learn Events
An anti-covid disinfecting robot family project — Parallax Forums

An anti-covid disinfecting robot family project

pik33pik33 Posts: 2,387
edited 2021-11-29 19:40 in Propeller 2

I posted this in the general discussion, but as it is powered by a P2 I am posting this also here.

There is an official post and short movie about the robot at our university web pages and YT channel.
Both in Polish, but:

  • the Google Translate does a decent job in translating the post
  • YT - while not translated, you can see the robot and our team. White lamps instead of UV for safety - uncovered for the demonstration)

A post on our university web: https://we.pb.edu.pl/2021/11/18/rob-uv-autonomiczna-dezynfekcja-pomieszczen-ktora-wykorzystuje-promieniowanie-uv-c/

Google Translate: https://we-pb-edu-pl.translate.goog/2021/11/18/rob-uv-autonomiczna-dezynfekcja-pomieszczen-ktora-wykorzystuje-promieniowanie-uv-c/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=pl&_x_tr_pto=nui

Youtube:

Several details:

The main brain of the robot is a board with a slot for a P2 Edge module. The module controls 2 BLDC motors, UVC lamps, fans, 8 ultrasonic proximity sensors and a remote controller. The robot has also a pair of speakers and a HDMI screen, both also powered by a P2. There is also a RPi4 in it, to control L515 lidar/depth camera. The processed proximity data go to the P2 via UART.

Comments

  • That's super neat, Pik33.

    I like how you (subconsciously?) worked the shape of the P2-eval board into the robot chassis shape : )

    How have you found the depth cameras to work with, do they have blindness to things like dark velvet?

  • pik33pik33 Posts: 2,387
    edited 2021-11-30 08:23

    The shape is not my design: mechanical engineers did this, and I even didn't notice yet its P2 Eval shape

    I got an advertisement of this camera in my mail. We tested it and of course it has its weak point. It works with 850 nm light, so it doesn't like sunlight. It doesn't also like black surfaces. Too heavy stuff for P2 - it needs USB3, so RPi4 used to preprocess its data. RPi4 can be also machine learned. Not yet of course. It does simple processing, reducing data stream from many MB/s to several kB/s and send it to the P2.

    The velvet (or any soft material) is another problem, this time for ultrasonic sensors. This means the robot will not stop and hit the black velvet wall.

    In the COVID hospital corridors there is no sunlight, no velvet walls, and no people dressed fully in black, so it should work fine there.

  • When it comes to the RealSense L515, there is a small topic about non-optimal surfaces at the following publication link (from Intel):

    https://intelrealsense.com/optimizing-the-lidar-camera-l515-range/

    Perhaps some kind of flexible whiskers can also be used, as a last-resort obstable avoidance solution.

Sign In or Register to comment.