During the development process, Disney (if not Stark Industries) lawyers, British robotics experts and part-time Iron Man cosplayers must be on standby James Bruton Is building one using machine learning and a prosthesis Raspberry Pi Zero W Move by yourself. What could go wrong?
Bruton is not keen on embedding electrodes in his skull, so he sees the other limbs as input sources for his arms. He created a wearable motion capture suit to collect data and equipped it with various microcontrollers, including a Teensy 4.1 and an Adafruit MPU-6050 measurement unit. A similar assembly is mounted on the headband to track the user’s head.
By providing data from sensors to Raspberry Pi Zero, Bruton can train up to the arm with repetitive movements. This is his design and 3D printing, Can correctly predict what it should do based on the incoming sensor data.
The arm is installed on the backpack, and it can do simple tasks well, such as lifting when lifting the left leg, and lowering it again when lifting the right leg. Future plans include adding sensors to detect more movements, and the possibility of using a headband instead of a subcutaneous probe to use electromyography to sense brain waves, allowing the arm to read the user’s thoughts.
Anyone who is keen to follow in the footsteps of Bruton’s jingle can find his CAD files and codes GitHub.