OUR MISSION

Workshop Waifu

The emotive tool holder & workshop companion

Image Description

A Friend for the workshop

Named "Carry" as it is built to carry tools, the Workshop Waifu v.1.38 is a robot designed to bring a little life to the workshop.  In the mech development lifecycle, it is the first attempt to control heavy-duty mech-grade motors with a computer (raspberry pi 3B+).  Controlling motors with a computer stead of manually opens up quite a few possibilities.

Timeline

17 April 2022

Stage1: motor & chassis

The basic frame was roughed out in PVC with motors zip-tied to the frame and connected to a tractor battery to see if it was possible for the unit to move.

18 April 2022

Stage2

Testing basic emoting by displaying an image of eyes that will randomly move around, and be briefly replaced by an image of eyelids to simulate blinking.  The amount of emotion that can be interpreted in just a pair of eyes is quite astounding.

21 April 2022

Stage3: Remote driving systems test

Using a bluetooth controller/mouse and code to translate mouse coordinates into motor motion.  The brief high-pitched whirring motion accompanying each change in driving motion is actually a servo physically flipping a switch with 3 positions: motor forward, backward and neutral.
It is much simpler to control a high-current motor via a switch/servo instead of multiple relays, and more visually interesting too!

24 April 2022

Stage 4: Chassis Expansion

Making the Workshop Waifu much taller opens up many possibilities, including more tool storage space and more personal interaction with people.

27 April 2022

Stage 5: Head Articulation

Adding more neck/head articulation and finishing the head surface makes it look much more like a robot.

12 May 2022

Stage 6: aCTUATOR/SERVO CONTROL

By adding a variable resistor(potentiometer) to the abdominal joint, the computer can detect the position of the linear actuator and as a consequence move it to a desired position.  This opens up unlimited possibilities for future mech developments, such as pre-programmed sequences.  Eventually, a single button can be used to throw a punch or any other physically possible emotive action.
Further aesthetic paneling is added with EVA foam/floor mat.  The material is easy to work with and very friendly for human interaction.

26 May 2022

Stage 7: Face tracking

While testing the face trackin for the head, the eye graphics went a little wonky.  Running multiple loops in python at a relatively usable speed takes some work to get just right

30 May 2022

Stage 8: Voice commands

The first voice command is implemented, making the robot respond to its name.  Using pocket sphinx and the voice_recognition library, it is much easier than one would expect, and completely self-contained.  No google API or cloud voice control APIs here!

05 July 2022

Stage 9: Completion

With all the code running at the same time and aesthetic paneling (mostly complete) the workshop Waifu's function has been proven.  The only thing that remains is a more stable method of locomotion, perhaps tank treads hidden under the skirt...

About

Nye Mechworks Builds giant robots that anyone can make on a DIY budget!

Follow Us

Created with Mobirise web page templates