Saturday, October 29, 2011

Pythagoras Version 2: Steppers!

The accuracy of servos was maddening. Since I couldn't tweak the control loop or control the velocity, going back and forth across a line would create two separate paths. It overshot corners like crazy, and made the manipulator oscillate. The search began for a better control.

I had a few options.
  1. Rip out the servos' circuitry and roll my own control loop
  2. Rip out the servos' circuitry and use OpenServo's circuitry
  3. Ditch servos completely and use stepper motors with gearing
With option 1, I would likely be implementing PID control with some sort of feedforward component. Option 2 was the least appealing, since I would have to adapt the circuitry to my servos. Also, they were pretty expensive. Option 3 I didn't seriously consider at first. Thinking about it, though, steppers have all the characteristics I want. I can easily control both velocity and position. All this without messing with control loops. Sounds pretty good.

The downside is that I have no absolute position reference to start the counting. But that's what potentiometers are for!

Design And Build

Learning from past mistakes, I made the upper-to-lower arm length ratio much smaller, to get the precision over a larger range on the XY plane, at the expense of Z traversal. Potentiometers were linked to the output shaft through gears. I cooked up something that looked like this.

I also rotated the supports and added tabs, so they stay straight better. Later, I would find that using a more flexible metal (6061 aluminum) in this way is a bad idea.

I had a pretty convoluted way of getting both gears and arms onto the stepper shaft. Since I needed the arms to be able to turn past the potentiometer gear, the gears needed to be separate and machined separately.  And I also needed a spacer to keep the arms from colliding with the potentiometer gear. The solution: A stack of parts screwed together: 1/8" thick arm, then 1/8" gear, then 1/4" combination spacer and set screw holder.

SolidWorks really crapped out solving the model. Since I kept the upper arms floating to check the range of the manipulator, I kept getting overdefined errors if I tweaked the manipulator a little too far. Moving the manipulator took some time for the computer to solve the model. But I managed. As always, the design files are linked below in the Appendix. Time to find the right motors!

Stepper Motors

After much Google-Fu, I setteled on some 3V, 1.6A, 233 oz-in geared stepper motors from RobotShop. They have dandy planetary gearboxes with very little backlash (< 1 degree), with a gear ratio of 57/11 to 1. Given 200 steps per revolution, after the gearbox, I can get 0.35 degrees per step before microstepping. With my current eight microsteps, I get 0.04375 degrees per step. MWAHAHAHA.

They look like this in real life.
But now to drive the motors from a microcontroller signal. I used Allegro's A4988 stepper drivers on Pololu's breakout board. They support 8-35V, up to 2 A of current, do really neat current control for accurate microstepping, and microstep 2, 4, 8, and 16 divisions.

What is microstepping? Stepper motors move in discrete steps, so to speak. Microstepping allows a motor to stop at various places between the halfway points, multiplying the number of steps a stepper makes per revolution. For example, my steppers are 200 steps per revolution. Microstep two divisions, and it becomes 400. Eight times, and I get 1600 steps per revolution. Of course, this comes at a cost of torque. More steps, less torque in the motor. Also, the microsteps are less accurate then the whole steps, but the loss is negligible in this case (3-5%). Since I'm driving a light load, the torque loss is acceptable too. More information on microstepping and how it's actually done on the Wikipedia page.

Cutting the Parts!

Cutting the parts in the Invention Studio at Georgia Tech lets me use a higher quality and thus better edges. Yay! Here is preliminary results. You can see the arm-gear-set screw holder stack on the stepper's output.
Cutting out the rest of the parts. I took the manipulator and ball joints from version 1.
The entire wiring for the drivers and pots was done point to point with small wire. Nasty, nasty stuff. I'm not  proud, but it did work.
All the control signals were wired down through a ribbon cable. Yes, this also includes the analog signals from the potentiometers. Running around the motors. In a ribbon cable parallel to switching digital logic signals. And no filtering, high impedance potentiometers (5 kOhm), and you basically get really really bad data. But I didn't know this now because I have a new control algorithm to write for the steppers.

Coming Up Next
  • Updated control for steppers
  • How to fix the crappy analog signal problem
  • Movement!
Appendix 
Solidworks models for version 2: https://github.com/aaronbot3000/deltadraw/tree/master/solidworks_models_v2 

Monday, October 24, 2011

Pythagoras v1 Drawings

Basically, a bunch of drawing v1 has done.

Pythagoras: Sending Data to the Microcontroller

All these points and nothing to do with them? Why, stream them to the microcontroller!

The Protocol

Nothing special. Microcontroller requests a new point, computer sends a new point. If there are no new points, computer sends an end transmission code.
The microcontroller also buffers a set of points. Currently, it is set to store 256 points before requesting more. Pretty arbitrary. So the program flow is:

while stop flag not set
  If buffer is empty
    reload buffer
     if received end transmission
       set stop flag
  draw all points in buffer


One More Thing

The Z height needs to be adjustable. Get a couple buttons to adjust the drawing height, so the pen touches, not misses or jams into, the paper.

With that done, its picture time!
Complete with documentary quality commentary.

Pythagoras: Vectorizing Software

Having to program points in the robot is annoying at best and horribly inelegant and time consuming. What's better is the computer generating all these fine points, so all I have to do is say, "Robot! Draw!" and it does.

This program vectorizes raster images, but doesn't quite process (read: not at all) vector images, such as SVG or DXF. That is for a later project.

OpenCV

OpenCV is a really dandy computer vision library originally headed by Intel. It has since been open sourced and freely available to the public in C/C++, Python, Android, GPUs, and a work-in-progress iOS port. The library implements just about every computer vision algorithm you can think of. For the vectorizer, I use the Canny edge detection, contour finding, and polygonal approximation of curves.

Take Note

This guide is written assuming you know how to fiddle with OpenCV and program. If not, there are many other websites that teach you how to program and beginner's tutorials to OpenCV that do a better job than I would have.

TL; WR (Too Long, Won't Read)

Resize the image to around 800 pixels, apply Gaussian blur and Canny Edge Detection. Fiddle with thresholds and blurs until the edge image looks right. Run findContours, then approxPolyDP (polygon approximation). Increase approximation accuracy until output looks about right. Stream polygon points to robot over serial or your favorite communication protocol.

Links to source code in the appendix.

Start: Acquire Image and Resize

Why resize your source image? After all, if its big, then more detail, great. If its small, then process faster, great.

If the source image is too big, say greater than 1200 pixels across, then there is too much detail. Since the image processing algorithms are run only once, the speed of processing at is no issue, but the delta robot can only plot so fast. At larger sizes, very small, insignificant features will show up, creating a large number of points to plot, slowing down the drawing and cluttering the image. Of course, unless you want that kind of precision for a very feature full image.

If the source image is too small, what ends up happening is basically plotting at low resolution. The manipulator positions are quantized based on the resolution of the input image. Low res input, low res output. So even if the input image is small, resizing it to a larger image makes better results.

I have found the optimal image size of lower precision delta robots (such as version 1) to be around 400 pixels, and for higher precision delta robots (such as version 2) to be around 800 to 1200 pixels, depending on the input image.

Finally, make it grayscale, either by loading the image greyscale or by using cvtColor to pull out the green channel. CV algorithms love grayscale images. This is the source image I'm using.
yay Lena


Next: Blur and Edge Detect

Exactly as it says, run Gaussian blur and then Canny edge detection on the image, to get a binary image of pure, one pixel width lines representing all the detected edges. Why blur? Because Canny edge detection will find any and all lines in the image, regardless of the content or "importance" of the line. Without blur, the edge image looks like this.
Too much blur, and you lose too much data, and get an edge image like this.
A happy medium is nice, though it differs depending on the picture. For this image I used a seven pixel radius (not sure what the sigma is though, sorry) blur to get this.
Canny edge detection also has variables to adjust the hysteresis of the found lines, a high and low threshold. High threshold is the minimum "strength" of the edge for it to appear in the edge image. Low threshold is the minimum "strength" of the edge for it to continue being a line in the edge image. Drop below the low threshold, and the line stops in the edge image.

Finally: Contour Finding and Polygon Approximation

Currently, all the edges are just white pixels on a black image. They need to be grouped into lines. This is where the function FindContours comes in. It goes and finds all the groups of pixels that can become contours and makes them into unified contours. It has a bunch of fine ways to store the contours, such as in a hierarchy, showing which contours are inside other contours, or only finding outlines of things. In this case we want all the contours in any order, so we can pass it CV_RETR_LIST to get a list of contours.

We aren't done yet. Contours aren't nice straight lines that can be linearly approximated. The last step is ApproxPoly, that is, polygonal approximation. Give it the list of contours, and it will give you back a list of polygons, each a list of points. Exactly what the robot needs. There is one adjustable value: the precision of approximation. Low approximation gives very "shapy" images, such as below.
High approximation approaches an exact match to the edge image, but it can add many many extra points in the polygons. For example, the below image contains 23,000 points, while for a typical drawing I adjust for no more than 3000 points.
With some adjustment, I achieved a 2115 point image while still remaining quite faithful to the original edge image.
So now we have all these points, it is just a simple matter of mapping pixel values to real world inches (or {milli,centi,}meters). Of course, make sure you maintain aspect ratio, or you will get stretched out images as such.
One more thing to consider. If you flipped your coordinate axes like I did, with the Z axis pointing down, your images are mirrored about the Y axis. Simply flip back.

One more thing

Since the robot interpolates between points, extra points will need to be added to be sent. Traverse points. One to traverse to the start of the polygon, and one to raise the pen to prepare for traversing to the next polygon.

In the next post

Getting the data to the robot.

Appendix

Source image Lena: http://sipi.usc.edu/database/database.php?volume=misc&image=12#top

Implemented vectorizer code in Python (warning: dirty, dirty, hacked together code): https://github.com/aaronbot3000/deltadraw/tree/master/vectorizer

Sunday, October 23, 2011

Pythagoras, Version 1

Goals

The goals for this robot is straightforward. Vectorize an image, draw it on paper, and do it reasonably well.

And so...

Here I go designing a robot. After figuring out the math discussed in the previous post, I head out and model it all out in SolidWorks. And it looks all fine and dandy. A complete parts list can be found at the bottom of this post, in the Appendix.


Since I was away from Tech, I had no access to their big fun machinery, and am thus forced to buy custom parts. I get the parts all cut out from Big Blue Saw, all from a sheet of 1/8" 6061 aluminum. Regarding Big Blue Saw, when they say the parts cut on standard quality will have taper, they mean it. I measured .02" of taper on a 1/8" thick part. Its definitely noticeable.

For the motors I used pretty cheap Hitec HS-5485HB hobby RC servos. I did spend a little extra money to get some better digital servos. When sent a signal that is out of range, instead of destroying themselves, they stop. Brilliant!

For the pen I chose one of the nicer pens I have lying around, a Pilot Precise v5, but honestly, any pen will work. Later I switched it out for fear of permanently damaging my nicer pens. The pen holder is very simple. Just a hole in a plate, with the pen held in by shaft collars.

The rods are 4-40 threaded rods. Nothing fancy, though they are a little flexible. Stiffer material (carbon fiber?) for a later version.

Soon, mail shipment!
After much tapping of screw holes and filing of slots and tabs, assembly!



And then after hacking away at steel rods with a cheap (< $2) hacksaw from Wal-Mart, complete assembly is complete.

All that's left to do is to program it, right?

For motor control, I use an mbed, an ARM Cortex M3 based prototyping board made by NXP. It is similar to Arduino in that it has its own IDE and libraries that nicely abstract away the peripherals to something nice like i2c.send(0x3F)instead of having to deal with registers and the like (of course, at the cost of control over your microcontroller). However, the mbed has way more peripherals and memory, both RAM and Flash, and it runs at a cool 93 Mhz, as opposed the Arduino's 16-ish Mhz. The downside are:
  1. You have to be connected to the internet to program it, since it uses a nasty online IDE and compiler.
  2. Due to the online IDE and compiler, no debugger is available, just print lines to serial.
While there does exist an offline compiler for the mbed, you lose access to the libraries, so I just sucked it up and worked with their online IDE. After a bit of code hackery, first movement. Huzzah!



But something was wrong. The first images were supposed to be straight lines, and they were distinctly curved.


Curved even more than the iffy servo control would suggest. As it turns out, I should have been more careful with inlined functions. Removing the explicit inlined function, and I get successful drawing!
Yes, I am very aware the square it should be is very reminiscent of a piece of toast.

But that's okay, acceleration and deceleration of the manipulator hasn't been implemented yet. But it was implemented and how the path planning works will be discussed soon.

How the Path Planning Works


The algorithm works by simple linear interpolation between two points given in 3-D space. It also adds in a linear acceleration and deceleration zone, near the endpoints. The zones are currently set to be 0.5" long at both ends, though that can be varied. 

What happens is I have the current position of the manipulator, labeled (X1, Y1, Z1). I then create the step using the math as follows.
Basically, I am normalizing the line into a unit vector, then multiplying the vector by a step size to basically nudge the current position along the line.

The step size is how I control the acceleration and speed. Bigger step size, higher speed, at the cost of accuracy, due to the lack of control over the servos' velocity (this will be fixed in version 2, where I use stepper motors). The "checkpoints", or the set positions of the manipulator along the drawn line look like this.

Using inverse kinematics, I determine the angle of the motors, and since they are RC servos, I simply send the angle over a PWM signal and forget about it.


However, right now I have to program individual points into the microcontroller before it will draw them. Perhaps there is a better way of getting points to draw. Perhaps, from a vectorized image? Streamed from the computer?


Regarding Arm Lengths

After a couple days of experimentation, I realize the arm lengths are way too long for drawing. Over the entire drawing range, the arms move very little. A high upper to lower arm gives great range, especially in the Z direction. But that's not what I need. I need great accuracy over the X-Y plane. And so, after some guessing, I cut the arms shorter. To half their original length. Yes, those are leftover tips of arms being used as foot pads.
Protip, when you're cutting things with a cheap Wal-Mart hacksaw, and things have really really started to jam up, see if you have broken any, if not all of the teeth on said hacksaw.


With the shorter arms, I can draw a reasonably accurate star.
Next, real pictures!

Appendix
Parts List:
  • Custom mechanical parts cut from 1/8" thick 6061 aluminum from Big Blue Saw
  • Three hobby RC servos, Hitec HS-5485HB, from RC Superstore
  • 4-40 threaded rod from McMaster-Carr
  • Aluminum spacers from McMaster-Carr
  • Shaft collars from McMaster-Carr
  • A bunch of various 4-40 socket cap screws and nuts
  • Dubro 4-40 ball links from RC Planet
  • NXP mbed
  • 6 volt supply
  • Power filtering capacitor, around 220 uF will do
  • Four pushbuttons
Source Code and Models 

Saturday, October 22, 2011

Pythagoras: The Drawing Delta Robot: Math and Design

I built this robot when I saw this delta robot on YouTube. The speed and precision amazed me, and the three arms working in coordination intrigued me.

Coincidentally, I have always wanted to make some sort of non-Cartesian plotter. So I thought, why not a drawing delta robot?


The Math

A note on convention: The "upper arms" of the robot are the sections of each of the three arms that are directly and rigidly attached to the motors. The "lower arms" are the sections of the three arms that are directly attached to the manipulator through ball joints.

First and foremost is the kinematics of the robot. I can control the upper arms of the robot, attached to the motors, but the lower arms are free moving on ball joints. The range of the lower arms makes nice spheres in 3-D space. With the three arms, any given motor angles will give exactly 1-2 possible manipulator positions (assuming the lower arms are .

There are the two directions of kinematics to solve for the robot:
  • Forward: Given the motor positions, where is the manipulator?
  • Inverse: Given a manipulator, what are the three motor positions needed to achieve that location?
In my robot, I only use inverse kinematics, but its good to understand the forward kinematics too.

Forward Kinematics

Since the angle of the upper arms are known, the endpoints of the upper arms are known through some trigonometry.

However, to put these coordinates in terms of the entire robot, instead of the motor, a few offsets need to be added. First is the motor's offset from the robot's origin. This changes the Y and Z terms. Second is the rotation of each motor.
We can calculate the rotation using a rotation matrix, which given an angle theta and an X and Y coordinate, it will give the rotated X and Y coordinate.
And so for each arm we can construct a sphere of every position the lower arm can be, centered between the ends of the lower arms where they meet the upper arms.

And by solving for the intersection of all three spheres. The intersection of two spheres will give you a circle in 3-D space, adding the third will give you 1-2 points in 3-D space. And then you always choose the lower point.
The problem I had with this is the offsets in the hand.
I never got it quite right. My best guess was to bring the three spheres closer equal to the size of the offset in the manipulator. However, since my algorithm didn't need forward kinematics, I never fully implemented it.

Inverse Kinematics

Inverse kinematics is a tad bit easier, solving for the intersection of circles, instead of spheres.

Each lower arm creates a sphere of possible locations, offset by a constant amount by the manipulator. Since the upper arms can only move change their Y and Z coordinates by rotating the arm, they have a fixed X. This reduces the sphere to a circle and the inverse kinematics problem to a circle-circle intersection rather than a circle-sphere (marked in green). 
However, this simplification comes at the cost of unreachable positions. Checks must be implemented to make sure that a given point has a solution before trying to find it.

Anyways, I found the intersection of the circles by taking the line made by the midpoints of the circles and mathing them, as described in this website.

The actual code snippet that does the inverse kinematics follows. The actual "meat" of computation is relatively small.

An important note

When I made the code, I decided to flip the coordinate axes so I could use positive Z for the manipulator position instead of negative Z. As it turns out, this reverses your image and makes visualizing the robot tricky. If I were to redo the code, I would use negative Z, to keep it consistent with real life.


Design Considerations

The robot's goal is to draw on US letter paper (8.5" x 11"). With this in mind, I guessed the length of the lower arms, simulating the arms in SymPy (symbolic math for Python) and then modeling them in SolidWorks. The only guideline I had was to keep the upper arms shorter than the lower arms for the best range and keep the two rods of each lower arm as far apart as possible for stability. I learned a lesson from that.

In version one of the robot, the upper arms were quite long, relative to the lower arm. This made for really great range in the Z dimension. For a drawing robot, this is useless. Since the arms were so long, small movements in the motors created large movements in the manipulator. Given the low precision of the servos, I got poor results in the accuracy of the drawing. I later cut the arms shorter with a hacksaw to get better results.

For the best range of motion for drawing, when the pen is centered on the paper, the upper arms should be near horizontal, relative to the robot, as pictured. 

Appendix