I am in need of some advice. I am trying to build what I think is not too standard of a power supply to power some stepper motors. I think I already figured out the stepper motor control but I will be needing a supply to provide them with the appropriate voltage at their required current. They are spec-ed at 3.6v/phase and 6.1amps/phase (the are relativly large stepper motors). I was thinking originally of trying to find a 3.6v voltage regulator with enough current capability but I couldn’t find much of anything. I did find some in stock that were adjustable regulators and could supply 10A and their adjustment range is 1.25v-4.65v so it should be adjustable to the required 3.6v. I actually need to run two of these stepper motors so I assumed I’d use two of these regulators. Now, the other problem is that the regulators have a 6v input voltage, so now I’d need a high current 6v power supply. If I couldn’t find one of those at a resonable price I might try a 6v regulator(s) approach with a 12v (AC powered) power supply.
When doing these DC-DC conversions, do I need to be matching just wattage levels or current levels? If it’s just wattage then the 20amps of 3.6v from the two 10A capable regulators would be 72W combined, and therefor the 6V supply would only need to be 12A capable. And then the 12v supply(if I go that route) would be 6A. Am I thinking about this correctly?
All of the converting seems like a hoaky way to do it, but I don’t see a straight forward way of doing it that doesn’t cost several hundred dollars from a non-standard power supply house.
Thanks for the help!
Oh yea, one more question. Is there a way to do this with a more common voltage level power supply and a few high wattage, or a bank of smaller, zener diodes to achieve the needed voltage?
I think what you’re trying to do is generally not going to be that easy if you’re doing it from scratch, but I don’t have any 10A stepper drivers to recommend, either. Something like zener diodes shouldn’t be in your design other than really high-power units for voltage suppression. If you cascade many regulators, you’re going to start building up a lot of losses. A single regulator with 80% efficiency might sound okay, but cascade three of them, and you’re down to about 50%. If you’re getting 72W out, that means you’re wasting 72W, and that will make things quite hot.
I would look for a switching regulator module that can output 3.6V with an input around 12V, and then use a computer power supply to get the 12V from a wall outlet.
I haddn’t even been considering the efficiencies. Was I at least on the right track with thinking of the conversion matching in wattage(despite the efficiency losses)?
The computer power supply is an interesting idea. I’ve got plenty of old ATX power supplies laying around. What would happen if I tried to use the 3.3v supply on an ATX to run the motors that need 3.6v? Would they just pull a little more current? Would there be a possiblity of damaging the motors? I shouldn’t be running them at near their max torque capability.
If you had a good enough current on a 3.3V supply, that would probably be the easiest by far. I somehow thought computer supplies might not have 3.3V, but that seems dumb now.
The power approach is valid as long as you’re talking about switching power supplies. If you’re using linear regulators or parts like zener diodes, you’re just throwing away the excess voltage without getting any extra current in return.
Yea, I had to look at wiki to make sure about the 3.3v but it is there on the motherboard connector. I just don’t know if that voltage rail is meant to supply enough current to match what I’d need; 12.2A. That seems like a lot of current. Though apparently there are 3 seperate 3.3v wires on that connector, and all maybe, what, 18ga wire? I guess I could always set up a test.
Here is a ~$30 (shipped) power supply that can do 30A at 3.3V: http://www.newegg.com/Product/Product.aspx?Item=N82E16817709011 There are a lot of power supplies on Newegg.
Man, it’s crazy how cheap computer parts are these days.
I looked through some of my spare ATX supplies and found a little mini 300W one that’s speced at being able to provide 18A on the 3.3V supply. So hopefully that’ll work. Any predictions on what supplying 3.3v to 3.6v motors will do?
My prediction is that it will be a little slower than it could be. Let us know how it goes!
It worked! Well, at least mostly. I found a nice little unipolar driver circuit that uses IRFZ44 MOSFETs and hooked it up using one of my computer ATX power supplies. The little one I had must have a problem because the voltage got pulled way down when I powered it up. I found another old one in the closet that’s more normal desktop PC size and is speced on the label to have 20A on the 3.3V supply. Once I hooked it up the motor turned well and would change speeds and direction with the adjustments on the driver circuit. However, it was still pulling the 3.3V down considerably. When turning fast the voltage was about 2.94V and pulled 0.6A but the motor had pretty low torque capacity running at this high speed(which is generally how steppers work). When I turned the speed all the way down the voltage dropped and the current came up; 1.85V and pulled 2.2A, but the torque capacity was way up…I had to hold pretty tight to stop the motor. This motor is speced to have a holding torque of 1125 oz-in.
I’m confused though as to why the voltage is being pulled down so much. Even the higher 2.2A is a lot lower than the 20A capacity speced on the label. From looking at the pinout section about ATX power supplies here: en.wikipedia.org/wiki/ATX there is a 3.3V sense line(on pin 13 along with one of the 3.3V wires) which they say should be connected to the 3.3V on the motherboard to sense the 3.3V level. Maybe this is some kind of compensation that maintains 3.3V when higher loads are detected? Although I was connecting to pins 1 and 13 when trying to run my motor so you’d think that would be working. Should I try connecting to pins 1, 2, and 13(all three 3.3V wires)?
I also monitored the power going to the driver board which just requires 12V(there’s a linear 5v regulator on the driver board); I took that 12V from the drive connectors on the power supply. This is also where I connected the ground for it. The 12v never waivered and was only pulling ~3.3mA. Maybe the grounds for the drives and for the motherboard are isolated somehow? Maybe I should try tieing them together?
At least I’m getting somewhere…
I haven’t used 3.3V off a computer power supply, so I don’t know much about it. However, 20A is a lot for just a wire, so multiple wires are probably intended. Maybe you can look at a motherboard and see if those lines are obviously connected on that side.
Made some more progress. After reading through this article: wikihow.com/Convert-a-Comput … wer-Supply
I figured I should try using the loading power resistor that they say to add to the 5v line since it seems ATX supplies need a decent load to stay powered correctly. As if the motor load wasn’t enough?
Anyway, I added a cheap wire-wound 10ohm, 10watt resistor to the 5v line and now it supplies power to the motor much more consitantly. The motor is powered at all speeds and has much more torque at those speeds.
Low speed has a constant 3.31v and ~0.4a.
High speed drops to about 2.76v and current goes up to 4.4a.
The resistor gets pretty hot, certainly too hot to hang on to but it’s only drawing about 0.5a at 5v so it’s dissipating about 2.5watts.
It seems like the lower voltage just reduces the available torque but it still seems pretty high. I can’t stop it by holding the 5/8" shaft by hand in low speed, but I can when in high speed though not as easy as when the voltage was lower. As for the MOSFETs on the driver, they aren’t heat sinked and I can not really even percieve them as getting warm to the touch.
For those that are interested, all this is being done in an attempt to add 2-axis CNC capability to my metal working lathe.