I am researching this series of your gearmotors. I don’t understand why a 100% increase in gear ratio (e.g. from 50:1 to 100:1) produces only 30% more torque (170 to 220). If it’s always the same motor with different gear boxes, I would expect those values to track closely linear, possibly straying a bit due to minor additional friction. Please clue me in. Thanks.
In an ideal gearbox, the torque would double, but in reality, there are nontrivial losses in the gearbox proportional to the number of gears.
Sorry I’m a bit slow, but I need a bit more detail. I understand more gears = more surfaces = more friction = more overall losses of efficiency, but the chart depicts an enormous loss. Let’s use the 50:1 figures as a baseline. All losses attributed to that train are factored in. Now, to achieve 100:1 we just add a single 2:1 stage. I’m having a hard time understanding how that one additional stage introduces so much friction that we lose 70% of the theoretical torque gain. 70%? Is my math way off on that?
If you double 170 units of torque, you get 340. The specs say to expect 220 units, which is 65% of 340. You could attribute the difference to a 35% loss somewhere.
In any case, it would be very poor engineering practice to design a mechanical system that expects to make use of the maximum rated torque.
Our listed stall torque specifications are intended as rough, conservative estimates, and some are based on theoretical calculations or extrapolations rather than actual measurements of that gearmotor at that voltage. It is possible that the relationship between the gear ratio and the torque is really a little more linear than our specs indicate, but as Jim said, one should not be planing on operating these motors anywhere near stall, so it generally is not worth putting a great deal of effort into measuring the stall torque with high precision.
Agreed on design considerations. I had no intention of operating anywhere near stall torque. I just knew from experimentation I needed X torque to “barely turn”, so I just figured to buy something whose stall torque was roughly 3-4 times that. It was only then that I really scrutinized those charts and started the confusion.
Since my last post I googled gear train efficiency for various gear types. References disagree, but they all fell into the range of 92-98% for spur gears. Taking the most pessimistic view (92%) it would require an additional 5 gear stages (0.92^5) to drop power output down to 65% of input power. So, something is fishy. We’re either missing some piece of the puzzle, or those numbers were simply generated with far more pessimistic constants.
At this point I think experimentation is the only way to know.