Why mm/min ?
-
Hi,
This might be a simple question with many ideological, preferential, or historical reasons behind it, but is there any technical justification for using mm/min instead of mm/s as the feed rate unit?
Acceleration is typically expressed in mm/s², and stepper drivers operate in microsteps per second. From a user perspective, mm/s also feels more intuitive and tangible.
Some might argue that mm/min provides greater resolution, but since we can use floating-point values, that doesn't seem like a valid argument to me.
I’d love to hear your thoughts.
Thanks!
-
@JoA-RHU it comes from the original Gcode specification, see https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=823374
Feed rate is set in units per minute, and can be mm, inches or degrees. When the specification was drawn up, most CNC machines were mills and other slower moving, machining tools. Inches per mm would have made a lot of decimal points. Also feed rate for most tooling is specified in inches/mm per minute.Units per second has only become more useful with the advent of faster axes, and is somewhat limited to 3D printers. There's no particular advantage, yet, to change over, particularly as any conversion (usually just for the users benefit) can be done in software, eg the Gcode preview in most slicers show the speed as mm per second.
Ian