Hello, @YuriConfessor asked my to implement his robot arm, a 4 axis palletized robot type. I am currently developing, but what is needed is exactly testing the result. This is more than just to say "position xyz is reached, looks ok" or "firmware result data look ok". This thread wants to gather ideas how an exact-as-possible approach can be implemented instead, using the physical result. This will also allow to find possible problems between the firmware result and physical result. Let's say firmware says result 100,100,100 and physical result is 110,90,100 (and additionally information about orientation), then there is a problem in setup, code or somewhere.
Let's say you have a robot/3D printer/CNC, a firmware implementation, and some G-Code to make movements. The G-Code is executed, the axes move, the endpoint has some position and orientation as result.
How can be controlled that the movement result is the required G-Code? How can it be implemented? Which measurement methods are possible, can it be automated?
E. g. optical (Laser based, camera sensors, ...), mechanical, additional MCU to measure and compare with Duet's data. If the measurement is fast, it could be used for collision detection later also.
This thread shall gather ideas, I will add my own ideas here as well.