Job history
-
Here is a neat idea for those who are interested in generating a running list as a job/print history.
My end G-code in Superslicer looks like this:M400 echo >>"history_job.g" state.time^ ":file:" ^ job.file.fileName ^ ":elapsed time:" ^ job.duration M400 M0; end print
I originally wanted the M0 command to build the list but the job data goes null as soon as the M0 is called.
It generates a nice running list of completed jobs that look like this:
2022-01-08T22:53:45:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3816 2022-01-09T00:53:06:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3962 2022-01-09T18:39:19:file:0:/gcodes/Bridge calibration_Voron_0.2.gcode:elapsed time:1678
Anyone have any suggestions for improving the data output?
-
@alex-cr might be a good reason to run an SBC.
A nice little front end plugin and a database containing all the print information of each file.
Would look really good -
Ya that would be a cool usage for the SBC. I have issues with SBC right now as it doesn't appear to have a high enough polling rate during triggering.
If I was good enough I would consider writing a plugin for it.
-
Nice idea.
There is no real string handling capability in RRF, so formatting is limited to what you have now.
The other issue is there's no easy way to monitor the file size.
SBC would overcome both issues. -
I'd add to the formatting to separate each field with a comma. Maybe also add total extruded volume or length to get an idea of filament use.
Actually, one of my axioms is "data space is endless", and another "more data is better". So I might add as many things as I could think of and be sure to put a 64GB card in your duet. That way, some day when you say "I wonder...." you can move the file to a real computer and do some data mining.
-
Just out of curiosity I ran a test.
I created a file with 1 million lines with about the same data as you have saved. The output is a file with a size of about 94MB
RRF takes approx 0.14 seconds to open and append a line to that file.
So for practical purposes, file size growth would not seem to be a limiting or troublesome factor in this usage case unless your SD card was very small.macro
var StartTime=state.upTime+state.msUpTime echo >>"testfile.txt" "ADDED LINE in " ^ (state.upTime+state.msUpTime-var.StartTime)/1000 ^ " seconds" echo "ADDED LINE in " ^ (state.upTime+state.msUpTime-var.StartTime)/1000 ^ " seconds"
output
Line 999997 2022-01-08T22:53:45:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3816 Line 999998 2022-01-08T22:53:45:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3816 Line 999999 2022-01-08T22:53:45:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3816 Line 1000000 2022-01-08T22:53:45:file:0:/gcodes/Calibration cube_Voron_0.2.gcode:elapsed time:3816 ADDED LINE in 0.1380000 seconds
-
Thanks for looking at that! Great idea for a test.