Maximum G-Code file size



  • Is there a maximum allowable size for G-Code files in DWC?

    I have a 66mb file which causes DWC to repeatedly restart when it tries to display the files list.
    The list displays briefly with “scanning file” next to the big one then DWC disconnects.

    (No, I don’t know why the file is that big either. It has a lot of smooth curves so probably non-optimal design on my part)



  • I don't know what the maximum size is but I have a few that are over 100mb, one of which is 140mb and have no problems with DWC acting strangely with them. Do you maybe have some odd character in the file name? You haven't accidentally uploaded an stl instead of a gcode file have you (it's been done before)?


  • administrators

    This issue is caused by the cluster size on the SD card being small and the SD card being slow and/or fragmented. Solutions:

    1. Try firmware 1.21RC3 which contains additional code to avoid this
    2. Use a faster SD card
    3. Reformat the SD card using the largest cluster size available (64kb)
    4. If the SD card is fragmented, defragment it



  • Thanks - I’ll take a look at the SD card.

    Is it worth waiting for the stable release or is 1.21RC3 pretty much free of print killing bugs?


  • administrators

    1.21RC3 works reliably for me, except for a bug that sometimes causes DWC to disconnect after I upload a file. I can reconnect immediately. Read the release notes about changes you may need to make to your homing files.



  • I have a delta configuration so hopefully no changes are needed.

    I checked my SD card and, as you suspected, the block size was 4KB.

    Since this is the card that was supplied with the Duet and hadn't been reformatted, I thought I'd better let you know in case any stock needs reformatting.

    As for fragmentation, is this even an issue any more?

    Back in the stone age, when I was repairing computers made out of rocks and sharpened sticks, shifting from one track to another involved physically moving the read head and took forever (relatively speaking). Fragmentation that split sequential blocks throughout multiple tracks would cause significant performance issues.

    SD cards, on the other hand, have no moving parts and generally speaking retrieving one block takes pretty much the same amount of time as any other regardless of the logical track it might be located on. Caching and fault recovery by substitution of redundant tracks makes the actual location of the block even less relevant (or even predictable)

    This means that, in theory, defragmentation is unnecessary and may actually reduce the life of the card by performing additional write operations (as you know they can only do a limited number).

    Caching, of course, might struggle a little though I suspect most algorithms are intelligent enough to predict the next block needed.

    Small block sizes on the other hand would increase communications overhead significantly - many flash cards and SSDs are specifically tuned to perform far better with sequential reads than individual 4kb reads.

    I can see that having the potential to impact performance and cause timeouts. Anyone who's ever tried to back up large numbers of files to a memory stick will have seen how massive files transfer in a matter of seconds whereas folders containing a large number of small files can takes minutes or even hours. Increasing the block size will help here.

    It should be noted that large block sizes will reduce the maximum number of small files than can be stored; though I doubt most of us will actually run into that problem even with a 4GB card.


  • administrators

    Fragmentation is less of an issue with SD cards, but it still causes the cluster link table to be spread over a larger number of sectors, which increases the time taken to seek to near the the end of the file and read information from there.



  • That’s where the cluster size makes a big difference though presumably you don’t have enough spare RAM to hold even one 64k cluster let alone the 2 or 3 you’d need to cache the cluster table for an 8Gb drive.

    Working with tablets and desktops I’ve been spoiled and almost forgotten the joys of trying to cram an operating system and application into what is basically a glorified oven timer.


Locked
 

Looks like your connection to Duet3D was lost, please wait while we try to reconnect.