On 06 Oct, Chris Johnson <chris@xxxxxxxxxxxxxxxxxxxxx> wrote: > In article <5396701ce5jcgl@xxxxxxxxxxxxxxx>, Jim Lesurf > <jcgl@xxxxxxxxxxxxxxx> wrote: > > Otherwise I'll see if I can DIY a program to do this. > Should be fairly simple to write a utility to read a file say 512B at a > time and write the blocks into a second file, then after you have > reached say 256 MB, close outfile1 and then read the blocks and write in > to outfile2, and continue in the same way 'til the input file has been > completely read i.e. eof reached. Agreed. I was thinking of using fread and fwrite to do it in larger chunks. But I'm hesitating in the hope that either someone has already done a version that runs on the ARMiniX or that Jeff/Steffan provide a fix or workaround. No point in my re-inventing a wheel if someone else has already done better! :-) Logically speaking, it would be sensible if CDVDBurn allowed a large image to be split into smaller chunks than 2048MB. Ideally, a user-selectable choice. But again, the failure to be able to copy > 1024MB shouldn't arise anyway! [Added later] Jeff has now told me that Fat32fs has been coded to work for files up to 4GB - 1 byte in size. So it looks like the problem isn't Fat32Fs as such. Is there a deeper problem here with the ARMiniX or RO5.19? So far I've only been trying an SD card in the SD card slot. When I get a chance I'll try a usb stick or a 'SD-USB' reader with an SD card to see if these give the same problem. Slainte, Jim -- Electronics http://www.st-and.ac.uk/~www_pa/Scots_Guide/intro/electron.htm Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html Audio Misc http://www.audiomisc.co.uk/index.html --- To alter your preferences or leave the group, visit //www.freelists.org/list/armini-support List-related queries to info@xxxxxxxxxxxx