Michael, My guess is that you are on a 32 bit system, and the sftp utility cannot handle any file greater than 2 gig. When you get to 2 gig, it is done. Same thing for cp, scp and something else I can't recall. Yes, been there done that. The method I used when trying to overcome the 2 gig limit to copy a file to another machine on the network ( my laptop in this case) was to use netcat. Netcat was started on my laptop listening on a port and redirecting output to a file. Netcat was used on the linux box to send the file to that port on my laptop. I don't recall the exact syntax, but it was fairly simple and easy to figure out after a couple minutes with the man page. You may want to investigate setting up a TCP tunnel with ssh, and then using netcat on top of it, as that would give you the security of sftp. Or not, I haven't actually tried that bit. :) On Tue, 15 Mar 2005 15:52:57 -0500, Kline.Michael <Michael.Kline@xxxxxxxxxxxx> wrote: > I'm trying to use sftp for "security" reasons, but I've got some > tablespaces that are quite large, and they are giving me some very > strange errors. The old "bad" ftp routine handles these just perfectly. > The disk being sent to is more than large enough, some 60GB. > > Can't find much on google, but there's talk that if it's a 32-bit module > it may not be able to handle files > 4gb. Still, these are only about > 3GB. > -- Jared Still Certifiable Oracle DBA and Part Time Perl Evangelist -- //www.freelists.org/webpage/oracle-l