Operating System - Linux
1820300 Members
3178 Online
109622 Solutions
New Discussion юеВ

Re: wget file size limitation

 
SOLVED
Go to solution
K.C. Chan
Trusted Contributor

wget file size limitation

Does any one know if there is a filesize limitation with wget and if so, how do get around it? Is there a patch for it? I've got the following messages when trying to download a 3gbytes iso file. "2097100K -> .......... .......... .......... .......... ....File size limit exceeded (core dumped)". Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
14 REPLIES 14
K.C. Chan
Trusted Contributor

Re: wget file size limitation

the cmd is "wget -c --passive-ftp ftp://host/dir/file" where file is about 3gbytes. I've tried this with ftp via reget command, I got the same result. So I think this may be an ftpd file limitation. I am using wu-ftpd, does anyone have any idea? Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
Stuart Browne
Honored Contributor

Re: wget file size limitation

Given the limited information you've given us here, I'm picking it's the obvious reason.

'wget' uses a pointer to a LONG to store where it's up to in a file.

Simply put, 2GB baby.

Unfortunately, their mailing lists are useless, and have no decent information in them.

Official home page (http://www.gnu.org/software/wget/wget.html) mentions nothing about a file size limit either.

And sorry, I don't feel like a source-code poking to confirm it, but it's what it seems.

Any particular reason you are using 'wget' instead of a real FTP client (NCFTP's 'ncftpget' command comes to mind) ?
One long-haired git at your service...
K.C. Chan
Trusted Contributor

Re: wget file size limitation

ftp bombs out as well when it reaches 2G. Any idea?
Reputation of a thousand years can be determined by the conduct of an hour
Stuart Browne
Honored Contributor

Re: wget file size limitation

which 'ftp' ?

There's the standard 'ftp' FTP client, then theres the 3rd party (but usually distributed) 'ncftp' packages.

There are a number of things on a Linux box that can do 'ftp'.

Also, what is the FTP server that's serving it out? It's possible that it has a limitation also, which might be worth looking into.
One long-haired git at your service...
K.C. Chan
Trusted Contributor

Re: wget file size limitation

I am using wu-ftpd-2.6.2-5 and running RH7.3. Filesystem quota is off and ulimit for user is unlimited. Any idea what ftp pukes on filesize ~3g? Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
K.C. Chan
Trusted Contributor

Re: wget file size limitation

I believe the ftp client is "ftp-0.17-13". Any idea what is causing ftp to bail out on me like this? Does any one of any ftp utils that would get ~3G across the internet without any hassle? Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
Stuart Browne
Honored Contributor

Re: wget file size limitation

I don't know. Try 'ncftpget' (should be distributed with RH73).

See if it has the same issue.
One long-haired git at your service...
K.C. Chan
Trusted Contributor

Re: wget file size limitation

I did a man on ncftp, but I can't find something similar to "wget -c"? Is their an option for ncftpget to continue the interrupted download? Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
Stuart Browne
Honored Contributor

Re: wget file size limitation

If you do 'man ncftpget', you should see the simple option of:

ncftpget host /local/path /path/to/file

That will retrieve that file. By default, it will attempt to resume.

You can also use FTP URL's:

ncftpget ftp://host/path/to/file

to do the same thing.
One long-haired git at your service...
K.C. Chan
Trusted Contributor

Re: wget file size limitation

Ok, by default it will resume the download. Does this mean if I interrupt the download, like control-c out of it. Then at a later point start the download again, will nfcftpget start the download where I left off or re-download the file from start? Or does it mean, it will resume the download on a per session basis (only re-sume download if you lost connetion and you haven't quit out of the session). Could someone shed some lights into this? Thanks.
Reputation of a thousand years can be determined by the conduct of an hour
Stuart Browne
Honored Contributor

Re: wget file size limitation

Ever thought of just trying it instead of asking first?

No, it's not session based. Yes, if you break out of it, it will continue where it left off.
One long-haired git at your service...
Stuart Browne
Honored Contributor
Solution

Re: wget file size limitation

Ever thought of just trying it instead of asking first?

No, it's not session based. Yes, if you break out of it, it will continue where it left off.
One long-haired git at your service...
david_69
Advisor

Re: wget file size limitation

what file system are you trying to write to?
Does the filesystem type support large files?
(Ext2 and Ext3 do not support files larger than 2GB).
You might try scp, or netpipes, netcat, or nfs.
fall down seven time stand up eight
Stuart Browne
Honored Contributor

Re: wget file size limitation

Umm, david, ext2 & ext3 most certainly do support files larger than 2GB.

The 2GB file-size limitation was a kernel limit, not an FS design limit.

The late 2.2 series, and 2.4 series of kernel support up to 2TB file sizes.
One long-haired git at your service...