<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: wget file size limitation in Operating System - Linux</title>
    <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893384#M3600</link>
    <description>Ever thought of just trying it instead of asking first?&lt;BR /&gt;&lt;BR /&gt;No, it's not session based.  Yes, if you break out of it, it will continue where it left off.</description>
    <pubDate>Sat, 01 Feb 2003 06:56:41 GMT</pubDate>
    <dc:creator>Stuart Browne</dc:creator>
    <dc:date>2003-02-01T06:56:41Z</dc:date>
    <item>
      <title>wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893373#M3589</link>
      <description>Does any one know if there is a filesize limitation with wget and if so, how do get around it? Is there a patch for it? I've got the following messages when trying to download a 3gbytes iso file. "2097100K -&amp;gt; .......... .......... .......... .......... ....File size limit exceeded (core dumped)". Thanks.</description>
      <pubDate>Thu, 30 Jan 2003 20:18:24 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893373#M3589</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-30T20:18:24Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893374#M3590</link>
      <description>the cmd is "wget -c --passive-ftp ftp://host/dir/file" where file is about 3gbytes.  I've tried this with ftp via reget command, I got the same result. So I think this may be an ftpd file limitation. I am using wu-ftpd, does anyone have any idea? Thanks.</description>
      <pubDate>Thu, 30 Jan 2003 21:30:34 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893374#M3590</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-30T21:30:34Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893375#M3591</link>
      <description>Given the limited information you've given us here, I'm picking it's the obvious reason.&lt;BR /&gt;&lt;BR /&gt;'wget' uses a pointer to a LONG to store where it's up to in a file.&lt;BR /&gt;&lt;BR /&gt;Simply put, 2GB baby.&lt;BR /&gt;&lt;BR /&gt;Unfortunately, their mailing lists are useless, and have no decent information in them.&lt;BR /&gt;&lt;BR /&gt;Official home page (&lt;A href="http://www.gnu.org/software/wget/wget.html)" target="_blank"&gt;http://www.gnu.org/software/wget/wget.html)&lt;/A&gt; mentions nothing about a file size limit either.&lt;BR /&gt;&lt;BR /&gt;And sorry, I don't feel like a source-code poking to confirm it, but it's what it seems.&lt;BR /&gt;&lt;BR /&gt;Any particular reason you are using 'wget' instead of a real FTP client (NCFTP's 'ncftpget' command comes to mind) ?</description>
      <pubDate>Thu, 30 Jan 2003 22:35:32 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893375#M3591</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-01-30T22:35:32Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893376#M3592</link>
      <description>ftp bombs out as well when it reaches 2G. Any idea?</description>
      <pubDate>Thu, 30 Jan 2003 23:20:31 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893376#M3592</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-30T23:20:31Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893377#M3593</link>
      <description>which 'ftp' ?&lt;BR /&gt;&lt;BR /&gt;There's the standard 'ftp' FTP client, then theres the 3rd party (but usually distributed) 'ncftp' packages.&lt;BR /&gt;&lt;BR /&gt;There are a number of things on a Linux box that can do 'ftp'.&lt;BR /&gt;&lt;BR /&gt;Also, what is the FTP server that's serving it out?  It's possible that it has a limitation also, which might be worth looking into.</description>
      <pubDate>Fri, 31 Jan 2003 00:20:28 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893377#M3593</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-01-31T00:20:28Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893378#M3594</link>
      <description>I am using wu-ftpd-2.6.2-5 and running RH7.3. Filesystem quota is off and ulimit for user is unlimited. Any idea what ftp pukes on filesize ~3g? Thanks.</description>
      <pubDate>Fri, 31 Jan 2003 01:09:46 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893378#M3594</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-31T01:09:46Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893379#M3595</link>
      <description>I believe the ftp client is "ftp-0.17-13". Any idea what is causing ftp to bail out on me like this? Does any one of any ftp utils that would get ~3G across the internet without any hassle? Thanks.</description>
      <pubDate>Fri, 31 Jan 2003 01:18:30 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893379#M3595</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-31T01:18:30Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893380#M3596</link>
      <description>I don't know.  Try 'ncftpget' (should be distributed with RH73).&lt;BR /&gt;&lt;BR /&gt;See if it has the same issue.</description>
      <pubDate>Fri, 31 Jan 2003 01:37:29 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893380#M3596</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-01-31T01:37:29Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893381#M3597</link>
      <description>I did a man on ncftp, but I can't find something similar to "wget -c"?  Is their an option for ncftpget to continue the interrupted download? Thanks.</description>
      <pubDate>Fri, 31 Jan 2003 03:46:15 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893381#M3597</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-31T03:46:15Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893382#M3598</link>
      <description>If you do 'man ncftpget', you should see the simple option of:&lt;BR /&gt;&lt;BR /&gt;ncftpget host /local/path /path/to/file&lt;BR /&gt;&lt;BR /&gt;That will retrieve that file.  By default, it will attempt to resume.&lt;BR /&gt;&lt;BR /&gt;You can also use FTP URL's:&lt;BR /&gt;&lt;BR /&gt;ncftpget ftp://host/path/to/file&lt;BR /&gt;&lt;BR /&gt;to do the same thing.</description>
      <pubDate>Fri, 31 Jan 2003 03:59:26 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893382#M3598</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-01-31T03:59:26Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893383#M3599</link>
      <description>Ok, by default it will resume the download. Does this mean if I interrupt the download, like control-c out of it. Then at a later point start the download again, will nfcftpget start the download where I left off or re-download the file from start? Or does it mean, it will resume the download on a per session basis (only re-sume download if  you lost connetion and you haven't quit out of the session). Could someone shed some lights into this? Thanks.</description>
      <pubDate>Fri, 31 Jan 2003 17:20:00 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893383#M3599</guid>
      <dc:creator>K.C. Chan</dc:creator>
      <dc:date>2003-01-31T17:20:00Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893384#M3600</link>
      <description>Ever thought of just trying it instead of asking first?&lt;BR /&gt;&lt;BR /&gt;No, it's not session based.  Yes, if you break out of it, it will continue where it left off.</description>
      <pubDate>Sat, 01 Feb 2003 06:56:41 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893384#M3600</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-02-01T06:56:41Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893385#M3601</link>
      <description>Ever thought of just trying it instead of asking first?&lt;BR /&gt;&lt;BR /&gt;No, it's not session based.  Yes, if you break out of it, it will continue where it left off.</description>
      <pubDate>Sat, 01 Feb 2003 07:01:01 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893385#M3601</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-02-01T07:01:01Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893386#M3602</link>
      <description>what file system are you trying to write to?&lt;BR /&gt;Does the filesystem type support large files?&lt;BR /&gt;(Ext2 and Ext3 do not support files larger than 2GB).&lt;BR /&gt;You might try scp, or netpipes, netcat, or nfs.</description>
      <pubDate>Thu, 20 Feb 2003 18:54:02 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893386#M3602</guid>
      <dc:creator>david_69</dc:creator>
      <dc:date>2003-02-20T18:54:02Z</dc:date>
    </item>
    <item>
      <title>Re: wget file size limitation</title>
      <link>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893387#M3603</link>
      <description>Umm, david, ext2 &amp;amp; ext3 most certainly do support files larger than 2GB.&lt;BR /&gt;&lt;BR /&gt;The 2GB file-size limitation was a kernel limit, not an FS design limit.&lt;BR /&gt;&lt;BR /&gt;The late 2.2 series, and 2.4 series of kernel support up to 2TB file sizes.</description>
      <pubDate>Thu, 20 Feb 2003 22:23:45 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-linux/wget-file-size-limitation/m-p/2893387#M3603</guid>
      <dc:creator>Stuart Browne</dc:creator>
      <dc:date>2003-02-20T22:23:45Z</dc:date>
    </item>
  </channel>
</rss>

