# File size limit exceeded?

## vfxpro

I decided to download Fedora Core 2 dvd iso. (OK, I know, I know. Sorry for the sacrilege. I still like to follow the Red Hat flavored distros)

```
wget ftp://mirrors.kernel.org/fedora/core/2/i386/iso/FC2-i386-DVD.iso
```

about the time the file gets to 2 gigs, wget exits with:

```
File size limit exceeded
```

The disk I'm saving to is ext3 and has over 30 gigs free. I'm running kernel-2.6.7-gentoo-r11 What gives? I can't find anything in the file system part of the kernel config that would prohibit large files. Do I need to enable High Memory Support? That doesn't seem like it would be the culprit. I have quota support disabled in the kernel also. (I assume that no quota  ==  unlimited  per user.)

BTW, what is the actual file size limit in ext3? I see different numbers on the net and they don't agree.

If any one is interested I can post my config, but for now I'll spare you.

----------

## srlinuxx

I'm not sure, but it's worth investigating the /etc/security/limits.conf or

/proc/sys/fs/file-max

----------

## vfxpro

No quotas set at all in 

/etc/security/limits.conf

 /proc/sys/fs/file-max  is  12386

What does that mean? 12 gigs?

I was wable to download the file with another client. It seems the limitation is something in wget. I might need to manually configure it and recompile. Should I report this as a bug? I could swear I've used wget to grab files this big in the past.

----------

## 4e71

 *Quote:*   

> 
> 
> proc/sys/fs/file-max is 12386 
> 
> What does that mean? 12 gigs?
> ...

 

this is the max number of open files. nothing to do with file size.

I'm having the same problem transfering big files in my LAN with: ftp, wget & opera (all fail at 2GB). 

On the server side I tried vsftpd and pure-ftpd, with the same result.

Still busy investigating...

(No problem with openssh<->sftp so that's what i'm using at the moment)

+L

----------

## 4e71

Well nothing to do with the servers.

I rebooted in Windows 2000 and both wget and opera were able to download the same 3 GB file from my ftp server without a glitch.

It seems VERY STRANGE that 3 different clients (wget, the canonical ftp and opera) have all the same limitation.

Also strange is the gentoo changelog for wget, because it seems this problem had been solved some time ago already (but maybe the patch was later removed?)

 *Quote:*   

> 
> 
> *wget-1.9 (22 Oct 2003)
> 
>   22 Oct 2003; Seemant Kulleen <seemant@gentoo.org> wget-1.9.ebuild:
> ...

 

----------

## Rainmaker

I believe the file size limit on ext3 is 2 Gig. Try copying the file from the partion it's on now to ext3... Will probably fail.

Or try

```
dd if=/dev/zero of=tempfile bs=1M count=2049
```

----------

## c4

i just pulled down the Fedora iso to my home directory. It's a ext3 partiton and it handled the iso just fine

```
bash-2.05b$ ls -l temp/*.iso

-rw-r--r--  1 mv users 4273096704 Sep  5 17:32 temp/FC2-x86_64-DVD.iso

```

I used Kasablanca v0.4. However I have had problems with both gftp and iglooFTP while transfering files over 2.00 GB. They both stopped transfering / crashed as the file downloaded became 2.00 GB.. so as far as I can tell, it isnt an ext3 issue, but instead some limitation with the clients used ?

----------

## srlinuxx

I don't reckon I ever wget'd a file larger than an iso, but I've bittorrent'd 4 and 5 gig'rs being saved to ext3.  Who's got a link to file that big so I can do some testing?

----------

## 4e71

it really is just the clients. I checked several today (including source code, when available). 

The most obvious thing is the file size, internally it is stored as a (signed) int, so, on a 32 bit system,  it simply cannot be bigger than 7FFFFFFF=2GB

 (hey! a reason to buy an Athlon64!  :Wink:  ) . 

"unsigned long long" should be used instead (gcc supports it and works ok on 32 bit systems) and of course other changes would have to be made.

Finally I emerged lftp , a pretty decent program that was able to download a 3GB file without any problem.

The thing that annoyed me a bit is the fact that both Opera and wget DID work correctly on win2k. 

+L

----------

## FloppyMaster0

Well, you could try adding this to your CFLAGS, and re-merging wget

```
-D_FILE_OFFSET_BITS=64
```

This SHOULD compile the program with large file support. I haven't tried this though.

----------

## vfxpro

 *Quote:*   

> Well, you could try adding this to your CFLAGS, and re-merging wget 
> 
> ```
> -D_FILE_OFFSET_BITS=64
> ```
> ...

 

I finally had a chance to test this and yes, it seems to have fixed the problem. Now I can use wget to pull down those huge files.

----------

## OdinsDream

.. All this, and I just got finished writing a post about how great unix applications are because programming philosophy dictates never building in artificial limits.

Well, so much for that. Isn't the idea to pass along the file operations to the OS, so this kind of thing doesn't happen? I'm recompiling wget as we speak, I too wanted to get the fedora DVD iso.

Just thought I'd post an update:

I recompiled wget with the new compiler setting, and whether it did any good, I'm not sure. I assume it did, but there's a second issue. Apparently, the progress bar still has trouble with the filesize. I tried resuming my download from the 2.0G I had already retrieved. The solution ended up being setting the progress bar to something else:

wget -c <url> --progress=dot:mega

..worked for me. I'm not so sure I'm very impressed with wget right now, though. All the statistics are negative, indicating it still doesn't understand the filesize. I'm reminded of the days in windows when you'd copy a large file and the time remaining would overflow. Are these bugs well known?

Edit Again:

The above did Not Work... see this thread:

https://forums.gentoo.org/viewtopic.php?t=226056&highlight=

----------

## navier-stokes

hum, i'm having the same problem copying a 2.4Gig file over a mounted NFS.

strange. I'm sure i've copied >2Gig files over my lan before...

----------

## Gentree

 *Quote:*   

>  All this, and I just got finished writing a post about how great unix applications are because programming philosophy dictates never building in artificial limits.

 

I know what you mean about wget not handling large files but it seems to do most things well. I dont recall ever seeing the slightest prob with portage in this respect, so it was probably a good choise there.

Hwvr, it does seem poor that there were several apps that tripped up on this one.

I dont think this is an "articificial limit" , its a bug, an over-sight. 

With the advent of DVD size files this issue should be addressed across the board. 

Have you posted bug reports?

 :Cool: 

----------

## Cklein

I'm having the same problem with a mandrake dvd iso. I'll recompile wget and tell what happens just to confirm this is the issue...

----------

## StifflerStealth

I use NcFTP, and I have no troubles. It comes with a program that is basically the same as wget, but I never noticed any trouble with it. Well, you could almost call it a wget clone, but with the NcFTP core. There is another one that it comes with that is strictly for putting. Or you can just use the big NcFTP client itself. I am almost considering of changing my make.conf to use ncftpget instead of wget, but I don't know if protage would support it. http://www.ncftp.com/

I don't recall if it has the 2 gig issue or not. I don't think it does, but I'll DL the Fedora DVD since it is legal to DL.

-Stiff

----------

## boudie

Had same problem, downloading debian sarge dvd using wget. It gave me

"File size limit exceeded" at 2 gig. Resumed download using ncftp.

 Seems to work. Was fearing a kernel recompile.

----------

