# File size limit 2GB - 1, ulimit set to unlimited.

## jsedwards

I'm running an old version (2.6.7-hardened-r7, ~3 years old) of Gentoo and I just ran into a problem where it won't let any file size exceed 2 gigabytes.  I have checked the ulimit -a it says the file size is unlimited.  I called getrlimit64 at the beginning of the program (_LARGEFILE64_SOURCE is defined) and it returns -1 for both soft and hard limits, so I assume that means it is unlimited.

I found this old thread from 2004-2005: https://forums.gentoo.org/viewtopic-t-218447-highlight-wget+fedora+dvd.html and it seems that this same problem showed up around the same time I built this system.

Is there some limit compiled in somewhere, so it just won't let the file size go over 32 bits?

Thanks

  -Scott

----------

## barophobia

My guess your filesystem limits you to 2gigs.  This should not happen though even with a 3 year old computer.

----------

## jsedwards

The file system is ext3 which handles files > 2GB on all of my other systems.  I've got an 8 GB file on one of my systems using ext3.  I'm just wondering if I did something wrong when I built the Gentoo system?  Like I needed to set a flag somewhere when it was compiled to allow files > 2 GB?  Of course if that is the case then there's really nothing I can do about it at this point.  I was just hoping that it was some configuration thing that I could easily change.

----------

## eccerr0r

Is libc up to date on the machine?

need all four: filesystem support, kernel support, libc support, and application support to support large files...

----------

## jsedwards

 *Quote:*   

> Is libc up to date on the machine?

 

I guess that would be a no.  I haven't dared to update it since it is 800 miles away and last time I updated it, it wouldn't come back up after I rebooted.  Hopefully, in the next month or so I am going to make the trip, replace some hardware and upgrade it.  I do have to say it is impressive that it has run this long:

```
 19:55:18 up 880 days, 20:12,  1 user,  load average: 0.00, 0.00, 0.00
```

----------

## eccerr0r

It actually scares the heck out of me when one of my machines reaches that high in uptime...

partially because it means that I forgot to upgrade it  :Smile: 

Luckily most upgrades can be done hot, but the libc image for init probably won't change till it's rebooted... and upgrading sshd is always a scare...

That being said you might be able to test whether you can write larger files if you have a chroot newer libc with newer apps... to see if it's your kernel or filesystem that's not up to date.

----------

## danomac

Did you enable large single file support in the kernel?

```

Block layer  --->

   [*]   Support for Large Single Files 

```

I forgot to compile this in once and had this issue.

----------

## jsedwards

 *Quote:*   

> Did you enable large single file support in the kernel?

 

I don't even see that option available (CONFIG_LSF) in my kernel, so that might just be the problem.

 *Quote:*   

> It actually scares the heck out of me when one of my machines reaches that high in uptime...
> 
> partially because it means that I forgot to upgrade it  

 

That was sort of what happened, I just never had time to do what I originally set it up to do.  I got too busy with more important things and months went by.  I finally gave up on the original idea altogether.  When I finally decided what to do with it, it was way too out of date and configured completely wrong for what I wanted to do.  So I decided to just let it go until I can travel to it and fix it for the new use.  I have just been using it to backup files until then.  Unfortunately, one of the files I have been storing just exceeded 2 gigs.

Thanks for all of your help!

----------

