[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Set nodatacow per file?



> I get the following errors when running fileflags on large (>2GB) database files:
>
> open(): No such file or directory
>
> open(): Value too large for defined data type

http://www.gnu.org/software/coreutils/faq/#Value-too-large-for-defined-data-type

"""The message "Value too large for defined data type" is a system
error message reported when an operation on a large file is attempted
using a non-large file data type. Large files are defined as anything
larger than a signed 32-bit integer, or stated differently, larger
than 2GB.

Many system calls that deal with files return values in a "long int"
data type. On 32-bit hardware a long int is 32-bits and therefore this
imposes a 2GB limit on the size of files. When this was invented that
was HUGE and it was hard to conceive of needing anything that large.
Time has passed and files can be much larger today. On native 64-bit
systems the file size limit is usually 2GB * 2GB. Which we will again
think is huge."""
--
To unsubscribe from this list: send the line "unsubscribe linux-btrfs" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Linux NFS]     [Linux NILFS]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Photo]     [Yosemite News]    [Yosemite Photos]    [Free Online Dating]     [Linux Kernel]     [Linux SCSI]     [XFree86]

Add to Google Powered by Linux