By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
445,677 Members | 1,177 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 445,677 IT Pros & Developers. It's quick & easy.

How to deal with big-size files??

P: n/a
What can i do to open , write and seek in file with size 2 GB or more

I know that you r limited with int type size which is 32-bit but how
programs like gzip , zip read and write from files with big size like
2-GB

how can i break this limit in C or even C++
Dec 31 '07 #1
Share this Question
Share on Google+
8 Replies


P: n/a
On Dec 31 2007, 11:59*pm, "هنداوى" <3D.v.Wo...@gmail.comwrote:
What can i do to open , write and seek in file with size 2 GB or more

I know that you r limited with int type size which is 32-bit but how
programs like gzip , zip read and write from files with big size like
2-GB

*how can i break this limit in C or even C++
In Win32, you can use LARGE_INTEGER structure, and the APIs support 64-
bit integers.
Many compilers have built-in support for 64-bit integers, such as
__int64 in Microsoft C++ compiler.
However, it seems that standard C functions do not support 64-bit
integers.
Dec 31 '07 #2

P: n/a
Some compilers provide big file support
Microsoft and other compilers like lcc-win provide special functions
like fseeki64 and other functions to cope with files bigger than
2GB under the windows system.
what about more than 2GB , windows or linux as example can deal with
more than 2GB and it's written in C++
how can they do that
Dec 31 '07 #3

P: n/a
ظ‡ظ†ط¯ط§ظˆظ‰ wrote:
>Some compilers provide big file support
Microsoft and other compilers like lcc-win provide special functions
like fseeki64 and other functions to cope with files bigger than
2GB under the windows system.

what about more than 2GB , windows or linux as example can deal with
more than 2GB and it's written in C++
how can they do that
bigger than 2GB means BIGGER (greater than). This means a file of
1000 GB if you wish!
--
jacob navia
jacob at jacob point remcomp point fr
logiciels/informatique
http://www.cs.virginia.edu/~lcc-win32
Dec 31 '07 #4

P: n/a
bigger than 2GB means BIGGER (greater than). This means a file of
1000 GB if you wish!
Is there is any method available in C/C++ or any language to deal with
that

Python for example manage variables types in this case is the size
will be infinite"limited to physical memory size" or also limited to
64-bit
Dec 31 '07 #5

P: n/a
هنداوى wrote:
What can i do to open , write and seek in file with size 2 GB or more

I know that you r limited with int type size which is 32-bit but how
programs like gzip , zip read and write from files with big size like
2-GB

how can i break this limit in C or even C++
The size of an int isn't relevant; the relevant functions are fseek()
and ftell(), and they use long, not int.
'long' is only guaranteed to be at least 32 bits; it can be larger
than that, and on many systems it is. On systems where 'long' is a 64-
bit type, there is no problem (for now - wait a few decades and even
64 bits might not be enough).

I'm not very familiar with systems where files can be greater than
2GB, but long is too small to hold a file length, so I'll let others
tell you what to do in that case. However, in principle even when
'long' is only 32 bits, you could handle files longer than 2GB by
using fgetpos() and fsetpos() instead of fseek() and ftell(). The
interface is clumsier, but it's still workable. However, I know that
on at least some systems, the ability to work in that fashion to
handle files larger than 2GB is deliberately disabled; you have to use
special functions that are not defined by the C standard to access the
part of a long file that is after an offset of LONG_MAX.
Dec 31 '07 #6

P: n/a
On Dec 31, 9:48*am, jameskuy...@verizon.net wrote:
هنداوى wrote:
What can i do to open , write and seek in file with size 2 GB or more
I know that you r limited with int type size which is 32-bit but how
programs like gzip , zip read and write from files with big size like
2-GB
*how can i break this limit in C or even C++

The size of an int isn't relevant; the relevant functions are fseek()
and ftell(), and they use long, not int.
'long' is only guaranteed to be at least 32 bits; it can be larger
than that, and on many systems it is. On systems where 'long' is a 64-
bit type, there is no problem (for now - wait a few decades and even
64 bits might not be enough).

I'm not very familiar with systems where files can be greater than
2GB, but long is too small to hold a file length, so I'll let others
tell you what to do in that case. However, in principle even when
'long' is only 32 bits, you could handle files longer than 2GB by
using fgetpos() and fsetpos() instead of fseek() and ftell(). The
interface is clumsier, but it's still workable. However, I know that
on at least some systems, the ability to work in that fashion to
handle files larger than 2GB is deliberately disabled; you have to use
special functions that are not defined by the C standard to access the
part of a long file that is after an offset of LONG_MAX.
Hence, the FAQ:

12.25: What's the difference between fgetpos/fsetpos and ftell/fseek?
What are fgetpos() and fsetpos() good for?

A: ftell() and fseek() use type long int to represent offsets
(positions) in a file, and may therefore be limited to offsets
of about 2 billion (2**31-1). The newer fgetpos() and
fsetpos()
functions, on the other hand, use a special typedef, fpos_t,
to
represent the offsets. The type behind this typedef, if
chosen
appropriately, can represent arbitrarily large offsets, so
fgetpos() and fsetpos() can be used with arbitrarily huge
files.
fgetpos() and fsetpos() also record the state associated with
multibyte streams. See also question 1.4.

References: K&R2 Sec. B1.6 p. 248; ISO Sec. 7.9.1,
Secs. 7.9.9.1,7.9.9.3; H&S Sec. 15.5 p. 252.
Dec 31 '07 #7

P: n/a
On Dec 31, 11:16*am, user923005 <dcor...@connx.comwrote:
On Dec 31, 9:48*am, jameskuy...@verizon.net wrote:


هنداوى wrote:
What can i do to open , write and seek in file with size 2 GB or more
I know that you r limited with int type size which is 32-bit but how
programs like gzip , zip read and write from files with big size like
2-GB
*how can i break this limit in C or even C++
The size of an int isn't relevant; the relevant functions are fseek()
and ftell(), and they use long, not int.
'long' is only guaranteed to be at least 32 bits; it can be larger
than that, and on many systems it is. On systems where 'long' is a 64-
bit type, there is no problem (for now - wait a few decades and even
64 bits might not be enough).
I'm not very familiar with systems where files can be greater than
2GB, but long is too small to hold a file length, so I'll let others
tell you what to do in that case. However, in principle even when
'long' is only 32 bits, you could handle files longer than 2GB by
using fgetpos() and fsetpos() instead of fseek() and ftell(). The
interface is clumsier, but it's still workable. However, I know that
on at least some systems, the ability to work in that fashion to
handle files larger than 2GB is deliberately disabled; you have to use
special functions that are not defined by the C standard to access the
part of a long file that is after an offset of LONG_MAX.
thank's alot
that really the info(s) what i want
Dec 31 '07 #8

P: n/a
On Dec 31 2007, 6:17 pm, "هنداوى" <3D.v.Wo...@gmail.comwrote:
Some compilers provide big file support
Microsoft and other compilers like lcc-win provide special functions
like fseeki64 and other functions to cope with files bigger than
2GB under the windows system.

what about more than 2GB , windows or linux as example can deal with
more than 2GB and it's written in C++
how can they do that
There is also open() and O_LARGEFILE as defined by POSIX.
Also, there might be some library to deal with large files.
Jan 1 '08 #9

This discussion thread is closed

Replies have been disabled for this discussion.