473,386 Members | 1,773 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,386 software developers and data experts.

Too many open files

AMD
Hello,

I need to split a very big file (10 gigabytes) into several thousand
smaller files according to a hash algorithm, I do this one line at a
time. The problem I have is that opening a file using append, writing
the line and closing the file is very time consuming. I'd rather have
the files all open for the duration, do all writes and then close them
all at the end.
The problem I have under windows is that as soon as I get to 500 files I
get the Too many open files message. I tried the same thing in Delphi
and I can get to 3000 files. How can I increase the number of open files
in Python?

Thanks in advance for any answers!

Andre M. Descombes
Feb 4 '08 #1
6 8357
Why don't you start around 50 threads at a time to do the file
writes? Threads are effective for IO. You open the source file,
start a queue, and start sending data sets to be written to the
queue. Your source file processing can go on while the writes are
done in other threads.
Feb 4 '08 #2
On Mon, 04 Feb 2008 13:57:39 +0100, AMD wrote:
The problem I have under windows is that as soon as I get to 500 files I
get the Too many open files message. I tried the same thing in Delphi
and I can get to 3000 files. How can I increase the number of open files
in Python?
Windows XP has a limit of 512 files opened by any process, including
stdin, stdout and stderr, so your code is probably failing after file
number 509.

http://forums.devx.com/archive/index.php/t-136946.html

It's almost certainly not a Python problem, because under Linux I can
open 1000+ files without blinking.

I don't know how Delphi works around that issue. Perhaps one of the
Windows gurus can advise if there's a way to increase that limit from 512?

--
Steven
Feb 4 '08 #3
Jeff wrote:
Why don't you start around 50 threads at a time to do the file
writes? Threads are effective for IO. You open the source file,
start a queue, and start sending data sets to be written to the
queue. Your source file processing can go on while the writes are
done in other threads.
I'm sorry, but you are totally wrong. Threads are a very bad idea for IO
bound operation. Asynchronous event IO is the best answer for any IO
bound problem. That is select, poll, epoll, kqueue or IOCP.

Christian

Feb 4 '08 #4
AMD wrote:
Hello,

I need to split a very big file (10 gigabytes) into several thousand
smaller files according to a hash algorithm, I do this one line at a
time. The problem I have is that opening a file using append, writing
the line and closing the file is very time consuming. I'd rather have
the files all open for the duration, do all writes and then close them
all at the end.
The problem I have under windows is that as soon as I get to 500 files I
get the Too many open files message. I tried the same thing in Delphi
and I can get to 3000 files. How can I increase the number of open files
in Python?

Thanks in advance for any answers!

Andre M. Descombes
Not quite sure what you mean by "a hash algorithm" but if you sort the file
(with external sort program) on what you want to split on, then you only have to
have 1 file at a time open.

-Larry
Feb 4 '08 #5
En Mon, 04 Feb 2008 12:50:15 -0200, Christian Heimes <li***@cheimes.de>
escribi�:
Jeff wrote:
>Why don't you start around 50 threads at a time to do the file
writes? Threads are effective for IO. You open the source file,
start a queue, and start sending data sets to be written to the
queue. Your source file processing can go on while the writes are
done in other threads.

I'm sorry, but you are totally wrong. Threads are a very bad idea for IO
bound operation. Asynchronous event IO is the best answer for any IO
bound problem. That is select, poll, epoll, kqueue or IOCP.
The OP said that he has this problem on Windows. The available methods
that I am aware of are:
- using synchronous (blocking) I/O with multiple threads
- asynchronous I/O using OVERLAPPED and wait functions
- asynchronous I/O using IO completion ports

Python does not (natively) support any of the latter ones, only the first.
I don't have any evidence proving that it's a very bad idea as you claim;
altough I wouldn't use 50 threads as suggested above, but a few more than
the number of CPU cores.

--
Gabriel Genellina

Feb 4 '08 #6
AMD
Thank you every one,

I ended up using a solution similar to what Gary Herron suggested :
Caching the output to a list of lists, one per file, and only doing the
IO when the list reaches a certain treshold.
After playing around with the list threshold I ended up with faster
execution times than originally and while having a maximum of two files
open at a time! Its only a matter of trading memory for open files.
It could be that using this strategy with asynchronous IO or threads
could yield even faster times, but I haven't tested it.
Again, much appreciated thanks for all your suggestions.

Andre M. Descombes
Hello,

I need to split a very big file (10 gigabytes) into several thousand
smaller files according to a hash algorithm, I do this one line at a
time. The problem I have is that opening a file using append, writing
the line and closing the file is very time consuming. I'd rather have
the files all open for the duration, do all writes and then close them
all at the end.
The problem I have under windows is that as soon as I get to 500 files I
get the Too many open files message. I tried the same thing in Delphi
and I can get to 3000 files. How can I increase the number of open files
in Python?

Thanks in advance for any answers!

Andre M. Descombes
Feb 5 '08 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: yanivmad | last post by:
hi I like to ask if there any option to control the html files?!? I have some HTML files at my web site that use one "Help.htm" file for all the pages {I use the "window.open(.... " option at...
0
by: KevinGravelle | last post by:
Hello, I am running an Apache web server and my httpd configuration file is utilizing the aspdotnet_module in order to run an ASP.NET application. Whenever I try and open my existing web...
10
by: Grocery Clerk | last post by:
I know open() returns a file descriptor and fopen() returns a pointer to FILE. The question is, when do I use fopen() and when do I use open()? Could someone give me an example when to use one...
13
by: Daniel Walzenbach | last post by:
Hi, Imagine the following situation: I have an asp.net application which allows uploading files to a SQL Server 2000 database (Files are stored as type "images"). As a next step I would like to...
7
by: Tyrone Showers | last post by:
I have a problem of getting the error "too many files open" and would like to trace my application. However, I have found nothing about how to display the current number of open files. Does...
7
by: jonathandrott | last post by:
sorry newbie question probably. i'm trying to open an specific folder. open each file with in the folder individually and process each one. all the processing code has been written. i'm looking...
14
by: padh.ayo | last post by:
Can C open (and keep open) two files at any given time? so, FILE *a, *b; a = fopen(argv, "r"); b = fopen(argv, "r"); ?
1
by: kencana | last post by:
Hi all, I was wondering why I always get "failed to open stream: HTTP request failed!" error in either loading a normal or xml file. i don't understand why i can't get the whole result. the result...
2
by: hharry | last post by:
Hello All, Does anyone know of a method to automatically detect if a file is corrupted ? Due to a failed backup process a number of files were corrupted. The files are mostly .xls, .doc, .pdf....
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.