473,672 Members | 2,497 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Memory limit reached during CLI query, or what?

Running a CLI script that fopen's a file, parses the lines into an
array, checks the array entries against a few regular expression
qualifiers (i.e. !eregi("bot",$e ntry)) and dump the good entries into
a MySQL db.

The fopen'ed file is a 410MB text file with log entries formatted like
a .CSV file ("" text qualifiers and comma separators).

Running the script from a web browser in Windows98: Works fine, except
it stops about 1/10th of the way through (@~40MB).

Running the script via CLI on the RH9 server: Works fine, except I get
"Terminated " about 3/4rs of the way through (@~300MB).

Am I running into a memory useage issue or what? There's no
distinction between the last entry run successfully and the one after
that, in the log file dump, and everything goes into the db
fine...until the script terminates.

The Windows98 machine is running 512MB/PC133 DRAM with a 120GB HDD,
connected via a full T1 to the RH9 server with the MySQL db.

The RH9 server is a production server (I know...but I'm running out of
options on where to process this behemoth text file!) running 1GB RAM
and twin 60GB HDDs.

I will probably take what I have, trim the file to the remaining
entries, and append to the db, however I would like to understand why
the script is cutting out.

Thanks in advance for any illumination!
Jul 17 '05 #1
5 3425
On 6 Aug 2004 13:01:04 -0700
st**********@ho tmail.com (James Butler) wrote:
Running a CLI script that fopen's a file, parses the lines into an
array, checks the array entries against a few regular expression
qualifiers (i.e. !eregi("bot",$e ntry)) and dump the good entries into
a MySQL db.

The fopen'ed file is a 410MB text file with log entries formatted like
a .CSV file ("" text qualifiers and comma separators).

Running the script from a web browser in Windows98: Works fine, except
it stops about 1/10th of the way through (@~40MB).


Try setting max execution time a little higher in the script, I could
imagine that is what blocks it. (I think the default is 30 seconds.)
E.g.:
set_time_limit( 0);
to make it run until the script is done.

[snip]

Don't know about the CLI issue though. :(

Best regards,
Madsen

--
Anders K. Madsen --- http://lillesvin.linux.dk

"There are 10 types of people in the world.
Those who understand binary - and those who don't."
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)

iD8DBQFBE/Q0lNHJe/JASHcRAnm6AJ9P/uF+K7mOk2ARMJgI 4HB4b2cQjQCggYX E
2+d8N8F3vzOn1PC cAkXHe5Y=
=+GLO
-----END PGP SIGNATURE-----

Jul 17 '05 #2

"James Butler" <st**********@h otmail.com> wrote in message
news:5e******** *************** ***@posting.goo gle.com...
Running a CLI script that fopen's a file, parses the lines into an
array, checks the array entries against a few regular expression
qualifiers (i.e. !eregi("bot",$e ntry)) and dump the good entries into
a MySQL db.

The fopen'ed file is a 410MB text file with log entries formatted like
a .CSV file ("" text qualifiers and comma separators).

Running the script from a web browser in Windows98: Works fine, except
it stops about 1/10th of the way through (@~40MB).

Running the script via CLI on the RH9 server: Works fine, except I get
"Terminated " about 3/4rs of the way through (@~300MB).

Am I running into a memory useage issue or what? There's no
distinction between the last entry run successfully and the one after
that, in the log file dump, and everything goes into the db
fine...until the script terminates.

The Windows98 machine is running 512MB/PC133 DRAM with a 120GB HDD,
connected via a full T1 to the RH9 server with the MySQL db.

The RH9 server is a production server (I know...but I'm running out of
options on where to process this behemoth text file!) running 1GB RAM
and twin 60GB HDDs.

I will probably take what I have, trim the file to the remaining
entries, and append to the db, however I would like to understand why
the script is cutting out.

Thanks in advance for any illumination!


Very well could be a memory issue. PHP4 is rather lousy when it comes to
memory management. In my experience, stability issue start to crop up once
the amount the data processed exceeds 100 MB. Hard to tell without seeing
some code though.

(I'm going on the assumption that you've turned off time limit. Since you're
doing a database insert, I don't think the code would have gone half as far
if it were on.)
Jul 17 '05 #3
"James Butler" wrote:
Running a CLI script that fopen’s a file, parses the lines into
an
array, checks the array entries against a few regular expression
qualifiers (i.e. !eregi("bot",$e ntry)) and dump the good entries into a MySQL db.

The fopen’ed file is a 410MB text file with log entries
formatted like
a .CSV file ("" text qualifiers and comma separators).

Running the script from a web browser in Windows98: Works fine, except it stops about 1/10th of the way through (@~40MB).

Running the script via CLI on the RH9 server: Works fine, except I get "Terminated " about 3/4rs of the way through (@~300MB).

Am I running into a memory useage issue or what? There’s no
distinction between the last entry run successfully and the one after that, in the log file dump, and everything goes into the db
fine...until the script terminates.

The Windows98 machine is running 512MB/PC133 DRAM with a 120GB HDD,
connected via a full T1 to the RH9 server with the MySQL db.

The RH9 server is a production server (I know...but I’m running
out of
options on where to process this behemoth text file!) running 1GB RAM and twin 60GB HDDs.

I will probably take what I have, trim the file to the remaining
entries, and append to the db, however I would like to understand why the script is cutting out.

Thanks in advance for any illumination!


I believe you can use fread and read it chunk at a time (and discard
what you have just read).

"Watch out when you use readfile to read big files ! Reading a file
of 6 meg will result in php using 6 megs of memory ! Php might stop
your script if you cross the memory limit. You’re better of using
fread when reading big files " from
http://ca3.php.net/manual/en/function.readfile.php

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/PHP-Memory-l...ict137374.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=459616
Jul 17 '05 #4
Thank you all for your thoughts. I believe you are all correct that it
is, in fact, a memory issue. I have decided to particlize the
process...using chunk files @~10MB each and processing them one at a
time, not as a batch. Yeah, that's around 40 iterations of the
process, but at least I can get the data into the database, finally.

I appreciate all of your help.
Jul 17 '05 #5
"James Butler" wrote:
Thank you all for your thoughts. I believe you are all correct that it is, in fact, a memory issue. I have decided to particlize the
process...using chunk files @~10MB each and processing them one at a time, not as a batch. Yeah, that’s around 40 iterations of the
process, but at least I can get the data into the database, finally.
I appreciate all of your help.


James, did you see my comment? I don’t think you have to do all that.
Just use fread.

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/PHP-Memory-l...ict137374.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=461323
Jul 17 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
6247
by: Christophe Halatek | last post by:
Hi NG, sometimes I bekome a "Server Application Error" on my w2k Server when call a ASP page The server has reached the maximum recovery limit for the application during the processing of your request. Please contact your Administrator The only thing to do is to restart the server... :-(
19
14833
by: Thue Tuxen Sørensen | last post by:
Hi everybody ! I´m maintaining a large intranet (approx 10000 concurrent users) running on one IIS box and one DB box with sqlserver 2000. Currently there is 2,5 GB Ram, 1 1400 mhz cpu and 2 scsi disks installed on the db box. Sqlserver is set to use max 1,4 GB RAM, and the sqlserver does not seem to be using it all.
9
10838
by: Terry E Dow | last post by:
Howdy, I am having trouble with the objectCategory=group member.Count attribute. I get one of three counts, a number between 1-999, no member (does not contain member property), or 0. Using LDIFDE as a comparison I get the same results. No members means just that, an empty group. Zero means that the DirectorySearcher.SizeLimit has been exceeded....
25
2369
by: Zeng | last post by:
I finally narrowed down my code to this situation, quite a few (not all) of my CMyClass objects got hold up after each run of this function via the simple webpage that shows NumberEd editbox. My memory profile shows that those instances survive 3 rounds of GC collections - it's not what I expected. In my real code, CMyClass occupies big amount of memory and they all share one stance of another class that I don't have enough memory hold...
15
6720
by: jane | last post by:
We are getting this error during a large large load. 2006-06-03-02.58.31.688266 Instance:sieinst Node:000 PID:1286274(db2agent (SRMW) 0) TID:1 Appid:GAFE423C.P4CE.0671B3070116 database utilities DIAG_ERROR Probe:0 Database:SRMW LOADID: 1286274.2006-06-03-02.58.31.581068.0 (4;125) Error loading partition. 0, fffffffffffff43d, Detected in file:
0
1268
by: xievvv | last post by:
We are experiencing a memory leak in one of our applications. It is a web-based reporting system that produces large (> 500mb) PDF reports. It takes approx 4 hours to run the largest of these reports and during this process we have been experiencing issues with the aspnet worker process being recycled as the application's memory keeps growing to around 600mb. From my investigation, it appears that the memory is becoming bloated with character...
4
2076
by: =?Utf-8?B?eWtmZmM=?= | last post by:
Some codes instantiate an Arraylist and add many many objects to the list before the code start to process the items and kill the object. (I assume this could be a bad approach and I will need to find ways to improve the design/performance) But the question is: will this going to kill my system due to memory overflow or something? I wonder if VB.net is designed to save my Arraylist "contents" onto virtual memory (disk) once its size...
0
8485
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8403
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8930
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
8677
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
6238
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5704
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4227
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
2819
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
1816
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.