473,382 Members | 1,389 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,382 software developers and data experts.

Perl performance?

I'm a perl newbie, with next to no programming experience (I did a bunch
of Fortran 25 years ago, but nothing since).

I have a problem I need to solve, and I'm wondering whether perl is the
best tool. I need to log a fairly fast data stream to a file, after
adding a time stamp to the end of each line. The data is ASCII text, and
will be coming into a serial port on a laptop. The data stream is at
115,200 baud, with 64 lines per second, each line being 40 to 45
characters long. I'm not sure yet what format the line ends are.

I've successfully logged the data using a Windows terminal program, but I
really need to add a time stamp to each line, so I can sync the data up
with another data stream when I do the post processing. I think I could
use the Win32::SerialPort module to get the data into perl, have it parse
the data looking for line ends, add a time stamp to the end of each line
and log the line to a file.

Win32::SerialPort info:

http://members.aol.com/Bbirthisel/SerialPort.html

I realize that the time stamps will only have a resolution of one second,
but I figure that when I do the post processing I could look for the
records where the time stamp changed seconds, and then count records to
infer the time for each one. I don't need super high accuracy. Plus or
minus a half second will be more than good enough.

The only laptop I currently have available that has a serial port is a
Dell PIII 500 running Win 98. I can't put Linux on it, as it belongs to
my wife, and she needs Win 98 for some business applications. Is it
likely that a perl program would have enough performance to deal with data
at 115,200 baud, with 64 records per second?

I'm tempted to try perl as it seems to be general purpose enough that I
could use my new perl skills for all kinds of other things. But, if perl
likely won't be up to the task I'll look into using a compiled language.

Thanks for your advice,

--
Kevin Horton
Ottawa, Canada
e-mail: khorton02(_at_)rogers(_dot_)com
Mac OS X 10.2.6, Dual G4 1.42
Jul 19 '05 #1
2 3999
Kevin Horton wrote:
I'm a perl newbie, with next to no programming experience (I did a bunch
of Fortran 25 years ago, but nothing since).

I have a problem I need to solve, and I'm wondering whether perl is the
best tool. I need to log a fairly fast data stream to a file, after
adding a time stamp to the end of each line. The data is ASCII text, and
will be coming into a serial port on a laptop. The data stream is at
115,200 baud, with 64 lines per second, each line being 40 to 45
characters long. I'm not sure yet what format the line ends are.

I've successfully logged the data using a Windows terminal program, but I
really need to add a time stamp to each line, so I can sync the data up
with another data stream when I do the post processing. I think I could
use the Win32::SerialPort module to get the data into perl, have it parse
the data looking for line ends, add a time stamp to the end of each line
and log the line to a file.

Win32::SerialPort info:

http://members.aol.com/Bbirthisel/SerialPort.html

I realize that the time stamps will only have a resolution of one second,
but I figure that when I do the post processing I could look for the
records where the time stamp changed seconds, and then count records to
infer the time for each one. I don't need super high accuracy. Plus or
minus a half second will be more than good enough.

The only laptop I currently have available that has a serial port is a
Dell PIII 500 running Win 98. I can't put Linux on it, as it belongs to
my wife, and she needs Win 98 for some business applications. Is it
likely that a perl program would have enough performance to deal with data
at 115,200 baud, with 64 records per second?

I'm tempted to try perl as it seems to be general purpose enough that I
could use my new perl skills for all kinds of other things. But, if perl
likely won't be up to the task I'll look into using a compiled language.

Thanks for your advice,


If it's a serial port, the stream cannot be that fast by modern standards.
Any language will do fine, perl is probably easier to start with than, say,
C. You can have timestamps of any precision with Time::HiRes module (look
at CPAN), or you may call gettimeofday() via syscall (I am not sure if
Windows has this function but most probably it does).

Jul 19 '05 #2
On Sun, 21 Sep 2003 04:51:41 -0400, Isaac Mushinsky wrote:
Kevin Horton wrote:
I'm a perl newbie, with next to no programming experience (I did a bunch
of Fortran 25 years ago, but nothing since).

I have a problem I need to solve, and I'm wondering whether perl is the
best tool. I need to log a fairly fast data stream to a file, after
adding a time stamp to the end of each line. The data is ASCII text,
and will be coming into a serial port on a laptop. The data stream is
at 115,200 baud, with 64 lines per second, each line being 40 to 45
characters long. I'm not sure yet what format the line ends are.

I've successfully logged the data using a Windows terminal program, but
I really need to add a time stamp to each line, so I can sync the data
up with another data stream when I do the post processing. I think I
could use the Win32::SerialPort module to get the data into perl, have
it parse the data looking for line ends, add a time stamp to the end of
each line and log the line to a file.

Win32::SerialPort info:

http://members.aol.com/Bbirthisel/SerialPort.html

I realize that the time stamps will only have a resolution of one
second, but I figure that when I do the post processing I could look for
the records where the time stamp changed seconds, and then count records
to infer the time for each one. I don't need super high accuracy. Plus
or minus a half second will be more than good enough.

The only laptop I currently have available that has a serial port is a
Dell PIII 500 running Win 98. I can't put Linux on it, as it belongs to
my wife, and she needs Win 98 for some business applications. Is it
likely that a perl program would have enough performance to deal with
data at 115,200 baud, with 64 records per second?

I'm tempted to try perl as it seems to be general purpose enough that I
could use my new perl skills for all kinds of other things. But, if
perl likely won't be up to the task I'll look into using a compiled
language.

Thanks for your advice,

If it's a serial port, the stream cannot be that fast by modern standards.
Any language will do fine, perl is probably easier to start with than,
say, C. You can have timestamps of any precision with Time::HiRes module
(look at CPAN), or you may call gettimeofday() via syscall (I am not sure
if Windows has this function but most probably it does).


Yeah, 115,200 is way faster than needed, but that is what this box puts
out, so I am stuck with it unless the vendor will heed my requests that he
ratchet the speed down a bit. I have successfully captured coherent data
using HyperTerminal PE on this laptop, so I know the serial port is up to
the task.

Thanks for the pointer to the Time::HiRes module. I'll install perl on
the laptop and start playing around with it. I suspect I'll be back for
more help once I get into this.

--
Kevin Horton
Ottawa, Canada
e-mail: khorton02(_at_)rogers(_dot_)com
http://go.phpwebhosting.com/~khorton/rv8/

Jul 19 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

10
by: Bob | last post by:
why is perl referred to as a CGI script and not php, php has the facility to execute backticked execute commands that enable the programmer to use operating system facilities as well ? why...
58
by: @ | last post by:
A benchmark in 2002 showed PHP is much slower in shell or when Apache has Mod_Perl. With the new PHP kissing Java's ass, Perl is once again the #1 CGI choice. Java is for a big team in short...
1
by: Marc H. | last post by:
Hello, I recently converted one of my perl scripts to python. What the script does is simply search a lot of big mail files (~40MB) to retrieve specific emails. I simply converted the script...
1
by: Al Belden | last post by:
Hi all, I've been working on a problem that I thought might be of interest: I'm trying to replace some korn shell scripts that search source code files with perl scripts to gain certain features...
7
by: Ajar | last post by:
I want to write a program which will automatically login to my ISPs website, retrieve data and do some processing. Can this be done? Can you point me to any example python programs which do similar...
9
by: Dieter Vanderelst | last post by:
Dear all, I'm currently comparing Python versus Perl to use in a project that involved a lot of text processing. I'm trying to determine what the most efficient language would be for our...
0
by: Kirt Loki Dankmyer | last post by:
So, I download the latest "stable" tar for perl (5.8.7) and try to compile it on the Solaris 8 (SPARC) box that I administrate. I try all sorts of different switches, but I can't get it to compile....
6
by: www.gerardvignes.com | last post by:
Hi, Have you run into a situation where you had to switch from PHP 4.x.x to Perl 5.x.x in order to get better performance? I am using an OO approach to PHP for my website's server code. There...
6
by: surfivor | last post by:
I may be involved in a data migration project involving databases and creating XML feeds. Our site is PHP based, so I imagine the team might suggest PHP, but I had a look at the PHP documentation...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.