By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,994 Members | 1,166 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,994 IT Pros & Developers. It's quick & easy.

Req: ? Python command to append data to file

P: 17
Hi

This is my first attempt at writing Python script - it's probably a bit ambitious, but there again, whatever doesn't kill you makes you stronger... ;-)

I'm trying to write a script for a datalogger, which will import data from a laboratory flow meter at regular user-defined intervals via a serial port connection, then save it as a CSV file which can then be opened in Excel for analysis.

The imported data is in the form of an 18-byte long comma delimited ASCII character string which I've helpfully called "data".
e.g.: 12.345,100.23,22.5

Each comma delimited value will represent a column, and each string of data will represent a seperate row in the Excel spreadsheet.

So far (with some help from the Bytes community!) I've managed to import a string of data, print it to screen, and save it as a csv file using the code listed below. My "Print to screen" and "Save to file" routines are embedded in a "While" loop, which loops round repeatedly until the required (user defined) number of data strings have been collected.

I can see all the results printed to the screen, but using my "Save to file" routine in this way means that the data file is constantly being over-written, and the only data that ends up saved to the file is the last string to be collected.

I was wondering if there is a Python command along the lines of "append to file", that I could use which would allow me to write successive strings of data to the .csv file without over-writing the previous entries.

I guess the alternative is to use Python to keep adding each new data string to the end of the previous ones,
seperated by a suitable "new row" character, to make one long "master string" for subsequent import into Excel. However, the "master string" may end up rather long - I'm planning to collect data at a rate of 1 new 18 byte string every 10 seconds for a couple of hours! Also, if the data connection or power supply was interrupted at any stage during data acquisition, all the data collected might be lost if this method was used...

I'm sure this must be a fairly routine problem, but I'm not sure of the best solution, and would be very grateful for your help & advice.

Many thanks in anticipation

Dave


Expand|Select|Wrap|Line Numbers
  1. #Write recieved data to file
  2.     data=data + "\n"
  3.     print "Writing data to file 'c:/test/a.csv'"
  4.     fob=open('c:/test/a.csv','w')
  5.     fob.write(data)
  6.     fob.close()
  7.  
Sep 15 '10 #1
Share this Question
Share on Google+
6 Replies


bvdet
Expert Mod 2.5K+
P: 2,851
Minor adjustment. Open the file in "append" mode.
Expand|Select|Wrap|Line Numbers
  1. f = open("file_name", 'a')
Sep 15 '10 #2

P: 17
Many thanks for the rapid reply.

I got the whole thing working last night using the quick & dirty "master string" technique described above, and it seems to work fine - I can now send and receive data from the flow meter at user-defined intervals, and save the result as a .csv which opens in Excel :-)

A few quickie related questions:

1) How long can a single string can be before it causes problems? Is it better practice to use the append file method instead?

2) When I am reading & writing to file or to COM1 port, do I need to enclose every single read or write command within open and close file/port commands; or can I get away with doing this less frequently...e.g. just at the start and end of the main code?

3) Do I need to execute a flush command periodically when writing to file or COM ports?

4) How do I get Python to create a brand new file in a Windows directory using a string variable as the file name?

Sorry for the dumb questions - if there is a good tutorial on line/in print that you could recommend which covers these issues at a basic level, I'd be very grateful.

Thanks again for your help

Dave
Sep 16 '10 #3

bvdet
Expert Mod 2.5K+
P: 2,851
1. The length of the string is only limited by available memory. String concatenation (string addition) is expensive. I would accumulate the strings in a list and write the data to a file once. Example:
Expand|Select|Wrap|Line Numbers
  1. output = []
  2. output.append("string1")
  3. output.append("string2")
  4. fileObj.write("\n".join(output))
2. If you accumulate the strings in a list, I would open and close the file each time. When a file is closed, the output buffers are flushed, assuring the data is written.

3. No. See #2.

4.
Expand|Select|Wrap|Line Numbers
  1. var = "file_name"
  2. path = "C:\\dirname\\"
  3. fileObj = open("%s%s.dat" % (path, var, 'w'))
If portability is an issue, it's better to use os.path.join() to concatenate the path and file name.
Sep 16 '10 #4

bvdet
Expert Mod 2.5K+
P: 2,851
I reread your earlier post where you were concerned about losing your data connection. You could design your code to immediately write the data to disk instead of failing by using a try/except block. If you are concerned about power interruption, write the data to disk each time it is collected. I doubt if opening a file, writing a small amount of data, and closing the file every 10 seconds would be noticeable.
Sep 16 '10 #5

P: 17
Thanks again bvdet!
Really appreciate you taking the time to help out with my queries.

Have a great weekend!
Best wishes

Dave
Sep 17 '10 #6

bvdet
Expert Mod 2.5K+
P: 2,851
My pleasure. I'm glad to help.

BV
Sep 18 '10 #7

Post your reply

Sign in to post your reply or Sign up for a free account.