By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,164 Members | 1,001 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,164 IT Pros & Developers. It's quick & easy.

Fast computer problem?

P: n/a
I have access, for testing at my client's site, a Win2000 computer
running A2003 retail. He recently upgraded all of his other machines to
DualCore Pentiums with 2 gig ram and run A2003 runtime. I believe all
current SPs for Windows and Office are installed on the fast machines.

I have 1 process/subroutine that has worked for a couple of years
without a problem. It works fine on the testing (slow) machine. The
process checks a folder for any INI files there. After I process the
INI file I change the extension to RED then search for any other INI
files in the folder.

Twice in the last week a situation has occurred where it reads the INI
file, processes the data. It's then supposed to rename the file to RED
and then find the next INI file. If not found it exits the loop. But
in one case 1500+ duplicate records were created from 1 INI file instead
of 1 record as expected and yesterday 775+ duplicate records were
created. It's like it processed the INI file, did the rename, but found
the same INI file before it could be renamed and processed it again for
many times until it finally could do the renaming and then exited the loop.

In another process, used since 2005 without incident, we have a printer
that converted files to PDFs. When the routine is called I call a DLL
via CreateObject, call a couple of DLL commands, then swap the default
printer to the PDF converter, print the report, and reset the printer.

However, on the fast computers, the PDF file was not being created. It
looked like it was bypassing the entire process. I saw I was trapping
for a specific error number, removed the trap so all errors would
display, and it calling the DLL code generated errors. I fixed this
problem by using Stephen Leban's PDF converter, an excellent replacement
since it works without a problem.

I can run both processes all day long from the slow computer. But on
the fast computers the problems occur.

I remember when FoxPro required a patch when fast computers came out to
slow the computer down so it could complete command instructions.

Do you have any ideas as to why a fast computer might have a problem
where slow computers don't? Do you think I should find anyplace in my
code that has a "Filecopy" or "Name" (DOS rename), or any other DOS like
commands and insert a DoEvents after it?
Sep 27 '08 #1
Share this Question
Share on Google+
9 Replies


P: n/a
On Sat, 27 Sep 2008 12:26:41 -0700, Salad <oi*@vinegar.comwrote:

DoEvents is not going to make it better.
Consider moving the files in a Processed subfolder. Another option is
to first read the filenames into an array (using the Dir function),
then process the array.

-Tom.
Microsoft Access MVP

>I have access, for testing at my client's site, a Win2000 computer
running A2003 retail. He recently upgraded all of his other machines to
DualCore Pentiums with 2 gig ram and run A2003 runtime. I believe all
current SPs for Windows and Office are installed on the fast machines.

I have 1 process/subroutine that has worked for a couple of years
without a problem. It works fine on the testing (slow) machine. The
process checks a folder for any INI files there. After I process the
INI file I change the extension to RED then search for any other INI
files in the folder.

Twice in the last week a situation has occurred where it reads the INI
file, processes the data. It's then supposed to rename the file to RED
and then find the next INI file. If not found it exits the loop. But
in one case 1500+ duplicate records were created from 1 INI file instead
of 1 record as expected and yesterday 775+ duplicate records were
created. It's like it processed the INI file, did the rename, but found
the same INI file before it could be renamed and processed it again for
many times until it finally could do the renaming and then exited the loop.

In another process, used since 2005 without incident, we have a printer
that converted files to PDFs. When the routine is called I call a DLL
via CreateObject, call a couple of DLL commands, then swap the default
printer to the PDF converter, print the report, and reset the printer.

However, on the fast computers, the PDF file was not being created. It
looked like it was bypassing the entire process. I saw I was trapping
for a specific error number, removed the trap so all errors would
display, and it calling the DLL code generated errors. I fixed this
problem by using Stephen Leban's PDF converter, an excellent replacement
since it works without a problem.

I can run both processes all day long from the slow computer. But on
the fast computers the problems occur.

I remember when FoxPro required a patch when fast computers came out to
slow the computer down so it could complete command instructions.

Do you have any ideas as to why a fast computer might have a problem
where slow computers don't? Do you think I should find anyplace in my
code that has a "Filecopy" or "Name" (DOS rename), or any other DOS like
commands and insert a DoEvents after it?
Sep 27 '08 #2

P: n/a
On Sat, 27 Sep 2008 12:26:41 -0700, Salad <oi*@vinegar.comwrote:
And another thing: it seems you have a database design flaw if it is
possible to created unwanted duplicates. Apply a unique index.

-Tom.

<clip>
Sep 27 '08 #3

P: n/a
Tom van Stiphout wrote:
On Sat, 27 Sep 2008 12:26:41 -0700, Salad <oi*@vinegar.comwrote:

DoEvents is not going to make it better.
OK.
Consider moving the files in a Processed subfolder. Another option is
to first read the filenames into an array (using the Dir function),
then process the array.
I'll think on this.

We may use an option my client found today at MS support for
A97...disable the dual cpu processing and slow the computer down to 1
CPU. Kind of stupid...buy a fast computer to make it slow but sometimes
one's gotta do what one's gotta do.
http://support.microsoft.com/kb/178650

>
-Tom.
Microsoft Access MVP
>>I have access, for testing at my client's site, a Win2000 computer
running A2003 retail. He recently upgraded all of his other machines to
DualCore Pentiums with 2 gig ram and run A2003 runtime. I believe all
current SPs for Windows and Office are installed on the fast machines.

I have 1 process/subroutine that has worked for a couple of years
without a problem. It works fine on the testing (slow) machine. The
process checks a folder for any INI files there. After I process the
INI file I change the extension to RED then search for any other INI
files in the folder.

Twice in the last week a situation has occurred where it reads the INI
file, processes the data. It's then supposed to rename the file to RED
and then find the next INI file. If not found it exits the loop. But
in one case 1500+ duplicate records were created from 1 INI file instead
of 1 record as expected and yesterday 775+ duplicate records were
created. It's like it processed the INI file, did the rename, but found
the same INI file before it could be renamed and processed it again for
many times until it finally could do the renaming and then exited the loop.

In another process, used since 2005 without incident, we have a printer
that converted files to PDFs. When the routine is called I call a DLL
via CreateObject, call a couple of DLL commands, then swap the default
printer to the PDF converter, print the report, and reset the printer.

However, on the fast computers, the PDF file was not being created. It
looked like it was bypassing the entire process. I saw I was trapping
for a specific error number, removed the trap so all errors would
display, and it calling the DLL code generated errors. I fixed this
problem by using Stephen Leban's PDF converter, an excellent replacement
since it works without a problem.

I can run both processes all day long from the slow computer. But on
the fast computers the problems occur.

I remember when FoxPro required a patch when fast computers came out to
slow the computer down so it could complete command instructions.

Do you have any ideas as to why a fast computer might have a problem
where slow computers don't? Do you think I should find anyplace in my
code that has a "Filecopy" or "Name" (DOS rename), or any other DOS like
commands and insert a DoEvents after it?
Sep 28 '08 #4

P: n/a
Tom van Stiphout wrote:
On Sat, 27 Sep 2008 12:26:41 -0700, Salad <oi*@vinegar.comwrote:
And another thing: it seems you have a database design flaw if it is
possible to created unwanted duplicates. Apply a unique index.
Maybe. I'll see if I can index hyperlinks. Or at least see if I can
findfirst a hyperlink.

It seems weird to consider modifying a database table, perhaps splitting
a hyperlink into it's parts and store in separate fields for something
that's been working for years on slower computers.
>
-Tom.

<clip>
Sep 28 '08 #5

P: n/a
Tom van Stiphout <to*************@cox.netwrote:
>DoEvents is not going to make it better.
It just might. "Yields execution so that the operating system can process other
events."

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
Sep 30 '08 #6

P: n/a
On Sep 29, 11:43*pm, "Tony Toews [MVP]" <tto...@telusplanet.net>
wrote:
Tom van Stiphout <tom7744.no.s...@cox.netwrote:
DoEvents is not going to make it better.

It just might. *"Yields execution so that the operating system can process other
events."

Tony
--
Tony Toews, Microsoft Access MVP
* *Please respond only in the newsgroups so that others can
read the entire thread of messages.
* *Microsoft Access Links, Hints, Tips & Accounting Systems athttp://www.granite.ab.ca/accsmstr.htm
* *Tony's Microsoft Access Blog -http://msmvps.com/blogs/access/
It seems like doevents wouldn't do anything here because what Salad is
doing (renaming files) wouldn't raise any 'events' to be processed. I
think the safest thing would be to do what Tom describes, i.e. load
the file names into an array or collection first and use a for...loop
on that rather than a do...loop with a dir() in the middle of it
especially if the dir() process and the renaming process are somehow
happening asynchronously.

Bruce
Sep 30 '08 #7

P: n/a
Salad <oi*@vinegar.comwrote in
news:J5******************************@earthlink.co m:
It seems weird to consider modifying a database table, perhaps splitting
a hyperlink into it's parts and store in separate fields for something
that's been working for years on slower computers.
Probably it is. I suppose you could show us your code as an alternative.

I think I'd follow Tom's suggestion. But instead I might put in a loop that
waited until a DoesFileExist(The Red File) function returns a true with
maybe a counter or timer in it (the loop, not the function) so that it
didn't run infinitely in case something screws up. Then I'd look for the
next ini file.
(Or maybe until DoesFileExist(The Ini File) returns a false.)

Have you thought about changing your sequence to Read (to a string),
Rename, Process or even Rename, Read, Process?

How are you looking for the ini files? Dir$("*.ini") or Dir("*.ini") won't
return a file a second time with subsequent calls to Dir$() or Dir().

Just in Case:

Public Function DoesFileExist2000(ByVal FilePath$) As Boolean
With WizHook
.Key = 51488399
DoesFileExist2000 = .FileExists(FilePath)
End With
End Function

BTW, I thought ini files went out with the last ice age? The Registry is
very easy to write to and to read.

Disable one of the dual cores? Surely NOT?! Better to write everything in
long hand on recipe cards ...

--
lyle fairfield
Oct 1 '08 #8

P: n/a
lyle fairfield wrote:
Salad <oi*@vinegar.comwrote in
news:J5******************************@earthlink.co m:

>>It seems weird to consider modifying a database table, perhaps splitting
a hyperlink into it's parts and store in separate fields for something
that's been working for years on slower computers.


Probably it is. I suppose you could show us your code as an alternative.
Here's my original code. I can look at this all day and not see
anything that would make this code fail.
....initialize/Dim variables here then
If strFolder <"" Then
strSearch = strFolder & "*.INI"
strFiles = Dir(strSearch)
Do While strFiles ""
strRename = Left(strFiles, Len(strFiles) - 3) & "red"

'see if it originated inhouse. If so, delete the file
'and return false. If external source, return true.
blnAppend = AppendIt(strFolder & strFiles)

If blnAppend Then

'process the ini file and append a rec into table. If
'a rec is added, return true. If there's an error, return false
AppendINIData strFolder & strFiles

'blnRename is a global variable. The value is set each time
'a file is processed in AppendIniData. It defaults to true. If 'an
error occurs, it's set to false.
If blnRename Then
Name strFolder & strFiles As strFolder & strRename
Else
Exit Do
End If

End If
strFiles = Dir(strSearch)
Loop
ReadIni = True
End If
>
I think I'd follow Tom's suggestion. But instead I might put in a loop that
waited until a DoesFileExist(The Red File) function returns a true with
maybe a counter or timer in it (the loop, not the function) so that it
didn't run infinitely in case something screws up. Then I'd look for the
next ini file.
(Or maybe until DoesFileExist(The Ini File) returns a false.)

Have you thought about changing your sequence to Read (to a string),
Rename, Process or even Rename, Read, Process?
If there's an error I'd like to know about it and keep the file as it
is. If it can't be processed correctly it would inform the user of the
error and file name.
>
How are you looking for the ini files? Dir$("*.ini") or Dir("*.ini") won't
return a file a second time with subsequent calls to Dir$() or Dir().
As you can see above, I don't use Dir() by itself
Just in Case:

Public Function DoesFileExist2000(ByVal FilePath$) As Boolean
With WizHook
.Key = 51488399
DoesFileExist2000 = .FileExists(FilePath)
End With
End Function

BTW, I thought ini files went out with the last ice age? The Registry is
very easy to write to and to read.
The files are created from an external program. Works fine for me tho,
each row a "field name" with it's value.
Disable one of the dual cores? Surely NOT?! Better to write everything in
long hand on recipe cards ...
I hate the concept of doing so. I modified my code. Before I call the
Appendit() function I now check to see if the hyperlink filename value
(contained in the INI file) already exists in the table. I do this by
splitting the hyperlink address from the text in the table. If it does
exist, I attempt to rename the INI file. If it doesn't exist, it's a
new record and I process it. Then I add the INI filename to a variable
containing the list of files that I've processed in this loop. I then
do a dir(folder *INI) again and search for the next ini. If one exists,
check if its in the list of processed files. If it exists exit the loop
otherwise start the process step again. I'm hoping these added steps
will stop the problem.
Oct 1 '08 #9

P: n/a
Salad <oi*@vinegar.comwrote in
news:Hp******************************@earthlink.co m:
Here's my original code. I can look at this all day and not see
anything that would make this code fail.
...initialize/Dim variables here then
If strFolder <"" Then
strSearch = strFolder & "*.INI"
strFiles = Dir(strSearch)
Do While strFiles ""
strRename = Left(strFiles, Len(strFiles) - 3) & "red"

'see if it originated inhouse. If so, delete the file
'and return false. If external source, return true.
blnAppend = AppendIt(strFolder & strFiles)

If blnAppend Then

'process the ini file and append a rec into table. If
'a rec is added, return true. If there's an error, return false
AppendINIData strFolder & strFiles

'blnRename is a global variable. The value is set each time
'a file is processed in AppendIniData. It defaults to true. If
'an
error occurs, it's set to false.
If blnRename Then
Name strFolder & strFiles As strFolder & strRename
Else
Exit Do
End If

End If
strFiles = Dir(strSearch)
Loop
ReadIni = True
End If
I suspect that the problem could be solved by changing the second
strFiles = Dir(strSearch)
to
strFiles = Dir()
(This is the standard way of using Dir and is related to Tom's point.)

and that the renaming in the loop could be replaced after the loop with
Name strSearch AS Replace(StrSearch,".INI", ".red")
or maybe
Kill Replace(StrSearch,".INI", ".red")
--
lyle fairfield
Oct 1 '08 #10

This discussion thread is closed

Replies have been disabled for this discussion.