473,731 Members | 2,630 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

How to speed up a code loop with INSERT INTO query?

Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

Aug 2 '07 #1
8 6490
Why not just import the html file into a table using built in import
features? To do it manually, you would go to file - get external data-
import, and select the html file. To do it in code, you would use
docmd.transfert ext. On the import, your lines would automatically
truncate to 255 if that were the size of the field you were importing
to. You could do a global replace of the ' either before or after the
import.

As far as your code goes, I would expect that the delay would have a lot
to do with the replace(left( functions and the sound thing, but I
imagine you tested it without the click.

hope this helps
-John

SaltyBoat wrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
Aug 2 '07 #2

"SaltyBoat" <ha*****@gmail. comwrote in message
news:11******** **************@ e9g2000prf.goog legroups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openr ecordset( "EWBsRaw")
....
Do Until ...
....
rsEWB.addnew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
....
Loop
....
rsEWB.Close
set rsEWB = nothing
Aug 2 '07 #3
open a DAO recordset in Apppend mode and use AddNew .. Update

SaltyBoat wrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
Aug 2 '07 #4
I'd link the text data (not import it), then loop/parse thru it, then
insert the usable data once it's clean. This way the bulk raw data
stays out of the database, and the information is saved already
trimmed and clean.

On Aug 2, 12:59 pm, SaltyBoat <hall...@gmail. comwrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

Aug 2 '07 #5
Thanks everyone for the help and advice. I tried the link to html,
and the docmd.transfert ext suggestions and find that they seem to
require that the data embedded in the html to use an html table
format, which my data does not have. I then tried the advice of using
a recordset in code, versus the docmd.runsql "...INSERT INTO...", and
using a recordset helps quite a bit. I found that the speed my record
processing more than tripled. Very good improvement! And, a couple
people speculated about the delay of the Call Click sound routine,
upon testing that I found it to have no perceptible hit on
performance. Again, thanks for the help, this Usenet group is great!
Here is the working debugged 'real code' that I ended up using, which
is only slightly modified from paii, Ron's 'air code', thanks Ron..

I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begi n code========
Public Sub importPDFrecord set()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenR ecordset("EWBsR aw")
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwro te:
"SaltyBoat" <hall...@gmail. comwrote in message

news:11******** **************@ e9g2000prf.goog legroups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openr ecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 2 '07 #6
I was thinking that you would use transfertext and just treat the file
as a text file rather than an html file, and have it just import each
line into a single field.
If you want to do it with the recordset method and speed it up, you
might consider replacing all the 's with a text editor beforehand, and
then just saying rsEWB![Field1] = txtLine

SaltyBoat wrote:
>
I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begi n code========
Public Sub importPDFrecord set()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenR ecordset("EWBsR aw")
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwro te:
>"SaltyBoat" <hall...@gmail. comwrote in message

news:11******* *************** @e9g2000prf.goo glegroups.com.. .
>>Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left( txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openr ecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 3 '07 #7
On Aug 2, 5:05 pm, SaltyBoat <hall...@gmail. comwrote:
Thanks everyone for the help and advice. I tried the link to html,
and the docmd.transfert ext suggestions and find that they seem to
require that the data embedded in the html to use an html table
format, which my data does not have. I then tried the advice of using
a recordset in code, versus the docmd.runsql "...INSERT INTO...", and
using a recordset helps quite a bit. I found that the speed my record
processing more than tripled. Very good improvement! And, a couple
people speculated about the delay of the Call Click sound routine,
upon testing that I found it to have no perceptible hit on
performance. Again, thanks for the help, this Usenet group is great!
Here is the working debugged 'real code' that I ended up using, which
is only slightly modified from paii, Ron's 'air code', thanks Ron..

I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begi n code========
Public Sub importPDFrecord set()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenR ecordset("EWBsR aw")
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwro te:
"SaltyBoat" <hall...@gmail. comwrote in message
news:11******** **************@ e9g2000prf.goog legroups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(tx tLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.
Air Code
dim rsEWB as recordset
set rsEWB = currentdb.openr ecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing
The bang maybe slow.You could probably speed it up a little bit by
declaring the field, eg.

Dim TheField as DAO.Field

After opening the recordset

Set TheField = rsEWB![Field1]

And later

TheField = Replace(Left(tx tLine, 254), "'", "")
or
TheField.Value = Replace(Left(tx tLine, 254), "'", "")

And you might reduce time a little more by using a Regular Expression
to do the replace.This example replaces HTML tags:

Public Function RemoveTags(ByVa l InputString$)
' requires that Visual Basic Script be installed
' VBS is installed by default with Windows
' it disappears only with direct action by an administrator
Dim re As Object
Set re = CreateObject("V BScript.RegExp" )
InputString = Trim(InputStrin g)
With re
.Global = True
.IgnoreCase = True
.Pattern = "\<(.|\n)*? \>"
RemoveHTMLTags = .Replace(InputS tring, "")
End With
Set re = Nothing
End Function

I think the pattern for replacing a single quote would be "\'"

Of course, you don't want to keep instantiating and releasing that RE
Object for every text line so you'd probably want to give it modular
scope, or load the whole file into a string, doing the replace on the
whole string, then saving the string to file again (maybe a new file)
and then importing the individual lines. That might be more efficient
as well.

Of course faster hardware will help. I have a 64 bit machine, and a
laptop. But I develop on an old hand-me-down clunker, an E-Machnes
1742. I figure if the app is fast enough on it, it'll be fast enough
on anything. When I want something to whir I go over to the 64 bit
machine. Sometimes I sit and wait for a long time before I realize
that it completed the task so fast I didn't notice that it started.

Aug 3 '07 #8
Thanks John Welch, I just did a menu command 'File|Get External Data|
Import...|Text Files' of the html file renamed as a txt file. (I am
guessing that this menu wizard uses the same underlying transfertext
method). Total import of 907,000 lines of text took about three
minutes. This is fantastic improvement over the eight hours it used
to take using code and a docmd.runsql INSERT INTO, statement. <grin>.

Also, prior to import I used a text editor to strip out the
troublesome 'single quote' characters, taking about five seconds.

On Aug 2, 8:32 pm, John Welch <so...@nospam.c omwrote:
I was thinking that you would use transfertext and just treat the file
as a text file rather than an html file, and have it just import each
line into a single field.
If you want to do it with the recordset method and speed it up, you
might consider replacing all the 's with a text editor beforehand, and
then just saying rsEWB![Field1] = txtLine

SaltyBoat wrote:
I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?
===========begi n code========
Public Sub importPDFrecord set()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenR ecordset("EWBsR aw")
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========
On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwro te:
"SaltyBoat" <hall...@gmail. comwrote in message
>news:11******* *************** @e9g2000prf.goo glegroups.com.. .
>Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable( )
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\d umps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(t xtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.
Air Code
dim rsEWB as recordset
set rsEWB = currentdb.openr ecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(tx tLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 3 '07 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
3313
by: Markku Uttula | last post by:
I think I'm doing something wrong. I'm able to connect to Oracle just fine, execute queries and all, but I'm having serious problems with the speed :( For example, the following PHP-script on my machine executes about 6 seconds: <? $db_conn = ocilogon("my_username", "my_password", "my_database"); $loop_count = 1000;
17
4019
by: Shailesh Humbad | last post by:
I just posted an article I wrote called ASP Speed Tricks. It covers techniques to optimize output of database data in HTML, for both simple tables and complex tables. More advanced ASP authors might be interested in the complex table optimizations. Please check it out at: http://www.somacon.com/aspdocs/ Hope you enjoy, Shailesh
5
3530
by: ArShAm | last post by:
Hi there Please help me to optimize this code for speed I added /O2 to compiler settings I added /Oe to compiler settings for accepting register type request , but it seems that is not allowed and if I remove register type for "l" , time of generating codes doesn't change the original code makes some files , but I removed that section to make it simple for you to read please help me to optimize it for faster running
4
8220
by: cameron | last post by:
I have always been under the impression that LDAP was optimized for speed. Fast queries, fast access, slower writes. I have a block of data in LDAP and in SQL. Exact same data. The query is fast but the first access to the result set takes longer that to do the query in SQL and process the sql results. From my trace.axd LDAP Test Starting Search 0.000112 0.000043 LDAP Test Done Search 0.003821 0.003374 <--- fast query at .003 sec LDAP...
8
2564
by: Josué Maldonado | last post by:
Hello List, I'm importing some data from Foxpro to Postgres, there is atable wich contains aprox 4.8 million rows and it size about 830MB. I uploaded it to Postgres using dbf2pg and worked fine, it tooks about 10-15 minutes. Now I'm inserting some data from that table to a brand new table in Postgresql, for that I'm doing insert into ... select from. The point is inserting this data from one table to another table in Postgresql took...
2
9429
by: rn5a | last post by:
In a ASP applicatiuon, the FOrm has a textbox & a select list where the admin can select multiple options. Basically the admin has to enter the name of a new coach in the textbox & select the soccer clubs which he will be coaching; thus he can select only one soccer club for a new coach or multiple soccer clubs. This is how I am trying it: When this Form will be submitted, the new coaches name will be inserted in a MS-Access DB table...
1
1610
by: staticfire | last post by:
Hi i am in need of help with an ajax related problem on my site. I've coded an active users list which shows the members and guests online. I'm sure you've all seen active user lists before, like the one at the bottom of this forum for example but using ajax i created one which updates without you having to refresh the page. It all works fine except it is causing my site to freeze quite often. I have considered removing it but thn i would have...
11
2057
by: mdboldin | last post by:
I hav read on this forum that SQL coding (A) below is preferred over (B), but I find (B) is much faster (20-40% faster) (A) sqla= 'INSERT INTO DTABLE1 VALUES (%d, %d, %d, %f)' % values curs.execute(sqla) (B) pf= '?, ?, ?, ?'
3
4456
by: uma9 | last post by:
hi, the code below is used to insert a single record....i want to know how to insert multiple records using a "for" loop...please help Set connect = CreateObject ("ADODB.Connection") connect.open "DSN=OPTUMETL;Driver= Oracle in OraHome92;Server=urnts1.uhc.com;UID=OPTUMETL;PWD=OPTUMETL" Reporter.ReportEvent 0, "Database connection", "Successfully connected to URNTS1" Set objRecordset = CreateObject("ADODB.Recordset") ' Stmt to execute...
0
8772
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
1
9233
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9177
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8184
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
6732
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6030
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4546
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4803
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
2714
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.