473,416 Members | 1,542 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,416 software developers and data experts.

How to speed up a code loop with INSERT INTO query?

Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

Aug 2 '07 #1
8 6453
Why not just import the html file into a table using built in import
features? To do it manually, you would go to file - get external data-
import, and select the html file. To do it in code, you would use
docmd.transfertext. On the import, your lines would automatically
truncate to 255 if that were the size of the field you were importing
to. You could do a global replace of the ' either before or after the
import.

As far as your code goes, I would expect that the delay would have a lot
to do with the replace(left( functions and the sound thing, but I
imagine you tested it without the click.

hope this helps
-John

SaltyBoat wrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
Aug 2 '07 #2

"SaltyBoat" <ha*****@gmail.comwrote in message
news:11**********************@e9g2000prf.googlegro ups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openrecordset( "EWBsRaw")
....
Do Until ...
....
rsEWB.addnew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
....
Loop
....
rsEWB.Close
set rsEWB = nothing
Aug 2 '07 #3
open a DAO recordset in Apppend mode and use AddNew .. Update

SaltyBoat wrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
Aug 2 '07 #4
I'd link the text data (not import it), then loop/parse thru it, then
insert the usable data once it's clean. This way the bulk raw data
stays out of the database, and the information is saved already
trimmed and clean.

On Aug 2, 12:59 pm, SaltyBoat <hall...@gmail.comwrote:
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.

The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)

'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

Aug 2 '07 #5
Thanks everyone for the help and advice. I tried the link to html,
and the docmd.transfertext suggestions and find that they seem to
require that the data embedded in the html to use an html table
format, which my data does not have. I then tried the advice of using
a recordset in code, versus the docmd.runsql "...INSERT INTO...", and
using a recordset helps quite a bit. I found that the speed my record
processing more than tripled. Very good improvement! And, a couple
people speculated about the delay of the Call Click sound routine,
upon testing that I found it to have no perceptible hit on
performance. Again, thanks for the help, this Usenet group is great!
Here is the working debugged 'real code' that I ended up using, which
is only slightly modified from paii, Ron's 'air code', thanks Ron..

I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begin code========
Public Sub importPDFrecordset()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenRecordset("EWBsRaw")
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwrote:
"SaltyBoat" <hall...@gmail.comwrote in message

news:11**********************@e9g2000prf.googlegro ups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======

If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openrecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 2 '07 #6
I was thinking that you would use transfertext and just treat the file
as a text file rather than an html file, and have it just import each
line into a single field.
If you want to do it with the recordset method and speed it up, you
might consider replacing all the 's with a text editor beforehand, and
then just saying rsEWB![Field1] = txtLine

SaltyBoat wrote:
>
I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begin code========
Public Sub importPDFrecordset()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenRecordset("EWBsRaw")
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwrote:
>"SaltyBoat" <hall...@gmail.comwrote in message

news:11**********************@e9g2000prf.googlegr oups.com...
>>Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.

Air Code

dim rsEWB as recordset

set rsEWB = currentdb.openrecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 3 '07 #7
On Aug 2, 5:05 pm, SaltyBoat <hall...@gmail.comwrote:
Thanks everyone for the help and advice. I tried the link to html,
and the docmd.transfertext suggestions and find that they seem to
require that the data embedded in the html to use an html table
format, which my data does not have. I then tried the advice of using
a recordset in code, versus the docmd.runsql "...INSERT INTO...", and
using a recordset helps quite a bit. I found that the speed my record
processing more than tripled. Very good improvement! And, a couple
people speculated about the delay of the Call Click sound routine,
upon testing that I found it to have no perceptible hit on
performance. Again, thanks for the help, this Usenet group is great!
Here is the working debugged 'real code' that I ended up using, which
is only slightly modified from paii, Ron's 'air code', thanks Ron..

I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?

===========begin code========
Public Sub importPDFrecordset()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenRecordset("EWBsRaw")
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========

On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwrote:
"SaltyBoat" <hall...@gmail.comwrote in message
news:11**********************@e9g2000prf.googlegro ups.com...
Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.
Air Code
dim rsEWB as recordset
set rsEWB = currentdb.openrecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing
The bang maybe slow.You could probably speed it up a little bit by
declaring the field, eg.

Dim TheField as DAO.Field

After opening the recordset

Set TheField = rsEWB![Field1]

And later

TheField = Replace(Left(txtLine, 254), "'", "")
or
TheField.Value = Replace(Left(txtLine, 254), "'", "")

And you might reduce time a little more by using a Regular Expression
to do the replace.This example replaces HTML tags:

Public Function RemoveTags(ByVal InputString$)
' requires that Visual Basic Script be installed
' VBS is installed by default with Windows
' it disappears only with direct action by an administrator
Dim re As Object
Set re = CreateObject("VBScript.RegExp")
InputString = Trim(InputString)
With re
.Global = True
.IgnoreCase = True
.Pattern = "\<(.|\n)*?\>"
RemoveHTMLTags = .Replace(InputString, "")
End With
Set re = Nothing
End Function

I think the pattern for replacing a single quote would be "\'"

Of course, you don't want to keep instantiating and releasing that RE
Object for every text line so you'd probably want to give it modular
scope, or load the whole file into a string, doing the replace on the
whole string, then saving the string to file again (maybe a new file)
and then importing the individual lines. That might be more efficient
as well.

Of course faster hardware will help. I have a 64 bit machine, and a
laptop. But I develop on an old hand-me-down clunker, an E-Machnes
1742. I figure if the app is fast enough on it, it'll be fast enough
on anything. When I want something to whir I go over to the 64 bit
machine. Sometimes I sit and wait for a long time before I realize
that it completed the task so fast I didn't notice that it started.

Aug 3 '07 #8
Thanks John Welch, I just did a menu command 'File|Get External Data|
Import...|Text Files' of the html file renamed as a txt file. (I am
guessing that this menu wizard uses the same underlying transfertext
method). Total import of 907,000 lines of text took about three
minutes. This is fantastic improvement over the eight hours it used
to take using code and a docmd.runsql INSERT INTO, statement. <grin>.

Also, prior to import I used a text editor to strip out the
troublesome 'single quote' characters, taking about five seconds.

On Aug 2, 8:32 pm, John Welch <so...@nospam.comwrote:
I was thinking that you would use transfertext and just treat the file
as a text file rather than an html file, and have it just import each
line into a single field.
If you want to do it with the recordset method and speed it up, you
might consider replacing all the 's with a text editor beforehand, and
then just saying rsEWB![Field1] = txtLine

SaltyBoat wrote:
I want to speed this up even more.
I am curious of your advice as to whether faster hardware can help.
Is the likely bottleneck the CPU processing time?
Or, the speed of write to disk?
===========begin code========
Public Sub importPDFrecordset()
Dim rsEWB As DAO.Recordset
Set rsEWB = CurrentDb.OpenRecordset("EWBsRaw")
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
rsEWB.AddNew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
Loop
Close #F
rsEWB.Close
Set rsEWB = Nothing
End Sub
============end code=========
On Aug 2, 10:30 am, "paii, Ron" <n...@no.comwrote:
"SaltyBoat" <hall...@gmail.comwrote in message
>news:11**********************@e9g2000prf.googlegr oups.com...
>Needing to import and parse data from a large PDF file into an Access
2002 table: I start by converted the PDF file to a html file. Then
I read this html text file, line by line, into a table using a code
loop and an INSERT INTO query. About 800,000 records of raw text.
Later, I can then loop through and parse these 800,000 strings into
usable data using more code.
The problem I have is that the conversion of the text file, using a
code loop and an INSERT INTO query, runs at the speed of about 30
records per second. Nearly 8 hours total. I welcome suggestions as
to how to speed this up. (Either code based, and/or hardware based.)
'=====Loop import text file into table code=====
' dumps.html is a 14,000 KB text file, with 800,000 lines.
' Click() and TaDa() are code that makes sound so I can hear the
progress of the loop.
' The table EWBsRaw is local and has two fields, ID (an autonumber)
and Field1 (255 size text field).
'===
Public Sub ImportPDFtable()
TheLine = 0
F = FreeFile
Open "G:\BOD\DEWRs\dumps.html" For Input As #F
Do Until EOF(F)
Line Input #F, txtLine
TheLine = TheLine + 1
DoCmd.RunSQL "INSERT INTO EWBsRaw ( Field1 ) SELECT '" &
Replace(Left(txtLine, 254), "'", "") & "' AS Expr1"
Call Click
Loop
Close #F
Debug.Print TheLine
Call TaDa
End Sub
'=====end code=======
If a query can't do it as John Welch posted, why not open the EWBsRaw a
record set to insert the records. RunSql needs to open and close the table
for each new record; Open / Close can be very expensive operations. Also
what is "Call Click", how long does it take to run.
Air Code
dim rsEWB as recordset
set rsEWB = currentdb.openrecordset( "EWBsRaw")
...
Do Until ...
...
rsEWB.addnew
rsEWB![Field1] = Replace(Left(txtLine, 254), "'", "")
rsEWB.Update
...
Loop
...
rsEWB.Close
set rsEWB = nothing

Aug 3 '07 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
by: Markku Uttula | last post by:
I think I'm doing something wrong. I'm able to connect to Oracle just fine, execute queries and all, but I'm having serious problems with the speed :( For example, the following PHP-script on my...
17
by: Shailesh Humbad | last post by:
I just posted an article I wrote called ASP Speed Tricks. It covers techniques to optimize output of database data in HTML, for both simple tables and complex tables. More advanced ASP authors...
5
by: ArShAm | last post by:
Hi there Please help me to optimize this code for speed I added /O2 to compiler settings I added /Oe to compiler settings for accepting register type request , but it seems that is not allowed...
4
by: cameron | last post by:
I have always been under the impression that LDAP was optimized for speed. Fast queries, fast access, slower writes. I have a block of data in LDAP and in SQL. Exact same data. The query is fast...
8
by: Josué Maldonado | last post by:
Hello List, I'm importing some data from Foxpro to Postgres, there is atable wich contains aprox 4.8 million rows and it size about 830MB. I uploaded it to Postgres using dbf2pg and worked fine,...
2
by: rn5a | last post by:
In a ASP applicatiuon, the FOrm has a textbox & a select list where the admin can select multiple options. Basically the admin has to enter the name of a new coach in the textbox & select the...
1
by: staticfire | last post by:
Hi i am in need of help with an ajax related problem on my site. I've coded an active users list which shows the members and guests online. I'm sure you've all seen active user lists before, like the...
11
by: mdboldin | last post by:
I hav read on this forum that SQL coding (A) below is preferred over (B), but I find (B) is much faster (20-40% faster) (A) sqla= 'INSERT INTO DTABLE1 VALUES (%d, %d, %d, %f)' % values...
3
by: uma9 | last post by:
hi, the code below is used to insert a single record....i want to know how to insert multiple records using a "for" loop...please help Set connect = CreateObject ("ADODB.Connection")...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.