By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,967 Members | 1,241 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,967 IT Pros & Developers. It's quick & easy.

ordinal position of fields in ado recordsets

P: n/a
I want to populate an array with values from an ado recordset
(multiple rows)

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...

Feb 14 '06 #1
Share this Question
Share on Google+
24 Replies


P: n/a
The absolute simplest way is to use the GetRows method

Dim varArray as variant

varArray = rst1.GetRows

You then have a two dimensional array the first dimension of which is the
columns the second dimenstion is the rows. (the opposite way round to how
you've defined your array).

Alternatively (to get rows by columns as you have now)

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker as integer
dim lngRowCount as long
with rst1
.MoveLast
Redim varArray(0 to .RecordCount -1, 0 to .Fields.count -1)

.MoveFirst
lngRowCount = -1
Do Until .EOF
lngRowCount = lngRowCount + 1
For intfieldmarker = 0 To .Fields.Count -1
varArray(lngRowCount, intfieldMarker) = .fields(intfieldMarker)
Next Fld
rst1.MoveNext
Loop
End With

--

Terry Kreft
"Donald Grove" <do*********@verizon.net> wrote in message
news:74********************************@4ax.com...
I want to populate an array with values from an ado recordset
(multiple rows)0

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...

Feb 14 '06 #2

P: n/a
"Donald Grove" <do*********@verizon.net> wrote in message
news:74********************************@4ax.com...
I want to populate an array with values from an ado recordset
(multiple rows)

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...

Why not do it in one line?
varArray=rst1.GetRows

If you really had to do something else, why would you not simply use two
counters lngRow and lngCol to do this without referring to the
AbsolutePosition property which is 1-based whereas your array seems to be
zero-based.

I also don't understand what you can do with DAO recordsets that you can't
with ADO. Both of these will get you the value of the first field in the
recordset assuming its name is "FirstField".
rst.Fields(0).Value
rst.Fields("FirstField").Value
Feb 14 '06 #3

P: n/a
Hmmm, Thinking about this again I would get the Rows/Columns version this
way

dim rst1 as new adodb.recordset
dim varTemp as Variant
dim varArray()
dim intfieldmarker as integer
dim lngRowCount as long

varTemp = rst1.GetRows

Set rst1 = nothing

' Then swap the array around
redim varArray(lbound(vartemp, 2) to ubound(vartemp, 2), lbound(vartemp,
1) to ubound(vartemp, 1))

For intfieldmarker = lbound(vartemp, 1) to ubound(vartemp, 1)
For lngRowCount = lbound(vartemp, 2) to ubound(vartemp, 2)
varArray(lngRowCount , intfieldmarker ) =
varTemp(intfieldmarker, lngRowCount)
Next
Next
--

Terry Kreft
"Terry Kreft" <te*********@mps.co.uk> wrote in message
news:od********************@karoo.co.uk...
The absolute simplest way is to use the GetRows method

Dim varArray as variant

varArray = rst1.GetRows

You then have a two dimensional array the first dimension of which is the
columns the second dimenstion is the rows. (the opposite way round to how
you've defined your array).

Alternatively (to get rows by columns as you have now)

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker as integer
dim lngRowCount as long
with rst1
.MoveLast
Redim varArray(0 to .RecordCount -1, 0 to .Fields.count -1)

.MoveFirst
lngRowCount = -1
Do Until .EOF
lngRowCount = lngRowCount + 1
For intfieldmarker = 0 To .Fields.Count -1
varArray(lngRowCount, intfieldMarker) = ..fields(intfieldMarker) Next Fld
rst1.MoveNext
Loop
End With

--

Terry Kreft
"Donald Grove" <do*********@verizon.net> wrote in message
news:74********************************@4ax.com...
I want to populate an array with values from an ado recordset
(multiple rows)0

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...


Feb 14 '06 #4

P: n/a
On Tue, 14 Feb 2006 12:18:54 GMT, Donald Grove
<do*********@verizon.net> wrote:

And what are you going to do with that array? Iterate over it, find
items in it, sort it? All operations you can do on the recordset
itself. So keep the data in the rs, and don't create an unnecessary
copy.

-Tom.

I want to populate an array with values from an ado recordset
(multiple rows)

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...


Feb 14 '06 #5

P: n/a
When I want an array that emulates the recordset I generally use
GetString and Split, first on the row delimiter, and Split on the
column delimiter again on each element (row) of the array, giving me
an array of "sub" arrays, each sub array being the values of one
record.

Of course, if one is going to do multiple scans of the data, arrays are
much quicker than recordsets (or seem to be).

In languages where Arrays can be Sparse (eg Javascript) this can result
in amazing performance. We can number our row arrays according to the
(numerical) primary key of the recordset, assuming such exists. Then
Customers[7][10] gives us the value of the eleventh field of the
customer whose ID (PK) is 7. Is this possible in VBA? Sure, but I think
not nearly so effciently.

Feb 14 '06 #6

P: n/a
On 14 Feb 2006 08:15:23 -0800, "Lyle Fairfield" <ly***********@aim.com> wrote:
When I want an array that emulates the recordset I generally use
GetString and Split, first on the row delimiter, and Split on the
column delimiter again on each element (row) of the array, giving me
an array of "sub" arrays, each sub array being the values of one
record.

Of course, if one is going to do multiple scans of the data, arrays are
much quicker than recordsets (or seem to be).

In languages where Arrays can be Sparse (eg Javascript) this can result
in amazing performance. We can number our row arrays according to the
(numerical) primary key of the recordset, assuming such exists. Then
Customers[7][10] gives us the value of the eleventh field of the
customer whose ID (PK) is 7. Is this possible in VBA? Sure, but I think
not nearly so effciently.


When I once mentioned that I used arrays for recordset processing on the client side in web pages I
got such a supercilious reply that I've been in hiding ever since!

Feb 14 '06 #7

P: n/a
Life's a bitch isn't it? Got a link to the thread so we can enjoy it
again? ... or just understand more clearly what you are talking about!

Feb 14 '06 #8

P: n/a
On 14 Feb 2006 11:27:55 -0800, "Lyle Fairfield" <ly***********@aim.com> wrote:
Life's a bitch isn't it? Got a link to the thread so we can enjoy it
again? ... or just understand more clearly what you are talking about!


1. try
http://groups.google.co.uk/group/com...5eaf96306fd87d

2. I rather over-reacted to David Fenton's reply

3. I couldn't resist a dig at XML (Codasyl in disguise)

I have a 1% finished and abandoned browser version of Access, mainly the DAO forms
model including forms design mode. Of course you have to code in Javascript! I remember I spent a
lot of the time on combo-boxes with Access-like selection, autocomplete etc. The actual treatment
of tables was very simple. But I only dealt with the simplest queries.

Feb 14 '06 #9

P: n/a
If you're content with Internet Explorer and have ADO installed on the
client machine you can use ADO recordsets within HTML. No need of ASP
or any Server Side application. Just declare the ADO objects as
ActiveXObjects and IE will handle them within Client Side Script.
Except for concerns about security (everything in the HTML is available
to anyone downloading the page) this is pretty fast and cool.
The arrays you mention are another way. I use them within ASP; so only
the HTML output is downloaded. But they are incredibly fast as I think,
you imply.

Feb 14 '06 #10

P: n/a
"Perhaps you could teach a pig to sing, but I for one would not want to
listen to it. "

Oh that's classic DWF, thank you for bringing it up again, that gave me a
good laugh just when I needed it.

--

Terry Kreft
"polite person" <si*****@ease.com> wrote in message
news:60********************************@4ax.com...
On 14 Feb 2006 11:27:55 -0800, "Lyle Fairfield" <ly***********@aim.com> wrote:
Life's a bitch isn't it? Got a link to the thread so we can enjoy it
again? ... or just understand more clearly what you are talking about!
1. try

http://groups.google.co.uk/group/com...5eaf96306fd87d

<SNIP>
Feb 15 '06 #11

P: n/a
Actually it is to move the records into another table. Perhaps my
method is a little roundabout, but I am a rookie at this.
I take all the new records from tableA populate the array, then use
the array to add the new records to tableB.

So you are saying just open the two recordsets and at the new data
from one to the other and skip using the array altogether?

Honestly, I have been working directly with tables and queries in
access for about 5 years. It is only in the last 6 months that I have
even learned what a recordset is, let alone what an array is. So some
of my methods are probably more complicated than they need to be.

I want to do things in the most efficient way I can, but I have to
confess that I was using an array partly because I wanted to learn
more about them... :)

On Tue, 14 Feb 2006 07:13:44 -0700, Tom van Stiphout
<no*************@cox.net> wrote:
On Tue, 14 Feb 2006 12:18:54 GMT, Donald Grove
<do*********@verizon.net> wrote:

And what are you going to do with that array? Iterate over it, find
items in it, sort it? All operations you can do on the recordset
itself. So keep the data in the rs, and don't create an unnecessary
copy.

-Tom.

I want to populate an array with values from an ado recordset
(multiple rows)

I use the absolute position of the cursor in the recordset to define
the row of my array to be populated. I have a workaround for lack of
a way to define the ordinal position of a field (incrementing a
counter variable), but it feels so primitive:

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker is integer

Redim varArray(rst1.recordcount, rst1.Fields.count)

rst1.MoveFirst

Do While Not rst1.EOF
intfieldMarker = 0
For Each Fld In rst1.Fields
varArray(rst1.AbsolutePosition, intfieldMarker) = Fld
intfieldMarker = intfieldMarker + 1
Next Fld
rst1.MoveNext
Loop

If I used DAO, I could call the ordinal position of each field, and
use that to indicate the column of my array. But that method isn't
available with ADO, it seems. Does anyone know something a little
less risky than incrementing a counter?

Is it possible, for instance to store the field names in the first row
of the array, and then somehow use those fieldnames as pointers to the
column of the array? Just a thought...

Feb 15 '06 #12

P: n/a
Thanks for this. I read about the getrows method, but I was scared to
try it. I should play around with it. And thanks for cluing me in
that the columns are the first dimension. Is that the standard style
for 2D arrays, columns as first dimension, rows as second dimension?'

everyone's responses are super helpful!

On Tue, 14 Feb 2006 12:49:54 -0000, "Terry Kreft"
<te*********@mps.co.uk> wrote:
The absolute simplest way is to use the GetRows method

Dim varArray as variant

varArray = rst1.GetRows

You then have a two dimensional array the first dimension of which is the
columns the second dimenstion is the rows. (the opposite way round to how
you've defined your array).

Alternatively (to get rows by columns as you have now)

dim Fld as Field
dim rst1 as new adodb.recordset
dim varArray()
dim intfieldmarker as integer
dim lngRowCount as long
with rst1
.MoveLast
Redim varArray(0 to .RecordCount -1, 0 to .Fields.count -1)

.MoveFirst
lngRowCount = -1
Do Until .EOF
lngRowCount = lngRowCount + 1
For intfieldmarker = 0 To .Fields.Count -1
varArray(lngRowCount, intfieldMarker) = .fields(intfieldMarker)
Next Fld
rst1.MoveNext
Loop
End With

Feb 15 '06 #13

P: n/a
"Donald Grove" <do*********@verizon.net> wrote in message
news:tq********************************@4ax.com...
Actually it is to move the records into another table. Perhaps my
method is a little roundabout, but I am a rookie at this.
I take all the new records from tableA populate the array, then use
the array to add the new records to tableB.

So you are saying just open the two recordsets and at the new data
from one to the other and skip using the array altogether?

Honestly, I have been working directly with tables and queries in
access for about 5 years. It is only in the last 6 months that I have
even learned what a recordset is, let alone what an array is. So some
of my methods are probably more complicated than they need to be.

I want to do things in the most efficient way I can, but I have to
confess that I was using an array partly because I wanted to learn
more about them... :)

Moving records from one table to another should generally be done with just
executing an sql statement of the form INSERT INTO. This is usually not
only the easiest, but the most efficient. There are times when there are so
many operations to be done on the records before they are put back, that it
may be easier to do this with two recordsets: one open as forward-only
read-only and the other as append-only.
Feb 15 '06 #14

P: n/a
Ordinarily I would agree. But, I have a multiuser data system using
linked tables. I have found that linked tables make everything run
more slowly. REALLY REALLY slowly, and that using ADO for certain
basic tasks is just faster.

On Wed, 15 Feb 2006 13:11:24 +0000 (UTC), "Anthony England"
<ae******@oops.co.uk> wrote:
"Donald Grove" <do*********@verizon.net> wrote in message
news:tq********************************@4ax.com.. .
Actually it is to move the records into another table. Perhaps my
method is a little roundabout, but I am a rookie at this.
I take all the new records from tableA populate the array, then use
the array to add the new records to tableB.

So you are saying just open the two recordsets and at the new data
from one to the other and skip using the array altogether?

Honestly, I have been working directly with tables and queries in
access for about 5 years. It is only in the last 6 months that I have
even learned what a recordset is, let alone what an array is. So some
of my methods are probably more complicated than they need to be.

I want to do things in the most efficient way I can, but I have to
confess that I was using an array partly because I wanted to learn
more about them... :)

Moving records from one table to another should generally be done with just
executing an sql statement of the form INSERT INTO. This is usually not
only the easiest, but the most efficient. There are times when there are so
many operations to be done on the records before they are put back, that it
may be easier to do this with two recordsets: one open as forward-only
read-only and the other as append-only.

Feb 15 '06 #15

P: n/a
Donald Grove <do*********@verizon.net> wrote in
news:22********************************@4ax.com:
Moving records from one table to another should generally be done
with just executing an sql statement of the form INSERT INTO.
This is usually not only the easiest, but the most efficient.
There are times when there are so many operations to be done on
the records before they are put back, that it may be easier to do
this with two recordsets: one open as forward-only read-only and
the other as append-only.


Ordinarily I would agree. But, I have a multiuser data system
using linked tables. I have found that linked tables make
everything run more slowly. REALLY REALLY slowly, and that using
ADO for certain basic tasks is just faster.


This appears to me to be a complete non sequitur.

SQL can be execute in ADO or DAO.

It is by far the most efficient method for moving records from one
table to another, and it will also be much faster than a recordset
or array since there is no intermediate data structure created in
memory.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Feb 16 '06 #16

P: n/a
So far, my understanding of dao or ado recordsets doesn't include what
you describe.

The table with the full data is accessed through cnn1, the table I
want to append records to must use a different connection, as it is
not in the same .mdb file. After reading your message, I decided to
give it a try, but I could do it. I don't know how to write an sql
statement that appends data from a recordset on one connection to a
recordset or table in a different location. I would love to know how,
and I haven't been able to find it in any of the books I have. Help
would be appreciated.

On Wed, 15 Feb 2006 20:26:52 -0600, "David W. Fenton"
<XX*******@dfenton.com.invalid> wrote:
Donald Grove <do*********@verizon.net> wrote in
news:22********************************@4ax.com :
Moving records from one table to another should generally be done
with just executing an sql statement of the form INSERT INTO.
This is usually not only the easiest, but the most efficient.
There are times when there are so many operations to be done on
the records before they are put back, that it may be easier to do
this with two recordsets: one open as forward-only read-only and
the other as append-only.


Ordinarily I would agree. But, I have a multiuser data system
using linked tables. I have found that linked tables make
everything run more slowly. REALLY REALLY slowly, and that using
ADO for certain basic tasks is just faster.


This appears to me to be a complete non sequitur.

SQL can be execute in ADO or DAO.

It is by far the most efficient method for moving records from one
table to another, and it will also be much faster than a recordset
or array since there is no intermediate data structure created in
memory.

Feb 16 '06 #17

P: n/a
Why not forget ADO Recordsets as arrays and tell us what you REALLY
want to do? Maybe we could make some useful suggestions

Feb 16 '06 #18

P: n/a
Like I said it's a multi-user "split" data system with the main tables
sitting on a server. But the linked tables make things run too slow.

So, I don't want linked tables. I want copies of the tables on the
client side. When a user adds a new record they are putting data into
unbound text boxes in a form, and when they "save" two things happen:

1. Using ADO, the dbase checks to see if there are new records on the
server, copies them and adds them to the user's copy of the table (in
case other user's have added new records since the local copy's last
update).

2. The data for the new record is added to the server table, and THEN
to the user's copy, complete with the auto number generated in the
server table. This keeps the two tables identical.

It's the first step that has been problematic, and where an ADO insert
statement would come in quite handy, but I don't know how to do an sql
insert from a recordset on connection1 (to the server) into a table on
connection2 (the local copy).

There is a separate protocol for updating an existing record, but it's
much easier to manage because nothing needs to be added anywhere.

My users are happy, because the can cruise through the data with lots
of forms, subforms and reports, etc, and it doesn't run super slow,
because the tables aren't linked. My main interest is in making this
method as streamlined as possible.
On 16 Feb 2006 15:21:40 -0800, "Lyle Fairfield"
<ly***********@aim.com> wrote:
Why not forget ADO Recordsets as arrays and tell us what you REALLY
want to do? Maybe we could make some useful suggestions

Feb 17 '06 #19

P: n/a

"Donald Grove" <do*********@verizon.net> wrote in message
news:q8********************************@4ax.com...
Like I said it's a multi-user "split" data system with the main tables
sitting on a server. But the linked tables make things run too slow.

So, I don't want linked tables. I want copies of the tables on the
client side. When a user adds a new record they are putting data into
unbound text boxes in a form, and when they "save" two things happen:

1. Using ADO, the dbase checks to see if there are new records on the
server, copies them and adds them to the user's copy of the table (in
case other user's have added new records since the local copy's last
update).

2. The data for the new record is added to the server table, and THEN
to the user's copy, complete with the auto number generated in the
server table. This keeps the two tables identical.

It's the first step that has been problematic, and where an ADO insert
statement would come in quite handy, but I don't know how to do an sql
insert from a recordset on connection1 (to the server) into a table on
connection2 (the local copy).

There is a separate protocol for updating an existing record, but it's
much easier to manage because nothing needs to be added anywhere.

My users are happy, because the can cruise through the data with lots
of forms, subforms and reports, etc, and it doesn't run super slow,
because the tables aren't linked. My main interest is in making this
method as streamlined as possible.


Why do you suppose it is that many regular posters here use (and would
recommend the use of) linked tables? If the performance is too slow for
you, do you assume they have lower standards?
I have designed a great many databases with linked tables and generally
speed is never an issue. The user clicks on the button and the data
appears. Ocassionally there might be a process which needs more time than
this - say 1-5 seconds but there is usually a valid reason for this and my
customers understand this. However, for general operations and navigating
around the database is (as far as the users are concerned) pretty much
instantaneous.
If you are having speed issues with linked tables then there may be
something that can be done while retaining this structure. I would not
imagine that you could improve things by essentially abandoning the way
Access was designed to work in favour of this convoluted approach involving
two sets of data, recordsets, arrays, etc.

If you really believe you cannot work with linked tables, you could give us
an example of what sort of operation you are attempting and what sort of
speed you get. For example, to add a new record, to find a customer by
name, to return the results of some query. If I had to look through 100,000
contacts and find all of those called "John" and save the results in a
textfile - I would expect to do it in about 1 second.


Feb 17 '06 #20

P: n/a
Then again, maybe we can't.

Feb 17 '06 #21

P: n/a
Donald Grove <do*********@verizon.net> wrote in
news:e7********************************@4ax.com:
On Wed, 15 Feb 2006 20:26:52 -0600, "David W. Fenton"
<XX*******@dfenton.com.invalid> wrote:
Donald Grove <do*********@verizon.net> wrote in
news:22********************************@4ax.co m:
Moving records from one table to another should generally be
done with just executing an sql statement of the form INSERT
INTO. This is usually not only the easiest, but the most
efficient. There are times when there are so many operations to
be done on the records before they are put back, that it may be
easier to do this with two recordsets: one open as forward-only
read-only and the other as append-only.

Ordinarily I would agree. But, I have a multiuser data system
using linked tables. I have found that linked tables make
everything run more slowly. REALLY REALLY slowly, and that
using ADO for certain basic tasks is just faster.
This appears to me to be a complete non sequitur.

SQL can be execute in ADO or DAO.

It is by far the most efficient method for moving records from one
table to another, and it will also be much faster than a recordset
or array since there is no intermediate data structure created in
memory.


So far, my understanding of dao or ado recordsets doesn't include
what you describe.


It doesn't involve recordsets. It involves executing SQL commands
that operate directly on the data, instead of populating a recordset
with the data and then operating on the recordset.

The latter is a sequential process, one record at a time.

The former is a set operation, and is substantially faster (not to
mention much easier to code).
The table with the full data is accessed through cnn1, the table I
want to append records to must use a different connection, as it
is not in the same .mdb file. . . .
Big deal.

Don't use ADO. Just write your SQL using the IN statement to specify
data from a different MDB file.

Something like:

INSERT INTO tblLocal
SELECT tblRemote.*
FROM tblRemote IN 'c:\data\Remote.mdb'

In that case, tblLocal would need to be a linked table in the front
end where you are running the SQL.
. . . After reading your message, I decided to
give it a try, but I could do it. I don't know how to write an sql
statement that appends data from a recordset . . .
There are no recordsets involved.

Using DAO:

Dim db As DAO.Database
Dim strSQL As String

Set db = CurrentDB()
strSQL = "INSERT INTO tblLocal SELECT tblRemote.*"
strSQL = strSQL & " FROM tblRemote IN 'c:\data\Remote.mdb'"
db.Execute strSQL, dbFailOnError
Set db = Nothing
. . . on one connection to a
recordset or table in a different location. I would love to know
how, and I haven't been able to find it in any of the books I
have. Help would be appreciated.


If you have linked tables, one for one back end database, and one
for another, you wouldn't even need to IN '' parameter to specify
the source MDB.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Feb 17 '06 #22

P: n/a
Donald Grove <do*********@verizon.net> wrote in
news:q8********************************@4ax.com:
Like I said it's a multi-user "split" data system with the main
tables sitting on a server. But the linked tables make things run
too slow.

So, I don't want linked tables. I want copies of the tables on
the client side. When a user adds a new record they are putting
data into unbound text boxes in a form, and when they "save" two
things happen:

1. Using ADO, the dbase checks to see if there are new records on
the server, copies them and adds them to the user's copy of the
table (in case other user's have added new records since the local
copy's last update).
If you have a linked table to the back end data table, then there's
no need for ADO or anything. A saved query could be used. You'd just
write a query that joined the remote and local tables with an outer
join, and appended only the records that were not in the local
table. This assumes that there's a primary key to join on. The SQL
would look something like this:

INSERT INTO tblLocal
SELECT tblRemote.*
FROM tblRemote LEFT JOIN tblLocal ON tblRemote.ID = tblLocal.ID
WHERE tblLocal.ID Is Null;

You could create a saved query that does this and just run it.

Or you could do it all in code.

No ADO needed. No recordsets needed.
2. The data for the new record is added to the server table, and
THEN to the user's copy, complete with the auto number generated
in the server table. This keeps the two tables identical.

It's the first step that has been problematic, and where an ADO
insert statement would come in quite handy, but I don't know how
to do an sql insert from a recordset on connection1 (to the
server) into a table on connection2 (the local copy).
Just ignore ADO and recordsets. All you need to do is write a query.
There is a separate protocol for updating an existing record, but
it's much easier to manage because nothing needs to be added
anywhere.


Perhaps in that case, the records should be deleted and re-appended.

I still think that your speed problems with linked are due to design
errors in the forms you are using. I have never ever in my 10 years
of full-time Access problems found a need to do what you are
proposing.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Feb 17 '06 #23

P: n/a
"Anthony England" <ae******@oops.co.uk> wrote in
news:dt**********@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com:
If you really believe you cannot work with linked tables, you
could give us an example of what sort of operation you are
attempting and what sort of speed you get. For example, to add a
new record, to find a customer by name, to return the results of
some query. If I had to look through 100,000 contacts and find
all of those called "John" and save the results in a textfile - I
would expect to do it in about 1 second.


I can only guess that there's some kind of summarizing of data being
done that is the source of the slowdown. And perhaps it's being done
in the most inefficient way possible, or with multiple joins, or
without proper filtering, or any number of things that can cause
this kind of inefficiency.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Feb 17 '06 #24

P: n/a

"David W. Fenton" <XX*******@dfenton.com.invalid> wrote in message
news:Xn**********************************@127.0.0. 1...
"Anthony England" <ae******@oops.co.uk> wrote in
news:dt**********@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com:
If you really believe you cannot work with linked tables, you
could give us an example of what sort of operation you are
attempting and what sort of speed you get. For example, to add a
new record, to find a customer by name, to return the results of
some query. If I had to look through 100,000 contacts and find
all of those called "John" and save the results in a textfile - I
would expect to do it in about 1 second.


I can only guess that there's some kind of summarizing of data being
done that is the source of the slowdown. And perhaps it's being done
in the most inefficient way possible, or with multiple joins, or
without proper filtering, or any number of things that can cause
this kind of inefficiency.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/

Indeed we don't know. But I somehow doubt that we would find a
well-designed database struggling against a very demanding task.
Feb 17 '06 #25

This discussion thread is closed

Replies have been disabled for this discussion.