By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,948 Members | 1,857 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,948 IT Pros & Developers. It's quick & easy.

VBA query a recorset or a table from memory????

P: 93
Hi

I have a large table and I need to submit 500 queries (each one is different)

I don't want to use recordset.open strSQL as it taking to much time (it should be below 0.15s and the table is 200 000 records)

So what I'm thinking is to have a table in memory?? so I can submit SELECT query on it. Is that possible??

something like SELECT field1 FROM recordset? or table in memory?

Many thanks in advance.
Emil
Aug 10 '08 #1
Share this Question
Share on Google+
5 Replies


Dököll
Expert 100+
P: 2,364
Greetings, emsik1001!

You mentioned VBA, right away I thought Access forum... I know you'll get the help needed there. Do stay tuned if you have heard anything. I am sure you are aware, we're on different time zones, someone will see ya.

That said, I do think what you asking is achievable, perhaps you're interested on doing a passthrough query on it, a live link.

You could then add a make-table query to satisfy speed, since you will be querying for stuff.

Which in turn would allow you to query however many times you need (make-table query to build a table because depending on your connection, a direct connection from say a form to a passthrough may be slow).

You can then link table made up to a form if you want, add a button to fire up the query through form and data load to form.

Try it, if it fits your current project... Please keep an eye out for answers on this one also, mine may be terribly off to what you need at this point. It worked for for me once accessing a Sybase database, hope you can use...

Good luck with the project!

Dököll
Aug 10 '08 #2

Expert Mod 2.5K+
P: 2,545
Hi. There is no mechanism I know of to base SQL processing on recordset data held in memory - Execute, RunSQL etc have no means of accessing user-defined data structures to store intermediate recordsets. Dokoll is right to point you to pass-through queries - you don't then have the overheads associated with opening a recordset and closing it again in code each time, which you appear to be doing using the OpenRecordset method. Not sure you need to in all cases, though, as there are alternative ways to run queries without opening recordsets in advance - if you were working in Access you would have DoCmd.RunSQL. There is also the database object's Execute method to run action queries directly, as I remember.

It is of course possible to open a recordset and process it one record at a time for multiple purposes in the same VBA procedure - but I would be certain that you would cost yourself a huge amount of work and performance would be worse, not better.

Vendors put a lot of effort into optimising their database engines. You don't say what your back-end database is - and presumably you are using ADO recordsets, so it could be Access, SQL Server or any other - but if native SQL queries are running too slow it is almost certain that the table or tables you are accessing, and the queries you are running, are not optimally set up for speedy access.

In my opinion your best bet is to ensure that your tables and queries are running under optimum conditions. This means making sure your underlying tables have properly-set primary and secondary keys - as each one of these is automatically indexed, and such indexes can speed up processing manyfold. You can also set up additional indexes on non-key fields where this would speed up processing.

Also, run action queries using methods such as Execute which do not require opening an ADO recordset in advance to run them.

As a further speed tip, define and use bespoke functions in VBA in place of IIFs and other speed hogs (complex calculated fields, say) in your queries. Compared to an IIF, a VBA function runs in general twice as fast.

Good luck.

Cheers

Stewart
Aug 11 '08 #3

Expert Mod 2.5K+
P: 2,545
Emsik1001, please do not duplicate your posts in this or any other forum. This is against site rules and wastes member time.

I have deleted the duplicate post you placed in this forum.

MODERATOR
Aug 11 '08 #4

P: 93
Thanks for the advice I will try it today.

The table is pure numbers nothing more. Should there be any indexing in this case? The criteria are on all fields as the table only consist of the fields I need.
Aug 11 '08 #5

Expert Mod 2.5K+
P: 2,545
Hi. I can't imagine a table with of the order of 10^5 rows that has no ordering structure in it somehow, so unless it is truly unstructured numeric data there would be some benefit to indexing the fields on which you are grouping your data. Without some form of grouping or structuring, what do the rows mean? Even larger-scale numeric data (e.g. population samples) has some form of order inasmuch as the samples will have area identifiers and so on to distinguish one row from another.

To give some idea of the usefulness of an index, a binary search on ordered indexed data will find any match in 500,000 records in just 19 index lookups (2^19=524288). A linear search on unordered data (no index) would on average look at around (500,000)/2 rows to find a match - over 13,000 times the number of rows. (This assumes that the data is going to be ordered or grouped in some way, but as I said I cannot imagine that you have 10^5 rows of completely unordered data - though it is possible.)

There are overheads imposed by index maintenance, but the overheads are small compared to manipulating data on an unordered basis - or ordering outside of the optimised query engine driving the database itself.

-Stewart
Aug 11 '08 #6

Post your reply

Sign in to post your reply or Sign up for a free account.