By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,657 Members | 1,682 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,657 IT Pros & Developers. It's quick & easy.

3265 error "Item not found in collection"

P: 49
VBA for Access attempting to read from one table and write summary data to the next.

Expand|Select|Wrap|Line Numbers
  1. Private Sub SummaryButton_Click()
  2. Dim CurrQuality As Long
  3. Dim CurrHour As Date
  4. Dim CurrProd As Long
  5. Dim CurrEmployee As Long
  6. Dim MyDb As Database, MDrs As Recordset, MDSrs As Recordset2
  7.  
  8.  
  9. Set MyDb = DBEngine.Workspaces(0).Databases(0)
  10. Set MDrs = MyDb.OpenRecordset("MachineData", DB_OPEN_TABLE)
  11. Set MDSrs = MyDb.OpenRecordset("MachineData Summary", DB_OPEN_DYNASET)
  12.  
  13. 'Clear old summary file
  14. MDSrs.MoveFirst
  15. While Not MDSrs.EOF
  16.     MDSrs.Delete
  17.     MDSrs.MoveNext
  18. Wend
  19.  
  20.  
  21. MDrs.MoveFirst  'start at the first record
  22. CurrQuality = MDrs(Quality)
  23. CurrHour = MDrs(TimeStamp1)
  24. CurrProd = MDrs(Production)
  25. CurrEmployee = MDrs(Employee#)
  26.  
  27. While Not MDrs.EOF
  28.     CurrHour = Minute(MDrs(TimeStamp1))
  29.     CurrQuality = MDrs(Quality)
  30.         While CurrQuality = MDrs(Quality) And CurrHour = Minute(MDrs(TimeStamp1))
  31.             MDrs.MoveNext
  32.         Wend
  33.     MDrs.MovePrevious
  34.     MDSrs(StartProduction) = CurrProd
  35.     MDSrs(Machine) = MDrs(Machine)
  36.     MDSrs(StartTimeStamp1) = CurrHour
  37.     MDSrs(Quality) = CurrQuality
  38.     MDSrs(Employee#) = MDrs(Employee#)
  39.     MDSrs(EndTimeStamp1) = MDrs(TimeStamp1)
  40.     MDSrs(EndProduction) = MDrs(Production)
  41.     MDrs.MoveNext
  42. Wend
  43.  
  44. End Sub
  45.  
Error is at "CurrQuality = MDrs(Quality)" And if anyone knows of a command to clear a table vs what I'm doing, deleting one record at a time it would be appreciated.
Dec 31 '11 #1

✓ answered by Stewart Ross

Firstly, if you are trying to access a recordset's fields as you are doing from lines 22 onwards you have to enclose the field name in double quotes,like this:

Expand|Select|Wrap|Line Numbers
  1. CurrQuality = MDrs("Quality")
Secondly, if you wish to clear all records from a table there is no need to use a recordset at all. You could simply do something like:

Expand|Select|Wrap|Line Numbers
  1. CurrentDb.Execute "Delete * from [YourTable] Where [SomeCondition]=[WhateverIsBeingTested]"
If you can come up with a clearer explanation of what you are trying to do I'm sure we can assist you further with specifics.

-Stewart

Share this Question
Share on Google+
11 Replies


Expert Mod 2.5K+
P: 2,545
Firstly, if you are trying to access a recordset's fields as you are doing from lines 22 onwards you have to enclose the field name in double quotes,like this:

Expand|Select|Wrap|Line Numbers
  1. CurrQuality = MDrs("Quality")
Secondly, if you wish to clear all records from a table there is no need to use a recordset at all. You could simply do something like:

Expand|Select|Wrap|Line Numbers
  1. CurrentDb.Execute "Delete * from [YourTable] Where [SomeCondition]=[WhateverIsBeingTested]"
If you can come up with a clearer explanation of what you are trying to do I'm sure we can assist you further with specifics.

-Stewart
Jan 1 '12 #2

P: 49
“If you can come up with a clearer explanation of what you are trying to do I'm sure we can assist you further with specifics.”
You asked:
Big Picture:
We are attempting to use an OPC connection on each machine to track production on the factory floor. We have a data logger to feed the MS SQL 2008 R2 database, and then I’m trying to use MS/Access as the front end for the end user.
OPC will be collecting data about every 3 seconds (seems like overkill, but we do not want to operator waiting around for a downtime code to be transmitted). We have about 120 machines on the floor so I’m planning on using a VIEW in SQL 2008 for each machine to limit the network traffic, 1200 records per hour per machine vs. 144,000 per hour quarrying without the view. More coding, but I would think it should be faster. Please tell me if I’m wrong.
We are interested in a “by hour snapshot” of each machine’s production, efficiency and downtime, i.e. Process monitor. In the code above is my first attempt at summarizing the data down to one line of data when the hour is up or a change in the downtime code has been found. The plan is to use the summary data to drive the reports for the end user.
Once a night, similar code will be ran on the MS SQL database to append the summarized data to the history file for that day’s data. Detail data will only be kept for the current day.
Down the road we want feedback to each machine letting the operator know if they are running to plan or not. And have a screen at the end of the cell showing how each machine in the cell is doing in real time. Sounds like big brother watching, but we do try to focus on the process and not the operator most of the time.
I know a little programing, suggestions always appreciated. But not knowing the “grammar” of VBA is driving me crazy.

PS. Data I'm dealing with now is dummy data from the data logging software.
Jan 1 '12 #3

Expert Mod 2.5K+
P: 2,545
From what you have outlined above I am sure that it would be beneficial to compute the summary machine performance data using SQL queries instead of recordset-based processing, particularly if those queries were passed to SQL-Server itself for processing (Access terms these 'pass-through' queries, as the query is not handled by Access but instead passed through to the host database back-end server for processing).

SQL queries are likely to run much faster than using recordsets in Access, for a number of reasons. The database engine is optimised in ways that recordset processing cannot be. If SQL-Server's processing is used you have the ability to process records on the back-end server much faster than Access could do so, particularly as Access has to fetch all its data from SQL Server to do anything with it, whereas pass-through queries run by SQL Server from Access simply send the results back to Access, not the records processed.

What we'd need to help you is an indication of the names of the tables and fields involved, and the summary data you expect to result from the 'by hour'snapshot and the daily summary run.

A positive advantage of using SQL is that SQL focuses on the what of the processing, not the how; with recordset processing you have to work out for yourself the how part, as well as grappling with VBA syntax to get to first base. This is a very error-prone and time-consuming approach compared to preparing a summary query in SQL - or, if you are not familiar with SQL itself, in the graphical query designer in Access to begin with.

-Stewart
Jan 1 '12 #4

P: 49
I’m 95% sure I understand what you are saying. Do all the data crunching on the SQL server. My original plan was to copy the viewed data for that machine locally then crunch the data, at most 12,000 records a day given a 10 hour work day per machine.

The one thing I have a hard time getting my head around your suggestion is an end user needs to be able to bring up data anytime during the day. Are you going to tell me (show me) how to pass a request from Access to MS SQL 2008 to run a SQL program to fetch the data requested? If that is what you are suggesting it will have to be multiuser friendly.

I have written some SQL queries in Oracle, far from a pro.

Do I understand your suggestion?
Jan 1 '12 #5

Expert Mod 2.5K+
P: 2,545
You are right about what I'm suggesting. Use SQL Server for what it is good at - including the back-end processing of SQL queries to do the number crunching. Access, which can present the user with a friendly forms-based front-end, would use the queries you develop as the base on which everything else is built.

Users would not be interacting directly with queries as such; for ease of use you'd create an Access form or report based on such a query to lay out the data in whatever is the most appropriate way.

Make use of Access's strengths as a system for creating user-friendly forms and reports based on simple switchboards (see, for example, the Northwind sample database supplied with Access for ideas on how users can interact via forms and reports). I am also suggesting that you make use of SQL-Server's inherent processing power to perform the back-end number crunching - it makes much more sense than spending what will be a lot of time inventing in VBA what SQL can do already.

I and other posters could probably come up with SQL queries to help start you on the road to developing your application, at least short-cutting the effort you would undoubtedly be putting in if you try to develop everything in VBA.

You may need to adopt hybrid approaches, given the scale of the processing involved (for example, creating temporary tables to store intermediate results each hour to speed up user querying), but this is not something at present that can be predicted without prototyping up a working system - which is what you have already started on with your test system data.

-Stewart
Jan 1 '12 #6

P: 49
Know a good referance book for MicroSoft SQL Server2008 R2 so I don't have to bother anyone for every and, if, or, but.

Sounds like I'm going to spend some time in that.
Jan 1 '12 #7

NeoPa
Expert Mod 15k+
P: 31,712
This may sound like a trite answer, but when I was working in T-SQL (SQL Server's SQL language) I found everything I needed on the web. I kept a page open at Transact-SQL Reference (2008 R2) and from there I could find everything I ever needed.

Otherwise, we have a pretty responsive SQL Server forum here too :-)
Jan 2 '12 #8

P: 49
Ross,

Just getting back with you, here is what Rabbit helped me out with for an SQL to do the same thing I was doing above:

Expand|Select|Wrap|Line Numbers
  1. SELECT U1.Quality, U1.TimeStamp1 AS StartTime,
  2.     CASE WHEN U2.PriorTimeStamp1 IS NULL THEN (
  3.         SELECT MAX(TimeStamp1) FROM @t
  4.     ) ELSE U2.PriorTimeStamp1 END AS EndTime 
  5. FROM (
  6.     SELECT ROW_NUMBER() OVER (ORDER BY T1.TimeStamp1) AS S,
  7.         T1.Quality, T1.TimeStamp1, T2.TimeStamp1 AS PriorTimeStamp1
  8.     FROM (
  9.         SELECT ROW_NUMBER() OVER (ORDER BY TimeStamp1) AS R, * 
  10.         FROM @t
  11.     ) AS T1
  12.     LEFT JOIN (
  13.         SELECT ROW_NUMBER() OVER (ORDER BY TimeStamp1) AS R, * 
  14.         FROM @t
  15.     ) AS T2
  16.     ON (T1.R - 1) = T2.R
  17.     WHERE T1.Quality != T2.Quality
  18.         OR T2.Quality IS NULL
  19.         OR DATEPART(MINUTE, T1.TimeStamp1) != DATEPART(MINUTE, T2.TimeStamp1)
  20. ) AS U1
  21. LEFT JOIN (
  22.     SELECT ROW_NUMBER() OVER (ORDER BY T1.TimeStamp1) AS S,
  23.         T1.Quality, T1.TimeStamp1, T2.TimeStamp1 AS PriorTimeStamp1
  24.     FROM (
  25.         SELECT ROW_NUMBER() OVER (ORDER BY TimeStamp1) AS R, * 
  26.         FROM @t
  27.     ) AS T1
  28.     LEFT JOIN (
  29.         SELECT ROW_NUMBER() OVER (ORDER BY TimeStamp1) AS R, * 
  30.         FROM @t
  31.     ) AS T2
  32.     ON (T1.R - 1) = T2.R
  33.     WHERE T1.Quality != T2.Quality
  34.         OR T2.Quality IS NULL
  35.         OR DATEPART(MINUTE, T1.TimeStamp1) != DATEPART(MINUTE, T2.TimeStamp1)
  36. ) AS U2
  37. ON (U1.S + 1) = U2.S
  38.  
I have not put a stop watch it, but this code appears slower than what I created above. I suspect is because of the way the data must be retrieved, row by row, we are losing SQL's power.

Here is Rabbit's quick overview:

"There's a lot going on in that SQL, let me know if you have any questions.

Here's a quick rundown. It joins the table to itself on the row to the previous row. Which can be used to see if the previous row's information differs from the current row's information.

It then takes that and joins to itself again to get the start and end data."

I'm debating if doing the crunching in SQL is the way to go?

One other factor I didn't mention before is we have two networks at our plant:

1) Company wide network
2) Engineering network (only in our plant)

Both networks are attached to one computer, this computer is where I am planning on doing the crunching before passing the data into the company wide network. It is just a matter of doing the crunching in Access or SQL?
If you still feel strongly that I should stay in SQL I’ll still look at going down this path.
Jan 8 '12 #9

Rabbit
Expert Mod 10K+
P: 12,421
The convoluted SQL that I posted is a result of having to work around the drawbacks of SQL. The problem is that while it is record based, it is not record agnostic, by which I mean a record is unaware of the other records. And as a result, when you need to compare between records, it can be a performance hog. If you put in the correct indexes, you may be able to speed up the process considerably.
Jan 9 '12 #10

NeoPa
Expert Mod 15k+
P: 31,712
Rabbit:
it is not record agnostic
Do you not mean it is record agnostic? I know this seems fussy but I was confused, and I prefer to understand where I can.

PS. This is a very important point of course, that people often struggle to appreciate when dealing with data in an RDBMS.
Jan 9 '12 #11

Rabbit
Expert Mod 10K+
P: 12,421
Sorry, yes, I meant it is record agnostic.
Jan 9 '12 #12

Post your reply

Sign in to post your reply or Sign up for a free account.