473,396 Members | 1,734 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Heavy Data-Import to Access

Hello,

I'm not a specialist with Access - so I hope that you can give me some
input.

I've to import xml data (about 15 different data fields) into an
access-database each night. As scripting language I've to use ASP.

Do you think there will be a problem with about 20.0000 to 30.000
data-records, which have to be updated (about 99 % of the data-records
have to be updated) each night? Is Access strong enough for this heavy
traffic?

In my opinion that it is too heavy traffic - because I have to look at
every data-record (xml) and have to look up, if there is already a
record in the access-db. And this for more than 20.000 times!

What are your thoughts? What will be a good start? Some hints?

Yours,
Andreas Greiner
Nov 12 '05 #1
6 1627
DFS
"Innuendo" <a.*******@gmx.at> wrote in message
news:77**************************@posting.google.c om...
I've to import xml data (about 15 different data fields) into an
access-database each night. As scripting language I've to use ASP.

Do you think there will be a problem with about 20.0000 to 30.000
data-records, which have to be updated (about 99 % of the data-records
have to be updated) each night? Is Access strong enough for this heavy
traffic?
Access (really the Jet db engine) is plenty strong enough.
In my opinion that it is too heavy traffic - because I have to look at
every data-record (xml) and have to look up, if there is already a
record in the access-db. And this for more than 20.000 times!


Sounds like you'll spend about half your time parsing XML data and the other
half looking it up in the Access db.

Describe your problem in more detail and maybe I (or cdma) can offer
suggestions. It might be the case that you can write your XML data to work
tables and execute bulk lookups/inserts against the database.


Nov 12 '05 #2
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

For feasibility, it really depends on the computer that is running the
application. Does it have enough memory (RAM and disk space)? Is the
speed of the processor adequate? Are other programs sapping the
processing power/memory of the Access application.

You can import the data into a temporary table (see the Access Help
article "ImportXML Method"), then run elimination queries against that
data comparing the new data to existing data and eliminating the new
data that already exists in the db. E.g.:

DELETE DISTINCTROW NewTable.*
FROM NewTable INNER JOIN OldTable
ON NewTable.ID = OldTable.ID

IOW, you'd join the two tables using the fields that uniquely identify
each record, which deletes the new data in the NewTable that already
exists in the OldTable.

HTH,

MGFoster:::mgf00 <at> earthlink <decimal-point> net
Oakland, CA (USA)

-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.0
Charset: noconv

iQA/AwUBQA7ykYechKqOuFEgEQJYBQCggoB2kMwzwEohHAQGt2b0S6 QFPPEAmwQd
ZQV5ugmd929E2WvTtfMjm6A+
=AJtx
-----END PGP SIGNATURE-----
Innuendo wrote:
Hello,

I'm not a specialist with Access - so I hope that you can give me some
input.

I've to import xml data (about 15 different data fields) into an
access-database each night. As scripting language I've to use ASP.

Do you think there will be a problem with about 20.0000 to 30.000
data-records, which have to be updated (about 99 % of the data-records
have to be updated) each night? Is Access strong enough for this heavy
traffic?

In my opinion that it is too heavy traffic - because I have to look at
every data-record (xml) and have to look up, if there is already a
record in the access-db. And this for more than 20.000 times!

What are your thoughts? What will be a good start? Some hints?

Yours,
Andreas Greiner


Nov 12 '05 #3
Hi,

thank you for your answer.

Yes, you are right. The XML-Data will contain data like firstname,
sirname, adress, birthday, email, ... - this data has to be "copied" to
the access-database.

At this point I don't know, if it is a new user, or an existing user. If
it is a new user, then I'have to make an insert - if it's an existing
user, I've to control the data, and if something has changed of the
xml-data, then I've to make an update.

Your idea with work tables is good. I think this will be an advantage
for me.

What do you exactly mean with "execute bulk lookups/inserts"?
What possibilitis do I have with access and bulk lookups and inserts?

Thank you for your nice help (and excuse my "Austrian" English)

Yours,
Andreas


*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 12 '05 #4
DFS
"Andreas Greiner" <a.*******@gmx.at> wrote in message
news:40***********************@news.frii.net...
Hi,

thank you for your answer.

Yes, you are right. The XML-Data will contain data like firstname,
sirname, adress, birthday, email, ... - this data has to be "copied" to
the access-database.

At this point I don't know, if it is a new user, or an existing user. If
it is a new user, then I'have to make an insert - if it's an existing
user, I've to control the data, and if something has changed of the
xml-data, then I've to make an update.

Your idea with work tables is good. I think this will be an advantage
for me.

What do you exactly mean with "execute bulk lookups/inserts"?
What possibilitis do I have with access and bulk lookups and inserts?
Andreas,

Instead of doing 20,000 lookups, insert all the records into a work table
and execute a few queries between the work table and the master:

1) execute updates to existing users

examples:

UPDATE Master INNER JOIN Work
ON Master.UserID = Work.UserID
SET Master.Address = Work.Address
WHERE Work.Address <> Master.Address
OR (Master.Address IS NULL AND Work.Address IS NOT NULL);

UPDATE Master INNER JOIN Work
ON Master.UserID = Work.UserID
SET Master.Email = Work.Email
WHERE Work.Email <> Master.Email
OR (Master.Email IS NULL AND Work.Email IS NOT NULL);

2) insert new users.

INSERT INTO Master (UserID, FirstName, LastName, Birthdate, Address...)
SELECT UserID, FirstName, LastName, Birthdate, Address...
FROM Work

(the above will try to insert all records, but some records will fail on the
primary key. Don't know if you'll get an Error using ASP or not). You
could also try:

INSERT INTO Master (UserID, FirstName, LastName, Birthdate, Address...)
SELECT UserID, FirstName, LastName, Birthdate, Address...
FROM Work
WHERE UserID NOT IN (SELECT UserID FROM Master);
3) empty the work table

DELETE FROM Work;
But, is this a one-time thing? You probably shouldn't have to be parsing
20,000 user records very often.

Thank you for your nice help (and excuse my "Austrian" English)

Yours,
Andreas


*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

Nov 12 '05 #5
Hi,

thank you very much for your hints. I think there is a solution with
your code examples.

One thing more: YES, I've to do this parsind EVERY night. So that will
be very heavy for access and the tables!

Andreas
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 12 '05 #6
DFS

"Andreas Greiner" <a.*******@gmx.at> wrote in message
news:40***********************@news.frii.net...
Hi,

thank you very much for your hints. I think there is a solution with
your code examples.
You're welcome.

One thing more: YES, I've to do this parsind EVERY night.
I haven't seen your db or system, but you probably shouldn't have to go to
that much trouble. Can you not include a CreateDate and/or LastUpdateDate
field in your db or XML somewhere that will let you know the new and updated
users?

So that will be very heavy for access and the tables!
Again, it won't be a problem for the Access/Jet db engine, but it could be
time consuming depending on how you implement your solution.
Andreas
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

Nov 12 '05 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: Kayra Otaner | last post by:
Hi all, I want to get your opinions on how to increase available/free memory and performance on a heavy volume database server. I have MySQL 4.0.13 running on RH 7.2 replicated to another RH...
0
by: kayra | last post by:
Hi all, I want to get your opinions on how to increase available/free memory and performance on a heavy volume database server. I have MySQL 4.0.13 running on RH 7.2 replicated to another RH...
9
by: Jack | last post by:
Hello I have a library of calculationally intensive classes that is used both by a GUI based authoring application and by a simpler non-interactive rendering application. Both of these...
3
by: John Wells | last post by:
A manager friend of mine sent me the following concern. He's preparing to shift to Postgresql from a proprietary DB and 4gl system: ----------- To that end, I've also started studying up on...
4
by: Michiel Alsters | last post by:
Hello everybody, I hope anybody can help me. I'll try to give a brief overview of my problem. I have running a program that performs a heavy calculation. To give the user feedback what the...
13
by: mloichate | last post by:
I must read a very heavy-weight text plain file (usually .txt extension) )and replace a given character with another given character in all text inside the file. My application was working pretty...
4
by: Hyo-Han Kim | last post by:
hi.. I am new to .NET .. If Client Program is built with VB.NET or C#.NET , would'nt Client Application be so heavy for PC?. I would like to build P2P Program with VB.NET .. I wonder if...
19
by: Alex Madon | last post by:
Hello, I am testing a web application (using the DBX PHP function to call a Postgresql backend). I have 375Mb RAM on my test home box. I ran ab (apache benchmark) to test the behaviour of the...
6
by: Chris Burnley | last post by:
We're seeing very strange behaviour with db2 under heavy load. We're running our system in GMT and the timezone never changes for British Summer Time. Basically the thing we're doing is (using...
1
by: bhardwajvinu | last post by:
We are experiencing a problem running MySql 5 whereby under heavy load(with 2 connection and getting raw data from two or three table by a common java task that is running as infinite loop ), memory...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.