By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,311 Members | 1,844 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,311 IT Pros & Developers. It's quick & easy.

Severe DB bloat after importing excel records

P: n/a
I just imported 62000 rows twice from Excel. In my import routine, I create a
holding table and then validate the data prior to committing to the main
table. The holding table is created in the front end. All other tables are in
the backend.

After the import, response from the application was extremely slow and it
appeared to have frozen. It occurred to me that database bloat might be the
problem. So I used Ctrl-Alt-Del to close the application and checked the
front end size. It had gone from 12 MB in size to 323 MB in size.

When my front end opens, I have code that checks the size of the front end.
If it exceeds a certain size, a backup copy is made and the database close on
compact switch is set so that when the front end is closed it will
automatically compact. At 323 MB in size, it took about 5 minutes to open the
front end, which normally opens in about 30 seconds. When the database was
closed to start the compact on close, it took 5 minutes to close and compact.
After the compact, the size of the front end was 8 MB.

So what to do, if anything. I don't think putting the holding table in the
backend will help, because I don't have a routine in place that checks the
backend and automatically compacts it (although I guess I could add one, but
what if the backend is accessed from a server by multiple users??). At least
if the problem is in the front end, it quickly (and painfully) becomes aware
to the user.

Still, it makes the application look amateurish and will probably aggravate
users. On the other hand, it is unlikely users will ever import more than
5000 rows at time (I was setting up the backends with historical data for 7
years of data which is why so many rows).

Are there any suggestions for what I can do to minimize the bloat, if
anything. The only thing I can think of is that my app does create an
instance of a temporary database with tables used to hold the results of
complex calculations. The temp database is created for each instance of the
front end that is running (so it is created on the user's PC and not on the
server where the main backend is). If I create the holding table in the temp
DB, it won't effect the more permanent front and back ends. I guess it would
still result in slow performance because the temp DB is linked to the front
end, but maybe that would only occur when tables are accessed from the temp
DB.

Suggestions, thoughts, comments are appreciated.

Thanks.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...ccess/200511/1
Nov 16 '05 #1
Share this Question
Share on Google+
7 Replies


P: n/a
Br
robert d via AccessMonster.com wrote:
I just imported 62000 rows twice from Excel. In my import routine, I
create a holding table and then validate the data prior to committing
to the main table. The holding table is created in the front end. All
other tables are in the backend.

After the import, response from the application was extremely slow
and it appeared to have frozen. It occurred to me that database bloat
might be the problem. So I used Ctrl-Alt-Del to close the application
and checked the front end size. It had gone from 12 MB in size to 323
MB in size.

When my front end opens, I have code that checks the size of the
front end. If it exceeds a certain size, a backup copy is made and
the database close on compact switch is set so that when the front
end is closed it will automatically compact. At 323 MB in size, it
took about 5 minutes to open the front end, which normally opens in
about 30 seconds. When the database was closed to start the compact
on close, it took 5 minutes to close and compact. After the compact,
the size of the front end was 8 MB.

So what to do, if anything. I don't think putting the holding table
in the backend will help, because I don't have a routine in place
that checks the backend and automatically compacts it (although I
guess I could add one, but what if the backend is accessed from a
server by multiple users??). At least if the problem is in the front
end, it quickly (and painfully) becomes aware to the user.

Still, it makes the application look amateurish and will probably
aggravate users. On the other hand, it is unlikely users will ever
import more than 5000 rows at time (I was setting up the backends
with historical data for 7 years of data which is why so many rows).

Are there any suggestions for what I can do to minimize the bloat, if
anything. The only thing I can think of is that my app does create an
instance of a temporary database with tables used to hold the results
of complex calculations. The temp database is created for each
instance of the front end that is running (so it is created on the
user's PC and not on the server where the main backend is). If I
create the holding table in the temp DB, it won't effect the more
permanent front and back ends. I guess it would still result in slow
performance because the temp DB is linked to the front end, but maybe
that would only occur when tables are accessed from the temp DB.

Suggestions, thoughts, comments are appreciated.

Thanks.


If you are doing calcs that use temporary tables then I'd certainly
suggest using local tables to do it.
--
regards,

Bradley

A Christian Response
http://www.pastornet.net.au/response
Nov 17 '05 #2

P: n/a
What do you mean by "local tables"? Do you mean tables that reside in the
front end?

Br@dley wrote:
I just imported 62000 rows twice from Excel. In my import routine, I
create a holding table and then validate the data prior to committing

[quoted text clipped - 42 lines]

Thanks.


If you are doing calcs that use temporary tables then I'd certainly
suggest using local tables to do it.


--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...ccess/200511/1
Nov 17 '05 #3

P: n/a
Br
robert d via AccessMonster.com wrote:
What do you mean by "local tables"? Do you mean tables that reside
in the front end?
Yes, or in a seperate MDB.
Br@dley wrote:
I just imported 62000 rows twice from Excel. In my import routine, I
create a holding table and then validate the data prior to
committing

[quoted text clipped - 42 lines]

Thanks.


If you are doing calcs that use temporary tables then I'd certainly
suggest using local tables to do it.


--
regards,

Bradley

A Christian Response
http://www.pastornet.net.au/response
Nov 17 '05 #4

P: n/a
Yes, as I mentioned in my first post I already create a Temp DB each time a
user starts the front end (tempDB is created on the user's PC where the front
end resides). The TempDB is killed at the end of the session. It seems
logical to put a temporary table there. I just wanted to get some
confirmation that this is probably the best way to handle it from people who
have more experience than I do.

Thanks.

Br@dley wrote:
What do you mean by "local tables"? Do you mean tables that reside
in the front end?


Yes, or in a seperate MDB.
I just imported 62000 rows twice from Excel. In my import routine, I
create a holding table and then validate the data prior to[quoted text clipped - 5 lines] If you are doing calcs that use temporary tables then I'd certainly
suggest using local tables to do it.


--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...ccess/200511/1
Nov 17 '05 #5

P: n/a
Br
Yup, so IMO it's probably the best way to handle it.

robert d via AccessMonster.com wrote:
Yes, as I mentioned in my first post I already create a Temp DB each
time a user starts the front end (tempDB is created on the user's PC
where the front end resides). The TempDB is killed at the end of the
session. It seems logical to put a temporary table there. I just
wanted to get some confirmation that this is probably the best way to
handle it from people who have more experience than I do.

Thanks.

Br@dley wrote:
What do you mean by "local tables"? Do you mean tables that reside
in the front end?


Yes, or in a seperate MDB.
> I just imported 62000 rows twice from Excel. In my import
> routine, I create a holding table and then validate the data
> prior to

[quoted text clipped - 5 lines]
If you are doing calcs that use temporary tables then I'd certainly
suggest using local tables to do it.


--
regards,

Bradley

A Christian Response
http://www.pastornet.net.au/response
Nov 17 '05 #6

P: n/a
I think it would be very difficult to say "this is the best way" if we
knew everything about the Excel range(s), the import procedure, the
front end and the back end. There is always a better way.
Since we don't know, I think it's a little more difficult than that.
In many years programming Excel and Access through VBA I have used a
temporary local table only once (when it was required by the group
leader); perhaps there is a good reason for doing so but I have yet to
see it.

Nov 17 '05 #7

P: n/a
Br
lylefair wrote:
I think it would be very difficult to say "this is the best way" if we
knew everything about the Excel range(s), the import procedure, the
front end and the back end. There is always a better way.
Since we don't know, I think it's a little more difficult than that.
One can assume my comments are based on the information given in the
post I was replying to. ie. it sounds like the best way given the
information we have.
In many years programming Excel and Access through VBA I have used a
temporary local table only once (when it was required by the group
leader); perhaps there is a good reason for doing so but I have yet to
see it.


I tend to only use temp tables when importing data from another system
that requires a lot of 'massaging'. Some people use temp tables whenever
things start getting complicated. Sometimes a complex SQL statement can
do the same thing much more elegantly. I like to do things this way and
try to make things dynamic. Sometimes though it's not practical given
time contraints/budget/etc if an elegant solution is not easily
forthcoming.
--
regards,

Bradley

A Christian Response
http://www.pastornet.net.au/response
Nov 17 '05 #8

This discussion thread is closed

Replies have been disabled for this discussion.