473,756 Members | 6,106 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

SQL Server Conversion

A client is panicking about their large Access application, which
has been running smoothly with 100s of thousands of records for
quite some time. They have a big project in the next year that will
lead to a lot of use of the database and the adding of quite a lot
of new data (though I can't conceive of them adding more than than
10s of thousands of records, which won't change the current
performance profile at all).

If there is a SQL Server conversion, my question is this:

1. just do an ODBC upsizing, OR

2. convert the whole damned thing to ADO.

Obviously, #1 is going to be a lot easier. Yes, I'm aware of a
number of places where I'll need to drastically alter the way the
application works (though it nowhere loads any large number of
records, of course). And I can easily think of several areas where
server-side processing will vastly improve performance.

My gut says to change as little as necessary, and just go with ODBC
linked tables and then fix all the things that are inefficient when
converted. This means a lot of stored procedures.

The one unbound form (where the highest volume of data entry takes
place) is now all done with DAO. Perhaps it would benefit from
conversion to ADO? Indeed, it is part of a UI with a bound list
subform and an unbound detail subform, so I could use a single ADO
recordset for the recordsource of both (or are form recordsources
DAO by default?), perhaps (I've thought of that one for quite some
time).

Any suggestions as to how to answer this question for myself? What
specific things should I look at to help evaluate the question?

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 12 '05 #1
26 3820
"David W. Fenton" <dX********@bwa y.net.invalid> wrote in message
news:94******** *************** ****@24.168.128 .74...
A client is panicking about their large Access application, which
has been running smoothly with 100s of thousands of records for
quite some time. They have a big project in the next year that will
lead to a lot of use of the database and the adding of quite a lot
of new data (though I can't conceive of them adding more than than
10s of thousands of records, which won't change the current
performance profile at all).

If there is a SQL Server conversion, my question is this:

1. just do an ODBC upsizing, OR

2. convert the whole damned thing to ADO.

Obviously, #1 is going to be a lot easier. Yes, I'm aware of a
number of places where I'll need to drastically alter the way the
application works (though it nowhere loads any large number of
records, of course). And I can easily think of several areas where
server-side processing will vastly improve performance.

My gut says to change as little as necessary, and just go with ODBC
linked tables and then fix all the things that are inefficient when
converted. This means a lot of stored procedures.

The one unbound form (where the highest volume of data entry takes
place) is now all done with DAO. Perhaps it would benefit from
conversion to ADO? Indeed, it is part of a UI with a bound list
subform and an unbound detail subform, so I could use a single ADO
recordset for the recordsource of both (or are form recordsources
DAO by default?), perhaps (I've thought of that one for quite some
time).

Any suggestions as to how to answer this question for myself? What
specific things should I look at to help evaluate the question?


I can only tell you that I have managed a large project with SQL Server
using ODBC/DAO for several years and find that I have no problems at all
with performance. I mostly use ODBC linked tables, but where I see a
performance advantage, I use Pass-Throughs and Stored Procedures. You'd be
amazed just how infrequently I need to do either one though.

I find that Jet/ODBC does a very good job at passing the SQL to the server
even when I don't go out of my way to force it. When I made the conversion
I did exactly as you described. I changed from linked Access tables to
ODBC linked SQL Server tables and then (without changing anything else) I
tested the app.

I did have to modify several things to "just make them work" and several
others to get performance up to par, but again, it was not so many things
as you might imagine. Once you achieve that position you can put the app
on-line and at a more leisurely pace make a closer examination of where you
might find some performance gains with additional changes.

My opinion is that you really don't need "all of the processing" on the
server so much as "all of the selecting". As long as the server is not
returning more rows/columns than necessary and bogging down the network,
any other processing will be about as fast on the client as on the server.
Nowadays, most client PCs have way more CPU than required anyway.

This strategy allows you to retain all DAO code (I've never written a line
of ADO) and stay with what's familiar. To me if this is an important app
that's been in long term use, then you will have more than enough to worry
about by introducing a new back end. Introducing a whole host of other
changes at the same time (that might be completely unnecessary) just clouds
the waters when you do need to chase a problem down.

About a year ago I went through my entire app and looked at every single
query to see which could be changed to Pass-Throughs or SPs. I found about
three (of 100 or so) that performed any better as a PT. And in my
experience SPs are more of an answer to security problems than performance
issues. They are great if you need to do a bunch of conditional
processing, use temp tables, and the like, but I just don't see many cases
where I can't pull in the required rows and do all of that same processing
in VBA just as quickly (while being much easier for me to write).

There may in fact be some advantages to ADO in NEW applications (still
waiting for some published data on this), but for a mature app already
fulfilling its requirements using DAO it just seems like an incredible
waste of time to attempt converting it.
--
I don't check the Email account attached
to this message. Send instead to...
RBrandt at Hunter dot com
Nov 12 '05 #2
It just seems to me that you could import all the Access tables to Sql
Server, establish your relationships in sql Server, add some triggers
for the data entry, but I would definitely go with ADO for the data
entry. And for error handling you will need stored procedures to commit
or rollback stuff on the sql server end (depending on how user intensive
your project is). But the robustness and performance of sql server will
more than repay you for your efforts. People at my place had me connect
them with ODBC for adhoc stuff. This seems to work fine. But I
wouldn't automate anything with ODBC.

The rule of thumb that has worked for me between sql Server and Access
is that ODBC works pretty well for adhoc queries. But for serious data
entry through Access or Web forms, and data crunching and automation of
data reporting, I would stick with sp's and ADO.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 12 '05 #3
We have had good experiences with just upsizing the back end
for Access 2000 applications then using ODBC linked tables.
We got the best results by using Access 2002 for the upsizing,
just upsizing the structure, then using the SQL Server Data
Transformation Service to port the data.
In one app there are some partially unbound forms where I used
ADO to move the data the data between the forms and the
tables, and occasionally VBA Boolean data (Checkboxes)
did not translate properly to SQL Server Bit data type.
I wrote some SQL Server Stored Procs to do the translation
explicitly and that appears to have fixed the problem.

Ragnar

"David W. Fenton" <dX********@bwa y.net.invalid> wrote in message
news:94******** *************** ****@24.168.128 .74...
A client is panicking about their large Access application, which
has been running smoothly with 100s of thousands of records for
quite some time. They have a big project in the next year that will
lead to a lot of use of the database and the adding of quite a lot
of new data (though I can't conceive of them adding more than than
10s of thousands of records, which won't change the current
performance profile at all).

If there is a SQL Server conversion, my question is this:

1. just do an ODBC upsizing, OR

2. convert the whole damned thing to ADO.

Obviously, #1 is going to be a lot easier. Yes, I'm aware of a
number of places where I'll need to drastically alter the way the
application works (though it nowhere loads any large number of
records, of course). And I can easily think of several areas where
server-side processing will vastly improve performance.

My gut says to change as little as necessary, and just go with ODBC
linked tables and then fix all the things that are inefficient when
converted. This means a lot of stored procedures.

The one unbound form (where the highest volume of data entry takes
place) is now all done with DAO. Perhaps it would benefit from
conversion to ADO? Indeed, it is part of a UI with a bound list
subform and an unbound detail subform, so I could use a single ADO
recordset for the recordsource of both (or are form recordsources
DAO by default?), perhaps (I've thought of that one for quite some
time).

Any suggestions as to how to answer this question for myself? What
specific things should I look at to help evaluate the question?

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc

Nov 12 '05 #4
ri*********@hot mail.com (Rick Brandt) wrote in
<bn************ *@ID-98015.news.uni-berlin.de>:
There may in fact be some advantages to ADO in NEW applications
(still waiting for some published data on this), but for a mature
app already fulfilling its requirements using DAO it just seems
like an incredible waste of time to attempt converting it.


Well, that's certainly my thought.

But the reason to use ADO is partly to avoid having to do a lot of
server-side stuff. I'll be administering this SQL Server app, so
that won't really be an issue, but if I can do through ADO what
would take a complex stored procedure in ODBC, maybe it would be
worth it.

In other words, where I'm contemplating replacing the existing
mechanisms with ones appropriate to SQL Server, should I not
contemplate the ADO methods for doing so?

Is there a problem mixing the two in an app, using linked
tables/ODBC in some places and ADO in places where it makes things
easier?

What kind of tasks would those likely be?

I've got a number of issues here:

1a. adding new people currently uses an unbound form that collects
a little data and presents a list of possible matches (to reduce
creation of duplicates; if they don't check for dupes they can't
add the record). My algorithm for evaluating closeness of the match
to what the user puts in is complex and belongs serverside, no
doubt. In thinking it through, I guess I'd really only pass the
WHERE clause that limits results to the close matches and then do
the calculation in Access using my algorithm for evaluating
closeness. I shudder to think about doing that server-side, as the
code is fairly complex. It will only be operating on small numbers
of records (always fewer than 100 or so), so I don't see an issue.

1b. the other part of adding a person is actually doing it. In
Access I open an AppendOnly recordset, add the record, plug in the
values, cache the PK value, save the record, close the recordset,
then open a form with the newly added record loaded. A stored
procedure with input parameters and an output value of the PK would
do this just fine, and will replace the existing code quite easily
(it's already a function that adds the record and returns the PK
value).

2. the main data person entry form is populated through a FIND
function that is based on name, and returns small groups of records
(1-100 or so, usually in the range of 3-10 records). It allows
wildcards so it's not always exact matching (and therefore, often
returning more than just 1 record). I just change the recordsource
for each search. This could surely be done with a passthrough, and
would easily replace the functionality I've already got in place,
simply by altering the SQL that is generated to be a passthrough.
Oops, no a passthrough doesn't work as a form recordsource, no? So,
it would be better as a stored procedure, I guess.

3. in the main data entry form there is a summary calculation of
monies donated that appears at the bottom of the form. Presently
I'm using an in-memory summary recordset that has totals for each
person precalculated and then do some complicated work to update
that (having a custom collection of PKs that have totals that need
to be updated, and requerying the recordset after N additions --
messy, yes, but it improved performance a lot). This could all be
replaced with a stored procedure and would be vastly more efficient
and easy to maintain. I'd have to look into whether it makes more
sense to have return values for the subtotals, or return a
resultset and walk through that to get the values for display.
Since the number of values changes over time, it's probably better
to walk through records (unless a return value can be an array).

4. the main data entry process involves creation of (but hardly any
updating of) child records of the main form. This is presently done
through a bound continuous form list of the child records linked to
an unbound subform that displays the detail. This works just fine,
as the list is noneditable, and the editable form is unbound. It
was complicated to program, though. I could easily rewrite the
retrieval and saving subroutines to operate server-side. I'd have
to decide, though, if I want to operate as I have, with Add doing
nothing but creating a new blank record and then Save updating the
existing record. I could handle that with a single stored
procedure, I guess, simply assigning the input values and if the PK
input value is null, the stored procedure will insert instead of
update. But that leads to a problem, because the OnCurrent of the
list form drives the display of the unbound detail form. I might
have to delink that. It would definitely require some rethinking of
that part of the process.

5. the main form runs a test in the OnCurrent event to check for
possible duplications. If there are possible duplicates, a
collection of the PKs of the possible duplicate records is
populated and a button RECONCILE DUPLICATES is set visible. When
that button is clicked, a form to reconcile the duplicates is
displayed, with two subforms, on the left the main record, on the
right an empty subform that is populated by a listbox at the top
which lists the PKs of the possible dups. It's OnUpdate event
populates the other subform. You can then move all the data from
one to the other (or on individual fields and subforms). I see no
reason why this duplicate reconciliation would not work as is,
since it's loading one record at a time, based on a PK. However,
the code running in the OnCurrent of the main data form would
probably be more efficient with SQL Server, though I'm not sure it
would benefit from anything server-side (could work as a stored
procedure, but probably just as easy to simply not bother).

6. the app includes a complex query-by-form interface that
currently writes data to a temp table. I see no reason to change
that, since the users click on checkmarks in the resultset to make
selections for export, etc. This QBF interface has a complex class
module that constructs the FROM and WHERE clauses dynamically based
on the criteria, and then uses that to construct the SELECT used
for the insert. How would that work in this case? Would I just pass
the dynamic SQL and assume Jet will tell the server to do the
selection and then Jet will do the insert into the temp table? This
is one where I think I'd mostly leave it alone and see if Jet is
smart enough to get it right. I see no reason why it wouldn't as
all the SQL strings are as efficient as they can be in terms of
joins and WHERE clauses. And the criteria are never based on
functions, though a few of them have wildcards (though that's no
issue except for changing the wildcard character).

OK, any comments on those things? They are the ones I can think of
off the top of my head, as somebody who plays around with SQL
Server and reads a lot about it, but hasn't been called upon to
build a full-fledged app with it.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 12 '05 #5
rp*****@aol.com (Rich P) wrote in
<3f************ *********@news. frii.net>:
It just seems to me that you could import all the Access tables to
Sql Server, establish your relationships in sql Server, add some
triggers for the data entry, but I would definitely go with ADO
for the data entry. And for error handling you will need stored
procedures to commit or rollback stuff on the sql server end
(depending on how user intensive your project is). But the
robustness and performance of sql server will more than repay you
for your efforts. People at my place had me connect them with
ODBC for adhoc stuff. This seems to work fine. But I wouldn't
automate anything with ODBC.

The rule of thumb that has worked for me between sql Server and
Access is that ODBC works pretty well for adhoc queries. But for
serious data entry through Access or Web forms, and data crunching
and automation of data reporting, I would stick with sp's and ADO.


You are saying that you would take an elaborate app that has been
in production use for 3 years (though not in the latest version)
and rewrite everything in it from the ground up to use ADO?

You have got to be completely nuts.

You know, there was a time not very long ago when ODBC was the only
way to connect to SQL Server from Access and people wrote lots of
very efficient applications that weren't slow at all. Maybe you
just don't know how to do that.

I'm pretty sure I can get a working app with a SQL Server back end
with ODBC a whole lot quicker than if I decided to rewrite
everything from scratch.

That said, were I starting the app right now to work against SQL
Server, then I'd go with ADO.

As to data entry, that's the one area where I'd consider ADO, at
least in one of the data entry areas (the one that's already
unbound). But I'm not sure that a stored procedure with ODBC would
work any less well.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 12 '05 #6
"David W. Fenton" <dX********@bwa y.net.invalid> wrote in message
news:94******** *************** ****@24.168.128 .78...
ri*********@hot mail.com (Rick Brandt) wrote in
<bn************ *@ID-98015.news.uni-berlin.de>:
There may in fact be some advantages to ADO in NEW applications
(still waiting for some published data on this), but for a mature
app already fulfilling its requirements using DAO it just seems
like an incredible waste of time to attempt converting it.
Well, that's certainly my thought.

But the reason to use ADO is partly to avoid having to do a lot of
server-side stuff. I'll be administering this SQL Server app, so
that won't really be an issue, but if I can do through ADO what
would take a complex stored procedure in ODBC, maybe it would be
worth it.


But you could also just create a DAO Recordset from your ODBC link (or a
Pass-Through or an SP) and run your existing process on it. If it were a very
complex process that needed to be run against a LOT of records so that the
result of a few records could be returned, then it would make sense to build it
as an SP on the server.

Remember that you will never pull *more* records from a SS back end than you
have been with your Jet back end. If it's been a managable amount of traffic
with Jet it still will be with ODBC. That isn't to say that one shouldn't
optimize where it's possible, but that doesn't mean that you have to do it all
at once. You see where the most needs are and pick your battles.
In other words, where I'm contemplating replacing the existing
mechanisms with ones appropriate to SQL Server, should I not
contemplate the ADO methods for doing so?
Possibly. As stated previously, I have never used ADO. I read a lot of posts
stating that I should be, but I never see any tangible reasons. Mostly just
people who use it "because". Which is perfectly fine. If people use it only
because they are familiar with its use and not with DAO, that's a perfectly good
reason. Many such posts state either implicitly or explicitly however that ADO
is so much *better* than DAO. What is never explicit in those posts is why.
Is there a problem mixing the two in an app, using linked
tables/ODBC in some places and ADO in places where it makes things
easier?
Other than the need to disambiguate a few object types I don't know of any
reasons not to use both.

What kind of tasks would those likely be?

I've got a number of issues here:

1a. adding new people currently uses an unbound form that collects
a little data and presents a list of possible matches (to reduce
creation of duplicates; if they don't check for dupes they can't
add the record). My algorithm for evaluating closeness of the match
to what the user puts in is complex and belongs serverside, no
doubt. In thinking it through, I guess I'd really only pass the
WHERE clause that limits results to the close matches and then do
the calculation in Access using my algorithm for evaluating
closeness. I shudder to think about doing that server-side, as the
code is fairly complex. It will only be operating on small numbers
of records (always fewer than 100 or so), so I don't see an issue.
Well, writing SPs in T-SQL _is_ harder (IMO) than using VBA code, but I can also
attest that it is way easier on SS than on other platforms. I also have to
write SPs on our AS400 which uses IBM's Universal Database (close to DB2). It
is a lot closer to ANSI SQL and is not nearly as intuitive. It also doesn't
provide the nice tools the SS has for the task. After I've been working in that
for a while T-SQL and SS feels like child's play.

1b. the other part of adding a person is actually doing it. In
Access I open an AppendOnly recordset, add the record, plug in the
values, cache the PK value, save the record, close the recordset,
then open a form with the newly added record loaded. A stored
procedure with input parameters and an output value of the PK would
do this just fine, and will replace the existing code quite easily
(it's already a function that adds the record and returns the PK
value)
Yes, but again a plain ol' DAO Recordset with zero rows returned (just for
adding records) would be imperceptibly different in terms of performance,
2. the main data person entry form is populated through a FIND
function that is based on name, and returns small groups of records
(1-100 or so, usually in the range of 3-10 records). It allows
wildcards so it's not always exact matching (and therefore, often
returning more than just 1 record). I just change the recordsource
for each search. This could surely be done with a passthrough, and
would easily replace the functionality I've already got in place,
simply by altering the SQL that is generated to be a passthrough.
Oops, no a passthrough doesn't work as a form recordsource, no? So,
it would be better as a stored procedure, I guess.
You can't use Pass-Throughs for sub-reports and sub-forms. They work fine for
top level forms and reports though.

3. in the main data entry form there is a summary calculation of
monies donated that appears at the bottom of the form. Presently
I'm using an in-memory summary recordset that has totals for each
person precalculated and then do some complicated work to update
that (having a custom collection of PKs that have totals that need
to be updated, and requerying the recordset after N additions --
messy, yes, but it improved performance a lot). This could all be
replaced with a stored procedure and would be vastly more efficient
and easy to maintain. I'd have to look into whether it makes more
sense to have return values for the subtotals, or return a
resultset and walk through that to get the values for display.
Since the number of values changes over time, it's probably better
to walk through records (unless a return value can be an array).

4. the main data entry process involves creation of (but hardly any
updating of) child records of the main form. This is presently done
through a bound continuous form list of the child records linked to
an unbound subform that displays the detail. This works just fine,
as the list is noneditable, and the editable form is unbound. It
was complicated to program, though. I could easily rewrite the
retrieval and saving subroutines to operate server-side. I'd have
to decide, though, if I want to operate as I have, with Add doing
nothing but creating a new blank record and then Save updating the
existing record. I could handle that with a single stored
procedure, I guess, simply assigning the input values and if the PK
input value is null, the stored procedure will insert instead of
update. But that leads to a problem, because the OnCurrent of the
list form drives the display of the unbound detail form. I might
have to delink that. It would definitely require some rethinking of
that part of the process.

5. the main form runs a test in the OnCurrent event to check for
possible duplications. If there are possible duplicates, a
collection of the PKs of the possible duplicate records is
populated and a button RECONCILE DUPLICATES is set visible. When
that button is clicked, a form to reconcile the duplicates is
displayed, with two subforms, on the left the main record, on the
right an empty subform that is populated by a listbox at the top
which lists the PKs of the possible dups. It's OnUpdate event
populates the other subform. You can then move all the data from
one to the other (or on individual fields and subforms). I see no
reason why this duplicate reconciliation would not work as is,
since it's loading one record at a time, based on a PK. However,
the code running in the OnCurrent of the main data form would
probably be more efficient with SQL Server, though I'm not sure it
would benefit from anything server-side (could work as a stored
procedure, but probably just as easy to simply not bother).

6. the app includes a complex query-by-form interface that
currently writes data to a temp table. I see no reason to change
that, since the users click on checkmarks in the resultset to make
selections for export, etc. This QBF interface has a complex class
module that constructs the FROM and WHERE clauses dynamically based
on the criteria, and then uses that to construct the SELECT used
for the insert. How would that work in this case? Would I just pass
the dynamic SQL and assume Jet will tell the server to do the
selection and then Jet will do the insert into the temp table? This
is one where I think I'd mostly leave it alone and see if Jet is
smart enough to get it right. I see no reason why it wouldn't as
all the SQL strings are as efficient as they can be in terms of
joins and WHERE clauses. And the criteria are never based on
functions, though a few of them have wildcards (though that's no
issue except for changing the wildcard character).

OK, any comments on those things? They are the ones I can think of
off the top of my head, as somebody who plays around with SQL
Server and reads a lot about it, but hasn't been called upon to
build a full-fledged app with it.


Looks pretty complicated, but I still see nothing that on the face of it would
have to be rewritten. If some of these areas perform badly after the switch I
would look at them, but I wouldn't be surprised if most of it just worked.

If there any areas that are somewhat modular meaning that you could split them
out into a copy of the database with relative ease then I would take one that
you think might be a good text case, put it in a separate file and redo the
whole thing using ADO and then you can make a fair comparison to see what
benefits were realized.
--
I don't check the Email account attached
to this message. Send instead to...
RBrandt at Hunter dot com
Nov 12 '05 #7
dX********@bway .net.invalid (David W. Fenton) wrote in
news:94******** *************** ****@24.168.128 .74:
If there is a SQL Server conversion, my question is this:

1. just do an ODBC upsizing, OR

2. convert the whole damned thing to ADO.


ADP ?

--
Lyle
(for e-mail refer to http://ffdba.com/contacts.htm)
Nov 12 '05 #8
ri*********@hot mail.com (Rick Brandt) wrote in
<bn************ *@ID-98015.news.uni-berlin.de>:
"David W. Fenton" <dX********@bwa y.net.invalid> wrote in message
news:94******* *************** *****@24.168.12 8.78...
ri*********@hot mail.com (Rick Brandt) wrote in
<bn************ *@ID-98015.news.uni-berlin.de>:
>There may in fact be some advantages to ADO in NEW applications
>(still waiting for some published data on this), but for a
>mature app already fulfilling its requirements using DAO it
>just seems like an incredible waste of time to attempt
>converting it.
Well, that's certainly my thought.

But the reason to use ADO is partly to avoid having to do a lot
of server-side stuff. I'll be administering this SQL Server app,
so that won't really be an issue, but if I can do through ADO
what would take a complex stored procedure in ODBC, maybe it
would be worth it.


But you could also just create a DAO Recordset from your ODBC link
(or a Pass-Through or an SP) and run your existing process on it.
If it were a very complex process that needed to be run against a
LOT of records so that the result of a few records could be
returned, then it would make sense to build it as an SP on the
server.


That's the rule of thumb I've been using in my thinking about it in
the list of items below.

But can't ADO do quite a few things server-side that Jet/ODBC won't
do server-side?
Remember that you will never pull *more* records from a SS back
end than you have been with your Jet back end. If it's been a
managable amount of traffic with Jet it still will be with ODBC.
That isn't to say that one shouldn't optimize where it's possible,
but that doesn't mean that you have to do it all at once. You see
where the most needs are and pick your battles.
It should be more manageable with ODBC as the indexes won't have to
be dragged across the LAN. For tables with 350K and 500K records
(the main parent and child tables, respectively) the PK index alone
is a pretty significant chunk of data. I assume, though, that Jet
pulls only as many index pages as it needs, though, and caches what
it can. It is truly remarkable how efficient it really is.
In other words, where I'm contemplating replacing the existing
mechanisms with ones appropriate to SQL Server, should I not
contemplate the ADO methods for doing so?


Possibly. As stated previously, I have never used ADO. I read a
lot of posts stating that I should be, but I never see any
tangible reasons. Mostly just people who use it "because". Which
is perfectly fine. If people use it only because they are
familiar with its use and not with DAO, that's a perfectly good
reason. Many such posts state either implicitly or explicitly
however that ADO is so much *better* than DAO. What is never
explicit in those posts is why.


Well, that's my gut feeling as well, but I don't have anything to
back it up, other than a few people I know who were able to build
perfectly efficient and stable ODBC apps before ADO was an option.
Is there a problem mixing the two in an app, using linked
tables/ODBC in some places and ADO in places where it makes
things easier?


Other than the need to disambiguate a few object types I don't
know of any reasons not to use both.


I've been disambiguating for a long time.

And I wouldn't create a reference to ADO, anyway -- I'd use late
binding.
What kind of tasks would those likely be?

I've got a number of issues here:

1a. adding new people currently uses an unbound form that
collects a little data and presents a list of possible matches
(to reduce creation of duplicates; if they don't check for dupes
they can't add the record). My algorithm for evaluating
closeness of the match to what the user puts in is complex and
belongs serverside, no doubt. In thinking it through, I guess
I'd really only pass the WHERE clause that limits results to the
close matches and then do the calculation in Access using my
algorithm for evaluating closeness. I shudder to think about
doing that server-side, as the code is fairly complex. It will
only be operating on small numbers of records (always fewer than
100 or so), so I don't see an issue.


Well, writing SPs in T-SQL _is_ harder (IMO) than using VBA code,


I certainly understand that. But if it's going to be a SQL Server
back end, I need to know how to do it, sooner rather than later.
but I can also attest that it is way easier on SS than on other
platforms. I also have to write SPs on our AS400 which uses IBM's
Universal Database (close to DB2). It is a lot closer to ANSI SQL
and is not nearly as intuitive. It also doesn't provide the nice
tools the SS has for the task. After I've been working in that
for a while T-SQL and SS feels like child's play.
I've looked at examples in various books and mucked around with it
myself and it doesn't look all that complicated, thought it's also
not a triviality.
1b. the other part of adding a person is actually doing it. In
Access I open an AppendOnly recordset, add the record, plug in
the values, cache the PK value, save the record, close the
recordset, then open a form with the newly added record loaded.
A stored procedure with input parameters and an output value of
the PK would do this just fine, and will replace the existing
code quite easily (it's already a function that adds the record
and returns the PK value)


Yes, but again a plain ol' DAO Recordset with zero rows returned
(just for adding records) would be imperceptibly different in
terms of performance,


But it holds locks on the server side, no? And I'd like to avoid
that.
2. the main data person entry form is populated through a FIND
function that is based on name, and returns small groups of
records (1-100 or so, usually in the range of 3-10 records). It
allows wildcards so it's not always exact matching (and
therefore, often returning more than just 1 record). I just
change the recordsource for each search. This could surely be
done with a passthrough, and would easily replace the
functionality I've already got in place, simply by altering the
SQL that is generated to be a passthrough. Oops, no a
passthrough doesn't work as a form recordsource, no? So, it
would be better as a stored procedure, I guess.


You can't use Pass-Throughs for sub-reports and sub-forms. They
work fine for top level forms and reports though.


OK, then a passthrough for the main form's recordsource, set on
each FIND operation would work fine. I wasn't worrying about the
subforms, as I don't see any reason why those (which are filtered
on their main foreign key) wouldn't work straightaway with the ODBC
conversion.

[further examples deleted]
OK, any comments on those things? They are the ones I can think
of off the top of my head, as somebody who plays around with SQL
Server and reads a lot about it, but hasn't been called upon to
build a full-fledged app with it.


Looks pretty complicated, but I still see nothing that on the face
of it would have to be rewritten. If some of these areas perform
badly after the switch I would look at them, but I wouldn't be
surprised if most of it just worked.


A couple of them perform badly already, so I'd like to hand those
off to the server to see if they get better.
If there any areas that are somewhat modular meaning that you
could split them out into a copy of the database with relative
ease then I would take one that you think might be a good text
case, put it in a separate file and redo the whole thing using ADO
and then you can make a fair comparison to see what benefits were
realized.


Well, none of the above examples was at all suggesting rewriting as
ADO -- I was just discussing where to convert to something
server-side (a passthrough, a stored procedure).

Sounds like your recommendation is to upsize and test and then work
from there.

That's definitely a philosophy that's completely inline with my
usual practices.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 12 '05 #9
Mi************@ Invalid.Com (Lyle Fairfield) wrote in
<Xn************ *******@130.133 .1.4>:
dX********@bwa y.net.invalid (David W. Fenton) wrote in
news:94******* *************** *****@24.168.12 8.74:
If there is a SQL Server conversion, my question is this:

1. just do an ODBC upsizing, OR

2. convert the whole damned thing to ADO.


ADP ?


Why?

The app exists as an MDB.

It's been tested and in use for years, with only a few small parts
with anything in them that is less than a year old.

Rewriting as an ADP means starting from scratch in an environment I
know zilch about, and one that is buggy and inconsistent.

Why in the world would you suggest converting an existing complex
application to an ADP? The only scenario I can think of would be if
it was going from the current 15-user population to, say, 1500.
That simply isn't going to happen, ever. And I'm not even sure that
in that case it would justified, either.

You hate Jet so much that you'd suggest rebuilding the whole app
from scratch? You'd suggest that they spend another $50K on this
app to get something that I'm not even certain they need?

These are questions, so I don't know if you feel that way or not,
but if you do, well, I really don't think very highly of your
degree of intelligence if you believe such expenditures are
justified, in any case with an existing MDB of any degree of
complexity.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 12 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
2385
by: Aakash Bordia | last post by:
Hello, Does anybody know what is the documented and known behavior of inserting/updating binary columns using host variables from a client to a server which have different code pages? Will any code page / character set conversion take place? I am particulary interested in insert/update from subqueries. eg: insert into t1(binarycol) select :HV1 from t2 versus
6
7961
by: New MSSQL DBA | last post by:
Hi all, we are now planning to upgrade our application from a non-unicode version to a unicode version. The application's backend is a SQL Server 2000 SP3. The concern is, existing business data are stored using collation "Chinese_PRC_CI_AS", i.e. Simplified Chinese. So I thought we need to extract these data out to the new SQL Server which is using Unicode (I assume it means converting them to nchar, nvarchar type of fields for I...
4
322
by: neptune | last post by:
I've been reading the numerous posts on using SQL Server vs. Access, but still don't know if SQL server is right for my applications. I work in a section of a large corporate accounting department that uses numerous large db applications (including Oracle, JD Edwards, & an archaic mainframe) for financial and contractual record keeping functions, maintained by off-site IT guys. It's a big company so our group has no control and little...
6
1509
by: Hank Reed | last post by:
Hello, I have convinced my boss that it is time to upgrade our Access 2000 database to SQL server or possibly some other back end. I have researched many of the posts to get a feel for the effort required. The application has about 120,000 lines of code but a relatively small dataset of about 60MB. If you scold me for not doing this, 100,000 lines ago, I'll understand. My particular programming style uses SQL statements embedded in...
1
3477
by: Curtis Justus | last post by:
Hi, We have a project that requires a .NET service to convert a Word document to a PDF document. Basically, a person would create a mail merge template in Word and through automation, we merge the data and spit out one or more PDF documents. It appears that the best thing for me to do is get a product that installs some type of printer driver. The problem is that the drivers pop up a dialog box prompting for the destination file...
3
12015
by: nan | last post by:
Hi All, I am trying to connect the Database which is installed in AS400 using DB2 Client Version 8 in Windows box. First i created the Catalog, then when i selected the connection type as ODBC, then i am getting
0
9454
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10028
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
9836
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9707
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8709
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7242
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
1
3804
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
3352
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2664
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.