471,580 Members | 1,517 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,580 software developers and data experts.

Philly-To-London via Intranet?

My Philadelphia, PA (USA) client has a branch office in London that wants to use
one of the applications I've developed for them.

Staying with an MS Access front end,
I see 4 possibilities:
--------------------------------------------------
1) Have London VPN into a PC in Philly. Virtually
a sure thing at minimal cost - just a few extra
PCs on our end.

2) Leave everything as-is and see if the front end
can connect over the intranet without hopelessly
bogging down. I'm not real hopeful, but the cost
to try it seems near zero: I just VPN to one of
their boxes and give it a shot from London.

3) Move the back end to SQL Server, but still ODBC
to the tables. I don't know enough to judge
whether this is even worth trying. Anybody?

4) Move the back end to SQL Server and replace all
the application's JET queries/ODBC connections
with ADO calls to stored procedures. This seems
like the Good-Right-And-Holy Path - but probably
more expensive than the clients will want.
---------------------------------------------------
--
PeteCresswell
Apr 6 '06 #1
30 1516
Per (PeteCresswell):
I see 4 possibilities:


Oops... make it 5:
-------------------------------------------------
5) Ship the whole system (front end, back end,
execution .BAT file...) to London and let
them run it independently there - but synchronize
the DB's nightly with a one-way copy (Philly => London).

This was the client's off-the-cuff request.

I don't like it because
- Long-term it seems like an ongoing daily task
- Somebody in London is sure to add their own data
to their copy of the back end only to have it go
"poof!" when the DB is replaced overnight - and
then request changes to the app that will let them
synchronize the DB both ways.
-------------------------------------------------
--
PeteCresswell
Apr 7 '06 #2
I'd try the VPN.

I had a client who wanted to have tighter security on an Access db 'cuz
they were going to VPN the branch offices into the main office. I
changed the back-end to SQL Server and all the queries to SQL pass-thru
queries (used DAO instead of ADO & an .adp). It took about 4 weeks to
convert all the queries to Views and Stored Procedures and to re-write
some set up routines, and then to transfer the data into SQL. Then it
took the Network people about 2 months to set up the VPN!!

I realized it would have been more efficient, money-wise, to just keep
the Access user security & let them VPN into the front-ends. So far the
VPN has been acceptable (at least I've not heard any complaints) - this
is a very low use DB - only sporadic heavy use at beginning of month.

You do know that each sign-in has to have their own copy of the
front-end (in a separate folder) on the server they are VPNing into?
--
MGFoster:::mgf00 <at> earthlink <decimal-point> net
Oakland, CA (USA)

(PeteCresswell) wrote:
My Philadelphia, PA (USA) client has a branch office in London that wants to use
one of the applications I've developed for them.

Staying with an MS Access front end,
I see 4 possibilities:
--------------------------------------------------
1) Have London VPN into a PC in Philly. Virtually
a sure thing at minimal cost - just a few extra
PCs on our end.

2) Leave everything as-is and see if the front end
can connect over the intranet without hopelessly
bogging down. I'm not real hopeful, but the cost
to try it seems near zero: I just VPN to one of
their boxes and give it a shot from London.

3) Move the back end to SQL Server, but still ODBC
to the tables. I don't know enough to judge
whether this is even worth trying. Anybody?

4) Move the back end to SQL Server and replace all
the application's JET queries/ODBC connections
with ADO calls to stored procedures. This seems
like the Good-Right-And-Holy Path - but probably
more expensive than the clients will want.
---------------------------------------------------

Apr 7 '06 #3
Per MGFoster:
You do know that each sign-in has to have their own copy of the
front-end (in a separate folder) on the server they are VPNing into?


I wasn't even thinking of something as sophisticated as a server - just an extra
dedicated PC for each user.

The server thing sounds like it would break my little deployment/work table
scheme.

All work tables are in C:\Temp - as is the local copy of the front end, which
gets downloaded/updated automagically by a .BAT file.

OTOH, nothing's impossible.... and maybe it would be beneficial to me to bite
the bullet, come to understand how the server environment works, and re-code
the execution stuff to deal with multiple identities on a single server.

Or... is there maybe some way for the server to define a virtual C: drive for
each user?
--
PeteCresswell
Apr 7 '06 #4
(PeteCresswell) wrote:
Per MGFoster:
You do know that each sign-in has to have their own copy of the
front-end (in a separate folder) on the server they are VPNing into?

I wasn't even thinking of something as sophisticated as a server - just an extra
dedicated PC for each user.

The server thing sounds like it would break my little deployment/work table
scheme.

All work tables are in C:\Temp - as is the local copy of the front end, which
gets downloaded/updated automagically by a .BAT file.

OTOH, nothing's impossible.... and maybe it would be beneficial to me to bite
the bullet, come to understand how the server environment works, and re-code
the execution stuff to deal with multiple identities on a single server.

Or... is there maybe some way for the server to define a virtual C: drive for
each user?


I don't know if it HAS to be on a server; that's just the way the
Network people set it up.
Apr 7 '06 #5

"(PeteCresswell)" <x@y.Invalid> wrote

Others can advise you better than I about VPNing, but it seems a likely
"winner".
3) Move the back end to SQL Server, but still ODBC
to the tables. I don't know enough to judge
whether this is even worth trying. Anybody?
I've worked with Access clients using DAO/ODBC to various server databases
over WANs, since back in Access 2.0 days. When the users were connected via
a 56KB leased line, with multiple users sharing the line, performance was
lousy. When the primary users (the ones who did most of the data
entry/update were put on T-1 lines shared between multiple users, they were
"happy campers"). Another large user contingent, almost entirely just
reading and reporting were on whatever kind of WAN the client corporation
had set up, but not all their locations had T-1 lines.

Larry Linson
Microsoft Access MVP

4) Move the back end to SQL Server and replace all
the application's JET queries/ODBC connections
with ADO calls to stored procedures. This seems
like the Good-Right-And-Holy Path - but probably
more expensive than the clients will want.
---------------------------------------------------
--
PeteCresswell

Apr 7 '06 #6
"(PeteCresswell)" <x@y.Invalid> wrote in
news:fk********************************@4ax.com:
1) Have London VPN into a PC in Philly. Virtually
a sure thing at minimal cost - just a few extra
PCs on our end.


With Windows Terminal Server this is by far the easiest thing to do.
That is the only way I'd implement branch office support of an app
with shared data these days.

Back in 1998, I had a client with offices in NYC and London (and two
outside consultants working at two separate locations in
Connecticutt). We did it with replication. We rejected
Citrix/Terminal Server as too expensive (it was coming in at
$900/remote user for software alone, not including
telecommunications costs).

Today, the NYC office would host a Terminal Server and remote
offices/users would VPN into the NYC office and run the app on the
Terminal Server. Given that the NYC office had a T1 by the year
2000, this would have worked extremely well.

I wouldn't consider any other option at all. Terminal Server is just
way too easy.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 7 '06 #7
"(PeteCresswell)" <x@y.Invalid> wrote in
news:55********************************@4ax.com:
5) Ship the whole system (front end, back end,
execution .BAT file...) to London and let
them run it independently there - but synchronize
the DB's nightly with a one-way copy (Philly => London).


Indirect replication is also an option, but Terminal Server would be
far easier to implement.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 7 '06 #8
"(PeteCresswell)" <x@y.Invalid> wrote in
news:gr********************************@4ax.com:
The server thing sounds like it would break my little
deployment/work table scheme.

All work tables are in C:\Temp - as is the local copy of the front
end, which gets downloaded/updated automagically by a .BAT file.


In a Terminal Server environment, you could just put it in the
location referred to by the %TEMP% variable, which will be in the
profile of each user.

Since Win2K, no one should be placing any files anywhere but in the
approved locations. For data, that's in the user profile. For users,
that's in the PROGRAMS folder. It would seem that an Access app is a
program so that you'd put it in the PROGRAMS folder, but that is not
a good approach, as the MDB needs to be read/write, and by default
from Win2K on, the PROGRAMS folder is READ-ONLY for
non-administrative users. By putting your app in the PROGRAMS
folder, you're forcing your users to run as administrative users
(which is very bad from a security standpoint) or forcing an
administrator to change the default permissions on the folder where
your app is installed. It's better to put the MDB in user space,
instead.

This does make relinking somewhat more diffucult in that you have to
do it for each user installation (and multiple users of a single PC
will have multiple copies of the front end), but that is the proper
way to insure that your app will work for people logging in as users
and not as administrators. It's the proper way to engineer
applications for Windows.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 7 '06 #9
Per David W. Fenton:
Indirect replication is also an option, but Terminal Server would be
far easier to implement.


Not sure of the distinction between "indirect" and just "replication"... but
what I heard from one of our illuminati was that when replication goes bad, it
*really* goes bad... dunno exactly what that meant, but it kind of scared me
off.
--
PeteCresswell
Apr 8 '06 #10
Per David W. Fenton:
Today, the NYC office would host a Terminal Server and remote
offices/users would VPN into the NYC office and run the app on the
Terminal Server.


Is there a reason for using a Terminal Server instead of "N" PCs in a closet
somewhere - besides unit cost of the PCs and maybe added electrical consumption?
Seems like with a Terminal Server (whatever that is.... but I'm guessing it's
some sort of PC running Windows Server...) the users would be at the mercy of
whoever configs the server and/or what applications/users are also hitting on it
besides them. Also if the server goes down, everybody's dead in the water
until somebody brings it back up or builds a new one.

My guys value independence highly. I tried to talk them into a SQL Server back
end for this thing on Day 1 and independence from IT and/or the LAN folks was
the reason they gave for rejecting it. I suspect that they'd react similarly
to a Terminal Server unless there was some reason for using it over individual
PCs.
--
PeteCresswell
Apr 8 '06 #11
"(PeteCresswell)" <x@y.Invalid> wrote in
news:u6********************************@4ax.com:
Per David W. Fenton:
Indirect replication is also an option, but Terminal Server would
be far easier to implement.


Not sure of the distinction between "indirect" and just
"replication"... but what I heard from one of our illuminati was
that when replication goes bad, it *really* goes bad... dunno
exactly what that meant, but it kind of scared me off.


Well, I don't know who that might have been, other than someone who
is truly ignorant of Jet replication.

It only goes bad if you do things wrong. This can include,
unfortunately, following recommendations in Microsoft's own
documentation (they still recommend replicating front ends as a
method of pushing out changes, even though that never works in the
long run -- replication works well only for data tables and queries,
i.e., the pure Jet objects in Access).

Replication does require knowledge to manage and care in
administration. But with that, it runs quite reliably.

As to "indirect" replication, it is a form of replication
synchronization distinct from DIRECT replication. Direct is what you
get when you synchronize from the Access user interface, and it
opens the full remote database across your network connection. This
works fine on a LAN, but on a WAN or a dialup connection, is fatally
dangerous (if the connection is dropped, at the very least, the
remote replica will lose replicability, which means you'll have to
manually recover the data changes in it and then replace it with a
new replica).

Indirect synchronization uses a process running on each end of the
synch to drop message files into a dropbox at the partner replica.
These message files detail the changes to data only, and are thus
very efficient. The process running on each machine reads and
applies the message files and then communicates with the other
machine by dropping its own messages in the remote dropbox. This
method is very safe and very efficient. However, it does require
care to keep running and has some drawbacks on servers (the
synchronizer process can't be run as a service, for instance).

Terminal Server is vastly preferable when you're dealing with fixed
offices that need to share data. Replication is very helpful for
roaming users who can't count on being able to connect reliably or
cheaply to the Internet to run the app on a Terminal Server. If the
remote users can wait to synch when they're physically in the office
and connected to the LAN, direct replication will suffice. If they
need to synch from the field, then they need indirect replication.

There's also a variety of synchronization called "Internet"
synchronization, which uses FTP rather than an SMB connection to
drop the message files in the remote dropboxes. But it is hard-wired
to depend on using IIS's FTP server on both ends of the synch, which
adds an additional dependency and a new layer of potential security
problems. Now that VPNs are pretty common, indirect replication over
a VPN running across the Internet is the easiest method for
accomplishing indirect replication, and Internet replication is
really not needed these days.

In the days when full-time Internet connections were rare and
Terminal Server cost a fortune, indirect replication was a great way
to have remote offices sharing data. But now it's main purpose is
supporting roaming users who need to edit data while disconnected.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 8 '06 #12
"(PeteCresswell)" <x@y.Invalid> wrote in
news:cj********************************@4ax.com:
Per David W. Fenton:
Today, the NYC office would host a Terminal Server and remote
offices/users would VPN into the NYC office and run the app on the
Terminal Server.
Is there a reason for using a Terminal Server instead of "N" PCs
in a closet somewhere - besides unit cost of the PCs and maybe
added electrical consumption?


Well, most clients who would be in this situation would already have
a file server. If it's Win2K Server or Win2K3 Server, Terminal
Server is a built-in feature, so why allocate a bunch of PCs for
this purpose?
Seems like with a Terminal Server (whatever that is.... but I'm
guessing it's some sort of PC running Windows Server...) the
users would be at the mercy of whoever configs the server and/or
what applications/users are also hitting on it besides them. . . .
Well, yes, I guess, but only in the same sense that users of a
website are at the mercy of the website's sysadmin, or users of a
file server are at the mercy of the administrator of the file
server.

As to performance, you have to allocate sufficient bandwidth and
RAM, but that's not a real issue until you're looking to support
dozens of simultaneous users. And if you compare the incremental
cost of adding a user to a terminal server to the cost of setting up
a dedicated PC for them, then the terminal server wins by a long
shot.

It also means that administration of the Access app is centralized,
on one machine, rather than spread over a bunch of workstations.
. . . Also if the server goes down, everybody's dead in the water
until somebody brings it back up or builds a new one.
That is no different than with a file server or a web server, so I
just don't see the point in bringing it up as an issue. A Terminal
Server is going to be as reliable as any Windows server.

I do all my work for several clients remotely on their Terminal
Server, and other than the client who has only the two default admin
terminal server sessions allowed, I never have problems connecting.
The servers are up 24/7, and there are no problems. The only issue
is when the TS licenses are already maxed out, and I run into that
only on the machines that have only the default two admin sessions
(i.e., they haven't bought additional licenses). If you're
supporting multiple users from a remote office, you'd simply buy
sufficient licenses to avoid that problem. At c. $40 per user for
that, it's really not very expensive at all.
My guys value independence highly. I tried to talk them into a
SQL Server back end for this thing on Day 1 and independence from
IT and/or the LAN folks was the reason they gave for rejecting it.
I suspect that they'd react similarly to a Terminal Server
unless there was some reason for using it over individual PCs.


The SQL Server could go down.

There really is *no* difference in terms of server reliability
between a SQL Server solution and a Terminal Server solution. But
the TS solution makes administration substantially easier (the
remote users don't have to have your app installed on their
computer).

You really need to go to Albert Kallal's website and read his
thoughts on terminal server.

And you really need to read up about it. It is a very reliable
solution and extremely easy to set up and manage. It's also
remarkably inexpensive to implement, especially when you're
migrating remote users to it who already have Office installed on
their local PCs (which is required for running Office apps on the
terminal server).

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 8 '06 #13
Per David W. Fenton:
That is no different than with a file server or a web server, so I
just don't see the point in bringing it up as an issue. A Terminal
Server is going to be as reliable as any Windows server.


My concern would be that on the troubleshooting groups radar, the terminal
server might look different: only a few people affected, maybe not covered by
whatever monitoring setup they have, possibly not understood by the
rank-and-file.
--
PeteCresswell
Apr 8 '06 #14
"(PeteCresswell)" <x@y.Invalid> wrote in
news:i0********************************@4ax.com:
Per David W. Fenton:
That is no different than with a file server or a web server, so I
just don't see the point in bringing it up as an issue. A Terminal
Server is going to be as reliable as any Windows server.


My concern would be that on the troubleshooting groups radar, the
terminal server might look different: only a few people affected,
maybe not covered by whatever monitoring setup they have, possibly
not understood by the rank-and-file.


Huh?

If they have any Win2K or Win2K3 servers, they already have a
Terminal Server. The default configurations provide two licenses
that are only usable by administrators. If you add licenses, you can
have as many people using the Terminal Server as there are licenses
and resources on the server to support them.

Administering Terminal Server is trivial (though there are some
tricky gotchas with the Win2K3 license server).

I think you're assuming it's some kind of special box, whereas any
existing Win2K or Win2K3 is already a Terminal Server, by default.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 8 '06 #15
Per David W. Fenton:
I think you're assuming it's some kind of special box, whereas any
existing Win2K or Win2K3 is already a Terminal Server, by default.


I was - because it seemed like a multi-user terminal server would have to have a
lot of horsepower to handle multiple concurrent instances of MS Access. Seems
like that's a bigger load than just somebody having, say, 10 apps open on their
desktop because that person can only be exercising one app at a time - whereas
with the terminal server, it would have to be servicing everybody who is logged
on more-or-less simultaneously.
--
PeteCresswell
Apr 8 '06 #16
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:87********************************@4ax.com...
Per David W. Fenton:
I think you're assuming it's some kind of special box, whereas any
existing Win2K or Win2K3 is already a Terminal Server, by default.


I was - because it seemed like a multi-user terminal server would have to have
a
lot of horsepower to handle multiple concurrent instances of MS Access. Seems
like that's a bigger load than just somebody having, say, 10 apps open on
their
desktop because that person can only be exercising one app at a time - whereas
with the terminal server, it would have to be servicing everybody who is
logged
on more-or-less simultaneously.
--
PeteCresswell


That's true, but most PCs these days are ridiculously more powerful than the
needs of the user so you would be surprised at the specs that will run 20 - 40
Terminal Server users. The box can't be a slouch, but it need not be deep blue
either.
--
Rick Brandt, Microsoft Access MVP
Email (as appropriate) to...
RBrandt at Hunter dot com
Apr 8 '06 #17
Per Rick Brandt:
That's true, but most PCs these days are ridiculously more powerful than the
needs of the user so you would be surprised at the specs that will run 20 - 40
Terminal Server users. The box can't be a slouch, but it need not be deep blue
either.


That's encouraging. Any idea what the incremental per-seat cost is?

I'm guessing there's at least one free seat by default because tech support
people VPN into users machines and I'm able to VPN into my client's machine from
home.
--
PeteCresswell
Apr 8 '06 #18
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:v3********************************@4ax.com...
Per Rick Brandt:
That's true, but most PCs these days are ridiculously more powerful than the
needs of the user so you would be surprised at the specs that will run 20 - 40
Terminal Server users. The box can't be a slouch, but it need not be deep
blue
either.


That's encouraging. Any idea what the incremental per-seat cost is?

I'm guessing there's at least one free seat by default because tech support
people VPN into users machines and I'm able to VPN into my client's machine
from
home.


My understanding is that you get one or two free Admin connections that are not
intended to be used while a person is actually sitting at the PC (something you
wouldn't do with a TS anyway). Non Admin users have to have a CAL and license
to whatever software they actually run on the server. If using Citrix (as we
do) then each users needs the Citrix client and a license for same.

--
Rick Brandt, Microsoft Access MVP
Email (as appropriate) to...
RBrandt at Hunter dot com
Apr 8 '06 #19
"Rick Brandt" <ri*********@hotmail.com> wrote in
news:Fs******************@newssvr27.news.prodigy.n et:
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:v3********************************@4ax.com...
Per Rick Brandt:
That's true, but most PCs these days are ridiculously more
powerful than the needs of the user so you would be surprised at
the specs that will run 20 - 40 Terminal Server users. The box
can't be a slouch, but it need not be deep blue
either.


That's encouraging. Any idea what the incremental per-seat cost
is?

I'm guessing there's at least one free seat by default because
tech support people VPN into users machines and I'm able to VPN
into my client's machine from
home.


My understanding is that you get one or two free Admin connections
that are not intended to be used while a person is actually
sitting at the PC (something you wouldn't do with a TS anyway).
Non Admin users have to have a CAL and license to whatever
software they actually run on the server. If using Citrix (as we
do) then each users needs the Citrix client and a license for
same.


Ack, no this is not right.

On WinXP, the remote desktop connection cannot be used while a
person is working at the console, but a Windows Terminal Server is a
completely different animal (though the underlying technology used
for both is the same).

Starting with Windows 2000 Server, Microsoft now installs Terminal
Server as a default (NT 4 Server could not be upgraded to Terminal
Server -- you had to install a special version of NT 4 for Terminal
Services; now it's built into the base server OS), and it provides a
license for 2 TS logons. These are real terminal session logons,
meaning their sessions are independent of the console session.
Someone can be logged on at the server console and two other people
can simultaenously run independent terminal sessions. The only
limitation is that those default licenses can only be used by
someone who is in the administrators group, not by regular users.
This makes it unsuitable for allowing two regular users to run TS
sessions, because they'd have full administrative rights on the
server.

The purpose of these two included TS licenses is:

1. to allow remote administration.

2. to demonstrate the capabilities of Windows Terminal Server.

A client of mine set up a box to serve primarily as a terminal
server for 10 offsite users (though the box is also running the
Blackberry Enterprise Server, and MSDE to support the Blackberry
server -- that's a fairly low performance operation). They bought a
Compaq Xeon 2.4GHz 1U server and put an inexpensive 75GB hard drive
in it (it doesn't need much local filespace, as that's handled by a
file server with 100s of GBs of space in a RAID array), and 2GBs of
RAM. The CALs were $32 each (educational discount), and they bought
a 10-pack. They already had Open Licenses for all the copies of
Windows 2003 Server and Office that they needed, so there was no
additional cost there.

The total cost of building this box and outfitting it for use was
under $3,000, and it's a brilliant machine, with plenty of room to
grow.

They already had the bandwidth to support it, but access speeds were
nicely boosted when their ISP put in a gigabit switch for them.
Their remote sites are using 384K DSL to connect to it, and
performance is quite good. I've used broadband and dialup to connect
to it and both work very well.

The administrative cost of deploying the same apps on workstations
at the client's 5 sites would have been huge, even leaving aside the
issue of how to share the data. Terminal Server makes it incredibly
easy to support these remote users (currently, the remote users are
running one Access app and QuickBooks).

I would not hesitate to double up a file server and terminal server,
as long as the number of users of the file server is less than 25 or
so, and the number of terminal server users is 10 or fewer. I would
not want to run a terminal server on a box that was running Exchange
Server, but that's mostly because of the problems with keeping
Exchange running smoothly more than it is with the requirements of
running WTS.

Your alternative of supporting a whole bunch of boxes in a closet
somewhere is completely ludicrous by comparison. If you're going to
go that route, you could look at blade workstations instead. But
none of that makes any sense when Terminal Server is so easy to
implement.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 8 '06 #20
Per David W. Fenton:
Terminal Server is so easy to
implement.


Can you point me to someplace that describes the drive naming conventions I'll
have to deal with. I'm assuming that I won't be able to fake a drive named
"C:" for each user.
--
PeteCresswell
Apr 8 '06 #21
Bri


(PeteCresswell) wrote:
Per David W. Fenton:
I think you're assuming it's some kind of special box, whereas any
existing Win2K or Win2K3 is already a Terminal Server, by default.

I was - because it seemed like a multi-user terminal server would have to have a
lot of horsepower to handle multiple concurrent instances of MS Access. Seems
like that's a bigger load than just somebody having, say, 10 apps open on their
desktop because that person can only be exercising one app at a time - whereas
with the terminal server, it would have to be servicing everybody who is logged
on more-or-less simultaneously.


I have a client that had just upgraded their main server about the time
I was recommending to test TS to them. They decided to use the old
server as the TS server for the test. It was a P3-366 dual processor
with 384 Mb RAM. We put 5 TS users on it and the response was quite
reasonable. Some of the heavy processing parts of the app were slower
than on their regular desktop, but these parts are used relatively
infrequently. Regular data entry and report creation was as fast as it
was on their local PCs.

--
Bri

Apr 8 '06 #22
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:77********************************@4ax.com...
Per David W. Fenton:
Terminal Server is so easy to
implement.


Can you point me to someplace that describes the drive naming conventions I'll
have to deal with. I'm assuming that I won't be able to fake a drive named
"C:" for each user.


We put each user's file into their {Profile} folder. The only other "drive"
references my apps ever make are relative to the folder the MDE resides in or
UNC paths so that is not affected. They have no access to any folders on the
TS.

IIRC their own local drives show up in browse dialogs labeled as (Users C:) or
similar so that they can save or reference files on their local system.

--
Rick Brandt, Microsoft Access MVP
Email (as appropriate) to...
RBrandt at Hunter dot com
Apr 8 '06 #23
Per Rick Brandt:
We put each user's file into their {Profile} folder. The only other "drive"
references my apps ever make are relative to the folder the MDE resides in or
UNC paths so that is not affected.


I've got work tables that are ODBC-connected to C:\Temp\AppNameTemp.mdb.
--
PeteCresswell
Apr 8 '06 #24
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:ha********************************@4ax.com...
Per Rick Brandt:
We put each user's file into their {Profile} folder. The only other "drive"
references my apps ever make are relative to the folder the MDE resides in or
UNC paths so that is not affected.


I've got work tables that are ODBC-connected to C:\Temp\AppNameTemp.mdb.


Couldn't you just as easily create a Temp folder in the home folder of the app?

--
Rick Brandt, Microsoft Access MVP
Email (as appropriate) to...
RBrandt at Hunter dot com
Apr 9 '06 #25
"(PeteCresswell)" <x@y.Invalid> wrote in
news:77********************************@4ax.com:
Per David W. Fenton:
Terminal Server is so easy to
implement.


Can you point me to someplace that describes the drive naming
conventions I'll have to deal with. I'm assuming that I won't be
able to fake a drive named "C:" for each user.


Huh?

User profiles are stored in the same location on Terminal Server as
in other versions of windows, on the system drive under Documents
and Settings. There are several environment variables to choose from
for finding the user's home profile location (which could also be on
a file server, as well, if the domain is configured that way).

That's the only place you should be deploying files, Terminal Server
or not. Anything else is going to be dependent on your users having
administrative permissions, or on setting custom user-level
permissions on your chosen location.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 9 '06 #26
"Rick Brandt" <ri*********@hotmail.com> wrote in
news:Fx*****************@newssvr27.news.prodigy.ne t:
"(PeteCresswell)" <x@y.Invalid> wrote in message
news:77********************************@4ax.com...
Per David W. Fenton:
Terminal Server is so easy to
implement.
Can you point me to someplace that describes the drive naming
conventions I'll have to deal with. I'm assuming that I won't
be able to fake a drive named "C:" for each user.


We put each user's file into their {Profile} folder. The only
other "drive" references my apps ever make are relative to the
folder the MDE resides in or UNC paths so that is not affected.
They have no access to any folders on the TS.


One could also put the front ends in a non-profile folder on a
user-writable share. For instance, you might have a second volume on
your terminal server and have a folder dedicated to storing the user
front ends, with folders named based on the user logon name, for
instance.
IIRC their own local drives show up in browse dialogs labeled as
(Users C:) or similar so that they can save or reference files on
their local system.


That's dependent on the the settings they have in place on the
Remote Desktop Client when they initiate their connection to the
server. By default, local drives are *not* mapped to the remote
connection -- you have to turn that on explicitly.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 9 '06 #27
"(PeteCresswell)" <x@y.Invalid> wrote in
news:ha********************************@4ax.com:
Per Rick Brandt:
We put each user's file into their {Profile} folder. The only
other "drive" references my apps ever make are relative to the
folder the MDE resides in or UNC paths so that is not affected.


I've got work tables that are ODBC-connected to
C:\Temp\AppNameTemp.mdb.


You'd need to relink to %TEMP%\AppNameTemp.mdb for each user, as
each user needs their own copy of the front end, and of any temp
databases.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 9 '06 #28
Per David W. Fenton:
That's the only place you should be deploying files, Terminal Server
or not. Anything else is going to be dependent on your users having
administrative permissions, or on setting custom user-level
permissions on your chosen location.


In non-terminal server situations I've had no problems with using C:\Temp -
mostly with users who are not in the Admin group. Dozens of apps.

Started doing it with Access 2.0 and never saw a reason to change.

This is all stuff that gets created on-the-fly when the app is opened and which
goes "poof!" when the app closes.

Doesn't seem like it would be that all that much coding to re-link all the work
tables every time the user opens the app - and I guess it would run fairly
quickly given that after the first time we'd mostly just check the links but not
have to re-link... but I never saw a functional reason to do it. I don't see
MS' saying that I should use the user's %Temp% as a functional reason - needs
some "why" behind it - but now I've got a "why"...

An occasionally frustrating aspect of this is that I cannot run two instances of
one of my apps at the same time: they butt heads over the work DB.

Now that Terminal Server has reared it's head, I guess it's time for me to get
on the bandwagon...and maybe I can do it so that two instances can run
concurrently -- something like %Temp%\AppNameVersionNumber\...
--
PeteCresswell
Apr 9 '06 #29
"(PeteCresswell)" <x@y.Invalid> wrote in
news:nv********************************@4ax.com:
Per David W. Fenton:
That's the only place you should be deploying files, Terminal
Server or not. Anything else is going to be dependent on your
users having administrative permissions, or on setting custom
user-level permissions on your chosen location.
In non-terminal server situations I've had no problems with using
C:\Temp - mostly with users who are not in the Admin group.
Dozens of apps.

Started doing it with Access 2.0 and never saw a reason to change.


The reason to change was the release of Win2K, which locked down the
C: drive to disallow user-level write access except to the users'
profiles. There is no C:\TEMP by default on most versions of Windows
(I think that went out with Win3.x), so you'd have to create it
anyway.
This is all stuff that gets created on-the-fly when the app is
opened and which goes "poof!" when the app closes.
It belongs in a user temp directory.

In any event, on a terminal server, you'd end up with collisions
between users, anyway, if you hardwire C:\TEMP and use the same MDB
name.
Doesn't seem like it would be that all that much coding to re-link
all the work tables every time the user opens the app - and I
guess it would run fairly quickly given that after the first time
we'd mostly just check the links but not have to re-link... but I
never saw a functional reason to do it.
The reason is that you're no longer writing your apps for Win9x --
you now are writing for a version of Windows that enforces security
rules that disallow non-administrators from writing to the locations
you've been using.

Your design has been bad design since 1999 or so.
. . . I don't see
MS' saying that I should use the user's %Temp% as a functional
reason - needs some "why" behind it - but now I've got a "why"...
Microsoft has been very lax in requiring application programmers to
make their apps function under non-admin logons. QuickBooks, for
instance, still requires at least Power User level permissions on
the local machine in order to run. This is very problematic in
principle on a Terminal Server, because Power Users get certain
administrative permissions. If Intuit would tell us exactly which
permissions it needs, then we could set the security policies to
include the Users group (if there was a relevant security policy
that applied). Better still, Intuit should get with the program and
fix their apps.

Microsoft should deny QuickBooks the "Windows compatible"
certification as long as it requires elevated permissions to run.
An occasionally frustrating aspect of this is that I cannot run
two instances of one of my apps at the same time: they butt heads
over the work DB.
That is a crucial issue on Terminal Server.
Now that Terminal Server has reared it's head, I guess it's time
for me to get on the bandwagon...and maybe I can do it so that two
instances can run concurrently -- something like
%Temp%\AppNameVersionNumber\...


If a single user never runs more than one version, then why have a
folder for it? Why not just the temp MDB in the user temp folder?

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 9 '06 #30
Per David W. Fenton:
Microsoft has been very lax in requiring application programmers to
make their apps function under non-admin logons. QuickBooks, for
instance, still requires at least Power User level permissions on
the local machine in order to run.


I think I've run into this on several occasions - even with Adobe Acrobat.

Every so often I try to go with the logical-sounding recommendation that my
day-to-day logon to my workstation should not have admin authority - and I get
burned every time and wind up going back to Admin.

In regards to MS' locking down C:... now that you mention it, I think one of my
apps tripped over that briefly. At the time, I just attributed it to the local
LAN/security guys going overboard.

Sounds to me like I need to get with The Program.... and I'm starting as of last
nite...
--
PeteCresswell
Apr 9 '06 #31

This discussion thread is closed

Replies have been disabled for this discussion.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.