By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
449,353 Members | 1,237 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 449,353 IT Pros & Developers. It's quick & easy.

Publishing Web Sites with VS 2005 - Lessons Learned

P: n/a
I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
4. My site is down for almost an hour while the publish utility
performs its "magic."
5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

Sep 18 '07 #1
Share this Question
Share on Google+
5 Replies


P: n/a
Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googlegr oups.com...
I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
This could still be the cable modem, as, on most, it is high speed download
(6mb/sec) and low upload (256k/sec). That will create a bottleneck. The ISP
side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
Fairly normal, but updating the page and/or the site DLLs is all you should
have to do.
3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
If you publish locally, instead of directly to the remote site, and then FTP
up, you avoid this problem. I NEVER publish directly to production.
4. My site is down for almost an hour while the publish utility
performs its "magic."
I would not publish to production.
5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.
There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.
The precompilation can make more of your source unavailable to hackers. It
also gives you the ability to obfuscate, further protecting your assemblies.
There is no reason to publish directly to a live site.
Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.
The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires some
control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your site.
If the page is commonly used, and you have menus, consider uploading a
sitemap while you put things in place. Downtime should be minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
Sep 18 '07 #2

P: n/a
I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed I
generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.

It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <No************@comcast.netNoSpamMwrote in
message news:%2******************@TK2MSFTNGP03.phx.gbl...
Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googlegr oups.com...
>I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.

This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.

Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.

If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>4. My site is down for almost an hour while the publish utility
performs its "magic."

I would not publish to production.
>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

The precompilation can make more of your source unavailable to hackers. It
also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider uploading
a sitemap while you put things in place. Downtime should be minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box! |
*************************************************

Sep 18 '07 #3

P: n/a
Unfortunately, it is not an "intelligent" publish solution.

Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing. Publish was designed to be a local publish and then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).

One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
"Just Me" <news.microsoft.comwrote in message
news:uM***************@TK2MSFTNGP06.phx.gbl...
>I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed
I generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.

It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <No************@comcast.netNoSpamMwrote in
message news:%2******************@TK2MSFTNGP03.phx.gbl...
>Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googleg roups.com...
>>I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.

This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.

Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.

If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>>4. My site is down for almost an hour while the publish utility
performs its "magic."

I would not publish to production.
>>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

The precompilation can make more of your source unavailable to hackers.
It also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider
uploading a sitemap while you put things in place. Downtime should be
minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*********************************************** **
| Think outside the box! |
*********************************************** **


Sep 19 '07 #4

P: n/a
On Sep 19, 9:40 am, "Cowboy \(Gregory A. Beamer\)"
<NoSpamMgbwo...@comcast.netNoSpamMwrote:
Unfortunately, it is not an "intelligent" publish solution.

Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing. Publish was designed to be a local publish and then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).

One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************" Just Me" <news.microsoft.comwrote in message

news:uM***************@TK2MSFTNGP06.phx.gbl...
I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed
I generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.
It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <NoSpamMgbwo...@comcast.netNoSpamMwrote in
messagenews:%2******************@TK2MSFTNGP03.phx. gbl...
Looks like a misunderstanding of "publish" to me.
Inline ...
<rmgala...@galaware.comwrote in message
news:11**********************@n39g2000hsh.googleg roups.com...
I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.
>So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.
>1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.
If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>4. My site is down for almost an hour while the publish utility
performs its "magic."
I would not publish to production.
>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.
There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.
The precompilation can make more of your source unavailable to hackers.
It also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.
The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.
Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider
uploading a sitemap while you put things in place. Downtime should be
minimal.
If you cannot FTP changes, I would consider a new host.
--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA
*************************************************
| Think outside the box! |
*************************************************- Hide quoted text -

- Show quoted text -
Thanks for the good advice. I will publish locally and copy the
modified files and site dlls with the "copy web site" utility. That
solves several of these problems. Even with that it looks like the
html guys will have to send me their changes so that I can publish the
site for them. They use MACs and don't have VS 2005.

I'm looking at Expression at the moment with a purchase scheduled for
next month. Sounds like it might have a better publish utility.

Sep 20 '07 #5

P: n/a
On Sep 20, 4:24 am, rmgala...@galaware.com wrote:
On Sep 19, 9:40 am, "Cowboy \(Gregory A. Beamer\)"

<NoSpamMgbwo...@comcast.netNoSpamMwrote:
Unfortunately, it is not an "intelligent"publishsolution.
Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing.Publishwas designed to be a localpublishand then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).
One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.
--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA
*************************************************
| Think outside the box!
|
*************************************************" Just Me" <news.microsoft.comwrote in message
news:uM***************@TK2MSFTNGP06.phx.gbl...
>I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
>doing lots of related work locally and cant keep track of what Ive changed
>I generally do a fullpublish, but I thought thepublishwas intelligent to
>know if the files were newer or not on the server.
It is generallyslowusing FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <NoSpamMgbwo...@comcast.netNoSpamMwrote in
messagenews:%2******************@TK2MSFTNGP03.phx. gbl...
>Looks like a misunderstanding of "publish" to me.
>Inline ...
><rmgala...@galaware.comwrote in message
>>news:11**********************@n39g2000hsh.google groups.com...
>>I've been using the VS2005Publishutility on one of my projects for
>>about six months. I have a large site with hundreds of files,
>>thousands if you include the code behind files.
>>So I thought I'd use thepublishutility and pre-compile my site, but
>>make it updateable. I thought I would be increasing the access time
>>for each page. Here is what I've found.
>>1. If Ipublisha pre-compiled, updateable version of the site, it
>>takes 50 minutes to upload 409 files. I have a cable modem, so I don't
>>think the problem is my network.
>This could still be the cable modem, as, on most, it is high speed
>download (6mb/sec) and low upload (256k/sec). That will create a
>bottleneck. The ISP side may also have a bottleneck.
>If you will use another method of uploading, you will find this is not as
>big of a problem. Justpublishto a local directory.
>>2. If I change anything, like the text property in an ASP:Label
>>control, I have to precompile andpublishagain. Text properties in
>>ASP:Label controls are not updateable. Only HTML is.
>Fairly normal, but updating the page and/or the site DLLs is all you
>should have to do.
>>3. If Ipublisha pre-compiled, updateable version of the site, I lose
>>all the files that my clients have uploaded, unless I download them
>>first.
>If youpublishlocally, instead of directly to the remote site, and then
>FTP up, you avoid this problem. I NEVERpublishdirectly to production.
>>4. My site is down for almost an hour while thepublishutility
>>performs its "magic."
>I would notpublishto production.
>>5. Every time I access a page for the first time, it has to be
>>compiled anyway. So every page isslowthe first time it is accessed.
>>Compound that with a web farm where every page has to be compiled the
>>first time it is accessed on each machine of the farm.
>There is byte code compilation and JIT compilation. You remove the first
>withpublish. You will still have to compile to machine specific code, no
>matter what.
>>So I'm wondering why I bother to precompile the site. Am I gaining
>>anything with a precompiled site? It's down for an hour while its
>>publishing. If I make a change, I have to precompile and upload the
>>whole thing. I may as well put the sources up there. At least that way
>>I can upload a simple change without bringing the site down for an
>>hour.
>The precompilation can make more of your source unavailable to hackers.
>It also gives you the ability to obfuscate, further protecting your
>assemblies. There is no reason topublishdirectly to a live site.
>>Does anyone have other suggestions? I've tried publishing my site to a
>>local disk and copying it. But that has many of the disadvantages I've
>>already mentioned. I don't have benchmarks for the "copy web site"
>>utility because I gave up on that early in the development phase. But
>>I've used it for sites that are not pre-compiled and updateable, and
>>it seems to work fine.
>The discipline I use is uploading a zip and then unpackaging to a staging
>directory. After that, I can move what I want. Of course, this requires
>some control over folders, which you may not have.
>Another option is topublishlocally and then move up the DLLs and pages
>that are changed. It has far less impact (time and otherwise) on your
>site. If the page is commonly used, and you have menus, consider
>uploading a sitemap while you put things in place. Downtime should be
>minimal.
>If you cannot FTP changes, I would consider a new host.
>--
>Gregory A. Beamer
>MVP, MCP: +I, SE, SD, DBA
>*********************************************** **
>| Think outside the box! |
>*********************************************** **- Hide quoted text -
- Show quoted text -

Thanks for the good advice. I willpublishlocally and copy the
modified files and site dlls with the "copy web site" utility. That
solves several of these problems. Even with that it looks like the
html guys will have to send me their changes so that I canpublishthe
site for them. They use MACs and don't haveVS2005.

I'm looking at Expression at the moment with a purchase scheduled for
next month. Sounds like it might have a betterpublishutility.- Hide quoted text -

- Show quoted text -
I still can't seem to publish the site to a local directory, and copy
the files via FTP to the live server. I can do it, but the site never
works. I can delete all the files on the server, and FTP the published
local directory, and it still doesn't work. If I have a simple site
with a few pages, no master pages, and no subdirectories, then I can
do this. But if I have a complex web site with master pages and
subdirectories, copying the locally published directory via FTP does
not work.

The only way I can publish the site is by publishing directly to the
live server.

I'm using master pages, user controls, themes, subdirectories (which
require authentication), and other 2.0 features. I think the master
pages and the subdirectories are causing the problem.
Nov 17 '07 #6

This discussion thread is closed

Replies have been disabled for this discussion.