473,387 Members | 1,420 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Publishing Web Sites with VS 2005 - Lessons Learned

I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
4. My site is down for almost an hour while the publish utility
performs its "magic."
5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

Sep 18 '07 #1
5 1852
Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googlegr oups.com...
I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
This could still be the cable modem, as, on most, it is high speed download
(6mb/sec) and low upload (256k/sec). That will create a bottleneck. The ISP
side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
Fairly normal, but updating the page and/or the site DLLs is all you should
have to do.
3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
If you publish locally, instead of directly to the remote site, and then FTP
up, you avoid this problem. I NEVER publish directly to production.
4. My site is down for almost an hour while the publish utility
performs its "magic."
I would not publish to production.
5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.
There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.
The precompilation can make more of your source unavailable to hackers. It
also gives you the ability to obfuscate, further protecting your assemblies.
There is no reason to publish directly to a live site.
Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.
The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires some
control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your site.
If the page is commonly used, and you have menus, consider uploading a
sitemap while you put things in place. Downtime should be minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
Sep 18 '07 #2
I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed I
generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.

It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <No************@comcast.netNoSpamMwrote in
message news:%2******************@TK2MSFTNGP03.phx.gbl...
Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googlegr oups.com...
>I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.

This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.

Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.

If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>4. My site is down for almost an hour while the publish utility
performs its "magic."

I would not publish to production.
>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

The precompilation can make more of your source unavailable to hackers. It
also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider uploading
a sitemap while you put things in place. Downtime should be minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box! |
*************************************************

Sep 18 '07 #3
Unfortunately, it is not an "intelligent" publish solution.

Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing. Publish was designed to be a local publish and then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).

One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
"Just Me" <news.microsoft.comwrote in message
news:uM***************@TK2MSFTNGP06.phx.gbl...
>I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed
I generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.

It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <No************@comcast.netNoSpamMwrote in
message news:%2******************@TK2MSFTNGP03.phx.gbl...
>Looks like a misunderstanding of "publish" to me.

Inline ...

<rm*******@galaware.comwrote in message
news:11**********************@n39g2000hsh.googleg roups.com...
>>I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.

So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.

1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.

This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.

If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.

Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.

If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>>4. My site is down for almost an hour while the publish utility
performs its "magic."

I would not publish to production.
>>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.

There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.

The precompilation can make more of your source unavailable to hackers.
It also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.

The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.

Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider
uploading a sitemap while you put things in place. Downtime should be
minimal.

If you cannot FTP changes, I would consider a new host.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*********************************************** **
| Think outside the box! |
*********************************************** **


Sep 19 '07 #4
On Sep 19, 9:40 am, "Cowboy \(Gregory A. Beamer\)"
<NoSpamMgbwo...@comcast.netNoSpamMwrote:
Unfortunately, it is not an "intelligent" publish solution.

Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing. Publish was designed to be a local publish and then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).

One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************" Just Me" <news.microsoft.comwrote in message

news:uM***************@TK2MSFTNGP06.phx.gbl...
I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
doing lots of related work locally and cant keep track of what Ive changed
I generally do a full publish, but I thought the publish was intelligent to
know if the files were newer or not on the server.
It is generally slow using FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <NoSpamMgbwo...@comcast.netNoSpamMwrote in
messagenews:%2******************@TK2MSFTNGP03.phx. gbl...
Looks like a misunderstanding of "publish" to me.
Inline ...
<rmgala...@galaware.comwrote in message
news:11**********************@n39g2000hsh.googleg roups.com...
I've been using the VS2005 Publish utility on one of my projects for
about six months. I have a large site with hundreds of files,
thousands if you include the code behind files.
>So I thought I'd use the publish utility and pre-compile my site, but
make it updateable. I thought I would be increasing the access time
for each page. Here is what I've found.
>1. If I publish a pre-compiled, updateable version of the site, it
takes 50 minutes to upload 409 files. I have a cable modem, so I don't
think the problem is my network.
This could still be the cable modem, as, on most, it is high speed
download (6mb/sec) and low upload (256k/sec). That will create a
bottleneck. The ISP side may also have a bottleneck.
If you will use another method of uploading, you will find this is not as
big of a problem. Just publish to a local directory.
>2. If I change anything, like the text property in an ASP:Label
control, I have to precompile and publish again. Text properties in
ASP:Label controls are not updateable. Only HTML is.
Fairly normal, but updating the page and/or the site DLLs is all you
should have to do.
>3. If I publish a pre-compiled, updateable version of the site, I lose
all the files that my clients have uploaded, unless I download them
first.
If you publish locally, instead of directly to the remote site, and then
FTP up, you avoid this problem. I NEVER publish directly to production.
>4. My site is down for almost an hour while the publish utility
performs its "magic."
I would not publish to production.
>5. Every time I access a page for the first time, it has to be
compiled anyway. So every page is slow the first time it is accessed.
Compound that with a web farm where every page has to be compiled the
first time it is accessed on each machine of the farm.
There is byte code compilation and JIT compilation. You remove the first
with publish. You will still have to compile to machine specific code, no
matter what.
>So I'm wondering why I bother to precompile the site. Am I gaining
anything with a precompiled site? It's down for an hour while its
publishing. If I make a change, I have to precompile and upload the
whole thing. I may as well put the sources up there. At least that way
I can upload a simple change without bringing the site down for an
hour.
The precompilation can make more of your source unavailable to hackers.
It also gives you the ability to obfuscate, further protecting your
assemblies. There is no reason to publish directly to a live site.
>Does anyone have other suggestions? I've tried publishing my site to a
local disk and copying it. But that has many of the disadvantages I've
already mentioned. I don't have benchmarks for the "copy web site"
utility because I gave up on that early in the development phase. But
I've used it for sites that are not pre-compiled and updateable, and
it seems to work fine.
The discipline I use is uploading a zip and then unpackaging to a staging
directory. After that, I can move what I want. Of course, this requires
some control over folders, which you may not have.
Another option is to publish locally and then move up the DLLs and pages
that are changed. It has far less impact (time and otherwise) on your
site. If the page is commonly used, and you have menus, consider
uploading a sitemap while you put things in place. Downtime should be
minimal.
If you cannot FTP changes, I would consider a new host.
--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA
*************************************************
| Think outside the box! |
*************************************************- Hide quoted text -

- Show quoted text -
Thanks for the good advice. I will publish locally and copy the
modified files and site dlls with the "copy web site" utility. That
solves several of these problems. Even with that it looks like the
html guys will have to send me their changes so that I can publish the
site for them. They use MACs and don't have VS 2005.

I'm looking at Expression at the moment with a purchase scheduled for
next month. Sounds like it might have a better publish utility.

Sep 20 '07 #5
On Sep 20, 4:24 am, rmgala...@galaware.com wrote:
On Sep 19, 9:40 am, "Cowboy \(Gregory A. Beamer\)"

<NoSpamMgbwo...@comcast.netNoSpamMwrote:
Unfortunately, it is not an "intelligent"publishsolution.
Tools like Expression do try to sync files, but Visual Studio keeps track of
nothing.Publishwas designed to be a localpublishand then use other means
to get to production. I would not rely on EW, however, until it is more
ASP.NET focused (next version?).
One way to keep track of files is to watch dates on the originals. It is a
pain, so I understand the need to upload everything some times.
--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA
*************************************************
| Think outside the box!
|
*************************************************" Just Me" <news.microsoft.comwrote in message
news:uM***************@TK2MSFTNGP06.phx.gbl...
>I just use WS_FTP to upload the ddl and or affected aspx files etc, if Im
>doing lots of related work locally and cant keep track of what Ive changed
>I generally do a fullpublish, but I thought thepublishwas intelligent to
>know if the files were newer or not on the server.
It is generallyslowusing FTP anyway I find.
"Cowboy (Gregory A. Beamer)" <NoSpamMgbwo...@comcast.netNoSpamMwrote in
messagenews:%2******************@TK2MSFTNGP03.phx. gbl...
>Looks like a misunderstanding of "publish" to me.
>Inline ...
><rmgala...@galaware.comwrote in message
>>news:11**********************@n39g2000hsh.google groups.com...
>>I've been using the VS2005Publishutility on one of my projects for
>>about six months. I have a large site with hundreds of files,
>>thousands if you include the code behind files.
>>So I thought I'd use thepublishutility and pre-compile my site, but
>>make it updateable. I thought I would be increasing the access time
>>for each page. Here is what I've found.
>>1. If Ipublisha pre-compiled, updateable version of the site, it
>>takes 50 minutes to upload 409 files. I have a cable modem, so I don't
>>think the problem is my network.
>This could still be the cable modem, as, on most, it is high speed
>download (6mb/sec) and low upload (256k/sec). That will create a
>bottleneck. The ISP side may also have a bottleneck.
>If you will use another method of uploading, you will find this is not as
>big of a problem. Justpublishto a local directory.
>>2. If I change anything, like the text property in an ASP:Label
>>control, I have to precompile andpublishagain. Text properties in
>>ASP:Label controls are not updateable. Only HTML is.
>Fairly normal, but updating the page and/or the site DLLs is all you
>should have to do.
>>3. If Ipublisha pre-compiled, updateable version of the site, I lose
>>all the files that my clients have uploaded, unless I download them
>>first.
>If youpublishlocally, instead of directly to the remote site, and then
>FTP up, you avoid this problem. I NEVERpublishdirectly to production.
>>4. My site is down for almost an hour while thepublishutility
>>performs its "magic."
>I would notpublishto production.
>>5. Every time I access a page for the first time, it has to be
>>compiled anyway. So every page isslowthe first time it is accessed.
>>Compound that with a web farm where every page has to be compiled the
>>first time it is accessed on each machine of the farm.
>There is byte code compilation and JIT compilation. You remove the first
>withpublish. You will still have to compile to machine specific code, no
>matter what.
>>So I'm wondering why I bother to precompile the site. Am I gaining
>>anything with a precompiled site? It's down for an hour while its
>>publishing. If I make a change, I have to precompile and upload the
>>whole thing. I may as well put the sources up there. At least that way
>>I can upload a simple change without bringing the site down for an
>>hour.
>The precompilation can make more of your source unavailable to hackers.
>It also gives you the ability to obfuscate, further protecting your
>assemblies. There is no reason topublishdirectly to a live site.
>>Does anyone have other suggestions? I've tried publishing my site to a
>>local disk and copying it. But that has many of the disadvantages I've
>>already mentioned. I don't have benchmarks for the "copy web site"
>>utility because I gave up on that early in the development phase. But
>>I've used it for sites that are not pre-compiled and updateable, and
>>it seems to work fine.
>The discipline I use is uploading a zip and then unpackaging to a staging
>directory. After that, I can move what I want. Of course, this requires
>some control over folders, which you may not have.
>Another option is topublishlocally and then move up the DLLs and pages
>that are changed. It has far less impact (time and otherwise) on your
>site. If the page is commonly used, and you have menus, consider
>uploading a sitemap while you put things in place. Downtime should be
>minimal.
>If you cannot FTP changes, I would consider a new host.
>--
>Gregory A. Beamer
>MVP, MCP: +I, SE, SD, DBA
>*********************************************** **
>| Think outside the box! |
>*********************************************** **- Hide quoted text -
- Show quoted text -

Thanks for the good advice. I willpublishlocally and copy the
modified files and site dlls with the "copy web site" utility. That
solves several of these problems. Even with that it looks like the
html guys will have to send me their changes so that I canpublishthe
site for them. They use MACs and don't haveVS2005.

I'm looking at Expression at the moment with a purchase scheduled for
next month. Sounds like it might have a betterpublishutility.- Hide quoted text -

- Show quoted text -
I still can't seem to publish the site to a local directory, and copy
the files via FTP to the live server. I can do it, but the site never
works. I can delete all the files on the server, and FTP the published
local directory, and it still doesn't work. If I have a simple site
with a few pages, no master pages, and no subdirectories, then I can
do this. But if I have a complex web site with master pages and
subdirectories, copying the locally published directory via FTP does
not work.

The only way I can publish the site is by publishing directly to the
live server.

I'm using master pages, user controls, themes, subdirectories (which
require authentication), and other 2.0 features. I think the master
pages and the subdirectories are causing the problem.
Nov 17 '07 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: avinash | last post by:
We apologize if this is a duplicate email. EIGHTEENTH INTERNATIONAL CONFERENCE ON SYSTEMS ENGINEERING (ICSEng05) LAS VEGAS, USA, AUGUST 16-18, 2005 (http://www.icseng.info) This series of...
2
by: werdna.sivad | last post by:
I just uninstalled VS 2005 Professional Beta 2 and replaced it with VS Web Developer 2005 Express Edition and VS C++ 2005 Express Edition. I am having problems publishing my ASP sites now. ...
7
by: richee | last post by:
Hi all, i got a bit of a stupid question, but there you go... I wrote some pretty simple apps using visual studio 2005 .net 2 but i need them to work with 1.1. I only did them in 2 because that...
12
by: Nathan Sokalski | last post by:
I recently upgraded to from Visual Studio .NET 2003 to Visual Studio .NET 2005. In Visual Studio .NET 2003 when I would select 'Build' it would add a *.dll with the name of the Project to a /bin/...
5
by: Andy G | last post by:
I'm having a hell of a time publishing my first VS 2005 web site. All it tells me is publish failed. Is there any place that gives me a hint on why it failed? I'm typing in the website address...
6
by: Scott Abel | last post by:
Decibel, the underground hard rock and extreme heavy metal magazine, had the same problems as many other content-heavy organizations: inefficient and disconnected processes, outdated tools, and the...
2
by: Angelo Cook | last post by:
how do you prevent the publishing of virtual directories in VS 2005. I have been using VS2003 and developing websites for years. I have been using virtual directories for images, icons, styles...
0
by: JR | last post by:
I have developed a couple of web sites that I now have created a single default.htm to link to both of them. They will both reside on the same web server in different directories under the...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.