473,395 Members | 1,653 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

size of .dlls

Is there a point when the size of a asp.net web project is too big ... and
it really should be broken down into multiple projects? For example, there
is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages.

Is there a potential for performance problems at any point? Or do the
advantages of a single project typically outweigh the advantages of having
things broken up?

Thanks in advance.
Mark
Nov 17 '05 #1
8 1228
Breaking your huge dll into multiple ones make sense. Everytime u change
even one page, the whole dll gets compiled, and app gets restarted. Also if
u split it into multiple ones, if one dll hangs it wont effect the other.

"Mark" <fi**************@umn.edu> wrote in message
news:%2****************@TK2MSFTNGP09.phx.gbl...
Is there a point when the size of a asp.net web project is too big ... and
it really should be broken down into multiple projects? For example, there is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages.

Is there a potential for performance problems at any point? Or do the
advantages of a single project typically outweigh the advantages of having
things broken up?

Thanks in advance.
Mark

Nov 17 '05 #2
Technically, you should construct your assembly so that it can fit on a page
of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to
fetch the code because the code is so small that it fits into L2 cache. You
avoid soft page faults. Otherwise, if your assembly is too big it then needs
to fit on several pages in memory. Everytime the cpu requests instructions
which aren't on that page, you suffer a soft page fault while the cpu loads
the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page from
RAM which is infinitely slower than L2 cache. Gowd forbid that your assembly
does not fit into ram because it is so big then it needs to be loaded off of
hard disk, and you should consider taking up some other profession at this
point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out from
the hardware section, how big your L2 cache is and you will have a very good
idea where you are and what you need to do to achieve the appropriate
performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...
Is there a point when the size of a asp.net web project is too big ... and
it really should be broken down into multiple projects? For example, there is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages.

Is there a potential for performance problems at any point? Or do the
advantages of a single project typically outweigh the advantages of having
things broken up?

Thanks in advance.
Mark

Nov 17 '05 #3
Alvin i thought this is true when we were dealing with old binary dll's. But
here jit compiler compiles that block of code(currently reqd) and executes.
Page fault in .net translates to hitting code not compile/not in cache by
jit/clr. So however big the assy, the full code never gets compiled and
loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on a page of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to fetch the code because the code is so small that it fits into L2 cache. You avoid soft page faults. Otherwise, if your assembly is too big it then needs to fit on several pages in memory. Everytime the cpu requests instructions
which aren't on that page, you suffer a soft page fault while the cpu loads the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page from RAM which is infinitely slower than L2 cache. Gowd forbid that your assembly does not fit into ram because it is so big then it needs to be loaded off of hard disk, and you should consider taking up some other profession at this
point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out from the hardware section, how big your L2 cache is and you will have a very good idea where you are and what you need to do to achieve the appropriate
performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...
Is there a point when the size of a asp.net web project is too big ... and it really should be broken down into multiple projects? For example,

there
is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages.

Is there a potential for performance problems at any point? Or do the
advantages of a single project typically outweigh the advantages of having things broken up?

Thanks in advance.
Mark


Nov 17 '05 #4
No, ask yourself where does the jitted code sit? Where ever it sits, it will
pile up and page fault. The principle doesn't change.

--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Rajesh.V" <Ra***********@hotmail.com> wrote in message
news:OG**************@TK2MSFTNGP09.phx.gbl...
Alvin i thought this is true when we were dealing with old binary dll's. But here jit compiler compiles that block of code(currently reqd) and executes. Page fault in .net translates to hitting code not compile/not in cache by
jit/clr. So however big the assy, the full code never gets compiled and
loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on a page
of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to
fetch the code because the code is so small that it fits into L2 cache. You
avoid soft page faults. Otherwise, if your assembly is too big it then

needs
to fit on several pages in memory. Everytime the cpu requests instructions which aren't on that page, you suffer a soft page fault while the cpu

loads
the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page

from
RAM which is infinitely slower than L2 cache. Gowd forbid that your

assembly
does not fit into ram because it is so big then it needs to be loaded off of
hard disk, and you should consider taking up some other profession at

this point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out

from
the hardware section, how big your L2 cache is and you will have a very

good
idea where you are and what you need to do to achieve the appropriate
performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...
Is there a point when the size of a asp.net web project is too big ...

and it really should be broken down into multiple projects? For example,

there
is a significant difference in .dll size between a web site that has 10 codebehind pages v. 50 codebehind pages v 150 codebehind pages.

Is there a potential for performance problems at any point? Or do the
advantages of a single project typically outweigh the advantages of having things broken up?

Thanks in advance.
Mark



Nov 17 '05 #5
This is intruiging. How then, do you balance this with the practicality of
keeping conceptual "applications" together? For example, if you have seven
ASP.NET applications that utilize the same security method, similar
connection strings/global variables, and identical look/feel, and you break
them up into seven ASP.NET projects to keep their codebehind .dll under the
size of L2 cache, you suddenly have 7 identical web.config files, 7
identical global.asax.cs files with forms authentication code, 7 identical
sets of user controls, etc. Not to mention you can't share your session
variables because you have 7 different virtual directories.

You can still leverage custom controls or class libraries you've built
separately .... but looking at those "issues" listed above, I feel torn ...

I would LOVE some guidance on how to deal with these issues. Thanks in
advance!

Mark
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:%2****************@TK2MSFTNGP10.phx.gbl...

No, ask yourself where does the jitted code sit? Where ever it sits, it will
pile up and page fault. The principle doesn't change.

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Rajesh.V" <Ra***********@hotmail.com> wrote in message
news:OG**************@TK2MSFTNGP09.phx.gbl...

Alvin i thought this is true when we were dealing with old binary dll's.
But here jit compiler compiles that block of code(currently reqd) and
executes. Page fault in .net translates to hitting code not compile/not in
cache by
jit/clr. So however big the assy, the full code never gets compiled and
loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on a page
of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to
fetch the code because the code is so small that it fits into L2 cache. You
avoid soft page faults. Otherwise, if your assembly is too big it then
needs to fit on several pages in memory. Everytime the cpu requests
instructions which aren't on that page, you suffer a soft page fault while
the cpu loads the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page from
RAM which is infinitely slower than L2 cache. Gowd forbid that your assembly
does not fit into ram because it is so big then it needs to be loaded off of
hard disk, and you should consider taking up some other profession at this
point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out from
the hardware section, how big your L2 cache is and you will have a very
good idea where you are and what you need to do to achieve the appropriate
performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...

Is there a point when the size of a asp.net web project is too big ... and
it really should be broken down into multiple projects? For example, there
is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages. Is there a
potential for performance problems at any point? Or do the advantages of a
single project typically outweigh the advantages of having things broken up?

Thanks in advance.
Mark
Nov 17 '05 #6
For this conceptual grouping you can either add the applications to the app
domain or you can group the applications into one application pool depending
on whether you want to do the work programmatically or administratively.
With seven different applications, the only way to share information is to
push that data thru a process boundary either using remoting or a
webservice. For security reasons, the data inside applications and within
the context from which they run must remain isolated.

Custom controls or libraries doesn't solve the problem of marshalling data
between these process boundaries either. The basics is, if it is in another
process, you need to share data in a specified way either thru marshalling
or webservices or writing files to disk.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <mf****@idonotlikespam.cce.umn.edu> wrote in message
news:up**************@TK2MSFTNGP12.phx.gbl...
This is intruiging. How then, do you balance this with the practicality of keeping conceptual "applications" together? For example, if you have seven ASP.NET applications that utilize the same security method, similar
connection strings/global variables, and identical look/feel, and you break them up into seven ASP.NET projects to keep their codebehind .dll under the size of L2 cache, you suddenly have 7 identical web.config files, 7
identical global.asax.cs files with forms authentication code, 7 identical
sets of user controls, etc. Not to mention you can't share your session
variables because you have 7 different virtual directories.

You can still leverage custom controls or class libraries you've built
separately .... but looking at those "issues" listed above, I feel torn ....
I would LOVE some guidance on how to deal with these issues. Thanks in
advance!

Mark
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:%2****************@TK2MSFTNGP10.phx.gbl...

No, ask yourself where does the jitted code sit? Where ever it sits, it will pile up and page fault. The principle doesn't change.

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Rajesh.V" <Ra***********@hotmail.com> wrote in message
news:OG**************@TK2MSFTNGP09.phx.gbl...

Alvin i thought this is true when we were dealing with old binary dll's.
But here jit compiler compiles that block of code(currently reqd) and
executes. Page fault in .net translates to hitting code not compile/not in cache by
jit/clr. So however big the assy, the full code never gets compiled and
loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on a page of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to fetch the code because the code is so small that it fits into L2 cache. You avoid soft page faults. Otherwise, if your assembly is too big it then
needs to fit on several pages in memory. Everytime the cpu requests
instructions which aren't on that page, you suffer a soft page fault while
the cpu loads the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page from RAM which is infinitely slower than L2 cache. Gowd forbid that your assembly does not fit into ram because it is so big then it needs to be loaded off of hard disk, and you should consider taking up some other profession at this
point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out from the hardware section, how big your L2 cache is and you will have a very
good idea where you are and what you need to do to achieve the appropriate
performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...

Is there a point when the size of a asp.net web project is too big ... and it really should be broken down into multiple projects? For example, there is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages. Is there a
potential for performance problems at any point? Or do the advantages of a single project typically outweigh the advantages of having things broken up?
Thanks in advance.
Mark

Nov 17 '05 #7
Alvin,

Thank you. Reading your first sentence with two options, are you in other
words implying that my options are:

1. Creating a single VS.NET project that all share the same files.
2. Creating multiple VS.NET projects that potentially have overlapping
files, and/or share information with each other via remoting or web
servcies.

Correct?

Secondly, doesn't this all make a strong argument with going with the first
simply because of all the additional development time/headache required to
set up the second option?

Thanks again. Your help is thoroughly appreciated.

Mark Field
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:u%******************@tk2msftngp13.phx.gbl...
For this conceptual grouping you can either add the applications to the app domain or you can group the applications into one application pool depending on whether you want to do the work programmatically or administratively.
With seven different applications, the only way to share information is to
push that data thru a process boundary either using remoting or a
webservice. For security reasons, the data inside applications and within
the context from which they run must remain isolated.

Custom controls or libraries doesn't solve the problem of marshalling data
between these process boundaries either. The basics is, if it is in another process, you need to share data in a specified way either thru marshalling
or webservices or writing files to disk.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <mf****@idonotlikespam.cce.umn.edu> wrote in message
news:up**************@TK2MSFTNGP12.phx.gbl...
This is intruiging. How then, do you balance this with the practicality of
keeping conceptual "applications" together? For example, if you have

seven
ASP.NET applications that utilize the same security method, similar
connection strings/global variables, and identical look/feel, and you

break
them up into seven ASP.NET projects to keep their codebehind .dll under

the
size of L2 cache, you suddenly have 7 identical web.config files, 7
identical global.asax.cs files with forms authentication code, 7 identical sets of user controls, etc. Not to mention you can't share your session
variables because you have 7 different virtual directories.

You can still leverage custom controls or class libraries you've built
separately .... but looking at those "issues" listed above, I feel torn

...

I would LOVE some guidance on how to deal with these issues. Thanks in
advance!

Mark
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl...

No, ask yourself where does the jitted code sit? Where ever it sits, it

will
pile up and page fault. The principle doesn't change.

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Rajesh.V" <Ra***********@hotmail.com> wrote in message
news:OG**************@TK2MSFTNGP09.phx.gbl...

Alvin i thought this is true when we were dealing with old binary dll's. But here jit compiler compiles that block of code(currently reqd) and
executes. Page fault in .net translates to hitting code not compile/not

in
cache by
jit/clr. So however big the assy, the full code never gets compiled and
loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on a

page
of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory to
fetch the code because the code is so small that it fits into L2 cache. You
avoid soft page faults. Otherwise, if your assembly is too big it then
needs to fit on several pages in memory. Everytime the cpu requests
instructions which aren't on that page, you suffer a soft page fault

while the cpu loads the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you will
suffer harder page faults as the cpu must now load the appropriate page

from
RAM which is infinitely slower than L2 cache. Gowd forbid that your

assembly
does not fit into ram because it is so big then it needs to be loaded off of
hard disk, and you should consider taking up some other profession at
this point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out

from
the hardware section, how big your L2 cache is and you will have a very
good idea where you are and what you need to do to achieve the appropriate performance. FYI, microsoft prefers to optimize their products for size
instead of speed for that very reason.

regards

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...

Is there a point when the size of a asp.net web project is too big ...

and
it really should be broken down into multiple projects? For example,

there
is a significant difference in .dll size between a web site that has 10
codebehind pages v. 50 codebehind pages v 150 codebehind pages. Is there a potential for performance problems at any point? Or do the advantages

of a
single project typically outweigh the advantages of having things broken

up?

Thanks in advance.
Mark


Nov 17 '05 #8
Yes, I'd go with option one for sure. If you are going to have such a strong
dependency on other applications, it would be a good idea to make these
related applications part of the same solution like you are saying. That
way, there is no marshalling and access to data occurs on the same calling
thread which is quick and cost efficient.
hth
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <mf****@idonotlikespam.cce.umn.edu> wrote in message
news:uE**************@tk2msftngp13.phx.gbl...
Alvin,

Thank you. Reading your first sentence with two options, are you in other
words implying that my options are:

1. Creating a single VS.NET project that all share the same files.
2. Creating multiple VS.NET projects that potentially have overlapping
files, and/or share information with each other via remoting or web
servcies.

Correct?

Secondly, doesn't this all make a strong argument with going with the first simply because of all the additional development time/headache required to
set up the second option?

Thanks again. Your help is thoroughly appreciated.

Mark Field
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in
message news:u%******************@tk2msftngp13.phx.gbl...
For this conceptual grouping you can either add the applications to the app
domain or you can group the applications into one application pool

depending
on whether you want to do the work programmatically or administratively.
With seven different applications, the only way to share information is to
push that data thru a process boundary either using remoting or a
webservice. For security reasons, the data inside applications and within the context from which they run must remain isolated.

Custom controls or libraries doesn't solve the problem of marshalling data between these process boundaries either. The basics is, if it is in

another
process, you need to share data in a specified way either thru marshalling or webservices or writing files to disk.

regards
--
-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Mark" <mf****@idonotlikespam.cce.umn.edu> wrote in message
news:up**************@TK2MSFTNGP12.phx.gbl...
This is intruiging. How then, do you balance this with the practicality
of
keeping conceptual "applications" together? For example, if you have

seven
ASP.NET applications that utilize the same security method, similar
connection strings/global variables, and identical look/feel, and you

break
them up into seven ASP.NET projects to keep their codebehind .dll
under the
size of L2 cache, you suddenly have 7 identical web.config files, 7
identical global.asax.cs files with forms authentication code, 7 identical sets of user controls, etc. Not to mention you can't share your
session variables because you have 7 different virtual directories.

You can still leverage custom controls or class libraries you've built
separately .... but looking at those "issues" listed above, I feel torn ...

I would LOVE some guidance on how to deal with these issues. Thanks
in advance!

Mark
fi******@umn.edu

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote

in message news:%2****************@TK2MSFTNGP10.phx.gbl...

No, ask yourself where does the jitted code sit? Where ever it sits, it will
pile up and page fault. The principle doesn't change.

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
"Rajesh.V" <Ra***********@hotmail.com> wrote in message
news:OG**************@TK2MSFTNGP09.phx.gbl...

Alvin i thought this is true when we were dealing with old binary dll's. But here jit compiler compiles that block of code(currently reqd) and
executes. Page fault in .net translates to hitting code not
compile/not
in
cache by
jit/clr. So however big the assy, the full code never gets compiled
and loaded. So only the MS will know a good ans to this.

"Alvin Bruney" <vapordan_spam_me_not@hotmail_no_spamhotmail.com > wrote in message news:O0*************@TK2MSFTNGP10.phx.gbl...
Technically, you should construct your assembly so that it can fit on
a page
of memory. With this simple goal in mind, when the cpu executes your
assembly or code from your assembly it does not have to go to main memory
to
fetch the code because the code is so small that it fits into L2
cache. You
avoid soft page faults. Otherwise, if your assembly is too big it then
needs to fit on several pages in memory. Everytime the cpu requests
instructions which aren't on that page, you suffer a soft page fault while the cpu loads the appropriate page and so on and so forth.

If your assembly is bigger so that it cannot fit into L2 cache, you
will suffer harder page faults as the cpu must now load the appropriate page from
RAM which is infinitely slower than L2 cache. Gowd forbid that your

assembly
does not fit into ram because it is so big then it needs to be loaded off
of
hard disk, and you should consider taking up some other profession at

this point because your customers will not be happy.

Measure the size of your assembly after a release build and figure out

from
the hardware section, how big your L2 cache is and you will have a
very good idea where you are and what you need to do to achieve the

appropriate performance. FYI, microsoft prefers to optimize their products for size instead of speed for that very reason.

regards

-----------
Got TidBits?
Get it here: www.networkip.net/tidbits
Mark" <fi**************@umn.edu> wrote in message
news:#9**************@TK2MSFTNGP09.phx.gbl...

Is there a point when the size of a asp.net web project is too big ...

and
it really should be broken down into multiple projects? For example,

there
is a significant difference in .dll size between a web site that has 10 codebehind pages v. 50 codebehind pages v 150 codebehind pages. Is
there a potential for performance problems at any point? Or do the advantages

of
a
single project typically outweigh the advantages of having things

broken up?

Thanks in advance.
Mark



Nov 17 '05 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Johann Blake | last post by:
I can hardly believe I'm the first one to report this, but having gone through the newsgroup, it appears that way. I would like to open a solution in the VS.NET IDE that consists of multiple...
2
by: Shiraz | last post by:
Hi I just made an installer for an application that uses two external COM dlls. On the surface, everything seems to be running smoothly and the the application runs without any errors. However,...
11
by: Devender Khari | last post by:
Hi Friends, I'm facing a situation as follows, need help on identifying possible issues. There is an MFC application developed in VC6.0, say ABCVC6.exe and another developed in VC.NET, say...
0
by: ZMan | last post by:
Scenario: This is about debugging server side scripts that make calls to middle-tier business DLLs. The server side scripts are legacy ASP 3.0 pages, and the DLLs are managed DLLs...
7
by: Oenone | last post by:
I'm sure there's an obvious way to do this, but I'm missing it so far. I have an ASP.NET application that relies on several DLLs to work. Currently in order to get my site working I have to put...
6
by: Brian Bischof | last post by:
I'm having troubles getting the debugging process to work consistenly for external classes. I got it to work once and then I turned it off. But now I can't get re-enabled. Here is what I'm doing....
0
by: Dave | last post by:
Hello The application I'm building an installer for uses dlls which were developed originally in C. Since the application itself is developed in C#, these dlls were wrapped using SWIG....
3
by: gopal | last post by:
I am developing an application in CSharp - windows forms based, which copies the DLLs both unmanaged and managed DLLs from a shared folder and will overwrite the existing versions of managed &...
10
by: =?Utf-8?B?UmljaGFyZA==?= | last post by:
Hi, I usually deploy my ASP .NET application to the server by publishing, using Visual Studio 2005 publish feature. This creates the Bin folder on the server, with the compiled DLLs. I've...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.