473,434 Members | 1,818 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,434 software developers and data experts.

Data Access Interfaces, Mapping and Domin model

I was curious to know what some developers out in the industry are doing when it comes to exposing Data access logic, specifically persistence. This is assuming that your not using an O/R framework or something that completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing about the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn, string phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to "know" about the domain model. I understand the fact that a change to the model can have a rippling effect, but personally the maintenance ease of sharing these common objects across the enterprise makes is worthy decision.

Also, where are your business entities being created. Are they being mapped within the BAL or does your DAL perform the mapping. Again, from readings, idealistically the mapping should *not* occur in the DAL. Instead it should be done by the business layer. But if you move forward with a domain model approach and allow your DAL to know about the domain model you will already require a reference to the domain model lib. In that case, why not build a generic base DAL that know how to build the business entities in a structured common way. Odds are that since u are using the domain model approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this style. Am I crazy here? The last system I worked on was built like this and by the time we were done we knew it forward and backward and it was a breeze to work with. BUT, we did things in which the DAL created the business entities and the DAL was aware of the business entities. Hind site I am questioning that; at the same time I personally thought it was easy to maintain and have since heard ( I left there ) that they are really pleased with it and the maintenance is easy.

Kevin C
Nov 16 '05 #1
5 2109
Sorry, here is the link I was referring to http://www.devx.com/vb2themax/Article/19892/0/page/2 .
"Kevin C" <kc@noneya.com> wrote in message news:u9**************@TK2MSFTNGP11.phx.gbl...
I was curious to know what some developers out in the industry are doing when it comes to exposing Data access logic, specifically persistence. This is assuming that your not using an O/R framework or something that completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing about the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn, string phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to "know" about the domain model. I understand the fact that a change to the model can have a rippling effect, but personally the maintenance ease of sharing these common objects across the enterprise makes is worthy decision.

Also, where are your business entities being created. Are they being mapped within the BAL or does your DAL perform the mapping. Again, from readings, idealistically the mapping should *not* occur in the DAL. Instead it should be done by the business layer. But if you move forward with a domain model approach and allow your DAL to know about the domain model you will already require a reference to the domain model lib. In that case, why not build a generic base DAL that know how to build the business entities in a structured common way. Odds are that since u are using the domain model approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this style. Am I crazy here? The last system I worked on was built like this and by the time we were done we knew it forward and backward and it was a breeze to work with. BUT, we did things in which the DAL created the business entities and the DAL was aware of the business entities. Hind site I am questioning that; at the same time I personally thought it was easy to maintain and have since heard ( I left there ) that they are really pleased with it and the maintenance is easy.

Kevin C
Nov 16 '05 #2
Kevin,

I used to subscribe to something akin to 1 and 2, but I found that it is
very, very difficult to keep everything in sync, and the idea of going "up"
one level was bothersome to me.

However, with .NET (specifically VS.NET), I've found it much easier to
justify 3. In previous versions of VS.NET, you could create a typed dataset
based on a data source in your system. I would then pass this across all
layers for my needs, almost having each layer act as a filter as the typed
dataset moves from one layer to the next.

Now, with VS.NET, it makes it even easier. When you create a dataset,
it will not only create adapters like it did before, but it will allow you
drag stored procedures onto the dataset. Now I wouldn't advocate attaching
them to the data structure, but you can have them exist as separate object
entities. The designer also creates interfaces on the objects it creates,
which helps again.

Now, if you go the typed dataset route (or the dataset route), you can
make your data layer even more abstract. Personally, I like to follow the
following pattern:

- For each table in the entity, define a standard set of stored procedures
for CRUD operations. I prefer xsp_<table name>_Insert for insert,
xsp_<table name>_Update for update and xsp_<table name>_Delete for delete.
You can define others, but for the most basic of data layers, this is the
minimum.

- Have a function in your data layer which takes the data table, determines
the changes (through the GetChanges method) and creates the appropriate data
adapters (based on the tables in the set). Because of the table names in
the data set, you should be able to get the stored procedure information
easily, and construct the command dynamically.

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:u9**************@TK2MSFTNGP11.phx.gbl...
I was curious to know what some developers out in the industry are doing
when it comes to exposing Data access logic, specifically persistence. This
is assuming that your not using an O/R framework or something that
completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing about
the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn, string
phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain
model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to
"know" about the domain model. I understand the fact that a change to the
model can have a rippling effect, but personally the maintenance ease of
sharing these common objects across the enterprise makes is worthy decision.

Also, where are your business entities being created. Are they being mapped
within the BAL or does your DAL perform the mapping. Again, from readings,
idealistically the mapping should *not* occur in the DAL. Instead it should
be done by the business layer. But if you move forward with a domain model
approach and allow your DAL to know about the domain model you will already
require a reference to the domain model lib. In that case, why not build a
generic base DAL that know how to build the business entities in a
structured common way. Odds are that since u are using the domain model
approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this
style. Am I crazy here? The last system I worked on was built like this
and by the time we were done we knew it forward and backward and it was a
breeze to work with. BUT, we did things in which the DAL created the
business entities and the DAL was aware of the business entities. Hind site
I am questioning that; at the same time I personally thought it was easy to
maintain and have since heard ( I left there ) that they are really pleased
with it and the maintenance is easy.

Kevin C
Nov 16 '05 #3
Ahh, i see. Here is a thing that maybe you can explain to me then. If you
are using typed datasets then how do you handle master detail relationships.
For example, you have a customer detail typed dataset, customerData.xsd.
That customer has orders, orders are all around the system so you create
CustomerData.xsd. Now one service interface is GetCustomerDetail,
returning the simple CustomerData dataset here is easy. Then there is a
GetCustomerOrders. The data return needs to be both customer info and order
info .. how do you do that when you have 2 *dataset* defined? It seems that
the fact that they are both defined as datasets is an issue.
Of course I could go define a new dataset than has both but where's the
benefit there. Then I just end up with a gazillion data tables .... or is
that the idea??

Kevin

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote in
message news:un**************@tk2msftngp13.phx.gbl...
Kevin,

I used to subscribe to something akin to 1 and 2, but I found that it is very, very difficult to keep everything in sync, and the idea of going "up" one level was bothersome to me.

However, with .NET (specifically VS.NET), I've found it much easier to
justify 3. In previous versions of VS.NET, you could create a typed dataset based on a data source in your system. I would then pass this across all
layers for my needs, almost having each layer act as a filter as the typed
dataset moves from one layer to the next.

Now, with VS.NET, it makes it even easier. When you create a dataset,
it will not only create adapters like it did before, but it will allow you
drag stored procedures onto the dataset. Now I wouldn't advocate attaching them to the data structure, but you can have them exist as separate object
entities. The designer also creates interfaces on the objects it creates,
which helps again.

Now, if you go the typed dataset route (or the dataset route), you can
make your data layer even more abstract. Personally, I like to follow the
following pattern:

- For each table in the entity, define a standard set of stored procedures
for CRUD operations. I prefer xsp_<table name>_Insert for insert,
xsp_<table name>_Update for update and xsp_<table name>_Delete for delete.
You can define others, but for the most basic of data layers, this is the
minimum.

- Have a function in your data layer which takes the data table, determines the changes (through the GetChanges method) and creates the appropriate data adapters (based on the tables in the set). Because of the table names in
the data set, you should be able to get the stored procedure information
easily, and construct the command dynamically.

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:u9**************@TK2MSFTNGP11.phx.gbl...
I was curious to know what some developers out in the industry are doing
when it comes to exposing Data access logic, specifically persistence. This is assuming that your not using an O/R framework or something that
completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing about the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn, string phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain
model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to
"know" about the domain model. I understand the fact that a change to the
model can have a rippling effect, but personally the maintenance ease of
sharing these common objects across the enterprise makes is worthy decision.
Also, where are your business entities being created. Are they being mapped within the BAL or does your DAL perform the mapping. Again, from readings, idealistically the mapping should *not* occur in the DAL. Instead it should be done by the business layer. But if you move forward with a domain model approach and allow your DAL to know about the domain model you will already require a reference to the domain model lib. In that case, why not build a generic base DAL that know how to build the business entities in a
structured common way. Odds are that since u are using the domain model
approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this
style. Am I crazy here? The last system I worked on was built like this
and by the time we were done we knew it forward and backward and it was a
breeze to work with. BUT, we did things in which the DAL created the
business entities and the DAL was aware of the business entities. Hind site I am questioning that; at the same time I personally thought it was easy to maintain and have since heard ( I left there ) that they are really pleased with it and the maintenance is easy.

Kevin C

Nov 16 '05 #4
Kevin,

That kind of is the idea. It comes down to how you define your
relationships. For this situation, you have a number of options. I
personally would include orders in the datastet with the customers, as they
are related pretty tightly. You then only have to pass one dataset to your
data layer to be modified. This allows for a greater degree of generality
when designing your data layer.

If you feel they should be in separate data sets, then you can always
modify your data layer to accept two data sets per operation and then have
it work on that.

However, because I would always have my data layer running within some
transactional context, there is nothing that says I can't call the data
layer twice to perform the operations on each data set.

Personally, I like to indicate whether a relationship is an owned
relationship or not (if an audit occurs on a record then all children that
are related to that record are audited as well). If it is owned, then I
think that it should be in the same data set. In this case, I think that a
customer owns the orders, but that's a decision you have to make.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:OC**************@tk2msftngp13.phx.gbl...
Ahh, i see. Here is a thing that maybe you can explain to me then. If
you
are using typed datasets then how do you handle master detail
relationships.
For example, you have a customer detail typed dataset, customerData.xsd.
That customer has orders, orders are all around the system so you create
CustomerData.xsd. Now one service interface is GetCustomerDetail,
returning the simple CustomerData dataset here is easy. Then there is a
GetCustomerOrders. The data return needs to be both customer info and
order
info .. how do you do that when you have 2 *dataset* defined? It seems
that
the fact that they are both defined as datasets is an issue.
Of course I could go define a new dataset than has both but where's the
benefit there. Then I just end up with a gazillion data tables .... or is
that the idea??

Kevin

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in
message news:un**************@tk2msftngp13.phx.gbl...
Kevin,

I used to subscribe to something akin to 1 and 2, but I found that it

is
very, very difficult to keep everything in sync, and the idea of going

"up"
one level was bothersome to me.

However, with .NET (specifically VS.NET), I've found it much easier
to
justify 3. In previous versions of VS.NET, you could create a typed

dataset
based on a data source in your system. I would then pass this across all
layers for my needs, almost having each layer act as a filter as the
typed
dataset moves from one layer to the next.

Now, with VS.NET, it makes it even easier. When you create a
dataset,
it will not only create adapters like it did before, but it will allow
you
drag stored procedures onto the dataset. Now I wouldn't advocate

attaching
them to the data structure, but you can have them exist as separate
object
entities. The designer also creates interfaces on the objects it
creates,
which helps again.

Now, if you go the typed dataset route (or the dataset route), you
can
make your data layer even more abstract. Personally, I like to follow
the
following pattern:

- For each table in the entity, define a standard set of stored
procedures
for CRUD operations. I prefer xsp_<table name>_Insert for insert,
xsp_<table name>_Update for update and xsp_<table name>_Delete for
delete.
You can define others, but for the most basic of data layers, this is the
minimum.

- Have a function in your data layer which takes the data table,

determines
the changes (through the GetChanges method) and creates the appropriate

data
adapters (based on the tables in the set). Because of the table names in
the data set, you should be able to get the stored procedure information
easily, and construct the command dynamically.

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:u9**************@TK2MSFTNGP11.phx.gbl...
I was curious to know what some developers out in the industry are doing
when it comes to exposing Data access logic, specifically persistence.

This
is assuming that your not using an O/R framework or something that
completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing

about
the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn,

string
phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain
model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to
"know" about the domain model. I understand the fact that a change to
the
model can have a rippling effect, but personally the maintenance ease of
sharing these common objects across the enterprise makes is worthy

decision.

Also, where are your business entities being created. Are they being

mapped
within the BAL or does your DAL perform the mapping. Again, from

readings,
idealistically the mapping should *not* occur in the DAL. Instead it

should
be done by the business layer. But if you move forward with a domain

model
approach and allow your DAL to know about the domain model you will

already
require a reference to the domain model lib. In that case, why not build

a
generic base DAL that know how to build the business entities in a
structured common way. Odds are that since u are using the domain model
approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this
style. Am I crazy here? The last system I worked on was built like this
and by the time we were done we knew it forward and backward and it was a
breeze to work with. BUT, we did things in which the DAL created the
business entities and the DAL was aware of the business entities. Hind

site
I am questioning that; at the same time I personally thought it was easy

to
maintain and have since heard ( I left there ) that they are really

pleased
with it and the maintenance is easy.

Kevin C


Nov 16 '05 #5
Nicholas,
The defining of the relationships is huge hole that I cannot seem to get
mentally filled when it comes to using typed datasets. To me, I think
defining the customer-order typed dataset is very limiting. There will be
more parts of the system that need just order information. At that time do
I create another typed dataset ... that is not good. Now I'm maintaining 2
orders in my system.

Kevin

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote in
message news:u3**************@TK2MSFTNGP09.phx.gbl...
Kevin,

That kind of is the idea. It comes down to how you define your
relationships. For this situation, you have a number of options. I
personally would include orders in the datastet with the customers, as they are related pretty tightly. You then only have to pass one dataset to your data layer to be modified. This allows for a greater degree of generality
when designing your data layer.

If you feel they should be in separate data sets, then you can always
modify your data layer to accept two data sets per operation and then have
it work on that.

However, because I would always have my data layer running within some
transactional context, there is nothing that says I can't call the data
layer twice to perform the operations on each data set.

Personally, I like to indicate whether a relationship is an owned
relationship or not (if an audit occurs on a record then all children that
are related to that record are audited as well). If it is owned, then I
think that it should be in the same data set. In this case, I think that a customer owns the orders, but that's a decision you have to make.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:OC**************@tk2msftngp13.phx.gbl...
Ahh, i see. Here is a thing that maybe you can explain to me then. If
you
are using typed datasets then how do you handle master detail
relationships.
For example, you have a customer detail typed dataset, customerData.xsd.
That customer has orders, orders are all around the system so you create
CustomerData.xsd. Now one service interface is GetCustomerDetail,
returning the simple CustomerData dataset here is easy. Then there is a
GetCustomerOrders. The data return needs to be both customer info and
order
info .. how do you do that when you have 2 *dataset* defined? It seems
that
the fact that they are both defined as datasets is an issue.
Of course I could go define a new dataset than has both but where's the
benefit there. Then I just end up with a gazillion data tables .... or is that the idea??

Kevin

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in
message news:un**************@tk2msftngp13.phx.gbl...
Kevin,

I used to subscribe to something akin to 1 and 2, but I found that it
is
very, very difficult to keep everything in sync, and the idea of going

"up"
one level was bothersome to me.

However, with .NET (specifically VS.NET), I've found it much easier
to
justify 3. In previous versions of VS.NET, you could create a typed

dataset
based on a data source in your system. I would then pass this across
all layers for my needs, almost having each layer act as a filter as the
typed
dataset moves from one layer to the next.

Now, with VS.NET, it makes it even easier. When you create a
dataset,
it will not only create adapters like it did before, but it will allow
you
drag stored procedures onto the dataset. Now I wouldn't advocate

attaching
them to the data structure, but you can have them exist as separate
object
entities. The designer also creates interfaces on the objects it
creates,
which helps again.

Now, if you go the typed dataset route (or the dataset route), you
can
make your data layer even more abstract. Personally, I like to follow
the
following pattern:

- For each table in the entity, define a standard set of stored
procedures
for CRUD operations. I prefer xsp_<table name>_Insert for insert,
xsp_<table name>_Update for update and xsp_<table name>_Delete for
delete.
You can define others, but for the most basic of data layers, this is the minimum.

- Have a function in your data layer which takes the data table,

determines
the changes (through the GetChanges method) and creates the appropriate

data
adapters (based on the tables in the set). Because of the table names in the data set, you should be able to get the stored procedure information easily, and construct the command dynamically.

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Kevin C" <kc@noneya.com> wrote in message
news:u9**************@TK2MSFTNGP11.phx.gbl...
I was curious to know what some developers out in the industry are doing when it comes to exposing Data access logic, specifically persistence.

This
is assuming that your not using an O/R framework or something that
completely abstracts you from the persistence details.

Are you:
1. Having simple data type interfaces and the data layer know nothing

about
the domain models. For example:

public int SaveCustomer( string fname, string lname, string ssn,

string
phoneNumber)

2. While this may anger OO peeps, does your DAL know about the domain
model. For example:

public int SaveCustomer ( Customer customer )

3. Other, like generic datasets or typed datasets?

I am really interested in the decision whether or not to allow the DAL to "know" about the domain model. I understand the fact that a change to
the
model can have a rippling effect, but personally the maintenance ease of sharing these common objects across the enterprise makes is worthy

decision.

Also, where are your business entities being created. Are they being

mapped
within the BAL or does your DAL perform the mapping. Again, from

readings,
idealistically the mapping should *not* occur in the DAL. Instead it

should
be done by the business layer. But if you move forward with a domain

model
approach and allow your DAL to know about the domain model you will

already
require a reference to the domain model lib. In that case, why not build a
generic base DAL that know how to build the business entities in a
structured common way. Odds are that since u are using the domain
model approach any changes to the domain model will probably affect the DAL.

I think this guy is right on the $$$ with the tradeoffs but I like this
style. Am I crazy here? The last system I worked on was built like this and by the time we were done we knew it forward and backward and it was a breeze to work with. BUT, we did things in which the DAL created the
business entities and the DAL was aware of the business entities. Hind

site
I am questioning that; at the same time I personally thought it was

easy to
maintain and have since heard ( I left there ) that they are really

pleased
with it and the maintenance is easy.

Kevin C



Nov 16 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

6
by: shailesh kumar | last post by:
Hi, I need to design data interfaces for accessing files of very large sizes efficiently. The data will be accessed in chunks of fixed size ... My data interface should be able to do a random...
1
by: Abhijit | last post by:
I am working in a data warehousing environment which gets sourced from Oracle ERP (AR/GL/AP). The dimensional entities associated with incoming data are GL Code (e.g. 110), Department (e.g. 1050),...
0
by: Stylus Studio | last post by:
Stylus Studio 6 XML Enterprise Edition Now Integrates with TigerLogic XDMS XQuery and Native XML Database Bedford, MA, -- Stylus Studio ( http://www.stylusstudio.com ), the industry-leading...
7
by: mittal.pradeep | last post by:
What is the better table design for a data collection application. 1. Vertical model (pk, attributeName, AttributeValue) 2. Custom columns (pk, custom1, custom2, custom3...custom50) Since the...
9
by: Tony Lee | last post by:
Some time a ago, on this newsgroup the following comments were made in recommending good references for Access (2003) >I used to recommend Dr. Rick Dobson's, "Programming Access <version>" for...
41
by: laimis | last post by:
Hey guys, I just recently got introduced to data mappers (DTO mapper). So now I have a SqlHelper being used by DTOMapper and then business layer is using DTOMapper when it needs to persist...
2
by: headware | last post by:
I'm relatively new to ASP.NET and ADO.NET, but I have a basic design question regarding the use of web services and APS.NET applications. Right now we have an application that uses web services to...
10
by: simon.hibbs | last post by:
Lets say that I have an application consisting of 3 files. A main.py file, gui.py and a data.py which handles persistent data storage. Suppose data.py defines a class 'MyDB' which reads in data...
13
by: Matthias S. | last post by:
hi there, i have generated a database model using vs and a *.dbml file. there are a lot of tables in the database and all of them are prefixed, like xxCustomers, xxOrders. for the generated...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...
0
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.