By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
448,691 Members | 1,123 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 448,691 IT Pros & Developers. It's quick & easy.

UpdateDB from DataSet

P: n/a
Al
I have this scenario:
1. XML file with schema and data is created from SQL Server tables. XML file
contains 6 tables, some of them have rows, some of them are empty.
2. XML file is given to the person with Pocket PC.
3. XML file is used to populate DataSet on the Pocket PC.
4. User adds new rows, changes values, deletes some rows.
5. Altered dataset is saved back to XML file.
6. XML file is given back to the person who initially created it.
7. XML file is used to populate DataSet.

I only know the way to update SQL Server tables by going through each record
in each table.
I was wondering if there is more simple and compact way to bring updated
stuff into SQL Server DB?
Everything is going to be done in VB 2005

Thank you
Al
Aug 4 '06 #1
Share this Question
Share on Google+
4 Replies


P: n/a
Al
I forgot to tell that all tables in a database have primary keys.

Al

"Al" <al@newsgroups.comwrote in message
news:%2******************@TK2MSFTNGP05.phx.gbl...
>I have this scenario:
1. XML file with schema and data is created from SQL Server tables. XML
file contains 6 tables, some of them have rows, some of them are empty.
2. XML file is given to the person with Pocket PC.
3. XML file is used to populate DataSet on the Pocket PC.
4. User adds new rows, changes values, deletes some rows.
5. Altered dataset is saved back to XML file.
6. XML file is given back to the person who initially created it.
7. XML file is used to populate DataSet.

I only know the way to update SQL Server tables by going through each
record in each table.
I was wondering if there is more simple and compact way to bring updated
stuff into SQL Server DB?
Everything is going to be done in VB 2005

Thank you
Al


Aug 4 '06 #2

P: n/a
Al wrote:
I have this scenario:
1. XML file with schema and data is created from SQL Server tables. XML file
contains 6 tables, some of them have rows, some of them are empty.
2. XML file is given to the person with Pocket PC.
3. XML file is used to populate DataSet on the Pocket PC.
4. User adds new rows, changes values, deletes some rows.
5. Altered dataset is saved back to XML file.
6. XML file is given back to the person who initially created it.
7. XML file is used to populate DataSet.

I only know the way to update SQL Server tables by going through each record
in each table.
I was wondering if there is more simple and compact way to bring updated
stuff into SQL Server DB?
Everything is going to be done in VB 2005

Thank you
Al

You can do the update using datasets. Perhaps not nearly as fast as
direct SQL calls, but probably easier to code/maintain.

Basically, you can reconstruct the "original" data as a dataset, then
"merge" it with the changed rows in the client's dataset, which we'll
call dsModified.

So on the client side, you can do dsModifed.GetChanges to make the
dataset have only changed rows in it. (Assuming GetChanges is available
int the compact framework, if that's what you're using.)

Then do dsModified.WriteXML or similar to write out the XML of this
dataset.

Then on the server side, as you say, reconstruct the client dataset
from the XML with the usual methods.

At this point, since you just constructed a new dataset from your XML,
I believe none of the rows will be flagged as "modified", which is bad,
because a dataset's Merge method will merge *only* those rows that are
marked as Modified. This means you have to use a "For Each" to do
row.SetModified on every row. This is likely to slow things down quite
a bit. (By the way, SetModified exists only in .NET 2.0.)

Then, still one server side -- this is another one of the slow parts --
build a dataset (called dsOriginal) that matches the schema of the
dataset you just built from XML. This can probably be done by using the
same method that built the dataset you sent to the client way back
when.

Then you can just do something like:

dsOriginal.Merge(dsModified)

Then update the data adapter used to create dsOriginal. This will write
the changes back to the database:

da.Update

If the Update command doesn't seem to be working (i.e., the changes
aren't being written to the database), try checking the number of data
tables in both datasets after the merge. I had a real head-scratcher
recently where the Merge wasn't working, and it turned out that
ds.Merge was adding the modified data as a new table instead of merging
it with the existing table. I solved this by using the data table's
Merge method instead of the dataset's, since my ds contained only one
table.

Other people with more experience might dismiss the above technique as
slow and/or inelegant. I welcome their criticism...I am offering this
method only as an example of how I have solved this problem, and would
love to hear of faster, more elegant solutions.

adm

Aug 4 '06 #3

P: n/a
Al
adm,
thank you very much.
I just started to work with Compact Framework and all the time have problems
with its huge limitations, as well as my limited knowledge. So to do
something you are suggesting (on pocket pc side) is going to be a nightmare.
If I will not find anything simple I'll try to do update step by step
comparing each row in each table in 2 datasets.
It looks like a long but clear to understand way. Maybe I'm wrong.

I also have another idea (probably stupid). What if I'll try to do this:
1. I populate DataSet1 with data from the database
2. I populate DataSet2 with data from client XML file
3. I replace tables in Dataset1 with tables from DataSet2 (I do not know if
it's possible)
4. I send data from DataSet1 back to db

I see the problem with deleted records, but in my applications I never
phisically delete any row - I have RowDeleted column in each table and in
case user wants to delete some records they are marked as deleted and never
appear in any row sets.

So everything I need is adding new rows and updating existing ones.

Al

<ad*****@yahoo.comwrote in message
news:11**********************@75g2000cwc.googlegro ups.com...
Al wrote:
>I have this scenario:
1. XML file with schema and data is created from SQL Server tables. XML
file
contains 6 tables, some of them have rows, some of them are empty.
2. XML file is given to the person with Pocket PC.
3. XML file is used to populate DataSet on the Pocket PC.
4. User adds new rows, changes values, deletes some rows.
5. Altered dataset is saved back to XML file.
6. XML file is given back to the person who initially created it.
7. XML file is used to populate DataSet.

I only know the way to update SQL Server tables by going through each
record
in each table.
I was wondering if there is more simple and compact way to bring updated
stuff into SQL Server DB?
Everything is going to be done in VB 2005

Thank you
Al


You can do the update using datasets. Perhaps not nearly as fast as
direct SQL calls, but probably easier to code/maintain.

Basically, you can reconstruct the "original" data as a dataset, then
"merge" it with the changed rows in the client's dataset, which we'll
call dsModified.

So on the client side, you can do dsModifed.GetChanges to make the
dataset have only changed rows in it. (Assuming GetChanges is available
int the compact framework, if that's what you're using.)

Then do dsModified.WriteXML or similar to write out the XML of this
dataset.

Then on the server side, as you say, reconstruct the client dataset
from the XML with the usual methods.

At this point, since you just constructed a new dataset from your XML,
I believe none of the rows will be flagged as "modified", which is bad,
because a dataset's Merge method will merge *only* those rows that are
marked as Modified. This means you have to use a "For Each" to do
row.SetModified on every row. This is likely to slow things down quite
a bit. (By the way, SetModified exists only in .NET 2.0.)

Then, still one server side -- this is another one of the slow parts --
build a dataset (called dsOriginal) that matches the schema of the
dataset you just built from XML. This can probably be done by using the
same method that built the dataset you sent to the client way back
when.

Then you can just do something like:

dsOriginal.Merge(dsModified)

Then update the data adapter used to create dsOriginal. This will write
the changes back to the database:

da.Update

If the Update command doesn't seem to be working (i.e., the changes
aren't being written to the database), try checking the number of data
tables in both datasets after the merge. I had a real head-scratcher
recently where the Merge wasn't working, and it turned out that
ds.Merge was adding the modified data as a new table instead of merging
it with the existing table. I solved this by using the data table's
Merge method instead of the dataset's, since my ds contained only one
table.

Other people with more experience might dismiss the above technique as
slow and/or inelegant. I welcome their criticism...I am offering this
method only as an example of how I have solved this problem, and would
love to hear of faster, more elegant solutions.

adm

Aug 4 '06 #4

P: n/a

Help me understand your application architecture here. My impression
was that it went something like this:

1. Server (a PC) sends "original" data to client (a Pocket PC).
2. User modifies the data on the client.
3. Client re-connects to server.
4. New/modified rows are merged with original data.

Is this true? If so, most of the operations I describe below operate on
the server side, which will have the full .NET framework, and so the
limitations of the client side don't matter.

ds.WriteXML and ds.GetChanges, the two functions that need to be run on
the client, are supported by the compact framework, as described here:
http://tinyurl.com/kmwvk

That is good news for you, and suggests that, unless I have
misunderstood something, the technique I described in my previous
message should work for you.


Al wrote:
adm,
thank you very much.
I just started to work with Compact Framework and all the time have problems
with its huge limitations, as well as my limited knowledge. So to do
something you are suggesting (on pocket pc side) is going to be a nightmare.
If I will not find anything simple I'll try to do update step by step
comparing each row in each table in 2 datasets.
It looks like a long but clear to understand way. Maybe I'm wrong.

I also have another idea (probably stupid). What if I'll try to do this:
1. I populate DataSet1 with data from the database
2. I populate DataSet2 with data from client XML file
3. I replace tables in Dataset1 with tables from DataSet2 (I do not know if
it's possible)
4. I send data from DataSet1 back to db

I see the problem with deleted records, but in my applications I never
phisically delete any row - I have RowDeleted column in each table and in
case user wants to delete some records they are marked as deleted and never
appear in any row sets.

So everything I need is adding new rows and updating existing ones.

Al

<ad*****@yahoo.comwrote in message
news:11**********************@75g2000cwc.googlegro ups.com...
Al wrote:
I have this scenario:
1. XML file with schema and data is created from SQL Server tables. XML
file
contains 6 tables, some of them have rows, some of them are empty.
2. XML file is given to the person with Pocket PC.
3. XML file is used to populate DataSet on the Pocket PC.
4. User adds new rows, changes values, deletes some rows.
5. Altered dataset is saved back to XML file.
6. XML file is given back to the person who initially created it.
7. XML file is used to populate DataSet.

I only know the way to update SQL Server tables by going through each
record
in each table.
I was wondering if there is more simple and compact way to bring updated
stuff into SQL Server DB?
Everything is going to be done in VB 2005

Thank you
Al

You can do the update using datasets. Perhaps not nearly as fast as
direct SQL calls, but probably easier to code/maintain.

Basically, you can reconstruct the "original" data as a dataset, then
"merge" it with the changed rows in the client's dataset, which we'll
call dsModified.

So on the client side, you can do dsModifed.GetChanges to make the
dataset have only changed rows in it. (Assuming GetChanges is available
int the compact framework, if that's what you're using.)

Then do dsModified.WriteXML or similar to write out the XML of this
dataset.

Then on the server side, as you say, reconstruct the client dataset
from the XML with the usual methods.

At this point, since you just constructed a new dataset from your XML,
I believe none of the rows will be flagged as "modified", which is bad,
because a dataset's Merge method will merge *only* those rows that are
marked as Modified. This means you have to use a "For Each" to do
row.SetModified on every row. This is likely to slow things down quite
a bit. (By the way, SetModified exists only in .NET 2.0.)

Then, still one server side -- this is another one of the slow parts --
build a dataset (called dsOriginal) that matches the schema of the
dataset you just built from XML. This can probably be done by using the
same method that built the dataset you sent to the client way back
when.

Then you can just do something like:

dsOriginal.Merge(dsModified)

Then update the data adapter used to create dsOriginal. This will write
the changes back to the database:

da.Update

If the Update command doesn't seem to be working (i.e., the changes
aren't being written to the database), try checking the number of data
tables in both datasets after the merge. I had a real head-scratcher
recently where the Merge wasn't working, and it turned out that
ds.Merge was adding the modified data as a new table instead of merging
it with the existing table. I solved this by using the data table's
Merge method instead of the dataset's, since my ds contained only one
table.

Other people with more experience might dismiss the above technique as
slow and/or inelegant. I welcome their criticism...I am offering this
method only as an example of how I have solved this problem, and would
love to hear of faster, more elegant solutions.

adm
Aug 4 '06 #5

This discussion thread is closed

Replies have been disabled for this discussion.