By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,567 Members | 1,672 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,567 IT Pros & Developers. It's quick & easy.

Working with Complex DataSet - Creating and Pass thru Transformati

P: n/a
Have a complex process where I need to Import a large amount of data
then run some transformations on this data then import into DataBase.

The transformation involves multiple fields and multiple process - so the
data needs to be read in 1 record at a time then run thru the transformation
that may create new data value then everything is imported into a db to store.

I have multiple questions
1)we used to have an internal data structure of Dictionary and lists that
would hold the dats and pass from 1 transformation or process till another
till it was time to import. Now the database is too complex and I am looking
for ideas on how to create a dynamic structure that relates to final db.
I was thinking of using a Dataset - is there a way to create a dataset that
mirors the db??? There would only need to be 1 record in the structure then
it is
imported into db.
The present db has 30 tables and many parent child relationsships

2)For large DataMining what do people use to store large complex data
structures that
have various processed done to them then moved to a db???

Thanks

Jul 9 '08 #1
Share this Question
Share on Google+
2 Replies


P: n/a
This is probably a bit too much data to do efficiently in .NET as it isnt the
best thing to process large amounts of data in. It might be better to think
about loading all the data into a set of tables in the database directly,
then running a stored procedure to do the transformations and then moving the
data over to the final destination tables.
In direct answer to your questions, a dataset can be created which holds a
number of tables structured like a database. It is also possible to hold all
the rows in the dataset and then bulk upload table by table to SQL Server if
thats what your using as a dbms. That performs a lot faster then running an
insert statement per row.

Does that help?
--
Ciaran O''Donnell
http://wannabedeveloper.spaces.live.com
"sippyuconn" wrote:
Have a complex process where I need to Import a large amount of data
then run some transformations on this data then import into DataBase.

The transformation involves multiple fields and multiple process - so the
data needs to be read in 1 record at a time then run thru the transformation
that may create new data value then everything is imported into a db to store.

I have multiple questions
1)we used to have an internal data structure of Dictionary and lists that
would hold the dats and pass from 1 transformation or process till another
till it was time to import. Now the database is too complex and I am looking
for ideas on how to create a dynamic structure that relates to final db.
I was thinking of using a Dataset - is there a way to create a dataset that
mirors the db??? There would only need to be 1 record in the structure then
it is
imported into db.
The present db has 30 tables and many parent child relationsships

2)For large DataMining what do people use to store large complex data
structures that
have various processed done to them then moved to a db???

Thanks
Jul 9 '08 #2

P: n/a
Hello Sippyuconn,

Apart from Ciaran¡¯s suggestions, I think we may also use LINQ to SQL.

LINQ to SQL is capable of quickly building an object model based on your
complex DB table relations by ¡°drag and drop¡± in its designer.
http://dotnetslackers.com/articles/c...tudio2008.aspx
Then we can update the data in the almost same way as we did for .NET
objects:
http://blogs.msdn.com/wriju/archive/...ect-model.aspx

Regarding the data mining question, I¡¯m not familiar with DM. You may want
to try the question in the microsoft.public.sqlserver.datamining newsgroup.

Regards,
Jialiang Ge (ji****@online.microsoft.com, remove ¡®online.¡¯)
Microsoft Online Community Support

Delighting our customers is our #1 priority. We welcome your comments and
suggestions about how we can improve the support we provide to you. Please
feel free to let my manager know what you think of the level of service
provided. You can send feedback directly to my manager at:
ms****@microsoft.com.

==================================================
Get notification to my posts through email? Please refer to
http://msdn.microsoft.com/subscripti...#notifications.

Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
http://msdn.microsoft.com/subscripti...t/default.aspx.
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.

"Ciaran O''Donnell" <Ci************@discussions.microsoft.comwrote in
message news:2C**********************************@microsof t.com...
This is probably a bit too much data to do efficiently in .NET as it isnt
the
best thing to process large amounts of data in. It might be better to
think
about loading all the data into a set of tables in the database directly,
then running a stored procedure to do the transformations and then moving
the
data over to the final destination tables.
In direct answer to your questions, a dataset can be created which holds a
number of tables structured like a database. It is also possible to hold
all
the rows in the dataset and then bulk upload table by table to SQL Server
if
thats what your using as a dbms. That performs a lot faster then running
an
insert statement per row.

Does that help?
--
Ciaran O''Donnell
http://wannabedeveloper.spaces.live.com
"sippyuconn" wrote:
>Have a complex process where I need to Import a large amount of data
then run some transformations on this data then import into DataBase.

The transformation involves multiple fields and multiple process - so the
data needs to be read in 1 record at a time then run thru the
transformation
that may create new data value then everything is imported into a db to
store.

I have multiple questions
1)we used to have an internal data structure of Dictionary and lists that
would hold the dats and pass from 1 transformation or process till
another
till it was time to import. Now the database is too complex and I am
looking
for ideas on how to create a dynamic structure that relates to final db.
I was thinking of using a Dataset - is there a way to create a dataset
that
mirors the db??? There would only need to be 1 record in the structure
then
it is
imported into db.
The present db has 30 tables and many parent child relationsships

2)For large DataMining what do people use to store large complex data
structures that
have various processed done to them then moved to a db???

Thanks

Jul 9 '08 #3

This discussion thread is closed

Replies have been disabled for this discussion.