473,396 Members | 2,002 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Serialization/Compression

Hello,

We are developing an application (Windows forms) that allows users to take
a snapshot of the some database tables and save them another set set of
tables (called Model tables)and work with them . Since the data that goes
into model tables are huge (in the order of 30000 records for each table) we
envisaged that we are going to run out of database space if more users start
hitting our application database. To solve this problem i suggetsed to split
the records that go into the model tables in chunks of say 5000 ,binary
serialize the data, compress it and store the compressed form of the data in
a blob field in the database. Of course the application will have to the
reverse : decompress, deserialize and then render data to GUI. This proces
does incur overhead because of the intermediate operations but i thought it
is worth implementing since it can save us atleast 60-70% of space which i
guess is pretty signifcant. Also the time taken to retrieve 6 records
(instead of 30000) from the database which contains 30000 records in
serialized format will be much efficient and faster.

I just want to know if this approach is a good solution. Please let me know
if there is a better way of resolving this issue. The downside if this
approach is when the user modifies the data. The problems are as follows.

1. If the user has edited the data i will have to find out which chunk he
has modified, serialize and compress only that portion of the data. I don't
want to serialize all the chunks of data if the user just modifies only one
chunk of the data. Though i can use some kind of identifier to identify the
chunk the process may be cumbersome.

2. Even If the user just modifies one record i will have to serialize and
compress 5000 records not matter what, which is kind of bad.

I am not sure as to how to tackle these problem and will greatly apperciate
if you help me out.

Thanks a lot for the help.

Bala



Jan 27 '06 #1
4 1431
Here are some questions.

1) Will the snapshots exists forever or only while they are being worked
on? Will you be able to delete the snapshots at some point in time?
2) How many users are expected to make snapshots at around the same time?
3) Are the snapshots only for viewing or will the user be making changes as
well?
4) If the user can make changes to a snapshot, where will those changes
persist? Will they be pushed back to the main take that the snapshot came
from?

"Bala Nagarajan" <ba********@newsgroups.nospam> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Hello,

We are developing an application (Windows forms) that allows users to take
a snapshot of the some database tables and save them another set set of
tables (called Model tables)and work with them . Since the data that goes
into model tables are huge (in the order of 30000 records for each table)
we envisaged that we are going to run out of database space if more users
start hitting our application database. To solve this problem i suggetsed
to split the records that go into the model tables in chunks of say 5000
,binary serialize the data, compress it and store the compressed form of
the data in a blob field in the database. Of course the application will
have to the reverse : decompress, deserialize and then render data to GUI.
This proces does incur overhead because of the intermediate operations
but i thought it is worth implementing since it can save us atleast 60-70%
of space which i guess is pretty signifcant. Also the time taken to
retrieve 6 records (instead of 30000) from the database which contains
30000 records in serialized format will be much efficient and faster.

I just want to know if this approach is a good solution. Please let me
know if there is a better way of resolving this issue. The downside if
this approach is when the user modifies the data. The problems are as
follows.

1. If the user has edited the data i will have to find out which chunk he
has modified, serialize and compress only that portion of the data. I
don't want to serialize all the chunks of data if the user just modifies
only one chunk of the data. Though i can use some kind of identifier to
identify the chunk the process may be cumbersome.

2. Even If the user just modifies one record i will have to serialize and
compress 5000 records not matter what, which is kind of bad.

I am not sure as to how to tackle these problem and will greatly
apperciate if you help me out.

Thanks a lot for the help.

Bala


Jan 27 '06 #2

I wrote an smart client multiuser application that handles much less
data, but it does serialize individual data on the client as XML.

I find that a more balanced approach, architecturally.

Bala Nagarajan wrote:
Hello,

We are developing an application (Windows forms) that allows users to take
a snapshot of the some database tables and save them another set set of
tables (called Model tables)and work with them . Since the data that goes
into model tables are huge (in the order of 30000 records for each table) we
envisaged that we are going to run out of database space if more users start
hitting our application database. To solve this problem i suggetsed to split
the records that go into the model tables in chunks of say 5000 ,binary
serialize the data, compress it and store the compressed form of the data in
a blob field in the database. Of course the application will have to the
reverse : decompress, deserialize and then render data to GUI. This proces
does incur overhead because of the intermediate operations but i thought it
is worth implementing since it can save us atleast 60-70% of space which i
guess is pretty signifcant. Also the time taken to retrieve 6 records
(instead of 30000) from the database which contains 30000 records in
serialized format will be much efficient and faster.

I just want to know if this approach is a good solution. Please let me know
if there is a better way of resolving this issue. The downside if this
approach is when the user modifies the data. The problems are as follows.

1. If the user has edited the data i will have to find out which chunk he
has modified, serialize and compress only that portion of the data. I don't
want to serialize all the chunks of data if the user just modifies only one
chunk of the data. Though i can use some kind of identifier to identify the
chunk the process may be cumbersome.

2. Even If the user just modifies one record i will have to serialize and
compress 5000 records not matter what, which is kind of bad.

I am not sure as to how to tackle these problem and will greatly apperciate
if you help me out.

Thanks a lot for the help.

Bala


Jan 27 '06 #3

Peter,
Thanks for responding. Let me know if you have more questions. I really
apperciate your time.

1) Will the snapshots exists forever or only while they are being worked
on? Will you be able to delete the snapshots at some point in time?
The snapshot will get refreshed every month. Every month a batch process
will be run to refresh the snapshot.

2) How many users are expected to make snapshots at around the same time?
Around 50.

3) Are the snapshots only for viewing or will the user be making changes
as
well?

The Snapshot is for viewing only. But the users can create a copy of this
snapshot (called Model in our world). The users can isert/update/delete
data at the model level.

Thanks
Bala


"Bala Nagarajan" <ba********@newsgroups.nospam> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Hello,

We are developing an application (Windows forms) that allows users to
take a snapshot of the some database tables and save them another set set
of tables (called Model tables)and work with them . Since the data that
goes into model tables are huge (in the order of 30000 records for each
table) we envisaged that we are going to run out of database space if
more users start hitting our application database. To solve this problem
i suggetsed to split the records that go into the model tables in chunks
of say 5000 ,binary serialize the data, compress it and store the
compressed form of the data in a blob field in the database. Of course
the application will have to the reverse : decompress, deserialize and
then render data to GUI. This proces does incur overhead because of the
intermediate operations but i thought it is worth implementing since it
can save us atleast 60-70% of space which i guess is pretty signifcant.
Also the time taken to retrieve 6 records (instead of 30000) from the
database which contains 30000 records in serialized format will be much
efficient and faster.

I just want to know if this approach is a good solution. Please let me
know if there is a better way of resolving this issue. The downside if
this approach is when the user modifies the data. The problems are as
follows.

1. If the user has edited the data i will have to find out which chunk he
has modified, serialize and compress only that portion of the data. I
don't want to serialize all the chunks of data if the user just modifies
only one chunk of the data. Though i can use some kind of identifier to
identify the chunk the process may be cumbersome.

2. Even If the user just modifies one record i will have to serialize and
compress 5000 records not matter what, which is kind of bad.

I am not sure as to how to tackle these problem and will greatly
apperciate if you help me out.

Thanks a lot for the help.

Bala



Jan 30 '06 #4
Hi

Commonly we did not recommend to compress whole table and stored in the
database. Because that will make the database hard to maintain. Once a mini
error occur, the whole database will be unavailable, because common a mini
error in a compress package will cause the whole package unavailable. That
is why we stored the data in the database, and the database helped to store
and maintain. The database have special mechanisms to maintain the data
including backup.

Also I am curious why you need to take a tables snapshot, if the snapshot
is for view only, we can query from the db directly.
If the users will make change to the modal, so will the modal be update
back into the database.
If no, why we did not use a dataset directly.

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.

Jan 31 '06 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Jim Hubbard | last post by:
I went to the compression newsgroups, but all I saw was spam. So, I thought I'd post his question here to get the best info I could from other programmers. Which compression algorithm offers...
1
by: Phil Price | last post by:
Hi there, I'm developing a shape recognition application for the tablet PC for a) fun b) university project. Currently I'm working on the learning stage using neural networks, and have to store...
0
by: Markus Minichmayr | last post by:
Hello! I already posted this issue on the remoting NG, but didn't get an answer, so I try it here. I have a client/server app., that communicates via .NET remoting and I need to transport...
8
by: Alex Vinokur | last post by:
Are data compression methods used as the serialization technique? -- Alex Vinokur email: alex DOT vinokur AT gmail DOT com http://mathforum.org/library/view/10978.html...
1
by: Matt | last post by:
I have a web service that currently returns a dataset. Depending on the data being returned its size will be in megabytes (XML Document could be possibly 100 or more megabytes). To speed up the...
1
by: chris.atlee | last post by:
I'm writing a program in python that creates tar files of a certain maximum size (to fit onto CD/DVD). One of the problems I'm running into is that when using compression, it's pretty much...
4
by: Sascha Dietl | last post by:
Hi NG, I got a problem while decrypting an encrypted a Serialized class: I Serialize a simple class to a Stream then encrypt it and write it to file everything seems to work here until i try to...
20
by: chance | last post by:
Hello, I want to add compression to a memory stream and save it in an Oracle database. This is the code I have so far: //save the Word document to a binary field, MemoryStream dataStream = new...
21
by: =?Utf-8?B?VkJB?= | last post by:
I compressed a file with GZipStream class and is larger than the original file.... how can this be?, the original file is 737 KB and the "compressed" file is 1.1 MB. Did i miss something or is...
3
by: GiJeet | last post by:
Hello, we have an app that scans documents into TIFF format and we need to transfer them over the internet. If anyone knows of a SDK we can use that can compress TIFFs on the fly or even if it can...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.