This post is the 'sequel' ;) of the "Data Oriented vs Object Oriented
Design" post, but it can be read and treated apart from that one. I
will just quote the beginning of my previous message to expose the
problem:
This post deals with type (2) files.
New to .NET, I'm working on an Winforms client application using VS
2005 beta2. My needs considering data storage are the followings:
(1) Small files (0 < length < 10 mb), containing lots of small
'objects' that need to be loaded in memory at runtime, in order to
garantee small access time to each 'object'. Each of these 'objects'
collection should be able to easily bind to a DataGridView, AND to
provide *filtering* and *sorting* capability.
(2) 'Mid-large' files (0 < length < 100 mb), containing lots of
'mid-large objects'. Such files shouldn't be fully loaded in memory,
but the class that deals with those files must expose random
accessing/adding/removing/modifying 'object' functionalities.
In facts, a file of type (1) contains an index of the large 'objects'
contained in a corresponding file of type (2). Each index will be used
to fill a DataGridView, and each entry (row) will provide a
'reference' to the 'big object' stored in the corresponding type (2)
file (in addition to some other object-specific details that will be
displayed on the DGView of course).
Since this is a Windows client application, using a database with a
data provider is out of the question.
My question will go straight to the point: what is the most efficient
approach to deal with such files?
I haven't much thought about it yet. Again, using a serialized List<T>
(or any object Collection) would be fine, but randomly accessing an
object without loading the whole list into memory is easier said than
done.
I'm sure you experienced people will have some great starting ideas ;)