467,116 Members | 1,198 Online
Bytes | Developer Community
Ask Question

Home New Posts Topics Members FAQ

Post your question to a community of 467,116 developers. It's quick & easy.

Proper table structure and SQL Server limits

Hello all,

I am migrating a Paradox application to SQL Server. My problem is that the
existing Paradox table structure is limited. Correcting it will mean a
re-write of the application (Delphi app). Although the record and column
limits are higher in SQL Server, it will eventually hit the wall. So I was
wondering if anyone could offer advise on how we might do this best....

The application is essentially an evaluation tool. It is very generic and
can evaluate anything from buying a digital camera or laptop, to purchasing
a house. All the attributes of whatever it is being measure (the
attributes) are defined as a new column.

Example: let's say you are evaluating laptops. The core table may look like
this:

Fields:
1) Identifier
2) Processor Speed
3) Processor Cache
4) Hard Disk Size
5) LAN connection
6) Cost

So in this table, there are 5 things being measure, 6 columns in total. A
record might look like this:

Field Data:
Identifier | Speed | Cache | HD Size | LAN | Cost ...
HP | 3ghz | 2mb | 60G | Wireless | $2000
So you can see that this structure will hit the wall eventually as more and
more criteria are added. With Paradox, the max number of columns is 255. In
SQL Server this is 1024. Of course there is also a maximum record size. For
SQL Server this is 8060 bytes.

What I'd prefer is a structure that has no limits on how big it can be. In
short, I need a new table structure that is long and skinny, vs very wide
(or fat):

Fields:
Identifier
Attribute
Value

So using the above example, the data might look like:

Field Data
Identifier | Attribute | Value
HP | Speed | 3ghz
HP | Cache | 2mb
HP | HD Size | 60G
HP | LAN | Wireless
HP | Cost | $2000

Obviously this structure is preferred because now I won't hit any column or
record limits, but it makes just about everything else in the application
more difficult. Fundamentally data validation becomes difficult

In the above example, how would one define this structure?
Identifier varchar(50)
Attribute varchar(50)
Value ???? <-- here lies my problem!!!

In some cases "value" would be an integer, sometimes numeric, sometimes
text. So then I figured, well I could have a number of tables as above,
each for the various data types, integers, numbers, free form text, etc.
Example:

For Integer data
Identifier varchar(50)
Attribute varchar(50)
Value integer

For floating point data
Identifier varchar(50)
Attribute varchar(50)
Value numeric

For free-form data
Identifier varchar(50)
Attribute varchar(50)
Value varchar or ntext
So now I have taken what was in one table, and split it across 3 or more
tables. Of course, all this data would need to be "re-connected" in a grid
like fashion so it looks like just one row of data to the client again.

It all seems to be getting very messy - to simply overcome a column/row
size limitation. Surely there must be an eloquent way to do something like
this?

thanx very much for any input.

regards,
-randall sell
Jul 20 '05 #1
  • viewed: 2205
Share:

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

5 posts views Thread by Anand | last post: by
10 posts views Thread by Peter Kirk | last post: by
4 posts views Thread by MB | last post: by
1 post views Thread by Garris, Nicole | last post: by
14 posts views Thread by Xah Lee | last post: by
4 posts views Thread by alexandre.brisebois@gmail.com | last post: by
2 posts views Thread by Memborg | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.