473,386 Members | 1,798 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,386 software developers and data experts.

Problem Inserting Unique Rows into a Table

I'm using SQL Express 2005.

I have two tables, TABLEA and TABLEB. TABLEA has the following:

col1
-----
NAMEAAA
NAMEAAA
NAMEBBB
NAMEBBB
NAMECCC
NAMECCC
NAMECCC
NAMEDDD

I want to insert rows into TABLEB so that:

fld1 fld2 fld3 fld4 fld5
---- ----- ---- ---- ----
NAMEAAA 0 NULL 0.0000 NULL
NAMEBBB 0 NULL 0.0000 NULL
NAMECCC 0 NULL 0.0000 NULL
NAMEDDD 0 NULL 0.0000 NULL
================================================== ======

If I try the following, 8 rows get created in TABLEB:

INSERT INTO TABLEB
([fld1]
,[fld2]
,[fld3]
,[fld4]
,[fld5])
SELECT col1,0,NULL,0,NULL
FROM TABLEA
WHERE not exists (select * from TABLEB
where TABLEB.fld1 = TABLEA.col1);

If instead I try the following, 8 rows still get created:

INSERT INTO TABLEB
([fld1]
,[fld2]
,[fld3]
,[fld4]
,[fld5])
SELECT col1,0,NULL,0,NULL
FROM TABLEA
WHERE col1 NOT IN (select fld1 FROM TABLEB);
================================================== =

If I put the word DISTINCT before "col1" in my SELECT statement, I get the error, "The text, ntext, or image data type cannot be selected as DISTINCT". If I remove the "col1,0,NULL,0,NULL" from the SELECT statement, I get the error, "The select list for the INSERT statement contains fewer items than the insert list".

I would like to mention that TABLEB has an identity column as its key.

Also, if I run the above twice, the second time it is run, no additional rows are put in TABLEB. If I then delete the 3 rows starting with NAMECCC from TABLEB, and rerun the above, it inserts the 3 rows NAMECCC, NAMECCC, NAMECCC !

If I remove the WHERE clause, and run the above twice, then it puts 16 rows in TABLEB.

It's almost as if the SQL language is incapable of perceiving duplicates that have been added during the session that is running.
Mar 10 '08 #1
2 3386
ck9663
2,878 Expert 2GB
I'm using SQL Express 2005.

I have two tables, TABLEA and TABLEB. TABLEA has the following:

col1
-----
NAMEAAA
NAMEAAA
NAMEBBB
NAMEBBB
NAMECCC
NAMECCC
NAMECCC
NAMEDDD

I want to insert rows into TABLEB so that:

fld1 fld2 fld3 fld4 fld5
---- ----- ---- ---- ----
NAMEAAA 0 NULL 0.0000 NULL
NAMEBBB 0 NULL 0.0000 NULL
NAMECCC 0 NULL 0.0000 NULL
NAMEDDD 0 NULL 0.0000 NULL
================================================== ======

If I try the following, 8 rows get created in TABLEB:

INSERT INTO TABLEB
([fld1]
,[fld2]
,[fld3]
,[fld4]
,[fld5])
SELECT col1,0,NULL,0,NULL
FROM TABLEA
WHERE not exists (select * from TABLEB
where TABLEB.fld1 = TABLEA.col1);

If instead I try the following, 8 rows still get created:

INSERT INTO TABLEB
([fld1]
,[fld2]
,[fld3]
,[fld4]
,[fld5])
SELECT col1,0,NULL,0,NULL
FROM TABLEA
WHERE col1 NOT IN (select fld1 FROM TABLEB);
================================================== =

If I put the word DISTINCT before "col1" in my SELECT statement, I get the error, "The text, ntext, or image data type cannot be selected as DISTINCT". If I remove the "col1,0,NULL,0,NULL" from the SELECT statement, I get the error, "The select list for the INSERT statement contains fewer items than the insert list".

I would like to mention that TABLEB has an identity column as its key.

Also, if I run the above twice, the second time it is run, no additional rows are put in TABLEB. If I then delete the 3 rows starting with NAMECCC from TABLEB, and rerun the above, it inserts the 3 rows NAMECCC, NAMECCC, NAMECCC !

If I remove the WHERE clause, and run the above twice, then it puts 16 rows in TABLEB.

It's almost as if the SQL language is incapable of perceiving duplicates that have been added during the session that is running.
What's the datatype of your COL1?

What I usually do in this kind of error is try and isolate the problem. For some experts, this maybe simple. But I'll share with you a technique am using.

First, try to successfully run your SELECT statement

Expand|Select|Wrap|Line Numbers
  1. SELECT DISTINCT col1,0,NULL,0,NULL
  2. FROM TABLEA
  3. WHERE TABLEA.col1 NOT IN (select TABLEB.fld1 FROM TABLEB)
If you can successfully ran that, it means the problem is not on your select statement.

Second, try to insert the result.

Expand|Select|Wrap|Line Numbers
  1. INSERT INTO TABLEB
  2. ([fld1],[fld2],[fld3],[fld4],[fld5])
  3. SELECT DISTINCT col1,0,NULL,0,NULL
  4. FROM TABLEA
  5. WHERE TABLEA.col1 NOT IN (select TABLEB.fld1 FROM TABLEB)

If you have an error, that means it's not on your select, it's in your INSERT.

Now try and do a hardcoding of value:

NSERT INTO TABLEB
([fld1],[fld2],[fld3],[fld4],[fld5])
VALUES ('test',0,NULL,0,NULL)

If the insert fails, you have other problem than your query. Which means you need more detail analysis.

It's called isolation technique. I usually do this until I can perfect the command am using. For some INSERT T-SQL are simple, but for others, it could look complicated. So try this until you managed to perfect the syntax on whatever you're using.

Good luck.

-- CK
Mar 10 '08 #2
I will play around with this technique. The data type in question is VARCHAR. I assumed the errors I am getting are normal - I brought them up to point out the inflexibility and inscrutability of the language and to reinforce the importance of finding a WHERE clause that would work, since my options to do it another way are apparently limited. After searching google for hours and finding the same simplistic (and apparently inaccurate) examples over and over, I did finally stumble on a clause that seems to do the trick as far as inserting only unique values (it isn't intuitively clear why it works, or what it even does). I replace my WHERE clause with the following:

group by col1
having count(*) > 0;
Mar 11 '08 #3

Sign in to post your reply or Sign up for a free account.

Similar topics

2
by: John Steen | last post by:
This is probably a silly question to most of you, but I'm in the process of splitting off years from a large DB to several smaller ones. Some of the existing smaller DBs already have most of the...
3
by: Alex Ayzin | last post by:
Hi, I have a problem that might be easy to solve(possibly, I've just overlooked an easy solution). Here we go: I have a dataset with 2 datatables in it. Now, I need to do the following: if...
3
by: Joachim Klassen | last post by:
Hi all, first apologies if this question looks the same as another one I recently posted - its a different thing but for the same szenario:-). We are having performance problems when...
15
by: Hemant Shah | last post by:
Folks, We have an SQL statement that was coded in an application many years ago (starting with DB V2 I think). When I upgraded to UDB 8.2, the optimizer does not use optimal path to access the...
3
by: Newbie | last post by:
I am using a Datagrid to show the contents of a DataTable. But it seems like the Datagrid is not getting the contents of the Datatable when the button ( btnAddAnotherLaborCategory) is clicked. ...
7
by: Brian Keating | last post by:
Hi there, Is it possible to add a unique constraint on two columns in a table, so that the constraint is a composite of the two? i.e. these two columns together should be unique...? i.e....
20
by: Development - multi.art.studio | last post by:
Hello everyone, i just upgraded my old postgres-database from version 7.1 to 7.4.2. i dumped out my 7.1 database (with pg_dump from 7.1) as an sql-file with copy-commands and to one file using...
5
by: Sieldan | last post by:
I'm trying to throw some data around so that I can manipulate it more easily. Unfortunately, I don't really know what I'm doing. :) So I'm creeping my way through it step by step. What I have done...
10
by: jmartmem | last post by:
Greetings, I have an ASP page with a 5x5 table embedded inside an Insert Record Form. This table contains several fields (mostly drop down list menus) and is used for corporate timekeeping (users...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.