473,773 Members | 2,345 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

SubQuery or Temp Table?

We were trying to remove duplicates and came up with two solutions.
One solution is similar to the one found in a book called "Advanced
Transact-SQL for SQL Server 2000" by Ben-Gan & Moreau. This solution
uses temp tables for removing duplicates. A co-worker created a
different solution that also removes duplicates, but the other solution
uses subqueries instead of temp tables.

Theorhetically, which solution would result in faster performance with
large tables? Would using temp tables peform faster when the source
table has 100,000 records, for example, or would the subquery function
more quickly in that situation?

Jul 23 '05 #1
3 10526
On 11 Apr 2005 08:07:42 -0700, im************* ******@yahoo.co m wrote:
We were trying to remove duplicates and came up with two solutions.
One solution is similar to the one found in a book called "Advanced
Transact-SQL for SQL Server 2000" by Ben-Gan & Moreau. This solution
uses temp tables for removing duplicates. A co-worker created a
different solution that also removes duplicates, but the other solution
uses subqueries instead of temp tables.

Theorhetically , which solution would result in faster performance with
large tables? Would using temp tables peform faster when the source
table has 100,000 records, for example, or would the subquery function
more quickly in that situation?


Hi imani,

That question is impossible to answer, without knowing anything about
your table structure or about the actual queries you use. Even with that
knowledge, the best answer to "which one performs best" is usually "test
them both in your environment, on your hardware and against your data".
The speed of queries depends on lots of factors; there is no generic
answer.

But the real question here is: why would you care? Cleaning duplicates
should always be a one-time operation - typically the kind of operation
where development time is much more important than exectution time. Just
run one of your queries and be done with it, then proceed to the really
important issue: take steps to ensure you'll never have to do it again.
(No, wait - reverse that: FIRST take steps to prevent new duplicates,
then take out the existing ones).

For regular tables, the way to rpevent duplicates is to find the natural
key and declare that as either PRIMARY KEY or UNIQUE.

If you are dealing with a staging table that's used for a data import
where you receive duplicates beyond your control, then you might want to
create a UNIQUE INDEX with the IGNORE_DUP_KEY option. Absolutely *NOT*
recommended for normal tables, but for this specific situation (import
of data known to have duplicates), it might be useful. You can read
aboout it in Books Online. Remember that IGNOORE_DUP_KEY can result in
loss of data, and that you can't control WHICH of the duplicate rows is
dropped.

Best, Hugo
--

(Remove _NO_ and _SPAM_ to get my e-mail address)
Jul 23 '05 #2
The advantage of subqueries over temp tables is that the intermediate
results do not have to be written to disk as long as there is enough
internal memory. This saves (expensive) I/O.

Temp tables have the advantage that they can be indexed, and that you
can remove duplicates in batches. Both techniques can greatly benefit
your operation.

But ss Hugo noted, you have not posted enough information to really
answer the question. It *will* depend on your situation: the hardware,
the tables sizes, the query, and possibly even the SQL-Server version
and edition.

HTH,
Gert-Jan
"im************ *******@yahoo.c om" wrote:

We were trying to remove duplicates and came up with two solutions.
One solution is similar to the one found in a book called "Advanced
Transact-SQL for SQL Server 2000" by Ben-Gan & Moreau. This solution
uses temp tables for removing duplicates. A co-worker created a
different solution that also removes duplicates, but the other solution
uses subqueries instead of temp tables.

Theorhetically, which solution would result in faster performance with
large tables? Would using temp tables peform faster when the source
table has 100,000 records, for example, or would the subquery function
more quickly in that situation?

Jul 23 '05 #3
As a very gross generalization, use derived tables and subquery
expressions. The temp table model in SQL Server is highly proprietary,
so it will not port. A temp table is a separate object that has to be
materialized. A derived table gets optimized as a part of the whole
query, so it might not need to be materialized and processed as a
separate step. Unless you add them, a temp table has no constraints,
indexes, etc.

Jul 23 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
4578
by: lev | last post by:
CREATE TABLE . ( NULL , , (44) ) ID is non-unique. I want to select all IDs where the last entry for that ID is of type 11.
7
4003
by: Kannan | last post by:
Hello, I have a situation which would essentially use a co-related subquery. I am trying to avoid using a co-related subquery due to its slow performanc and use a join statement instead. Here is what I am trying to do: Tables: ======== Limit
4
7190
by: John | last post by:
Hi everyone, I have a stored procedure which I use to query a table. The first part of the stored procedure uses a cursor to update a temp table whilst the second part of the query actually retrieves information from a database table utilising information based on the temp table. My problem is that when I run the procedure, the cursors status is output and therefore becomes part of the result set. I really only want the information...
7
9227
by: Andrew Mayo | last post by:
Here's a really weird one for any SQL Server gurus out there... We have observed (SQL Server 2000) scenarios where a stored procedure which (a) begins a transaction (b) inserts some rows into a table (c) re-queries another table using a subquery which references the inserted table (correlated or not)
3
13368
by: olanorm | last post by:
I have a query where one or more of the columns returned is a result from a subquery. These columns get their own alias. I want to filter out the rows containing NULL from the subqueries but it just won't work. When running columnAlias IS NOT NULL i just get the error "Invalid column name 'columnAlias'. This is the query: SELECT k.UserId, k.Lastname, k.Firstname, (SELECT kscr.answer FROM Results kscr WHERE kscr.UserID =
5
4357
by: ujjc001 | last post by:
access subquery error: "not enough storage is available to complete this operation" Query--- SELECT TOP 100 PERCENT UPPER(dbo.Employee.last + N', ' + dbo.Employee.first) AS Employee, Employee.employeeNumber FROM
2
1410
by: jim_geissman | last post by:
I have some queries that involve subqueries to select the appropriate record from a bunch of candidates. For example the following, which selects the most recent transaction for a given customer: CREATE TABLE #Customer (CustID int, OtherInfo int) INSERT #Customer SELECT 1,7 INSERT #Customer SELECT 2,8 INSERT #Customer SELECT 3,9 CREATE TABLE #Event (CustID int, EventID int, EventAmt int, EventDt
2
1483
by: dcourington | last post by:
I'm new to this group and not an experienced Access user, but have solved several of my problems already by reading other threads on this forum (Thanks!). Here's my current problem that I can't figure out how to solve: I'm trying to calculate total project management budget by project phase. I have a query that returns data grouped by project and index code, and has the sum of the PM budgets along with a field called 'WBS_CONCAT_Name'. ...
1
4169
by: jcf378 | last post by:
Hi all-- Does anyone have any insight as to how I might create a search form that allows a user to select criteria based on any related table in the whole database. The search form I have now only allows me to filter based on variables in a single table. I would like to have a search form where I can select multiple variables (from various linked tables) to filter by, and return results based on this multi-table filter. Allen Browne...
0
9621
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10264
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10106
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9914
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8937
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6717
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5484
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4012
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2852
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.