473,243 Members | 1,599 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,243 software developers and data experts.

Stored procedure and SQL Job Agent Task

I have a stored procedure thats transferring/processing data from one
table to two different tables. The destination tables have a unique
value constraint as the source tables at times has duplicate records
and this will prevent the duplicates from being reported. When the
stored procedure (which includes a cursor) is executed through query
analyzer, it runs fine, and reports an error everytime it sees a
duplicate value (as expected). It moves all the unique values from the
source to the destination tables.

However, if the same stored procedure is run as a task/job in SQL
Server Agent, the behaviour is different. The job fails when it see's
the error and ends up skipping records or terminating the procedure
all together. Eg. if there are 100 records in the source table with 10
duplicates, the stored procedure when run through Query Analyzer will
copy the 90 unique records to the destination tables but when run from
SQL-Agent, it copies just 10-15 records.

Any idea why this happens?

Feb 5 '07 #1
9 7643
Nevermind...

added "SET NOCOUNT ON" to the Sp and the task now works just fine.

On Feb 5, 3:14 pm, vikram.man...@gmail.com wrote:
I have a stored procedure thats transferring/processing data from one
table to two different tables. The destination tables have a unique
value constraint as the source tables at times has duplicate records
and this will prevent the duplicates from being reported. When the
stored procedure (which includes a cursor) is executed through query
analyzer, it runs fine, and reports an error everytime it sees a
duplicate value (as expected). It moves all the unique values from the
source to the destination tables.

However, if the same stored procedure is run as a task/job in SQL
Server Agent, the behaviour is different. The job fails when it see's
the error and ends up skipping records or terminating the procedure
all together. Eg. if there are 100 records in the source table with 10
duplicates, the stored procedure when run through Query Analyzer will
copy the 90 unique records to the destination tables but when run from
SQL-Agent, it copies just 10-15 records.

Any idea why this happens?

Feb 5 '07 #2
(vi***********@gmail.com) writes:
I have a stored procedure thats transferring/processing data from one
table to two different tables. The destination tables have a unique
value constraint as the source tables at times has duplicate records
and this will prevent the duplicates from being reported. When the
stored procedure (which includes a cursor) is executed through query
analyzer, it runs fine, and reports an error everytime it sees a
duplicate value (as expected). It moves all the unique values from the
source to the destination tables.

However, if the same stored procedure is run as a task/job in SQL
Server Agent, the behaviour is different. The job fails when it see's
the error and ends up skipping records or terminating the procedure
all together. Eg. if there are 100 records in the source table with 10
duplicates, the stored procedure when run through Query Analyzer will
copy the 90 unique records to the destination tables but when run from
SQL-Agent, it copies just 10-15 records.
I see that you have resolved the issue, but generally I think it's
better to use NOT EXISTS to avoid the dup error to happen in the first
place. You mention a cursor, but if the sole purpose is to copy rows
one table to another, running a cursor is not very effective.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Feb 5 '07 #3
Would NOT EXISTS add any overhead to the query? The application
logging data to the SQL server is using OLEDB and as a test is
currently logging nearly 10000 rows in the table every minute. I was
wondering if NOT EXISTS would actually slow down the INSERT.

Feb 6 '07 #4
(vi***********@gmail.com) writes:
Would NOT EXISTS add any overhead to the query? The application
logging data to the SQL server is using OLEDB and as a test is
currently logging nearly 10000 rows in the table every minute. I was
wondering if NOT EXISTS would actually slow down the INSERT.
I am a little confused here. In your original post you talked about an
Agent job and you even mentioned a cursor. Now you are talking about
an application that logs data.

But let's say that we are talking about application logging now. Then I
am curious: how come the application is (attempting to) logging duplicates?

As for the question as such, NOT EXISTS may add an overhead, but so will
an error that has to be rolled back.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Feb 6 '07 #5
The application logs "raw" data. The SQL Job (stored procedure) is
adding attributes to that data and moving it to "report" tables. The
cursor was used to select the right attributes based on the data
logged.

The application is logging data from a hardware device (PLC) thats
generating data faster than the application can accept (at times) ..
hence the issue of duplicates to avoid any data loss during the timing
issue. There is unfortunately no way to control the duplicate problem
at the application level. But since its generating data so fast - we
need to just dump the raw data in a table and then copy it for
reporting purposes.
On Feb 6, 5:32 pm, Erland Sommarskog <esq...@sommarskog.sewrote:
(vikram.man...@gmail.com) writes:
Would NOT EXISTS add any overhead to the query? The application
logging data to the SQL server is using OLEDB and as a test is
currently logging nearly 10000 rows in the table every minute. I was
wondering if NOT EXISTS would actually slow down the INSERT.

I am a little confused here. In your original post you talked about an
Agent job and you even mentioned a cursor. Now you are talking about
an application that logs data.

But let's say that we are talking about application logging now. Then I
am curious: how come the application is (attempting to) logging duplicates?

As for the question as such, NOT EXISTS may add an overhead, but so will
an error that has to be rolled back.

--
Erland Sommarskog, SQL Server MVP, esq...@sommarskog.se

Books Online for SQL Server 2005 athttp://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books...
Books Online for SQL Server 2000 athttp://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Feb 6 '07 #6
(vi***********@gmail.com) writes:
The application logs "raw" data. The SQL Job (stored procedure) is
adding attributes to that data and moving it to "report" tables. The
cursor was used to select the right attributes based on the data
logged.

The application is logging data from a hardware device (PLC) thats
generating data faster than the application can accept (at times) ..
hence the issue of duplicates to avoid any data loss during the timing
issue. There is unfortunately no way to control the duplicate problem
at the application level. But since its generating data so fast - we
need to just dump the raw data in a table and then copy it for
reporting purposes.
I can understand that you log everthing in the raw tables. That
certainly seems like the best strategy. And my suggestion was not
that you have the WHERE NOT EXISTS in this place, but rather the in
the Agent job.

Whether the WHERE NOT EXISTS would be too costly in the Agent job,
I don't think so. After all, if you can afford having a cursor, then
you don't appear to be in a hurry...
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Feb 6 '07 #7
I'll give it a shot. Is it generally more efficient to check for
duplicates through T-SQL like with WHERE NOT EXISTS? or use
constraints on the table? I realize the latter is a little painful as
it clogs the error logs for the job history.

thanks, Vikram

On Feb 6, 6:05 pm, Erland Sommarskog <esq...@sommarskog.sewrote:
(vikram.man...@gmail.com) writes:
The application logs "raw" data. The SQL Job (stored procedure) is
adding attributes to that data and moving it to "report" tables. The
cursor was used to select the right attributes based on the data
logged.
The application is logging data from a hardware device (PLC) thats
generating data faster than the application can accept (at times) ..
hence the issue of duplicates to avoid any data loss during the timing
issue. There is unfortunately no way to control the duplicate problem
at the application level. But since its generating data so fast - we
need to just dump the raw data in a table and then copy it for
reporting purposes.

I can understand that you log everthing in the raw tables. That
certainly seems like the best strategy. And my suggestion was not
that you have the WHERE NOT EXISTS in this place, but rather the in
the Agent job.

Whether the WHERE NOT EXISTS would be too costly in the Agent job,
I don't think so. After all, if you can afford having a cursor, then
you don't appear to be in a hurry...

--
Erland Sommarskog, SQL Server MVP, esq...@sommarskog.se

Books Online for SQL Server 2005 athttp://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books...
Books Online for SQL Server 2000 athttp://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Feb 6 '07 #8
(vi***********@gmail.com) writes:
I'll give it a shot. Is it generally more efficient to check for
duplicates through T-SQL like with WHERE NOT EXISTS? or use
constraints on the table? I realize the latter is a little painful as
it clogs the error logs for the job history.
If you want to prevent duplicates at all, you should use constraints.
And then you may use WHERE NOT EXISTS to avoid the errors to happen.
Note that if you only use WHERE NOT EXISTS, but does not have any index
at all, performance will be awful.

As for that log table, I would not put any constraints on that table
that prevents duplicates, nor any WHERE NOT EXISTS. It seems to me that
it may be better to permitt the application to log what it logs and
weed out duplicates later.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Feb 7 '07 #9
Thanks Erland - i noticed the performance is much better with WHERE
NOT EXISTS... and the SQL Job completes much more cleanly.
On Feb 7, 3:20 am, Erland Sommarskog <esq...@sommarskog.sewrote:
(vikram.man...@gmail.com) writes:
I'll give it a shot. Is it generally more efficient to check for
duplicates through T-SQLlike with WHERE NOT EXISTS? or use
constraints on the table? I realize the latter is a little painful as
it clogs the error logs for the job history.

If you want to prevent duplicates at all, you should use constraints.
And then you may use WHERE NOT EXISTS to avoid the errors to happen.
Note that if you only use WHERE NOT EXISTS, but does not have any index
at all, performance will be awful.

As for that log table, I would not put any constraints on that table
that prevents duplicates, nor any WHERE NOT EXISTS. It seems to me that
it may be better to permitt the application to log what it logs and
weed out duplicates later.

--
Erland Sommarskog,SQLServer MVP, esq...@sommarskog.se

Books Online forSQLServer 2005 athttp://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books...
Books Online forSQLServer 2000 athttp://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

Feb 7 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Richard Morey | last post by:
Hi.. I have written a stored procedure that take 5 - 8 minutes to fully execute.. I wrote this routine as a stored procedure because I started to create all the functionality via ASP but I kept...
7
by: Lauren Quantrell | last post by:
I have a table with three columns. One of the columns contains text output data nvarchar(500), one of them contains a filename nvarchar(50), one of the columns is a bit to record if it has been...
2
by: Tim.D | last post by:
Hello people, Thanks to Serge, PM, Knut and a host of others I have now successfully completed my very first SQL Stored Procedure. Takes some 13mins or so to run, but that was expected as it...
7
by: Anthony Robinson | last post by:
Have been encountering an odd issue. Every now and again, certain packages of stored procedures just become invalid. I'm aware that dropping or altering an underlying table would render a package...
5
by: Rhino | last post by:
I am trying to determine the behaviour of stored procedures in DB2 V8.2.x in Windows/Unix/Linux and how I can control that behaviour. Some documentation in the manuals is confusing the issue...
6
by: fumanchu | last post by:
I've got to let end users (really just one person) load billing batch files into a third party app table. They need to specify the billing cycle name, the batch name, and the input file name and...
1
by: Ed Murphy | last post by:
One of my clients has a stored procedure on their secondary server that copies a bunch of data from the production server. (Replication will break the accounting software, according to its...
4
by: PJackson | last post by:
I have been given the task of taking a 3,200 line COBOL stored procedure and duplicating the same functionality in UDB 7.2 on the Windows platform with a procedural SQL stored procedure. I have...
2
by: acw | last post by:
On a SQL Server 2000 db I would like to setup a stored procedure that accesses couple tables and runs the extended stored procedure xp..cmdshell. The goal is to grant users with limited privileges...
0
by: abbasky | last post by:
### Vandf component communication method one: data sharing ​ Vandf components can achieve data exchange through data sharing, state sharing, events, and other methods. Vandf's data exchange method...
0
by: fareedcanada | last post by:
Hello I am trying to split number on their count. suppose i have 121314151617 (12cnt) then number should be split like 12,13,14,15,16,17 and if 11314151617 (11cnt) then should be split like...
1
by: davi5007 | last post by:
Hi, Basically, I am trying to automate a field named TraceabilityNo into a web page from an access form. I've got the serial held in the variable strSearchString. How can I get this into the...
0
by: MeoLessi9 | last post by:
I have VirtualBox installed on Windows 11 and now I would like to install Kali on a virtual machine. However, on the official website, I see two options: "Installer images" and "Virtual machines"....
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: Aftab Ahmad | last post by:
Hello Experts! I have written a code in MS Access for a cmd called "WhatsApp Message" to open WhatsApp using that very code but the problem is that it gives a popup message everytime I clicked on...
0
by: Aftab Ahmad | last post by:
So, I have written a code for a cmd called "Send WhatsApp Message" to open and send WhatsApp messaage. The code is given below. Dim IE As Object Set IE =...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.