Does anyone have any idea how to do this? I want to pass a select command a
parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks 23 1798
Rob Oldfield wrote: Does anyone have any idea how to do this? I want to pass a select command a parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks
You pass it in the same way that you would for any other INSERT or
UPDATE statement.
SELECT * FROM MYTABLE
WHERE ID IN ( @param1, @param2 )
"John Bailo" <ja*****@texeme.com> wrote in message
news:Pr******************************@speakeasy.ne t... Rob Oldfield wrote: Does anyone have any idea how to do this? I want to pass a select
command a parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks
You pass it in the same way that you would for any other INSERT or UPDATE statement.
SELECT * FROM MYTABLE WHERE ID IN ( @param1, @param2 )
Hmm. Yes, that works but for INs that method means that I have to know how
many items I'm going to have in the list (though I guess it'd be possible to
use 'too many' in the setup and just set any redundant ones to an impossible
value. What I was looking for was more along the lines of select * from
mytable where id in (@listparams) so that I can set @listparams to be the
single variable 'a','b','c' or whatever. I've just tried that and it
doesn't work.
And on the LIKEs, if I enter a criterion of LIKE %@param1% (if that's turned
into a hyperlink, please ignore it) in Visual Studio then the SQL statement
comes out as LIKE N'%@p1% (ditto) and I don't even get asked for a
parameter value as it's just taking it as text.
Rob Oldfield wrote: And on the LIKEs, if I enter a criterion of LIKE %@param1% (if that's turned into a hyperlink, please ignore it) in Visual Studio then the SQL statement comes out as LIKE N'%@p1% (ditto) and I don't even get asked for a parameter value as it's just taking it as text.
What you really want is a SQL statement builder class that:
(1) Takes in an array of values and a SQL parameter collection object.
(2) Uses StringBuilder and appends an equivalent number of @param names
(@param1, @param2, l...)
(3) Then, for each string, it .Add() s a parameter to the parameter
collection.
(4) Passes back the parameter collection.
-- http://blog.360.yahoo.com/manfrommars_43 http://texeme.com
"John Bailo" <ja*****@texeme.com> wrote in message
news:Me********************@speakeasy.net... Rob Oldfield wrote:
And on the LIKEs, if I enter a criterion of LIKE %@param1% (if that's turned into a hyperlink, please ignore it) in Visual Studio then the SQL statement comes out as LIKE N'%@p1% (ditto) and I don't even get asked for a parameter value as it's just taking it as text.
What you really want is a SQL statement builder class that:
(1) Takes in an array of values and a SQL parameter collection object.
(2) Uses StringBuilder and appends an equivalent number of @param names (@param1, @param2, l...)
(3) Then, for each string, it .Add() s a parameter to the parameter collection.
(4) Passes back the parameter collection.
OK. I knew that I could just rebuild the SQL statement on the fly, but was
hoping to avoid it. Thanks for the advice.
Rob Oldfield wrote: OK. I knew that I could just rebuild the SQL statement on the fly, but was hoping to avoid it. Thanks for the advice.
Well an alternative would be to use a foreach loop instead of IN
So you would just use one @param per ExecuteQuery and take the results and
put it into an Array.
foreach( string s in myArrayOfLikeValues)
{
//@param = s
// ExecuteQuery
// take record, or value, add to an array
}
--
The Texeme Construct, http://www.texeme.com
360, http://360.yahoo.com/manfrommars_43
John Bailo <ja*****@texeme.com> wrote: Rob Oldfield wrote:
OK. I knew that I could just rebuild the SQL statement on the fly, but was hoping to avoid it. Thanks for the advice.
Well an alternative would be to use a foreach loop instead of IN
So you would just use one @param per ExecuteQuery and take the results and put it into an Array.
foreach( string s in myArrayOfLikeValues) { //@param = s
// ExecuteQuery
// take record, or value, add to an array
}
That's certainly a potential solution if you don't care in the
slightest about performance - but generally people don't like making N
queries where 1 would do...
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too http://support.microsoft.com/default...b;en-us;555167
"Rob Oldfield" <bl**@blah.com> wrote in message
news:%2****************@TK2MSFTNGP12.phx.gbl... Does anyone have any idea how to do this? I want to pass a select command a parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks
Interesting. Those 'More information' headings at the bottom - are they
supposed to be hyperlinks? And do you know where to?
"W.G. Ryan eMVP" <Wi*********@gmail.com> wrote in message
news:OL*************@TK2MSFTNGP15.phx.gbl... http://support.microsoft.com/default...b;en-us;555167 "Rob Oldfield" <bl**@blah.com> wrote in message news:%2****************@TK2MSFTNGP12.phx.gbl... Does anyone have any idea how to do this? I want to pass a select
command a parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks
Jon Skeet [C# MVP] wrote: That's certainly a potential solution if you don't care in the slightest about performance - but generally people don't like making N queries where 1 would do...
Yes, I see your point...but one could question that on the SQL side as
well. I mean, under the covers, what is the SQL server executable
really doing when you issue a command like LIKE IN? "SQL Server" is
all c++ code ultimately, and its probably parsing the LIKE IN clause and
creating multiple calls to the database...so you're stuck with multiple
calls anyway. LIKE and IN are very slow poorly executing SQL
statements at the outset, and whenever you can avoid SQL syntax that
uses IN you should.
So, in essence, by creating c# code your doing multiple calls on the
client side. Also, you could thread these calls, or maybe batch them
up or write a stored procedure that takes a delimited array as a value
(as your other post suggests -- althought it seems very poorly worded).
John Bailo <ja*****@texeme.com> wrote: That's certainly a potential solution if you don't care in the slightest about performance - but generally people don't like making N queries where 1 would do... Yes, I see your point...but one could question that on the SQL side as well. I mean, under the covers, what is the SQL server executable really doing when you issue a command like LIKE IN? "SQL Server" is all c++ code ultimately, and its probably parsing the LIKE IN clause and creating multiple calls to the database...
I very, very much doubt that. Databases are far smarter than that - if
you give SQL Server (or any other reasonable database) a decent amount
of information, it's likely to do *much* better than keeping that
information to yourself and making several calls.
For instance, if it needs to do a table scan, it can do that scan
*once*, checking (very quickly using a hash, probably) for each row
whether or not it's in any of the "IN" parameters. That's far quicker
than doing N table scans, having to drag each row into cache each time.
Of course, you need to add to that the difference in network
performance, latency etc.
Additionally, in terms of semantics, it can be a real pain issuing
multiple calls - if you're trying to get the distinct values for
another column, or sorting, or anything like that, you basically end up
having to do it client-side.
so you're stuck with multiple calls anyway. LIKE and IN are very slow poorly executing SQL statements at the outset, and whenever you can avoid SQL syntax that uses IN you should.
Sure, but when you need it, you need it. It's like saying to avoid
joins where they're not needed - that doesn't mean you should fetch
whole tables and do the joins manually on the client side instead!
So, in essence, by creating c# code your doing multiple calls on the client side. Also, you could thread these calls, or maybe batch them up or write a stored procedure that takes a delimited array as a value (as your other post suggests -- althought it seems very poorly worded).
I'm not sure which post you're referring to, but I never suggested
taking a delimited array as a stored proc parameter. I would suggest
building the "IN" parameter in SQL:
.... WHERE Foo IN (@param1, @param2, @param3 etc)
- the bit in brackets needs to be built up dynamically, but that's far
from difficult.
I can see no advantages to issuing multiple calls in this situation
other than the *tiny* advantage of not having to build the string up
dynamically. I can see *lots* of advantages to using a dynamically
generated "IN" clause.
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Jon Skeet [C# MVP] wrote: I can see no advantages to issuing multiple calls in this situation other than the *tiny* advantage of not having to build the string up dynamically. I can see *lots* of advantages to using a dynamically generated "IN" clause.
I've been finding that running sql statements on threads simultaneously
can have enormous speed advantages ( much of my work is with db2 on an
iseries).
Using my method, I could thread the three LIKES, and not use the IN at
all and run my statements simultaneously.
"John Bailo" <ja*****@texeme.com> wrote in message
news:43**************@texeme.com... Jon Skeet [C# MVP] wrote: I can see no advantages to issuing multiple calls in this situation other than the *tiny* advantage of not having to build the string up dynamically. I can see *lots* of advantages to using a dynamically generated "IN" clause.
I've been finding that running sql statements on threads simultaneously can have enormous speed advantages ( much of my work is with db2 on an iseries).
Using my method, I could thread the three LIKES, and not use the IN at all and run my statements simultaneously.
Sorry to butt back in to a thread that's going way above my head, but why
not just run some testing? You each write the code your way, and pass the
result to each other so that both can be tested in different environments?
John Bailo <ja*****@texeme.com> wrote: Jon Skeet [C# MVP] wrote: I can see no advantages to issuing multiple calls in this situation other than the *tiny* advantage of not having to build the string up dynamically. I can see *lots* of advantages to using a dynamically generated "IN" clause.
I've been finding that running sql statements on threads simultaneously can have enormous speed advantages ( much of my work is with db2 on an iseries).
Using my method, I could thread the three LIKES, and not use the IN at all and run my statements simultaneously.
And unless you've got 3 processors on the database and nothing else
using it, that could easily still run slower.
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Rob Oldfield wrote: Sorry to butt back in to a thread that's going way above my head, but why not just run some testing? You each write the code your way, and pass the result to each other so that both can be tested in different environments?
Because that would -- like, work, dude.
--
The Texeme Construct, http://www.texeme.com
360, http://360.yahoo.com/manfrommars_43
Jon Skeet [C# MVP] wrote: And unless you've got 3 processors on the database and nothing else using it, that could easily still run slower.
What I really need to do, is to go to an open source database like mySql and
check out the parsing routines as they are mapped to the actually file
scans.
I've always been really /curious/ as to what the magic is of parsing SQL
syntax and turning it back into regular code...which is what I'm assuming
is all that a database *engine* does.
In some sense, since these days, 99 percent of SQL code is done within a
scripting or bytecode language ( c#, vbs, java ) what we do is take c-type
code, wrap SQL statements into it, message it to a database, which then
parses the SQL code, and executes c++ statements!
The real efficiency would be to get rid of the SQL interpreter entirely and
just run c# code against the data files. To me that would be the fastest.
--
The Texeme Construct, http://www.texeme.com
360, http://360.yahoo.com/manfrommars_43
John,
I had errors with my dataconnection, so I jump in this thread, only giving a
reaction on your message.
I don't like (hate is the better word) the SQL expression language. However
in my opinion are you as a lot of people trying to get performance from that
small time needed for compiling that code.
The most time done with reading data from a server over a dataline to a
client is not that compiling part, therefore I would not expect to much from
giving it direct in whatever other code.
Just my thought,
:-)
Cor
John Bailo <ja*****@texeme.com> wrote: And unless you've got 3 processors on the database and nothing else using it, that could easily still run slower.
What I really need to do, is to go to an open source database like mySql and check out the parsing routines as they are mapped to the actually file scans.
I've always been really /curious/ as to what the magic is of parsing SQL syntax and turning it back into regular code...which is what I'm assuming is all that a database *engine* does.
In some sense, since these days, 99 percent of SQL code is done within a scripting or bytecode language ( c#, vbs, java ) what we do is take c-type code, wrap SQL statements into it, message it to a database, which then parses the SQL code, and executes c++ statements!
The real efficiency would be to get rid of the SQL interpreter entirely and just run c# code against the data files. To me that would be the fastest.
No. Bear in mind that the C# or Java code is usually isn't running on
the same machine as the database, so you've got a separation anyway -
given that separation, it makes sense to use something which is nearer
the metal. Much as I love C#, I wouldn't use it to try to implement a
database on the same scale as SQL Server or Oracle.
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Cor Ligthert [MVP] wrote: The most time done with reading data from a server over a dataline to a client is not that compiling part, therefore I would not expect to much from giving it direct in whatever other code.
Good argument...however, I would turn it around on you.
The reason that it doesn't take much time is that -- as in this argument
-- we've conditioned ourselves to warp our code to encapsulate our
requests in SQL.
Whereas, much of the time, I just want to grap individual data bits on
threads and utilize it...not create a statement, then batch it up, then
get back a bunch of data, then parse it ( ds(0), ds(1) etc etc ).
If I could go direct to a more flexible data store/object store that
more closely mapped to what I do in c# and java I think I would find
greater performance.
I would much prefer to make my business logic in an OO language as well.
I think SQL 2005 addresses these needs, but really any database that
lets you strip out the SQL interpreter would work as well.
From what I know the really big business databases aren't relational,
their big file systems that are manipulated by COBOL.
Jon Skeet [C# MVP] wrote: No. Bear in mind that the C# or Java code is usually isn't running on the same machine as the database, so you've got a separation anyway - given that separation, it makes sense to use something which is nearer the metal. Much as I love C#, I wouldn't use it to try to implement a database on the same scale as SQL Server or Oracle.
But that's not what I said.
I'm not trying to replace the dbms with another.
I want to eliminate it entirely and read my data direct as files into c#
and manipulate it there.
WHat I need is a data server simple as an http server that can take raw
requests and let me manipulate on the middle ware.
John Bailo <ja*****@texeme.com> wrote: No. Bear in mind that the C# or Java code is usually isn't running on the same machine as the database, so you've got a separation anyway - given that separation, it makes sense to use something which is nearer the metal. Much as I love C#, I wouldn't use it to try to implement a database on the same scale as SQL Server or Oracle.
But that's not what I said.
I'm not trying to replace the dbms with another.
I want to eliminate it entirely and read my data direct as files into c# and manipulate it there.
WHat I need is a data server simple as an http server that can take raw requests and let me manipulate on the middle ware.
Let me get this clear: will you have a server element separate from the
client element or not? If not, then you've eliminated one aspect which
is appealing to many, many people.
If you *are* going to separate them, then either you're suggesting
shunting all the data from one place to another (which would be much
slower in almost all cases than processing the query "in place") or you
*are* effectively replacing the DBMS with another.
It would help if you'd spell out your suggested architecture a bit more
clearly. I would be very surprised if you've got an architecture which
is significantly better than the one(s) the rest of the world uses
though...
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Jon Skeet [C# MVP] wrote: It would help if you'd spell out your suggested architecture a bit more clearly. I would be very surprised if you've got an architecture which is significantly better than the one(s) the rest of the world uses though...
My architecture would basically encapsulate very simply file Open/Close
and scan operations, that would message data via http.
There would be no relational aspects.
All the analysis would be done back on a Smart Client...which would take
in needed data as files.
Then it would parse and map out the needed data.
Everything would be done using primitive datatypes -- no object data
until its used for analysis or rendering.
John Bailo <ja*****@texeme.com> wrote: It would help if you'd spell out your suggested architecture a bit more clearly. I would be very surprised if you've got an architecture which is significantly better than the one(s) the rest of the world uses though...
My architecture would basically encapsulate very simply file Open/Close and scan operations, that would message data via http.
There would be no relational aspects.
All the analysis would be done back on a Smart Client...which would take in needed data as files.
Then it would parse and map out the needed data.
Everything would be done using primitive datatypes -- no object data until its used for analysis or rendering.
Okay. So in order to process a table which is a gigabyte in size (which
is far from huge), you'd need to transfer the whole gigabyte across the
network, even if only a row or two matches the eventual query?
Um, no, that's not a good way to build a performant system.
--
Jon Skeet - <sk***@pobox.com> http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
John,
Kudzu and Frans had long discussions about what you in my idea are talking
about now. http://groups.google.com/group/micro...5d2d117d4e5ea8
By the way, real Cobol is mostly using Sequential and index Sequential
files. It is not a real language to handle databases without any extra
expression language.
Cor
"John Bailo" <ja*****@texeme.com> schreef in bericht
news:43**************@texeme.com... Cor Ligthert [MVP] wrote:
The most time done with reading data from a server over a dataline to a client is not that compiling part, therefore I would not expect to much from giving it direct in whatever other code.
Good argument...however, I would turn it around on you.
The reason that it doesn't take much time is that -- as in this argument -- we've conditioned ourselves to warp our code to encapsulate our requests in SQL.
Whereas, much of the time, I just want to grap individual data bits on threads and utilize it...not create a statement, then batch it up, then get back a bunch of data, then parse it ( ds(0), ds(1) etc etc ).
If I could go direct to a more flexible data store/object store that more closely mapped to what I do in c# and java I think I would find greater performance.
I would much prefer to make my business logic in an OO language as well.
I think SQL 2005 addresses these needs, but really any database that lets you strip out the SQL interpreter would work as well.
From what I know the really big business databases aren't relational, their big file systems that are manipulated by COBOL. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Patrick W. Fraley |
last post by:
Hi Everybuddy,
I am having a little problem I can not find a solution for.
I am trying to use the SQL-LIKE statement in gadfly. After googling
around I found out that it is not supported, and...
|
by: Lapchien |
last post by:
I have 2 queries, 1 produces a list of direct debit payments, the other then
'marks' those records as processed. The parameter is a date (usually
today's date). I run these 2 queries from a...
|
by: Cory |
last post by:
When I run the subroutine testEmailContacts the msgbox says that there
is only 1 record. The sql for qyEmailContactsQyCard is below. There is
over 3000 records in the table "tbl:Contact". What am i...
|
by: Marc DVer |
last post by:
I am trying to create a query that can be loaded as a querydef object
but not having to assign values to the parameters if I don't want to.
Normally when using a parameter query in VBA my code...
|
by: Rob Oldfield |
last post by:
Does anyone have any idea how to do this? I want to pass a select command a
parameter and just use that as a LIKE/IN comparison with my SQL data.
Thanks
|
by: Dan Sugalski |
last post by:
Is there any good way to speed up SQL that uses like and has placeholders?
Here's the scoop. I've got a system that uses a lot of pre-generated
SQL with placeholders in it. At runtime these SQL...
|
by: tizmagik |
last post by:
I am having a lot of difficulty generating a CrossTab Query based
report. I have looked online for several tutorials and whatnot but I
have not been able to really find what I'm looking for, nor...
|
by: tlyczko |
last post by:
Hello,
We have Access databases (backends) that will eventually be
consolidated into a SQL Server database, to facilitate data reporting,
analysis, etc.
Some queries in one Access database...
|
by: TonyJH |
last post by:
Hello,
I have a database in Access2003. I have set up several queries that use the parameter entry . Each query produces different outputs from various tables that go into a report. Sometimes, a...
|
by: ryjfgjl |
last post by:
ExcelToDatabase: batch import excel into database automatically...
|
by: isladogs |
last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM).
In this month's session, we are pleased to welcome back...
|
by: Vimpel783 |
last post by:
Hello!
Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
|
by: jfyes |
last post by:
As a hardware engineer, after seeing that CEIWEI recently released a new tool for Modbus RTU Over TCP/UDP filtering and monitoring, I actively went to its official website to take a look. It turned...
|
by: PapaRatzi |
last post by:
Hello,
I am teaching myself MS Access forms design and Visual Basic. I've created a table to capture a list of Top 30 singles and forms to capture new entries. The final step is a form (unbound)...
|
by: CloudSolutions |
last post by:
Introduction:
For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
|
by: Defcon1945 |
last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
|
by: Shællîpôpï 09 |
last post by:
If u are using a keypad phone, how do u turn on JavaScript, to access features like WhatsApp, Facebook, Instagram....
|
by: Faith0G |
last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
| |