473,385 Members | 1,536 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

MSXML2.XMLHTTP

I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array because
the delimeters are line feed and comma, respectively.

TIA...

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #1
23 7440
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array
because the delimeters are line feed and comma, respectively.

TIA...


It's pretty tough to comment on this. You've identified the bottleneck as
the process of writing the data to the page, so the strQuote function is not
relevant, is it? What you do with the array contents seems to be more
relevant, at least to me.

Somebody (I think it might have been Chris Hohmann) posted an analysis of
different techniques for generating large blocks of html a few weeks ago
that you may find interesting.

Bob Barrows
--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #2
"Bob Barrows [MVP]" wrote in message
news:uC**************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
: > files] from a remote web site, parses them out and displays them in
: > scrolling divs. The first file has a little over 27k records, the
: > second has less. It retrieves the data pretty quick but it takes
: > awhile to write the page.
: >
: > Is there a better alternative to this approach?
: > This is my page:
: > http://kiddanger.com/lab/getsaveurl.asp
: >
: > This is the relevant code to retrieve the data:
: >
: > function strQuote(strURL)
: > dim objXML
: > set objXML = CreateObject("MSXML2.ServerXMLHTTP")
: > objXML.Open "GET", strURL, False
: > objXML.Send
: > strQuote = objXML.ResponseText
: > set objXML = nothing
: > end function
: >
: > I split the data into an array and then split that into a new array
: > because the delimeters are line feed and comma, respectively.
: >
: > TIA...
: >
:
: It's pretty tough to comment on this. You've identified the bottleneck as
: the process of writing the data to the page, so the strQuote function is
not
: relevant, is it? What you do with the array contents seems to be more
: relevant, at least to me.

Hi Bob. Thanks for responding.

Perhaps. I'm assuming the data is retrieved due to the activity light on my
switch. I have not actually put timers in, which I guess would be the next
test.

:
: Somebody (I think it might have been Chris Hohmann) posted an analysis of
: different techniques for generating large blocks of html a few weeks ago
: that you may find interesting.

I searched in this NG for all of Chris' posting and didn't find anything.
Then I searched for the reference you made and didn't find anything that way
either. Here is my subroutine for parsing the data and perhaps someone will
notice something that will help speed it up.

sub strWrite(str)
dim arr, i, arr2, j
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub

These are the calls for the two files:

dim strURL
strURL = "http://neustar.us/reports/rgp/domains_in_rgp.csv"
strWrite strQuote(strURL)
strURL = "http://neustar.us/reports/rgp/domains_out_rgp.csv"
strWrite strQuote(strUrl)

I made some changes to my buffer and some variables and it's noticably
faster. It still takes about 4-5 seconds to parse the data but I'm not sure
if that's all that bad for that amount.

I'm testing with two links, one on the Internet and one on my Intranet. The
Internet link normally displays them almost simultaneously. The Intranet
displays the first file, then almost as much of a delay for the next, which
is what I expected.

http://kiddanger.com/lab/getsaveurl.asp Internet
http://netfraud.us/asp/rgpr.asp Intranet

I wonder if I wrote everything to a string and then made only one write
statement if that would be faster. Any ideas?

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #3
I added the record count to the legend and now I know why the second one is
a lot faster. 1/10 the amount of records.
Jul 22 '05 #4
Roland Hall wrote:
...Here is my subroutine for parsing the data and perhaps someone
will notice something that will help speed it up...

for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next


Have you tried using Replace() instead of split?

for i = 1 to ubound(arr)
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if

prt(Replace(arr(i),",","<td>"))
next

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #5
Roland Hall wrote:
Somebody (I think it might have been Chris Hohmann) posted an
analysis of different techniques for generating large blocks of html
a few weeks ago that you may find interesting.


I searched in this NG for all of Chris' posting and didn't find
anything. Then I searched for the reference you made and didn't find
anything that way either.


Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method. Combined with Dave's idea, you would
get something like this:

sub strWrite(str)
dim arr, i, arr2, j
dim arHTML(), sRow
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
redim arHTML(ubound(arr))
for i = 1 to ubound(arr)
if i = 1 then
sRow= "<tr style=""font-weight: bold"">"
else
if i mod 2 = 0 then
sRow="<tr style=""background-color: #ddd"">"
else
sRow="<tr>"
end if
end if
sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
arHTML(i) =sRow
next
prt(Join(arHTML,vbCrLf))
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub
:

Bob Barrows

--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #6
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > ...Here is my subroutine for parsing the data and perhaps someone
: > will notice something that will help speed it up...
: >
: > for i = 1 to ubound(arr)
: > arr2 = split(arr(i),",")
: > if i = 1 then
: > prt("<tr style=""font-weight: bold"">")
: > else
: > if i mod 2 = 0 then
: > prt("<tr style=""background-color: #ddd"">")
: > else
: > prt("<tr>")
: > end if
: > end if
: > for j = 0 to ubound(arr2)
: > prt("<td>" & arr2(j))
: > next
: > next
:
: Have you tried using Replace() instead of split?
:
: for i = 1 to ubound(arr)
: if i = 1 then
: prt("<tr style=""font-weight: bold"">")
: else
: if i mod 2 = 0 then
: prt("<tr style=""background-color: #ddd"">")
: else
: prt("<tr>")
: end if
: end if
:
: prt(Replace(arr(i),",","<td>"))
: next
:
:
:

Thank Dave. I'll put a timer on it to see if the difference. Hard to tell
just looking. I know it's hard to write this stuff off the top of your
head, especially not seeing the raw data but I needed to make one mod to
your suggestion. There is no leading , (comma) so another <td> had to be
inserted.

prt("<td>" & replace(arr(i),",","<td>"))

Thanks for your insight. I like that a lot better than the array loop.

Roland
Jul 22 '05 #7
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?


How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server when
updated or by a periodically-executed script or Windows service. The
files could then be accessed locally (more quickly).

BTW 27,000 rows seems like an excessive amount of data for a user to
digest at once. Could the program present a search page (by field, by
alphabetic order, etc.) or summary page (listing categories) first? Then
the user could limit the search somewhat.

I would be tempted to periodically transfer the CSV file(s) to a local
directory and import the data into a database. Then an ASP page would
handle the search and presentation.
Jul 22 '05 #8
Roland Hall wrote:
There is no leading , (comma) so another <td> had to be inserted.

prt("<td>" & replace(arr(i),",","<td>"))


I would go even further and reach for HTML completeness:

prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #9
Michael D. Kersey wrote:
BTW 27,000 rows seems like an excessive amount of data for
a user to digest at once...


That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all at
once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set the
script up to spit out every changed record and/or every 100th record, or
something similar. I typically break the table every 10 or 20 rows by
inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the table
is closed, while Mozilla/Firefox/Opera will display each row as it arrives
(buffering must be off to see this in effect).

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #10
Dave Anderson wrote on 13 dec 2004 in
microsoft.public.inetserver.asp.general:
Michael D. Kersey wrote:
BTW 27,000 rows seems like an excessive amount of data for
a user to digest at once...


That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all
at once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set
the script up to spit out every changed record and/or every 100th
record, or something similar. I typically break the table every 10 or
20 rows by inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row as
it arrives (buffering must be off to see this in effect).


my observation is otherwise

--
Evertjan.
The Netherlands.
(Please change the x'es to dots in my emailaddress)
Jul 22 '05 #11
"Bob Barrows [MVP]" <re******@NOyahoo.SPAMcom> wrote in message
news:%2******************@TK2MSFTNGP12.phx.gbl...
Roland Hall wrote:
Somebody (I think it might have been Chris Hohmann) posted an
analysis of different techniques for generating large blocks of html
a few weeks ago that you may find interesting.


I searched in this NG for all of Chris' posting and didn't find
anything. Then I searched for the reference you made and didn't find
anything that way either.


Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method.


It sounds familiar but I couldn't find it either. Maybe the underpants
gnomes stole it. :) The closest thing I could come up with it this:

IsArray doesn't work with array var populated with xxx.GetRows()
http://groups-beta.google.com/group/...8211a93f83f823

Here are some older threads:

return single value in asp/sql
http://groups-beta.google.com/group/...61356799605006

logical problem
http://groups-beta.google.com/group/...9c1498e99d805e

Response.Write speed problem
http://groups-beta.google.com/group/...879828821abe40
Jul 22 '05 #12
"Bob Barrows [MVP]" wrote in message
news:%2******************@TK2MSFTNGP12.phx.gbl...
: Roland Hall wrote:
: >> Somebody (I think it might have been Chris Hohmann) posted an
: >> analysis of different techniques for generating large blocks of html
: >> a few weeks ago that you may find interesting.
: >
: > I searched in this NG for all of Chris' posting and didn't find
: > anything. Then I searched for the reference you made and didn't find
: > anything that way either.
:
: Darn. I just tried to find it as well, and failed. ISTR that the consensus
: was that adding the individual strings to an array and then using Join to
: combine them was the fastest method. Combined with Dave's idea, you would
: get something like this:
:
: sub strWrite(str)
: dim arr, i, arr2, j
: dim arHTML(), sRow
: arr = split(str,vbLf)
: prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
: strURL & "</legend>")
: prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
: prt("<table style=""padding: 3px"">")
: redim arHTML(ubound(arr))
: for i = 1 to ubound(arr)
: if i = 1 then
: sRow= "<tr style=""font-weight: bold"">"
: else
: if i mod 2 = 0 then
: sRow="<tr style=""background-color: #ddd"">"
: else
: sRow="<tr>"
: end if
: end if
: sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
: arHTML(i) =sRow
: next
: prt(Join(arHTML,vbCrLf))
: prt("</table>")
: prt("</div>")
: prt("</fieldset>")
: end sub

Thanks for your help Bob. I only had to make a few adjustments.
Jul 22 '05 #13
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > There is no leading , (comma) so another <td> had to be inserted.
: >
: > prt("<td>" & replace(arr(i),",","<td>"))
:
: I would go even further and reach for HTML completeness:
:
: prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

HTML completeness? I thought ending tags were no longer required? However,
wouldn't it then be:
prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

Roland
Jul 22 '05 #14
"Michael D. Kersey" wrote in message
news:O%***************@TK2MSFTNGP12.phx.gbl...
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
files]
: > from a remote web site, parses them out and displays them in scrolling
divs.
: > The first file has a little over 27k records, the second has less. It
: > retrieves the data pretty quick but it takes awhile to write the page.
: >
: > Is there a better alternative to this approach?
:
: How often are the CSV files updated at their remote site? If it's not
: too frequently, then the files could be transferred to your server when
: updated or by a periodically-executed script or Windows service. The
: files could then be accessed locally (more quickly).

I think they are updated once a day.

: BTW 27,000 rows seems like an excessive amount of data for a user to
: digest at once. Could the program present a search page (by field, by
: alphabetic order, etc.) or summary page (listing categories) first? Then
: the user could limit the search somewhat.
:
: I would be tempted to periodically transfer the CSV file(s) to a local
: directory and import the data into a database. Then an ASP page would
: handle the search and presentation.

These files are lists of domains being deleted and their status in the
deletion process. Sure, if you know a domain, a simple record would be
great but I believe this is a list that is mostly unknown to those seeking
it and why it is only available as a csv file.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #15

"Dave Anderson" <GT**********@spammotel.com> wrote in message
news:uJ**************@TK2MSFTNGP09.phx.gbl...
: Michael D. Kersey wrote:
: > BTW 27,000 rows seems like an excessive amount of data for
: > a user to digest at once...
:
: That raises another point I forgot to address in my other post. If the
: client machine is Internet Explorer, the table will be displayed all at
: once, rather than line-by-line, no matter what buffering you use.

That's what happens.

: I have jobs that I occasionally run with ASP scripts, and I often set the
: script up to spit out every changed record and/or every 100th record, or
: something similar. I typically break the table every 10 or 20 rows by
: inserting one of these: "</table><table>".
:
: It has been my observation that IE displays nothing at all until the table
: is closed, while Mozilla/Firefox/Opera will display each row as it arrives
: (buffering must be off to see this in effect).

There are only 3 columns in the first file and 2 in the second. Roughly 21k
rows in the first and 2k in the second. This is also a variant because it
is based upon the date each domain was registered. The following day, could
have twice as many or half as much but I doubt they'll vary greatly.

Currently it appears splitting it up will just slow down the process since
retrieving the file is where most of the latency occurs. I'll probably end
up writing and app to grab the file daily which will decrease the bandwidth
usage by almost 50%.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #16
Roland Hall wrote:
"Michael D. Kersey" wrote in message
news:O%***************@TK2MSFTNGP12.phx.gbl...
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?


How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server
when updated or by a periodically-executed script or Windows
service. The files could then be accessed locally (more quickly).


I think they are updated once a day.

You might want to consider caching them, refreshing the cache each day.
Generate the html strings once each day and put them in SSI files.

I would consider making them filterable, either by importing them into a
database, or converting them into xml.

Bob Barrows

--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #17
"Bob Barrows [MVP]" wrote in message
news:%2****************@TK2MSFTNGP14.phx.gbl...
: You might want to consider caching them, refreshing the cache each day.
: Generate the html strings once each day and put them in SSI files.

I'm not familiar.

: I would consider making them filterable, either by importing them into a
: database, or converting them into xml.

I plan on putting them in SQL.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #18
Roland Hall wrote:
"Bob Barrows [MVP]" wrote in message
news:%2****************@TK2MSFTNGP14.phx.gbl...
You might want to consider caching them, refreshing the cache each
day. Generate the html strings once each day and put them in SSI
files.
I'm not familiar.


SSI = server-side includes
In other words, each day, generate the html and write it into a file which
you include in your display page using <!--#include etc.
I would consider making them filterable, either by importing them
into a database, or converting them into xml.


I plan on putting them in SQL.

Good

Bob Barrows
--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #19
Evertjan. wrote:
It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row as
it arrives (buffering must be off to see this in effect).


my observation is otherwise


From your detailed response I infer you observed a powered-off CRT. That is
most certainly "otherwise".

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #20
Gazing into my crystal ball I observed "Roland Hall" <nobody@nowhere>
writing in news:uA**************@TK2MSFTNGP10.phx.gbl:
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > There is no leading , (comma) so another <td> had to be inserted.
: >
: > prt("<td>" & replace(arr(i),",","<td>"))
:
: I would go even further and reach for HTML completeness:
:
: prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

HTML completeness? I thought ending tags were no longer required?
However, wouldn't it then be:
prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

Roland


With XHTML, all elements must be closed and lowercase, even IMG and BR,
eg: <img src="some.png" alt="" /> or <br />

Even with HTML you do a lot better to use closing tags, easier to debug,
etc.

--
Adrienne Boswell
Please respond to the Group so others can share
Jul 22 '05 #21
Dave Anderson wrote on 14 dec 2004 in
microsoft.public.inetserver.asp.general:
Evertjan. wrote:
It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row
as it arrives (buffering must be off to see this in effect).


my observation is otherwise


From your detailed response I infer you observed a powered-off CRT.
That is most certainly "otherwise".


A powerful, though incorrect infer-sion.
--
Evertjan.
The Netherlands.
(Please change the x'es to dots in my emailaddress)
Jul 22 '05 #22
Adrienne Boswell wrote:
Where Talal's managerial executive frowns, Waleed leaves near
islamic, corresponding networks.


What??
Adrienne, is somebody spoofing you? If not, why the Followup-To to
news.admin.net-abuse.email

Bob Barrows
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Jul 22 '05 #23
Gazing into my crystal ball I observed "Bob Barrows [MVP]" <reb01501
@NOyahoo.SPAMcom> writing in news:#h**************@TK2MSFTNGP15.phx.gbl:
Adrienne Boswell wrote:
Where Talal's managerial executive frowns, Waleed leaves near
islamic, corresponding networks.


What??
Adrienne, is somebody spoofing you? If not, why the Followup-To to
news.admin.net-abuse.email

Bob Barrows


Yes, apparently, someone IS spoofing me. How odd! As a matter of fact, I
wasn't even near a computer when that post was made, I was in a taxi on the
way home from work.

It looks like HipCrime to me, and if you look at the headers you can see
the originating post is from 210.178.1.125, where my posts come from
67.102.130.26 (work) and 64.160.235.41 (home).


--
Adrienne Boswell
Please respond to the group so others can share
Jul 22 '05 #24

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Dan Sikorsky | last post by:
Uploading from browser to server using Msxml2.XMLHTTP takes a long time about 15 minutes for a 1.5MB file at 37.2Kbps, although it does get there. Is there anyway to speed things up? here's the...
17
by: Patrick | last post by:
I am almost certain that I could use HTTP Post/Get to submit XML Web Service call (over SSL as well, if using Version 3 of MSXML2) from an ASP Application? However, would I only be able to call...
5
by: RK | last post by:
I am getting the "HTTP /1.1 405 - method not allowed" error when I am sending XML string over Msxml2.XMLHTTP object. I am sending data in POST, also passing querystring variable and here is the...
2
by: Michael Christensen | last post by:
Hi How do I send an input-param to my web service with MSXML2.ServerXMLHTTP? Can this be done without the soap-toolkit? Can't find anything about it - looking forward getting some help :-)...
1
by: Raúl Martín | last post by:
I´ve a function in asp that run correctly but If I tried to change it forasp.net in asp: xmlHTTP = CreateObject("Microsoft.XMLHTTP") And I thought to use this sentence for asp.net but the...
2
by: Maris Janis Vasilevskis | last post by:
Hi, Is it possible to force HttpWebRequest to do exactly (not approximately) the same as MSXML2.ServerXMLHTTP does? More details. I port JScript to JScript.NET I have a server (ASP invoking...
3
by: BjörnHolmberg | last post by:
I'm trying to consume a WS from Excel (Office 98), seems that I can't send arguments into the WS. In order to pinpoint the problem I've written the following application i C# and VB6. It reproduces...
5
by: Brent | last post by:
This AJAX stuff is all new to me. To try it out, I borrowed this code from a website: ======================== var http_request = false; function makeRequest(url) { http_request = false;
2
by: noOby | last post by:
Hi developers, currently i'm doing a school project and i am stuck on MSXML2.XMLHTTP conversion from vb code to c#. Below is the vb Coding that i need to convert to C# Dim sUrl As String ...
2
by: Dave | last post by:
I am running the following code and I get an error: Set xmlHttp = Server.CreateObject("MSXML2.XMLHTTP.3.0") xmlHttp.Open "Get", URLToRSS, false xmlHttp.Send RSSXML = xmlHttp.ResponseText The...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.