469,613 Members | 1,319 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,613 developers. It's quick & easy.

MSXML2.XMLHTTP

I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array because
the delimeters are line feed and comma, respectively.

TIA...

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #1
23 7173
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?
This is my page:
http://kiddanger.com/lab/getsaveurl.asp

This is the relevant code to retrieve the data:

function strQuote(strURL)
dim objXML
set objXML = CreateObject("MSXML2.ServerXMLHTTP")
objXML.Open "GET", strURL, False
objXML.Send
strQuote = objXML.ResponseText
set objXML = nothing
end function

I split the data into an array and then split that into a new array
because the delimeters are line feed and comma, respectively.

TIA...


It's pretty tough to comment on this. You've identified the bottleneck as
the process of writing the data to the page, so the strQuote function is not
relevant, is it? What you do with the array contents seems to be more
relevant, at least to me.

Somebody (I think it might have been Chris Hohmann) posted an analysis of
different techniques for generating large blocks of html a few weeks ago
that you may find interesting.

Bob Barrows
--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #2
"Bob Barrows [MVP]" wrote in message
news:uC**************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
: > files] from a remote web site, parses them out and displays them in
: > scrolling divs. The first file has a little over 27k records, the
: > second has less. It retrieves the data pretty quick but it takes
: > awhile to write the page.
: >
: > Is there a better alternative to this approach?
: > This is my page:
: > http://kiddanger.com/lab/getsaveurl.asp
: >
: > This is the relevant code to retrieve the data:
: >
: > function strQuote(strURL)
: > dim objXML
: > set objXML = CreateObject("MSXML2.ServerXMLHTTP")
: > objXML.Open "GET", strURL, False
: > objXML.Send
: > strQuote = objXML.ResponseText
: > set objXML = nothing
: > end function
: >
: > I split the data into an array and then split that into a new array
: > because the delimeters are line feed and comma, respectively.
: >
: > TIA...
: >
:
: It's pretty tough to comment on this. You've identified the bottleneck as
: the process of writing the data to the page, so the strQuote function is
not
: relevant, is it? What you do with the array contents seems to be more
: relevant, at least to me.

Hi Bob. Thanks for responding.

Perhaps. I'm assuming the data is retrieved due to the activity light on my
switch. I have not actually put timers in, which I guess would be the next
test.

:
: Somebody (I think it might have been Chris Hohmann) posted an analysis of
: different techniques for generating large blocks of html a few weeks ago
: that you may find interesting.

I searched in this NG for all of Chris' posting and didn't find anything.
Then I searched for the reference you made and didn't find anything that way
either. Here is my subroutine for parsing the data and perhaps someone will
notice something that will help speed it up.

sub strWrite(str)
dim arr, i, arr2, j
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub

These are the calls for the two files:

dim strURL
strURL = "http://neustar.us/reports/rgp/domains_in_rgp.csv"
strWrite strQuote(strURL)
strURL = "http://neustar.us/reports/rgp/domains_out_rgp.csv"
strWrite strQuote(strUrl)

I made some changes to my buffer and some variables and it's noticably
faster. It still takes about 4-5 seconds to parse the data but I'm not sure
if that's all that bad for that amount.

I'm testing with two links, one on the Internet and one on my Intranet. The
Internet link normally displays them almost simultaneously. The Intranet
displays the first file, then almost as much of a delay for the next, which
is what I expected.

http://kiddanger.com/lab/getsaveurl.asp Internet
http://netfraud.us/asp/rgpr.asp Intranet

I wonder if I wrote everything to a string and then made only one write
statement if that would be faster. Any ideas?

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #3
I added the record count to the legend and now I know why the second one is
a lot faster. 1/10 the amount of records.
Jul 22 '05 #4
Roland Hall wrote:
...Here is my subroutine for parsing the data and perhaps someone
will notice something that will help speed it up...

for i = 1 to ubound(arr)
arr2 = split(arr(i),",")
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if
for j = 0 to ubound(arr2)
prt("<td>" & arr2(j))
next
next


Have you tried using Replace() instead of split?

for i = 1 to ubound(arr)
if i = 1 then
prt("<tr style=""font-weight: bold"">")
else
if i mod 2 = 0 then
prt("<tr style=""background-color: #ddd"">")
else
prt("<tr>")
end if
end if

prt(Replace(arr(i),",","<td>"))
next

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #5
Roland Hall wrote:
Somebody (I think it might have been Chris Hohmann) posted an
analysis of different techniques for generating large blocks of html
a few weeks ago that you may find interesting.


I searched in this NG for all of Chris' posting and didn't find
anything. Then I searched for the reference you made and didn't find
anything that way either.


Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method. Combined with Dave's idea, you would
get something like this:

sub strWrite(str)
dim arr, i, arr2, j
dim arHTML(), sRow
arr = split(str,vbLf)
prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
strURL & "</legend>")
prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
prt("<table style=""padding: 3px"">")
redim arHTML(ubound(arr))
for i = 1 to ubound(arr)
if i = 1 then
sRow= "<tr style=""font-weight: bold"">"
else
if i mod 2 = 0 then
sRow="<tr style=""background-color: #ddd"">"
else
sRow="<tr>"
end if
end if
sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
arHTML(i) =sRow
next
prt(Join(arHTML,vbCrLf))
prt("</table>")
prt("</div>")
prt("</fieldset>")
end sub
:

Bob Barrows

--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #6
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > ...Here is my subroutine for parsing the data and perhaps someone
: > will notice something that will help speed it up...
: >
: > for i = 1 to ubound(arr)
: > arr2 = split(arr(i),",")
: > if i = 1 then
: > prt("<tr style=""font-weight: bold"">")
: > else
: > if i mod 2 = 0 then
: > prt("<tr style=""background-color: #ddd"">")
: > else
: > prt("<tr>")
: > end if
: > end if
: > for j = 0 to ubound(arr2)
: > prt("<td>" & arr2(j))
: > next
: > next
:
: Have you tried using Replace() instead of split?
:
: for i = 1 to ubound(arr)
: if i = 1 then
: prt("<tr style=""font-weight: bold"">")
: else
: if i mod 2 = 0 then
: prt("<tr style=""background-color: #ddd"">")
: else
: prt("<tr>")
: end if
: end if
:
: prt(Replace(arr(i),",","<td>"))
: next
:
:
:

Thank Dave. I'll put a timer on it to see if the difference. Hard to tell
just looking. I know it's hard to write this stuff off the top of your
head, especially not seeing the raw data but I needed to make one mod to
your suggestion. There is no leading , (comma) so another <td> had to be
inserted.

prt("<td>" & replace(arr(i),",","<td>"))

Thanks for your insight. I like that a lot better than the array loop.

Roland
Jul 22 '05 #7
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data files]
from a remote web site, parses them out and displays them in scrolling divs.
The first file has a little over 27k records, the second has less. It
retrieves the data pretty quick but it takes awhile to write the page.

Is there a better alternative to this approach?


How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server when
updated or by a periodically-executed script or Windows service. The
files could then be accessed locally (more quickly).

BTW 27,000 rows seems like an excessive amount of data for a user to
digest at once. Could the program present a search page (by field, by
alphabetic order, etc.) or summary page (listing categories) first? Then
the user could limit the search somewhat.

I would be tempted to periodically transfer the CSV file(s) to a local
directory and import the data into a database. Then an ASP page would
handle the search and presentation.
Jul 22 '05 #8
Roland Hall wrote:
There is no leading , (comma) so another <td> had to be inserted.

prt("<td>" & replace(arr(i),",","<td>"))


I would go even further and reach for HTML completeness:

prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #9
Michael D. Kersey wrote:
BTW 27,000 rows seems like an excessive amount of data for
a user to digest at once...


That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all at
once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set the
script up to spit out every changed record and/or every 100th record, or
something similar. I typically break the table every 10 or 20 rows by
inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the table
is closed, while Mozilla/Firefox/Opera will display each row as it arrives
(buffering must be off to see this in effect).

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #10
Dave Anderson wrote on 13 dec 2004 in
microsoft.public.inetserver.asp.general:
Michael D. Kersey wrote:
BTW 27,000 rows seems like an excessive amount of data for
a user to digest at once...


That raises another point I forgot to address in my other post. If the
client machine is Internet Explorer, the table will be displayed all
at once, rather than line-by-line, no matter what buffering you use.

I have jobs that I occasionally run with ASP scripts, and I often set
the script up to spit out every changed record and/or every 100th
record, or something similar. I typically break the table every 10 or
20 rows by inserting one of these: "</table><table>".

It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row as
it arrives (buffering must be off to see this in effect).


my observation is otherwise

--
Evertjan.
The Netherlands.
(Please change the x'es to dots in my emailaddress)
Jul 22 '05 #11
"Bob Barrows [MVP]" <re******@NOyahoo.SPAMcom> wrote in message
news:%2******************@TK2MSFTNGP12.phx.gbl...
Roland Hall wrote:
Somebody (I think it might have been Chris Hohmann) posted an
analysis of different techniques for generating large blocks of html
a few weeks ago that you may find interesting.


I searched in this NG for all of Chris' posting and didn't find
anything. Then I searched for the reference you made and didn't find
anything that way either.


Darn. I just tried to find it as well, and failed. ISTR that the consensus
was that adding the individual strings to an array and then using Join to
combine them was the fastest method.


It sounds familiar but I couldn't find it either. Maybe the underpants
gnomes stole it. :) The closest thing I could come up with it this:

IsArray doesn't work with array var populated with xxx.GetRows()
http://groups-beta.google.com/group/...8211a93f83f823

Here are some older threads:

return single value in asp/sql
http://groups-beta.google.com/group/...61356799605006

logical problem
http://groups-beta.google.com/group/...9c1498e99d805e

Response.Write speed problem
http://groups-beta.google.com/group/...879828821abe40
Jul 22 '05 #12
"Bob Barrows [MVP]" wrote in message
news:%2******************@TK2MSFTNGP12.phx.gbl...
: Roland Hall wrote:
: >> Somebody (I think it might have been Chris Hohmann) posted an
: >> analysis of different techniques for generating large blocks of html
: >> a few weeks ago that you may find interesting.
: >
: > I searched in this NG for all of Chris' posting and didn't find
: > anything. Then I searched for the reference you made and didn't find
: > anything that way either.
:
: Darn. I just tried to find it as well, and failed. ISTR that the consensus
: was that adding the individual strings to an array and then using Join to
: combine them was the fastest method. Combined with Dave's idea, you would
: get something like this:
:
: sub strWrite(str)
: dim arr, i, arr2, j
: dim arHTML(), sRow
: arr = split(str,vbLf)
: prt("<fieldset><legend style=""font-weight: bold"">" & arr(0) & " " &
: strURL & "</legend>")
: prt("<div style=""height: 200px; overflow: auto; width: 950px"">")
: prt("<table style=""padding: 3px"">")
: redim arHTML(ubound(arr))
: for i = 1 to ubound(arr)
: if i = 1 then
: sRow= "<tr style=""font-weight: bold"">"
: else
: if i mod 2 = 0 then
: sRow="<tr style=""background-color: #ddd"">"
: else
: sRow="<tr>"
: end if
: end if
: sRow=sRow & vbCrLf & vbTab & Replace(arr(i),",","<td>"))
: arHTML(i) =sRow
: next
: prt(Join(arHTML,vbCrLf))
: prt("</table>")
: prt("</div>")
: prt("</fieldset>")
: end sub

Thanks for your help Bob. I only had to make a few adjustments.
Jul 22 '05 #13
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > There is no leading , (comma) so another <td> had to be inserted.
: >
: > prt("<td>" & replace(arr(i),",","<td>"))
:
: I would go even further and reach for HTML completeness:
:
: prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

HTML completeness? I thought ending tags were no longer required? However,
wouldn't it then be:
prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

Roland
Jul 22 '05 #14
"Michael D. Kersey" wrote in message
news:O%***************@TK2MSFTNGP12.phx.gbl...
: Roland Hall wrote:
: > I wrote a small script that grabs two CSV files [links to the data
files]
: > from a remote web site, parses them out and displays them in scrolling
divs.
: > The first file has a little over 27k records, the second has less. It
: > retrieves the data pretty quick but it takes awhile to write the page.
: >
: > Is there a better alternative to this approach?
:
: How often are the CSV files updated at their remote site? If it's not
: too frequently, then the files could be transferred to your server when
: updated or by a periodically-executed script or Windows service. The
: files could then be accessed locally (more quickly).

I think they are updated once a day.

: BTW 27,000 rows seems like an excessive amount of data for a user to
: digest at once. Could the program present a search page (by field, by
: alphabetic order, etc.) or summary page (listing categories) first? Then
: the user could limit the search somewhat.
:
: I would be tempted to periodically transfer the CSV file(s) to a local
: directory and import the data into a database. Then an ASP page would
: handle the search and presentation.

These files are lists of domains being deleted and their status in the
deletion process. Sure, if you know a domain, a simple record would be
great but I believe this is a list that is mostly unknown to those seeking
it and why it is only available as a csv file.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #15

"Dave Anderson" <GT**********@spammotel.com> wrote in message
news:uJ**************@TK2MSFTNGP09.phx.gbl...
: Michael D. Kersey wrote:
: > BTW 27,000 rows seems like an excessive amount of data for
: > a user to digest at once...
:
: That raises another point I forgot to address in my other post. If the
: client machine is Internet Explorer, the table will be displayed all at
: once, rather than line-by-line, no matter what buffering you use.

That's what happens.

: I have jobs that I occasionally run with ASP scripts, and I often set the
: script up to spit out every changed record and/or every 100th record, or
: something similar. I typically break the table every 10 or 20 rows by
: inserting one of these: "</table><table>".
:
: It has been my observation that IE displays nothing at all until the table
: is closed, while Mozilla/Firefox/Opera will display each row as it arrives
: (buffering must be off to see this in effect).

There are only 3 columns in the first file and 2 in the second. Roughly 21k
rows in the first and 2k in the second. This is also a variant because it
is based upon the date each domain was registered. The following day, could
have twice as many or half as much but I doubt they'll vary greatly.

Currently it appears splitting it up will just slow down the process since
retrieving the file is where most of the latency occurs. I'll probably end
up writing and app to grab the file daily which will decrease the bandwidth
usage by almost 50%.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #16
Roland Hall wrote:
"Michael D. Kersey" wrote in message
news:O%***************@TK2MSFTNGP12.phx.gbl...
Roland Hall wrote:
I wrote a small script that grabs two CSV files [links to the data
files] from a remote web site, parses them out and displays them in
scrolling divs. The first file has a little over 27k records, the
second has less. It retrieves the data pretty quick but it takes
awhile to write the page.

Is there a better alternative to this approach?


How often are the CSV files updated at their remote site? If it's not
too frequently, then the files could be transferred to your server
when updated or by a periodically-executed script or Windows
service. The files could then be accessed locally (more quickly).


I think they are updated once a day.

You might want to consider caching them, refreshing the cache each day.
Generate the html strings once each day and put them in SSI files.

I would consider making them filterable, either by importing them into a
database, or converting them into xml.

Bob Barrows

--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #17
"Bob Barrows [MVP]" wrote in message
news:%2****************@TK2MSFTNGP14.phx.gbl...
: You might want to consider caching them, refreshing the cache each day.
: Generate the html strings once each day and put them in SSI files.

I'm not familiar.

: I would consider making them filterable, either by importing them into a
: database, or converting them into xml.

I plan on putting them in SQL.

--
Roland Hall
/* This information is distributed in the hope that it will be useful, but
without any warranty; without even the implied warranty of merchantability
or fitness for a particular purpose. */
Technet Script Center - http://www.microsoft.com/technet/scriptcenter/
WSH 5.6 Documentation - http://msdn.microsoft.com/downloads/list/webdev.asp
MSDN Library - http://msdn.microsoft.com/library/default.asp
Jul 22 '05 #18
Roland Hall wrote:
"Bob Barrows [MVP]" wrote in message
news:%2****************@TK2MSFTNGP14.phx.gbl...
You might want to consider caching them, refreshing the cache each
day. Generate the html strings once each day and put them in SSI
files.
I'm not familiar.


SSI = server-side includes
In other words, each day, generate the html and write it into a file which
you include in your display page using <!--#include etc.
I would consider making them filterable, either by importing them
into a database, or converting them into xml.


I plan on putting them in SQL.

Good

Bob Barrows
--
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
Jul 22 '05 #19
Evertjan. wrote:
It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row as
it arrives (buffering must be off to see this in effect).


my observation is otherwise


From your detailed response I infer you observed a powered-off CRT. That is
most certainly "otherwise".

--
Dave Anderson

Unsolicited commercial email will be read at a cost of $500 per message. Use
of this email address implies consent to these terms. Please do not contact
me directly or ask me to contact you directly for assistance. If your
question is worth asking, it's worth posting.
Jul 22 '05 #20
Gazing into my crystal ball I observed "Roland Hall" <nobody@nowhere>
writing in news:uA**************@TK2MSFTNGP10.phx.gbl:
"Dave Anderson" wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
: Roland Hall wrote:
: > There is no leading , (comma) so another <td> had to be inserted.
: >
: > prt("<td>" & replace(arr(i),",","<td>"))
:
: I would go even further and reach for HTML completeness:
:
: prt("<td>" & replace(arr(i),",","</td><td>") & "</td>")

HTML completeness? I thought ending tags were no longer required?
However, wouldn't it then be:
prt("<td>" & replace(arr(i),",","</td><td>") & "</td></tr>")

Roland


With XHTML, all elements must be closed and lowercase, even IMG and BR,
eg: <img src="some.png" alt="" /> or <br />

Even with HTML you do a lot better to use closing tags, easier to debug,
etc.

--
Adrienne Boswell
Please respond to the Group so others can share
Jul 22 '05 #21
Dave Anderson wrote on 14 dec 2004 in
microsoft.public.inetserver.asp.general:
Evertjan. wrote:
It has been my observation that IE displays nothing at all until the
table is closed, while Mozilla/Firefox/Opera will display each row
as it arrives (buffering must be off to see this in effect).


my observation is otherwise


From your detailed response I infer you observed a powered-off CRT.
That is most certainly "otherwise".


A powerful, though incorrect infer-sion.
--
Evertjan.
The Netherlands.
(Please change the x'es to dots in my emailaddress)
Jul 22 '05 #22
Adrienne Boswell wrote:
Where Talal's managerial executive frowns, Waleed leaves near
islamic, corresponding networks.


What??
Adrienne, is somebody spoofing you? If not, why the Followup-To to
news.admin.net-abuse.email

Bob Barrows
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Jul 22 '05 #23
Gazing into my crystal ball I observed "Bob Barrows [MVP]" <reb01501
@NOyahoo.SPAMcom> writing in news:#h**************@TK2MSFTNGP15.phx.gbl:
Adrienne Boswell wrote:
Where Talal's managerial executive frowns, Waleed leaves near
islamic, corresponding networks.


What??
Adrienne, is somebody spoofing you? If not, why the Followup-To to
news.admin.net-abuse.email

Bob Barrows


Yes, apparently, someone IS spoofing me. How odd! As a matter of fact, I
wasn't even near a computer when that post was made, I was in a taxi on the
way home from work.

It looks like HipCrime to me, and if you look at the headers you can see
the originating post is from 210.178.1.125, where my posts come from
67.102.130.26 (work) and 64.160.235.41 (home).


--
Adrienne Boswell
Please respond to the group so others can share
Jul 22 '05 #24

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

2 posts views Thread by Maris Janis Vasilevskis | last post: by
3 posts views Thread by BjörnHolmberg | last post: by
2 posts views Thread by noOby | last post: by
2 posts views Thread by Dave | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.