473,395 Members | 1,460 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Very Fast Multithreaded URL Fetching

Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server
side fashion?

If the request is taking longer than it should, the object would need to be
able to timeout on the spot (without waiting). The timeout value would need
to be acurate to about 50 milliseconds. Usually, I would want the timeout to
be around 1.5 seconds.

Thanks in advance,
Arsen
Nov 15 '05 #1
10 2302
Arsen,

Suppose the first question is why does it need to be multithreaded?

Running it in a thread possibly fair enough but a simple loop should
suffice, no point complicating something that doesnt require it ;-)

Look at WebRequest class. You can set the timeout as a property. Feed the
WebRequest.Create(URI) from values in a loop from an array of 10 that holds
your urls.

Tam

"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #2
Arsen,

Suppose the first question is why does it need to be multithreaded?

Running it in a thread possibly fair enough but a simple loop should
suffice, no point complicating something that doesnt require it ;-)

Look at WebRequest class. You can set the timeout as a property. Feed the
WebRequest.Create(URI) from values in a loop from an array of 10 that holds
your urls.

Tam

"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #3
sorry, and i should have added...

Running 10 consecutive webrequests is considered rude by standard internet
protocol. 2 is the standard which is why XP is set to only allow two
downloads at any one time.

so a single threaded loop to go through an array of urls is your best
solution ;)

Tam
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #4
sorry, and i should have added...

Running 10 consecutive webrequests is considered rude by standard internet
protocol. 2 is the standard which is why XP is set to only allow two
downloads at any one time.

so a single threaded loop to go through an array of urls is your best
solution ;)

Tam
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #5
hey, where did my oringal reply go?... ok, here it is again.

Arsen,

Suppose the first question is why does it need to be multithreaded?

Running it in a thread possibly fair enough but a simple loop should
suffice, no point complicating something that doesnt require it ;-)

Look at WebRequest class. You can set the timeout as a property. Feed the
WebRequest.Create(URI) from values in a loop from an array of 10 that holds
your urls.

Tam

"joseph.inglis" <jo***********@btclick.com> wrote in message
news:bv**********@hercules.btinternet.com...
sorry, and i should have added...

Running 10 consecutive webrequests is considered rude by standard internet
protocol. 2 is the standard which is why XP is set to only allow two
downloads at any one time.

so a single threaded loop to go through an array of urls is your best
solution ;)

Tam
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server
side fashion?

If the request is taking longer than it should, the object would need to

be
able to timeout on the spot (without waiting). The timeout value would

need
to be acurate to about 50 milliseconds. Usually, I would want the

timeout to
be around 1.5 seconds.

Thanks in advance,
Arsen


Nov 15 '05 #6
hey, where did my oringal reply go?... ok, here it is again.

Arsen,

Suppose the first question is why does it need to be multithreaded?

Running it in a thread possibly fair enough but a simple loop should
suffice, no point complicating something that doesnt require it ;-)

Look at WebRequest class. You can set the timeout as a property. Feed the
WebRequest.Create(URI) from values in a loop from an array of 10 that holds
your urls.

Tam

"joseph.inglis" <jo***********@btclick.com> wrote in message
news:bv**********@hercules.btinternet.com...
sorry, and i should have added...

Running 10 consecutive webrequests is considered rude by standard internet
protocol. 2 is the standard which is why XP is set to only allow two
downloads at any one time.

so a single threaded loop to go through an array of urls is your best
solution ;)

Tam
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server
side fashion?

If the request is taking longer than it should, the object would need to

be
able to timeout on the spot (without waiting). The timeout value would

need
to be acurate to about 50 milliseconds. Usually, I would want the

timeout to
be around 1.5 seconds.

Thanks in advance,
Arsen


Nov 15 '05 #7
There are a couple of ways of doing this. I think the simplest is to use the
HttpWebRequest object. In your loop you would do:

foreach (string url in urls)
{
HttpWebRequest request = new HttpWebRequest(...);
request.BeginGetResponse(...);
}

that will queue up all the requests. In the call to BeginGetResponse(), you
can specify a callback that will be executed when the requests finish. In
that function you'll get EndGetResponse() which will get you a WebResponse
object with the data you want.

--
Eric Gunnerson

Visit the C# product team at http://www.csharp.net
Eric's blog is at http://weblogs.asp.net/ericgu/

This posting is provided "AS IS" with no warranties, and confers no rights.
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #8
There are a couple of ways of doing this. I think the simplest is to use the
HttpWebRequest object. In your loop you would do:

foreach (string url in urls)
{
HttpWebRequest request = new HttpWebRequest(...);
request.BeginGetResponse(...);
}

that will queue up all the requests. In the call to BeginGetResponse(), you
can specify a callback that will be executed when the requests finish. In
that function you'll get EndGetResponse() which will get you a WebResponse
object with the data you want.

--
Eric Gunnerson

Visit the C# product team at http://www.csharp.net
Eric's blog is at http://weblogs.asp.net/ericgu/

This posting is provided "AS IS" with no warranties, and confers no rights.
"Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to
fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server side fashion?

If the request is taking longer than it should, the object would need to be able to timeout on the spot (without waiting). The timeout value would need to be acurate to about 50 milliseconds. Usually, I would want the timeout to be around 1.5 seconds.

Thanks in advance,
Arsen

Nov 15 '05 #9
Hi Eric,

Thanks for the suggestion.

Would I be able to achive performance of the Web Application Stress tool
using the Async HttpWebRequest or do I have to use sockets in multiple
threads?

Also, in your example how do I implement the timeout (needs to be able to
timeout in 500ms - 2000ms).

Also, how do I wait for either the timeout or all of the Async requests
completing?

Thanks,
Arsen

"Eric Gunnerson [MS]" <er****@online.microsoft.com> wrote in message
news:OI**************@tk2msftngp13.phx.gbl...
There are a couple of ways of doing this. I think the simplest is to use the HttpWebRequest object. In your loop you would do:

foreach (string url in urls)
{
HttpWebRequest request = new HttpWebRequest(...);
request.BeginGetResponse(...);
}

that will queue up all the requests. In the call to BeginGetResponse(), you can specify a callback that will be executed when the requests finish. In
that function you'll get EndGetResponse() which will get you a WebResponse
object with the data you want.

--
Eric Gunnerson

Visit the C# product team at http://www.csharp.net
Eric's blog is at http://weblogs.asp.net/ericgu/

This posting is provided "AS IS" with no warranties, and confers no rights. "Arsen V." <ar****************@emergency24.com> wrote in message
news:eU**************@TK2MSFTNGP12.phx.gbl...
Hello,

Does anyone know if an object or how to create one, that will allow me to fetch up to 10 URLs (containing XML-feed data) in an extremelly fast server
side fashion?

If the request is taking longer than it should, the object would need to

be
able to timeout on the spot (without waiting). The timeout value would

need
to be acurate to about 50 milliseconds. Usually, I would want the

timeout to
be around 1.5 seconds.

Thanks in advance,
Arsen


Nov 15 '05 #10
"Arsen V." <ar****************@emergency24.com> wrote in news:
#b**************@TK2MSFTNGP10.phx.gbl:
Also, in your example how do I implement the timeout (needs to be able to
timeout in 500ms - 2000ms).


Multiple threads is much easier, and use blocking sockets. Sure async is
faster - but unless you have a realy dog of a machine even blocking sockets
can easily saturate a 100 MB ethernet.
--
Chad Z. Hower (a.k.a. Kudzu) - http://www.hower.org/Kudzu/
"Programming is an art form that fights back"
ELKNews - Get your free copy at http://www.atozedsoftware.com

Nov 15 '05 #11

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

10
by: Eric S. Johansson | last post by:
I have an application where I need a very simple database, effectively a very large dictionary. The very large dictionary must be accessed from multiple processes simultaneously. I need to be...
2
by: pradyumna | last post by:
In Project settins - C/C++ - Code Generation, what is the difference between the option "Multithreaded" and "Multithreaded DLL". I understand that on selecting multithreaded option, single and...
6
by: Dan Kelley | last post by:
We have a multithreaded app that responds to events, and writes these events to a text file. This text file is used by an external system for further processing. We want to be able to write...
4
by: Jürgen Devlieghere | last post by:
We encounter a crash every now and then at a client, and now I'm starting to doubt the fundamentals of life :-) We have a list of structs. struct SContactProperty { public:...
6
by: B B | last post by:
Okay, here is what's happening: I have a reasonably fast laptop (1.4 GHz Mobile M, so comparable to 2.5GHz P4) doing .net development. Running Windows XP pro, SP2 IIS is installed and running...
0
by: Shujun Huang | last post by:
Hi, I am working on converting Informix database to Postgre. I have one question for fetching records using PostgreSQL. The record I am fetching is a variable size text string. Before fetching...
50
by: diffuser78 | last post by:
I have just started to learn python. Some said that its slow. Can somebody pin point the issue. Thans
3
by: groups | last post by:
Hi all, I've recently ported a rather large C application to run multithreaded. A few functions have seriously deteriorated in performance, in particular when accessing a rather large global...
3
by: Jake K | last post by:
I have a multithreaded application that I now want to convert into a Windows Service. Does application.run work in a windows service? Are there things to take into consideration when creating a...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.