By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,827 Members | 1,984 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,827 IT Pros & Developers. It's quick & easy.

browser resends ajax request?

jhardman
Expert 2.5K+
P: 3,405
I have a web app that updates a very large file. processing time takes around 45 seconds (and this is good. there might be ways to trim that down, but it is seriously a huge file). I use (old-school) ajax (not jquery) to call a web service that runs through the file, it checks for errors and updates a database. everything seems to be working EXCEPT that the browser keeps re-sending the ajax request every 30 seconds if it doesn't get a response. and my web server just starts a new thread each time.

Is this a behavior I can change or prevent?
Jan 11 '19 #1
Share this Question
Share on Google+
4 Replies


gits
Expert Mod 5K+
P: 5,234
well - apparently this can happen under some circumstances like here:

https://blogs.oracle.com/ravello/bew...omatic-retries

or

https://stackoverflow.com/questions/...-post-requests

for me it seems that retrys can be valid behaviour thus i think somehow it should be handled by the software - in case the performance cant be improved to avoid the timeouts.
Jan 21 '19 #2

jhardman
Expert 2.5K+
P: 3,405
re-worked it to run for ten seconds and respond back with how far it got, then set a timer in the javascript to call it again starting where it left off if it wasn't finished. seems pretty kludgy to me. is that what you had in mind?

Jared
Feb 6 '19 #3

gits
Expert Mod 5K+
P: 5,234
well - for me it doesnt sound too kludgy - if its a process that a user starts in a frontend and you want him to wait for it and the process takes its time (and this time cant be improved) then its a good workaround and even something that you can use to inform the user about the progress so he/she isnt left alone with a indefinite requesttime.

Another way could be to use websockets or serverside events which would allow to push the finishing message to the client when everything is done. so you could send a request that starts the process and return with a 'process started' message directly. then - when the process is done - emit a message serverside to notify the client about it. This is a bigger re-work tho in most cases.
Feb 6 '19 #4

gits
Expert Mod 5K+
P: 5,234
to add another option to it that i used when having such longer running processes: when it was possible i did split the process intentionally by using parallel XMLHttpRequests. basically 1 first request was just retrieving the max and then i did split that into smaller chunks by sending an offset along with the requests in parallel. Depending on how the backend can work with that this did lead to a performance increase (better user experience due to shorter wait time and less issues with timeouts) with such longer running processes. its similar to your solution - but a bit more predictable and it might be that the parallelism can improve overall processing time. In case its a locking operation (exclusive reading of a file or session or such) the first 'determining' request could prepare the parallel processing by splitting the file/DB already and tell the frontend how many requests will be needed to get all chunks - long story short - there are some ways (including yours) to handle such things and optimize them if needed.
Feb 8 '19 #5

Post your reply

Sign in to post your reply or Sign up for a free account.