Two Simultaneous Ajax Requests Won't Run in Parallel

Two simultaneous AJAX requests won't run in parallel

Sounds like a session blocking issue

By default PHP writes its session data to a file. When you initiate a session with session_start() it opens the file for writing and locks it to prevent concurrent edits. That means that for each request going through a PHP script using a session has to wait for the first session to be done with the file.

The way to fix this is to change PHP sessions to not use files or to close your session write like so:

<?php
session_start(); // starting the session

$_SESSION['foo'] = 'bar'; // Write data to the session if you want to

session_write_close(); // close the session file and release the lock

echo $_SESSION['foo']; // You can still read from the session.

ajax calls not parallel

Requests

Requests

Response

Response

Hi,

I just ran a test of your script and response is in the images above. The first image is list of AJAX requests. The green ones are completed requests and the gray one is still in progress request. The second image shows a log of response of the calls in the order the request completed (first completed request shows on top and so on..).

1 in the images refers to the first AJAX call (of course I renamed it to /set-progress instead of /admin/movies/1). The subsequent second AJAX calls are tagged 2 - 6 and
/progress = /admin/movies/progress.

I presume your problem is that the subsequent (second call) requests are not running until the first one has completed. From my test which I conclude to be false. We can, clearly, see that the first request is still pending while other requests have completed. So, We can say that your requests are running in parallel.

For your case, there can be only one issue (the least probable; provided you would have enough resource to process a small request) and that is, your server is out of resources. meaning it does not have enough memory to serve the second calls. since the first call went through and the server is out of resources, it cannot process further requests and hence blocking the second calls until the first request has completed. So, your first call or any other running task/process/request is consuming your server resources, hence, blocking further requests.

Reading the comments and previous answer about the file being locked so parallel requests cant be sent to the server or the server cannot accept parallel requests. This statement is also incorrect. Session files are locked each time the session file is being updated that is true but this does not have anything to do with the browser sending multiple requests at once.

If the file is unavailable/does not exist during any request, Laravel simply throws a 500 error. Attached image below shows response for the case, the file not being available.

File not available

If the file is locked while writing, the response (to progress call) is simply empty. This means, the file exists but its contents are unavailable at the moment. The image below shows a sample response.

File locked during write operation

If this answer does not satisfy you, please follow these links. See if they might help.
Link 1 and Link 2.

My test environment:

  • PHP v7.0.13
  • Apache v2.4.23
  • Laravel v5.4
  • jQuery v3.2.1
  • Firefox v54.0.1
  • Platform Windows 10

Parallel asynchronous Ajax requests using jQuery

Try this solution, which can support any specific number of parallel queries:

var done = 4; // number of total requests
var sum = 0;

/* Normal loops don't create a new scope */
$([1,2,3,4,5]).each(function() {
var number = this;
$.getJSON("/values/" + number, function(data) {
sum += data.value;
done -= 1;
if(done == 0) $("#mynode").html(sum);
});
});

Multiple simultaneous running ajax requests in ExtJS 4

Singleton or non-singleton doesn't even change the way Ext.Ajax works. I think this could be due to the coding (did you wait for the calls to finish?)

Afaik, I never have this problem before when I do multiple calls. The only thing that is hogging the calls is the server (PHP), which doesn't support parallel processing and causes delays, and generate a pattern like this

  • Call 1 - start
  • Call 2 - start
  • Call 1 get processed in the server and Call 2 get queued up
  • Call 1 - finished
  • Call 2 get processed in server
  • Call 2 - finished

It could be disastrous if Call 1 requires more time to process than Call 2.

EDIT:

I have written this little demo just for you to feel how does it works. Check it out :) Spent me half an hour lol!

Multiple parallel request is not handled by server

Decorate your controller with [SessionState(SessionStateBehavior.ReadOnly)] attribute, then you can execute requests in parallel.

Multiple simultaneous AJAX GET requests block the server until data is returned

You can look into $.Deffered which can be used to chain multiple ajax calls to make them sequential. Your server will receive only one ajax call at a time and the next will be made after first one finishes. Your server will be free to respond to other requests.

  • Docs:https://api.jquery.com/category/deferred-object/
  • Similar issue: How to make all AJAX calls sequential?

Why second independant ajax response waits to be get untill first ajax response?

Session blocking problem.

In exp1() - calling session_write_close() before sleep(10) should prevent delay on other concurrent page requests.

"By default PHP writes its session data to a file. When you initiate a session with session_start() it opens the file for writing and locks it to prevent concurrent edits."

Two simultaneous AJAX requests won't run in parallel

"Long story short - call session_write_close() once you no longer need anything to do with session variables."

https://www.codeigniter.com/user_guide/libraries/sessions.html#a-note-about-concurrency



Related Topics



Leave a reply



Submit