Simultaneous Requests to PHP Script

Simultaneous Requests to PHP Script

The server, depending on its configuration, can generally serve hundreds of requests at the same time -- if using Apache, the MaxClients configuration option is the one saying :

The MaxClients directive sets the
limit on the number of simultaneous
requests that will be served.
Any
connection attempts over the
MaxClients limit will normally be
queued, up to a number based on the
ListenBacklog directive.
Once a child
process is freed at the end of a
different request, the connection will
then be serviced.



The fact that two clients request the same page is not a problem.

So :

Will the requests be queued?

No ; except if :

  • there is some lock somewhere -- which can happen, for instance, if the two requests come from the same client, and you are using file-based sessions in PHP : while a script is being executed, the session is "locked", which means the server/client will have to wait until the first request is finished (and the file unlocked) to be able to use the file to open the session for the second user.
  • the requests come from the same client AND the same browser; most browsers will queue the requests in this case, even when there is nothing server-side producing this behaviour.
  • there are more than MaxClients currently active processes -- see the quote from Apache's manual just before.


Will they be ignored?

No : this would mean only one user can use a website at the same time ; this would not be quite nice, would it ?

If it was the case, I could not post this answer, if you where hitting F5 at the same moment to see if someone answered !

(Well, SO is not in PHP, but the principles are the same)


Any other possibility?

Yes ^^



edit after you edited the OP and the comment :

Will each request have its own script
instance?

There is no such thing as "script instance" : put simply, what's happening where a request to a script is made is :

  • the webserver forks another process to handle the request (often, for performance reasons, those forks are made in advance, but this changes nothing)
  • the process reads the PHP script from disk

    • several processes can do this at the same time : there is no locking on file reading
    • the file is loaded into memory ; in a distinct memory block for each process
  • the PHP file in memory is "compiled" to opcodes -- still in memory
  • those opcodes are executed -- still from the block of memory that belongs to the process answering your request



Really, you can have two users sending a request to the same PHP script (or to distinct PHP scripts that all include the same PHP file) ; that's definitly not a problem, or none of the website I ever worked on would work !

Multiple php requests simultaneously, second request doesn't start until first finishes

This is because the PHP server only uses a single process by default, which prevents it from executing multiple requests simultaneously.

The built-in server supports multiple workers from PHP 7.4 onward (on platforms where fork() is available). See this commit: https://github.com/php/php-src/commit/82effb3fc7bcab0efcc343b3e03355f5f2f663c9

To use it, add the PHP_CLI_SERVER_WORKERS environment variable before starting the server. For example:

PHP_CLI_SERVER_WORKERS=10 php7.4 -S 127.0.0.1:7080 index.php

I recommend upgrading your PHP 7.2 installation if you want to utilize this feature.

PHP requests one by one or simultaneously

  1. If you have a route which is served by a controller function, for each request there is a separate instantiation of the controller. For example: user A and user B request same route laravel.com/stackoverflow, the controller is ready to respond to each request, independent of how many users are requesting at the same time. You can consider similar as a principle of processes for any service. For example, Laravel is running on PHP. So PHP makes process threads each time we need PHP to process any script. Similarly Laravel instantiates the controller for each request.
  2. For same user sending multiple requests, it will be still processed like point 1.
  3. If you want to process particular requests one by one, you can queue the jobs. For example let us say you want to process a payment. You have 5 requests happening. So the controller will be taking all requests simultaneously but the controller function can dispatch a queued job and those are processed one by one.
  4. Considering two persons try to request same route which has an DB update function, you can read a nice article here about optimistic and pessimistic locking.

How does php and apache handle multiple requests?

Requests are handled in parallel by the web server (which runs the PHP script).

Updating data in the database is pretty fast, so any update will appear instantaneous, even if you need to update multiple tables.

Regarding the mish mash, for the DB, handling 10 requests within 1 second is the same as 10 requests within 10 seconds, it won't confuse them and just execute them one after the other.

If you need to update 2 tables and absolutely need these 2 updates to run subsequently without being interrupted by another update query, then you can use transactions.

EDIT:

If you don't want 2 users editing the same form at the same time, you have several options to prevent them. Here are a few ideas:

  1. You can "lock" that record for edition whenever a user opens the page to edit it, and not let other users open it for edition. You might run into a few problems if a user doesn't "unlock" the record after they are done.
  2. You can notify in real time (with AJAX) a user that the entry they are editing was modified, just like on stack overflow when a new answer or comment was posted as you are typing.
  3. When a user submits an edit, you can check if the record was edited between when they started editing and when they tried to submit it, and show them the new version beside their version, so that they manually "merge" the 2 updates.

There probably are more solutions but these should get you started.

Can a PHP file handle multiple requests?

Haven't seen an implementation for http requests for that. All I've been able to accomplish is that you wait for all the requests to come back. You could do this in command line by forking the process and sending it to the background. Or you could utilize Gearman (distributed work) for that.

Allow clients to run multiple simultaneous PHP requests

Sounds like you're looking to perform an asynchronous request (AJAX), and jQuery is useful for this.

Check out the jQuery.ajax() documentation.

There are plenty of tutorials online to help you get started with jQuery, and this SO post discusses how to make Multiple ajax calls at same time.

Edit:
Since you clarified in your comments that:

  • You are already using jQuery but are running into challenges with the server allowing multiple simultaneous connections.
  • You also indicated that you already tried adding missing values to httpd.conf and restarted your Apache server.

If adding missing settings to httpd.conf doesn't work and you have also checked that they are not already included in httpd.conf through, perhaps, httpd-mpm.conf.

Your OP indicated that you have tried resolving the session conflicts using session_write_close(), but perhaps the additional requests are getting discarded/ignored rather than queued.
You could try replacing the default file-based session handler with one that calls a database, which should eliminate the problem of conflicts that arise when two processes attempt to read the same session file at the same time. This article (Saving PHP's Session data to a database) seems to cover it pretty well.

How are multiple requests to one file on a server dealt with?

The webserver (Apache, for example) is generally able to deal with several requests at the same time (the default being 200 or 400 for Apache).

If the requests correspond to read-only situations, there should be no problem at all : several process can read the same file at the same time -- and if that file is a PHP script, several process can execute it at the same time.

If your script is query-ing a database, there should not be too much problems : databases are made to deal with concurrency situations (even if reads scale better than writes, which may have to be stacked, if they modify the same data).

If your script is trying to write to a file, you should put some locking mecanism in place (using flock, for example), to avoid that ; as a consequence, each process will wait until there is no other writing to the file, before writing.



Related Topics



Leave a reply



Submit