Maximum Execution Time in PHP - running on Windows Server

Hello Guys,

How can I stop my PHP script from throwing up “maximum execution time” error. Actually I’m trying to extract information from a database of about 3000 applicants, then export the information into a CSV file. It was working fine when there were around 500 or so employees, but now that it’s around 3,000, the “maximum execution time” error keeps coming up after a while. I’ve tried to use ini_set(“max_execution_time”, “1000”); but it’s not working. What else do you think I can do?

Please note that the website is hosted on a windows server so I can’t use .htaccess. I don’t think I’ll be allowed to edit the php.ini file too.

Please is there any way around this?

i was going to say the same thing too, 3000 records is not much, u need to optimism your query or upgrade your server specs. And if all fails u can manually change the max execution time in your php.ini file, provided u are not on shared hosting.

Your environment is probably set to use safe mode and as such, you’ll likely be unable to set your maximum execution time. If you’re having only 3000 applicants, you should still be able to generate a CSV export without hitting the maximum execution time. Do you mind sharing what query you’re using and possible table schema?

Place this code at the top of your script:

set_time_limit(0);

http://www.php.net/set-time-limit for reference.

Set_time_limit(0) is the correct thing to do if you intend this to happen only on the current script during its runtime. If not, do it in your ini. The zero takes the priviledge of timing out from the php server. It now depends on the maximum time allowed for a script on the Os

Guyz, plz is there anything like stoppage script for php. One that can be used in iwp to stop the server.

Guyz, plz is there anything like stoppage script for php. One that can be used in iwp to stop the server.

Guyz, plz is there anything like stoppage script for php. One that can be used in iwp to stop the server.

3000 records? it will even be very hard to simulate that kind of environment. the database must have been poorly designed.

@Sledjama: this platform (codenigeria) is designed for all of us to interact with each other NOT to condemn people’s work, i would rather you proffer a solution than concluding that “the database must have been poorly designed”

Solution A:

  1. set_time_limit(…)
  2. Create a .htaccess with this line: php_value max_execution_time 200
  3. edit php.ini and put/modify the entry: max_execution_time = 90
    http://php.net/manual/en/function.set-time-limit.php - Google is your friend.

Solution B:

  1. Change your script to “SELECT * FROM table LIMIT $n,100” where $n is a variable retrieved from the $_GET[‘n’] parameter that increases by 100 each time. Once the script is complete it can either do a javascript call i.e. document.location = “script.php?n=”.$n . or header('location: script.php?n=".$n);

Solution C:

  1. Look at your PHP code, your SQL query, your database layout. Something here is happening way way too slow. Hard to know for sure what you’re doing wrong without seeing the code…
    tips:
    A) How complex is your SQL query. Does your where or order clauses search by indexed/primary key columns?
    B) Are you doing one big huge dirty SELECT * FROM table, getting 3000 records from table? When you could retrieve records in batches of perhaps 100 records.
    C) Or are you doing 3000 little queries which each retrieve a single result.
    D) Do you need all the data in all the columns? Do you have 1 column that is huge like a big chunk of text that doesn’t really need to be exported to your CSV file?
    E) What are you doing once you receive the data, are you processing this data? Is this slow?
    F) How are you storing the data to the .csv file? Is this a slow process.
    G) Where is your SQL server, and where is your client (Script). If your server is on a remote webserver, and the client is remotely accessing it, then you may find it considerably slower to do this. So better to run the client on the same machine as the server for this sort of CSV generation.

Try removing (commenting out) parts of the functionality of your SQL->CSV converter. It’ll help you find out where the bottleneck is (Reading, Processing, or Writing).

Sure I know SQL (MySQL) isn’t that fast - I’ve had to write C++ servers that would literally buffer upto 1GB of peak data, before dumping it out slowly into SQL via queries. But SQL should be fast enough to give you the results of 3000 records. - Only time I’ve had experience with such slow SQL data processing is with backing up internet forums which may have on the order of 100,000+ records.