Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync of large wikis or with large files fails #42

Open
Alfredo-HMS opened this issue Aug 24, 2016 · 5 comments
Open

Sync of large wikis or with large files fails #42

Alfredo-HMS opened this issue Aug 24, 2016 · 5 comments

Comments

@Alfredo-HMS
Copy link

We have a quite large wiki (25.000 files) with quite some large media files. To make the plugin work we have to:

  • Change parameters max_execution_time and memory_limit in php.ini
  • In file admin.php of sync plugin in function sync change the call to function @set_time_limit(30) to @set_time_limit(300) and in function _getSyncList add a call to @set_time_limit(300) just before the comment \get remote file lis

We think it'd be usefull not to have that time limit hardcoded to 30 but to include it as a parameter of the plugin and to include the second call in function _getSyncList

@FuzzyRoll
Copy link

@Alfredo-HMS For someone not that good with php, how do i add a call to @set_time_limit(300) in function _getSyncList?

Would love to get this working.

@Alberto-13
Copy link

I have a similar probem with large list of files and have edited the admin.php file as suggested by Alfredo-HMS (I do not know, if I need to change the php.ini file as well and to what values; I am new to php). I get the following error message:
"Failed to fetch remote file list. transport error - Timeout while reading headers (15.015s) ><"

  • any suggestions?

@Alfredo-HMS
Copy link
Author

Changing the @set_time_limit you change the maximum execution time of the script. The max_execution_time in php.ini sets the same limit but globally. It's quite redundant to change both of them but to play safe we change them both. The time limit set in the admin.php should overwrite the time limit set globally but anyway check what time limit you have gobally set in php.ini

The memory_limit in php.ini needs to be changed if you want to transfer big files like when you have attached big pdf to your pages. Set it as big as your largest file at least. We've set a limit of 256M and if works fine.

The message you get says the script has run into a maximum limit of execution of 15 seconds.

@Alberto-13
Copy link

@Alfredo-HMS thank you, that works!

  • php.ini: setting memory_limit to 256M;

  • admin.php: setting @set_time_limit to 300, adding a call to @set_time_limit(300) in function _getSyncList;

  • and when calling the plugin on the "Wiki Synchronization" page, setting the Timeout to 300

@eduardomozart
Copy link

eduardomozart commented Nov 19, 2021

@Alfredo-HMS, gratefully by your suggestions I was able to sync a secondary wiki to a primary one.
I believe that this plugin should be updated to use a mechanism similar to the "SearchIndex Manager", "Move page and namespace" and "BatchEdit" plug-ins, that uses AJAX to query the server, avoiding timeout issues.
When pushing large media files from remote wiki, it could push it by chunks to avoid timeout.
These approaches would diminish the need of changing PHP parameters and could have another good side-effect, would allow the user to follow the synchronization process because the browser could report in real-time the progress of the synchronization, and could allow the user cancel the synchronization at any time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants