Code: Select all
pdftk sc01.pdf dump_data | grep NumberOfPages | awk '{print $2}'
Moderators: kcleung, Wiki Admins
Code: Select all
pdftk sc01.pdf dump_data | grep NumberOfPages | awk '{print $2}'
Squid caching is way to grand for IMSLP right now Usually there are dedicated squid caching servers (with a ton of memory), but right now everything about IMSLP is on one server, and the first step towards multi-server is usually not in the direction of squid caching (usually it is the separation of MySQL).horndude77 wrote:Does wikimedia not handle caching of pages already? I'm not too familiar with its internals. If not I remember reading a while back about wikipedia using squid proxies to do some caching for them. Perhaps that's overkill though.
Indeed... the difference is that Mediawiki handles file size caching, but of course not file page number caching... so this will have to be implemented by me And yes, this will be the way to do caching if it is to be done; and yes, there will be a *lot* of headaches along the way :/ Especially since I'm not particularly familiar with the Mediawiki image processing code (and even less so with the huge change to it in Mediawiki 1.11)... but we'll see.Another option is to only do it once when the file is added. This however still has the disadvantage that if the file is replaced the pages field can get out of sync. Or worse someone can change the page count field for fun. How are you handling the file size? Perhaps this could be handled the same way (store it off in a database I assume).