Given Wikidata is currently grows at 3-10% **a week**, we need to make the Wikidata entity dumpers keep up with that.
The changes in batch size (4eedfb48e9fdc93eea13d9fd3bd341e66c1abfbc) and https://github.com/wmde/WikibaseDataModel/pull/762 will already ease some of the pain, but given the immense growth, this can probably hardly offset four weeks of Wikidata growth.
Possible things to do:
* Create a "master dump" (or some such) which all other dumps can be derived from (this will ease the pain on the DBs, but hardly considering CPU time)
* Increase the number of runners further (from 5 currently) https://gerrit.wikimedia.org/r/383414
* Try to derive old dumps from new ones (not quite easy to do and not sure how much to gain here)
* Do more profiling and try to find more low-hanging fruits (like the examples above, or T157013)
* Switch away from PHP5 to PHP7 or HHVM (also see the related discussion at T172165)
* Find the right `--batch-size` (https://gerrit.wikimedia.org/r/384204)
* …
#patch-for-review:
[x] https://gerrit.wikimedia.org/r/380628
[x] https://github.com/wmde/WikibaseDataModel/pull/762
[x] https://github.com/wmde/WikibaseDataModel/pull/764
[x] Release https://github.com/wmde/WikibaseDataModel/pull/762 and https://github.com/wmde/WikibaseDataModel/pull/764. Note: Will be deployed in `1.31.0-wmf.5`.
[x] https://gerrit.wikimedia.org/r/383414
[x] https://gerrit.wikimedia.org/r/384204
[] Conclude and revert/ alter https://gerrit.wikimedia.org/r/384204
[x] https://gerrit.wikimedia.org/r/384322 (T178247)
I consider this task done when the dumps finish no later than mid-Thursday again and don't run well into the weekend.