You can usually find me as Vulpix@user/ciencia-al-poder on IRC
Outside WMF I'm a developer and system administrator, and owner of https://www.wikidex.net/
You can usually find me as Vulpix@user/ciencia-al-poder on IRC
Outside WMF I'm a developer and system administrator, and owner of https://www.wikidex.net/
Since MCR is part of core and ExternalStorage is part of core (T362566), why was the migration script created on the WikimediaMaintenance extension instead of being in core?
El T92795#10100600, @SD0001 escribió:Ideally yes, but it's almost impossible in today's MediaWiki architecture. Even just preparing to make an edit is a complex process that involves checking permissions, ip blocks, rate limits, unicode compatibility, page protections (including cascading ones), and other restrictions added by extensions (spam/title blacklists, abuse filters, captchas). Extensions do not try to do all (or any) of that by themself – they just rely on MediaWiki core. There's no way to tell core to "perform all other checks but skip the editcontentmodel right check".
That is, until someone rewrites EditPage. It's marked with Surgeon General's Warning: prolonged exposure to this class is known to cause headaches, which may be fatal so I wouldn't hold my breath.
File from 2016: https://upload.wikimedia.org/wikipedia/commons/9/9b/Icons8_flat_phone_android.svg (file description page) is served as Content-Type: image/x-icon
El T62399#9979538, @Scott escribió:As a suggestion for how this could be implemented:
- check the title field value on keydown/paste event
- disable button if Foo:Foo:Bar and display a warning plus "yes, use this title" override checkbox which re-enables the button
Change #1043060 had a related patch set uploaded (by SD0001; author: SD0001):
[mediawiki/core@master] Don't check for editcontentmodel right while creating pages
I see a surge in backlog for WebVideoTranscodePrioritized
There's a solution that doesn't involve schema changes: auto create the page when the first comment is posted
El T10482#9737567, @Tgr escribió:For enhanced watchlists, (top) is probably not very useful, since edits are already grouped by page so most of the time it's obvious that the first of the group is the top edit.
(Only most of the time, because they are also grouped by date - T10681.)
Fixed, sorry, I didn't catch that strings are not nullable by default in php
In T361993#9696193, @Fersteax_Pasique wrote:I can't run anything, my wiki is on Miraheze, a wikifarm, I have no access to my wikis files.
In T362013#9695476, @Killarnee wrote:Expected behavior is that there are no reverted and manual revert tags attached to these two edits, because in this case the following edit doesn't revert the previous edit.
Check the Job queue manual page to see how the queue works. You may need to run runJobs.php to speed up the process of deleting pages
Oh! Sorry, I didn't notice that! That's correct indeed
I'm not sure if this is the same reason, but sometimes the update just doesn't happen, even if it's not a revert, deletion or done in quick succession.
If patroller X has a patrolling script that adds tags to some existing edit, will it work?
No. If https://gerrit.wikimedia.org/r/992763 gets merged, the right will only be granted to bots and sysops.
If AbuseFilter triggered by autoconfirmed adds tags to theirs edit, will it work?
Yes. Extension's actions are not affected by user rights.
Apparently, renaming an audio/video file doesn't automatically rename transcodes. Instead, new transcodes are queued, and only when purging the file description page. File renames should also rename the corresponding transcodes too
I've updated the patch and it should catch those corner cases, because I'm using the same logic as normalizeUserName from ActorStore
Looking at T353766, even if I have no details about the cause, I've amended the patch to check for general validity, which should also work for that other task
Apparently, you can't have a user name "ßéÿßlâÐëRèvêñgë". The first character "LATIN SMALL LETTER SHARP S" is detected by MediaWiki as a lowercase character and the user name converted to "SSéÿßlâÐëRèvêñgë" automatically by MediaWiki classes handling revisions and inserting the actor name.
Apparently, Discord has changed the way it displays embeds and now it displays 3 images, too. It was displaying only one until today.
I'm not sure why the task was repurposed. Create a new one? Apparently, when Safari is unable to display transparency, it chooses black as the background, which is exactly what your repurposed task wants to do. Supporting transparency should work for all browsers except Safari, which would display a black background instead.
Indeed, it works now. Closing...
Can I reopen this?
Thanks for the fix! I've tested it and it now works
In T54647#9313754, @alistair3149 wrote:As for Reader Web, it shouldn't affect the mobile footprint as they serve pages through MobileFrontend, which can strip the metadata out of needed.
I have the same issue. MediaWiki 1.39, PHP 7.4.33. I have more than 28000 *:rootjob:* keys on redis right now
Another option may be to add metadata to the pages that use the image. See https://developers.google.com/search/docs/appearance/structured-data/image-license-metadata
If anyone else encounters this problem and wants to recover the data from backup (only this data, and not rollback the entire database), follow this steps:
I'm pretty sure Nuke doesn't need to check IP addresses on Recent Changes. It should work using actor only. Looking at the code, I think usages of rc_user_text should be replaced by an actor name
In T117279#9255216, @Sollyucko wrote:I would like to have side by side available as an option on mobile, since inline diffs are sometimes an illegible mess of alternating words, e.g. an extreme case in https://en.m.wikipedia.org/wiki/Special:MobileDiff/1180439256: F38450324
In T189989#9137913, Krinkle wrote:Given that the social convention around MediaWiki, as originated on Wikipedia, is to use /doc pages I would generally suggest going along with that. It is not sustainable for the entire ecosystem to support multiple ways to do the same thing. In some cases we need to due to mutually exclusive requirements between important use cases, but I don't see that being the case here. I believe you could transition slowly/gradually, so there's no big barrier there either.
For the original request, I would suggest declining this request.
An additional benefit, in contrast to what WMF does to serve WebP instead of PNG/JPG on the CDN layer, is that files will be served with the correct file extension. Currently on WMF, some thumbnails are served as WebP with .png extension, confusing users when downloading those thumbnails to their devices for later viewing/editing or uploading them somewhere else.
In T18691#9034663, @Prototyperspective wrote:This would be especially
usefulneeded on mobile where one doesn't have a TOC.
When fixing this, debugging the resulting SQL query, I noticed something strange. The default value as defined in extension.json is [ 0 ]. If I set $wgNamespacesForEditPoints = [ 0, 6, 10, 14 ], the resulting query results in [ 0, 0, 6, 10, 14 ] (notice how the value 0 is repeated)
No, it doesn't fix the issue...
I'm trying to change LinkSuggest to the textSelection API, but this alone doesn't solve the problem of LinkSuggest not working with CodeMirror.
Someone else posted the same error here: Topic:Xgzx1hu3sdlxsyg1
Such a configuration exists on MediaWiki 1.39, added to solve T217307
Note that, if you follow a link to Special:MyLanguage but the page hasn't been translated in your language, it currently redirects you to the base page.
In T326859#8738554, @Jdlrobson wrote:I don't think it's unreasonable to expect reconfiguration with a new release. This happens quite frequently from my experience with MediaWiki and I do not see any guidance in https://www.mediawiki.org/wiki/Stable_interface_policy prohibiting it.
I'm going to be bold and reopening this task, for multiple reasons.
I've uploaded a fix. If you can test it and run the script again against the namespaces that failed, it should fix the problem. Existing revisions should be skipped
Fixed by Func in https://gerrit.wikimedia.org/r/c/mediawiki/tools/grabbers/+/881378
In any case, the fix would be to only skip the setParentId, not the revision insertion (if that doesn't cause any other issue when inserting the revision).
That's strange. The scripts haven't changed much since it was tested on 1.37. And in fact I have successfully imported a wiki very recently to 1.39
While I can see a potential security problem by allowing defining variables, using the var keyword doesn't have any potential impact that I'm aware of, and can be very useful for wikis that may define variables in each skin CSS file, or even core skins can define variables for themes, dark mode, etc.
In T328610#8587808, @Tacsipacsi wrote:What should be done with gadgets whose usability/usefulness really depends on desktop vs mobile, not on Vector vs Minerva (e.g. Navigation-Popups-Gadget, which doesn’t work without a mouse)?
I'd prefer a maintenance script to trigger those notifications. Per T58028#3097853, the next edit congratulory notification may be difficult to reach when it's reaching large numbers. This would give site owners a lot of control of when those messages should be sent (once a week, a month, etc).
I think this request is invalid because it already supports those parameters (they existed before the task was created, as they are mentioned in rPWBCac834c998a from 2014)
I'm experiencing the same problem very occasionally on wikis using redis as the cache backend, and also on wikis with memcached backend. Both redis and memcached are fully functional and I don't see any error in redis logs.
The #content element has this rule applied:
This recent commit may fix the issue rMW62763e5d52d519a8b08b8b178e0da34822367a08
At mediawiki.org there are lots of anons confirming fuzzied translations in bad faith, every day, and there's no easy way to revert them. I have to manually copy part of the text that was unwrapped from the span tags, and search it in the translation page to manually add the !!FUZZY!! text. This is very time consuming.
Note that the revision table was a 2-step migration.
In T282453#8469290, @adrelanos wrote:In any case (no need to nginx or above config) MediaWiki's core could generate a different HTML based on the browsers accept: HTTP request header. Two different HTML document versions in the cache however might be OK for a small wiki but too much for Wikipedia.
I hacked the cleanupUsersWithNoId.php script for MediaWiki 1.35 to fix other problems that weren't addressed by the original script: https://gist.github.com/ciencia/7cbe63b8520d4816ec36733454a2cb9a