Wikipedia:Bots/Fulfilled requests/2019
Appearance
- Contributions
- Operator: Computer Fizz
- Programming language: Node.js
- Function: cleanups
- Description: maintain rfd list, stub links, and correct
External links
toOther websites
, possibly other stuff as the need arises
--Computer Fizz (talk) 01:08, 20 September 2019 (UTC)
- Could you be more specific about what you mean by "maintain"? What exactly would you be maintaining? --Auntof6 (talk) 02:05, 20 September 2019 (UTC)
- @Auntof6: The RfD page has a list of all ongoing rfd's, which is tedious because it requires an extra step to opening and closing them. This bot will manage them isntead making things more simple (no pun intended). Computer Fizz (talk) 02:09, 20 September 2019 (UTC)
- I still don't understand. Plus, what would you be doing with "stub links"? --Auntof6 (talk) 04:38, 20 September 2019 (UTC)
- I mean this. It must be edited every time a new rfd is added or removed. Almost all of the page history is just that. And about the stub links, i plan to add them to short articles and removed from long articles. This feature can be tweaked, removed, or added to based on whether or not the community likes it. Computer Fizz (talk) 07:26, 20 September 2019 (UTC)
- That link is to an edit of a section of the page, but I already knew what page you're talking about. If we don't edit the page, how do you see letting people find the open RFDs?Twinkle automatically adds requests, so it's not a burden to edit the page for that.
As for the "stub links", I think you mean stub templates. How would you determine what pages to add them to? For one thing, it's not just a question of how long the page is; it's also about how complete the article is. For another, even if you did go by length, you'd have to count only the actual text as displayed, not the infoboxes, navboxes, categories, comments, tables, wikimarkup, references, related pages, other websites, and probably more. As for the external links, that might be helpful if we want a bot for it, but I don't usually find too many of them when I go through taking care of them. --Auntof6 (talk) 07:51, 20 September 2019 (UTC)- @Auntof6: i meant because the botwould edit it. either way it is denied. Computer Fizz (talk)
- That link is to an edit of a section of the page, but I already knew what page you're talking about. If we don't edit the page, how do you see letting people find the open RFDs?Twinkle automatically adds requests, so it's not a burden to edit the page for that.
- I mean this. It must be edited every time a new rfd is added or removed. Almost all of the page history is just that. And about the stub links, i plan to add them to short articles and removed from long articles. This feature can be tweaked, removed, or added to based on whether or not the community likes it. Computer Fizz (talk) 07:26, 20 September 2019 (UTC)
- I still don't understand. Plus, what would you be doing with "stub links"? --Auntof6 (talk) 04:38, 20 September 2019 (UTC)
- @Auntof6: The RfD page has a list of all ongoing rfd's, which is tedious because it requires an extra step to opening and closing them. This bot will manage them isntead making things more simple (no pun intended). Computer Fizz (talk) 02:09, 20 September 2019 (UTC)
- Could you be more specific about what you mean by "maintain"? What exactly would you be maintaining? --Auntof6 (talk) 02:05, 20 September 2019 (UTC)
Denied. Some of these suggested tasks like stub links and the external links to other websites changes would be content changes and are specifically not allowed to be done by bots. You also can't "do other stuff that arises" you need to have a clear specific task. Plus and I don't mean this as an insult, you very often have a poor understanding of how things work here (an example of which is asking for a bot to do things that can't be done by a bot) and simply cannot be trusted with a bot flag. -DJSasso (talk) 10:43, 20 September 2019 (UTC)
- @Djsasso: i see. thanks for being nice about it while actually helped me feel better. but may i rerequest if it's been some time and/or there are new reasons? Computer Fizz (talk) 19:58, 20 September 2019 (UTC)
- Contributions
- Operator: Mike Peel
- Programming language: Pywikipedia
- Function: Maintain commons category links using Wikidata
- Description: The bot looks through the contents of the maintenance categories at Category:Commons category link is locally defined and Category:Commons category link is defined as the pagename (which track differences between local commons sitelinks and those on wikidata) to find cases where the locally defined category links are to non-existent categories (then either removing the commons category completely, e.g. [1], or removing the local definition to use the commons sitelink from Wikidata, e.g. [2]), or are to category redirects (then removing the local link if the redirect points to the commons sitelinked category). It runs on enwp, see en:Wikipedia:Bots/Requests for approval/Pi bot 4 for full details. I see that Template:Commons category has been synced with the latest enwp version by @Djsasso: so I figure it might be useful here too.
--Mike Peel (talk) 14:44, 14 September 2019 (UTC)
- @Mike Peel: This sounds like a good idea. Does the bot log its changes? --Auntof6 (talk) 07:09, 15 September 2019 (UTC)
- @Auntof6: You can see the list of changes at en:Special:Contributions/Pi_bot - if you have a more specific type of log in mind then I can look into it? Thanks. Mike Peel (talk) 07:18, 15 September 2019 (UTC)
- @Mike Peel: Well, those edit summaries are descriptive enough. What I was thinking is thst someone might want to look at just the ones where the Commons category was removed so they could try to find matches. However, we do have Special:UnconnectedPages, so it's not that important. --Auntof6 (talk) 09:14, 15 September 2019 (UTC)
- @Auntof6: Hopefully that would happen via the watchlists - I get occasional alerts that pi bot was reverted on enwp when someone fixes a bad link, presumably because they're watching the article changes. Note that UnconnectedPages is for pages that aren't linked to Wikidata, not to Commons. Thanks. Mike Peel (talk) 09:22, 15 September 2019 (UTC)
- @Mike Peel: Well, those edit summaries are descriptive enough. What I was thinking is thst someone might want to look at just the ones where the Commons category was removed so they could try to find matches. However, we do have Special:UnconnectedPages, so it's not that important. --Auntof6 (talk) 09:14, 15 September 2019 (UTC)
- @Auntof6: You can see the list of changes at en:Special:Contributions/Pi_bot - if you have a more specific type of log in mind then I can look into it? Thanks. Mike Peel (talk) 07:18, 15 September 2019 (UTC)
- Go ahead and make a trial run of 100 edits. -DJSasso (talk) 10:40, 16 September 2019 (UTC)
- @Djsasso: I ran a test, which made 58 edits to articles and categories, see Special:Contributions/Pi_bot. I can make the rest later if you want, but I think that demonstrates the range of edits it's likely to make. Thanks. Mike Peel (talk) 12:34, 16 September 2019 (UTC)
- Yeah that is good enough. I meant to say 50 so it works out. I will take a look through the edits when I get a chance today. And if everything is good I will flag it. -DJSasso (talk) 12:42, 16 September 2019 (UTC)
- Approved. -DJSasso (talk) 14:18, 16 September 2019 (UTC)
- @Djsasso: Thanks! The first run-through is complete, I've scheduled it to run on Monday and Friday each week from now on. Thanks. Mike Peel (talk) 20:07, 16 September 2019 (UTC)
- @Djsasso: I ran a test, which made 58 edits to articles and categories, see Special:Contributions/Pi_bot. I can make the rest later if you want, but I think that demonstrates the range of edits it's likely to make. Thanks. Mike Peel (talk) 12:34, 16 September 2019 (UTC)
- Contributions
- Operator: DannyS712
- Programming language: AWB
- Function: Remove broken links to portals
- Description: A lot of templates that were copied over from enwiki have links to portals. But, portals don't exist here. This bot task would remove broken links to portals by editing in the template namespace. For example, Portal:Asia had a bunch of links, see https://simple.wikipedia.org/w/index.php?title=Special%3AWhatLinksHere&target=Portal%3AAsia. This would only remove links specified in the source. Thus, the links to the Asia portal would be primarily dealt with by editing template that includes them, specifically this edit --DannyS712 (talk) 19:48, 4 July 2019 (UTC)
- Definitely a reasonable task. Were you intending it just to be a single manual run or actually leaving it unattended and automated? Couple reasons I ask. If you just intended to do a manual run to fix it up once, or every once in a awhile then in the future you might want to just post a note on Wikipedia talk:AutoWikiBrowser/CheckPage and we can give you temporary AWB access to quickly take care of tasks like this, and any admin can do that. The other is that I went to check how big of a problem it was and could only find about 15 instances of it on the wiki and so I just fixed them since there were so few (in most cases portal links are nulled out here so they don't link or showup, only times they aren't are when they are direct linked like your above example). Surprised I missed the one you link to above when I brought the template over, I usually remove them when they are on templates, but I must have been sleeping that day. -DJSasso (talk) 12:27, 5 July 2019 (UTC)
- @Djsasso: it would be just once in a while manually triggered. It could also be extended to apply to books, since that namespace doesn't exist here either. --DannyS712 (talk) 12:30, 5 July 2019 (UTC)
- Contributions
- Operator: Examknow (talk)
- Programming language: Python
- Function: Clearing Sandbox, Archiving Discussion Threads, Fixing Common Spelling Errors, Removing Spammed External Site Links
- Description: ExamBot can use the data from pages in the wiki and wiki data items to perform actions that are very annoying to have to do manually. In the event that the bot goes haywire than there is an emergency stop button and a script that will revert all edit made by the bot in case something goes wrong.
--Examknow (talk) 01:20, 3 May 2019 (UTC)
- We have a bot for clearing the sandboxes and a bot for archiving discussion threads. However, I'm wondering how you were able to make a bot in Python that, while automated, fixes spelling errors and determines what links are spam. Could you elaborate on that? Thanks, 10:24, 3 May 2019 (UTC)
- Denied. We have bots that do most of these already. We also don't allow bots to edit content such as spelling errors. Also attempting to operate without approval doesn't look good either, especially for an operator who hasn't edited here in the past with any regularity. -DJSasso (talk) 10:40, 3 May 2019 (UTC)
- And just to add as I looked into your editing history on other wiki's you have next to no experience, and you are blocked on en.wiki. Your chance of running a bot here is essentially zero. -DJSasso (talk) 10:56, 3 May 2019 (UTC)