Jump to content

Wikipedia:Bots/Fulfilled requests/2013

From Simple English Wikipedia, the free encyclopedia
  • BaseBot
  • Operator: Base
  • Programming language: Java, wiki-java.
  • Function: Create 5 to 15 total stub articles about Paralympic Ice sledge hockey medalists.
  • Description: Using User:LauraHale/Paralympic medalist stub and data from the International Paralympic Committee, do a test run of 5 to 10 total stub articles about Paralympic Ice sledge hockey medalists. The bot would not do create any more than 15 articles at this point in time, without first seeking permission here. The articles should be all inherently notable, and we would like to do a small test run on Simple English Wikipedia with the idea of if the bot can functionally create the articles in Simple English (which does not have gendered language requirements in the same way as Spanish or Ukrainian), we can then seek permission on other language projects. There is zero intent to mass bot create articles on Simple English Wikipedia about Paralympic medalists. Just a test run. :) Small number. If there are problems, the articles can be easily nuked as small in number. An article can be created on Ice sledge hockey to compliment this work if the articles are kept. --LauraHale (talk) 19:41, 6 November 2013 (UTC)[reply]
  •  Not done After previous issues with automated article creation here the community does not support bots for creating articles. You are more than welcome to bring it up on Simple Talk to see if consensus has changed but I doubt the community would accept it. -DJSasso (talk) 19:47, 6 November 2013 (UTC)[reply]
  • I would have to search for it. It has been discussed ad nauseum. We had a string of users creating thousands of articles using bots and we have had to do mass deletions to many of them (I believe it was in excess of 10,000) so the community soured very greatly on the subject and don't want to see us go back down that slippery slope. I am about to run out for a bit. If you want to look in the mean time I would suggest looking in the archives of Simple Talk as there have been many such discussions about deleting the articles created by such means. I will look when I get back if you haven't found any of them by then. As I said the quickest route to gauging current community interest would just be to post on Simple Talk. -DJSasso (talk) 20:10, 6 November 2013 (UTC)[reply]
  • Innocent iwbot
  • Operator: Lavallen
  • Programming language: C#, DotNetWikiBot.
  • Function: removes interwiki to recently deleted pages on svwiki
  • Description: An adminbot on svwiki is collecting information from recently deleted pages. This bot removes links to these pages on the sisterwikis. For examples, see enwp. The number of edits will most likely be small on this project. The botflag is not essential, the bot runs well without it.

--Lavallen (talk) 07:26, 6 July 2012 (UTC)[reply]

  • I am surprised there is a separate bot doing this as most regular iw bots do this as part of their tasks. However it is a completely fine task to have. So run a 50 edit trial for us to see. -DJSasso (talk) 11:50, 6 July 2012 (UTC)[reply]
Ordinary iw-bot have no access to the content of deleted pages. They will detect missing pages sooner or later, but this bot will do it within 24 hours. I hope this will be of help X-wiki, when discussing notability. On svwiki we have detected many dead iw-links in RfD-talks. And since many interwiki-links can be an argument in a RfD-talk, such dead links can be deceptive. It will take some time to do 50 edits, but I am in no hurry. -- Lavallen (talk) 17:14, 6 July 2012 (UTC)[reply]

Please remove the bot flag for my bot. It is inactive since the interwiki linking nowadays is done via wikidata. Best regards --MagnusA (talk) 16:40, 23 August 2013 (UTC)[reply]

 Done -DJSasso (talk) 17:16, 23 August 2013 (UTC)[reply]

--Pratyya (Hello!) 05:18, 29 July 2013 (UTC)[reply]

How would this process operate? I don't think we want automated welcoming. As another admin pointed out somewhere, we sometimes get new users whose changes are all vandalism. We probably wouldn't want the same kind of welcoming for those users as we would for others, if we wanted them welcomed at all. --Auntof6 (talk) 05:21, 29 July 2013 (UTC)[reply]
Just want to put it on the record here that Prattya has spoken with me over IRC, and I have made it clear that a welcoming bot is unlikely to be well-received by the community. Chenzw  Talk  05:48, 29 July 2013 (UTC)[reply]
Also for the record, I did not know about that conversation. --Auntof6 (talk) 06:29, 29 July 2013 (UTC)[reply]
It has been long established that we don't do automated welcoming on this wiki because it is impersonal and is basically spam. -DJSasso (talk) 11:51, 29 July 2013 (UTC)[reply]
(Non-administrator observation) I knew someone would come up for this idea, but I think this is actually a really good idea. Over at other WMF Projects, I have seen lots of bots there welcome users with a kind description so that newcomers have something nice to do instead of just doing silly persistent vandalism. Perhaps we could create or edit a welcome template and add something like "please don't vandalise and/or get blocked" or "please do some help". If this type of bot-task worked, and if this request was successful, then the community could update Twinkle on this wiki if there are any issues. --~ curtaintoad ~~ talk ~ 06:23, 1 August 2013 (UTC)[reply]
It is something that many people dislike about many of the wikis that do use it. I personally hate getting spammed by such automated bots. The whole purpose of a welcome message is to be a friendly person who can help them if they have questions. Just throwing a canned message at them from a bot is the opposite of feeling warm and welcoming. It is offputting and impersonal. -DJSasso (talk) 14:09, 1 August 2013 (UTC)[reply]
 Bureaucrat note: Given that objections over such a task have been raised in the past, community approval (at ST) would be needed before the bot/trial can even be approved. Chenzw  Talk  14:12, 1 August 2013 (UTC)[reply]

Sorry guys I was busy these days. But my logic is we should welcome users . I saw that in most cases you don't welcome users personally then why it's problem with a bot? Also the bot will welcome only those users who've contributions and username blacklist is enabled. But I think we shouldn't wait for contributions. But I'll leave this to you. I'm proposing this for my experience. I've created my account at 26 January 2012. But I never knew that I can edit. I started editing at August of that year. If someone or a bot welcomed me maybe I would started earlier. That's why I'm saying users should be welcomed. The welcome template contains all the rules of a wiki. So an AGF user will know the policy and will edit well. Now still if you don't like then please kindly say to the Wikipedia:Bots that you don't allow welcoming bot. It'll be good in future.--Pratyya (Hello!) 14:38, 5 August 2013 (UTC)[reply]

Someone has told me not to welcome users unless they have made positive contributions. I find that quite logical and reasonable, since we wouldn't want to welcome a vandal. I usually almost never place welcome templates on vandal's pages. ✉→Arctic Kangaroo←✎ 15:12, 5 August 2013 (UTC)[reply]
I once started welcoming users and an administrator told me not to. I don't think that this wiki really welcomes vandals that much. --Reception123/Receptie123 (talk) 06:37, 6 August 2013 (UTC)[reply]
Automated welcoming would be less meaningful than personal welcoming. (I actually think the mass personal welcoming done by some users isn't very meaningful, either.) --Auntof6 (talk) 19:49, 5 August 2013 (UTC)[reply]
I have to agree with Auntof6. The rule of thumb most people use is don`t welcome someone who hasn`t edited. All you are doing is creating a talk page for someone who may never edit and wasting space. If you see someone actively editing then go for it. But doing mass welcoming like she said is not meaningful. As far as saying we don't allow it in WP:BOTs we don't really need to because us Crats who handle bot requests will just deny it. Anyone who wants to create a bot really should look at the previous approved/denied bots to begin with to see if there is a bot that has already applied to do what they want to do and see what comments those bots got. -DJSasso (talk) 12:20, 6 August 2013 (UTC)[reply]
Perhaps we could enable the new bot but automatically send a welcome message to the newcomer from the bot once the user has made a few positive contributions. --~ curtaintoad ~~ talk ~ 06:07, 7 August 2013 (UTC)[reply]
That is something for humans to do. There is no reliable way a bot can tell positive contributions from negative contributions. Is it really that hard for editors to welcome newcomers personally? Chenzw  Talk  06:10, 7 August 2013 (UTC)[reply]
Also, marking this request as Denied. Any other proposals, ideas etc. should be posted to WP:ST, not here anymore. Chenzw  Talk  06:11, 7 August 2013 (UTC)[reply]
  • Contributions
  • Operator: Hazard-SJ
  • Programming language: Python
  • Function: Interwiki bot
  • Description: Removing interwiki links if a page has an item on Wikidata

Thanks.  Hazard-SJ  ✈  03:56, 30 June 2013 (UTC)[reply]

Does the bot check to make sure all the wiki links on the page are on wikidata. I know our other bots currently leave wikilinks behind as part of their process of cleaning up wikilinks for wikis that aren't yet linked on wikidata. -DJSasso (talk) 11:39, 10 July 2013 (UTC)[reply]
Yes, it will check to ensure the link is already on Wikidata before removal.  Hazard-SJ  ✈  01:06, 11 July 2013 (UTC)[reply]
Run 50 trial edits. -DJSasso (talk) 19:59, 29 July 2013 (UTC)[reply]
Djsasso asked if it makes sure all the links (as in, multiple links) are on Wikidata. You replied in the singular, "the link". If, for example, there are 5 interwiki links on the page, does it make sure they are all in Wikidata, all 5? If they are not, and it can't add them (either the page doesn't exist or it's already in another Wikidata item), what does it do? --Auntof6 (talk) 21:33, 29 July 2013 (UTC)[reply]
It will ensure each link is on Wikidata before removal. I haven't fully coded it as yet, and I probably won't be able for the next 2 weeks, but I'm still willing, and since I have approval to do so, I can integrate adding to Wikidata. I'll give more details and run the trial as soon as I can.  Hazard-SJ  ✈  01:24, 5 August 2013 (UTC)[reply]
Sounds great, thanks! --Auntof6 (talk) 03:17, 5 August 2013 (UTC)[reply]
Approved for trial (50 edits). Chenzw  Talk  05:21, 17 August 2013 (UTC)[reply]

See also: Wikipedia:Simple_talk#Sandbox_cleaning.  Hazard-SJ  ✈  03:14, 23 May 2013 (UTC)[reply]

  • Alright give it a trial run. Make some changes to the sandbox etc. And then let it clean the sandbox at its intended interval. Shouldn't be a problem to approve it. So only need a small handful of cleans. If you are willing it might be handy to have you release the sourcecode on a user sub-page of your bot incase you too disappear and someone can easily take over. But that is in no way a requirement just a suggestion. -DJSasso (talk) 12:07, 23 May 2013 (UTC)[reply]
  • I removed the not done template you added since you are not a crat. We may still choose to have his bot replace yours since you are not an active editor here. Active editors are preferred to run bots. -DJSasso (talk) 14:36, 23 May 2013 (UTC)[reply]
  • OK, since we've gotten an update from Riley, should I still run the trial? I could afterwards leave a check page to disable it (though edit conflicts shouldn't be a problem). Also, releasing the source code would be no problem. I have stable versions, but currently I'm going to move a few things to Tool Labs from ToolServer. Though I might use the same code (as I use for sandbot tasks elsewhere) at least initially, I plan to merge everything (no problems here) for efficiency before butting it up in GitHub.  Hazard-SJ  ✈  02:24, 24 May 2013 (UTC)[reply]
    If both bots add exactly the same code, surely there shouldn't be any conflicts (like this). Is there any reason why we can't have two bots running at the same time? You and Σ do the same thing on en's sandboxes... Osiris (talk) 08:08, 24 May 2013 (UTC)[reply]
    Just so you know, it's been running for a while now, and I see that it's made a few edits now.  Hazard-SJ  ✈  21:41, 29 June 2013 (UTC)[reply]
    Don't think that two bots cleaning the sandbox will be a major issue here, so  Approved. Some redundancy is useful too. Chenzw  Talk  05:20, 17 August 2013 (UTC)[reply]
  • Contributions
  • Operator: Ty221
  • Programming language: AWB, Python (from tools.wikimedia.pl)
  • Function: Deleting anymouse users disscusions (adding QD template), finding potencial vandalisms

I had got flag on plwiki. Ty221 (talk) 09:35, 29 March 2013 (UTC)[reply]

I still don't quite know what your bot is going to do. We usually don't delete user discussions; also, how does your bot find vandalism? Chenzw  Talk  10:25, 29 March 2013 (UTC)[reply]
My bot will delete disscusion when user is anonymous and has dynamic IP (because nobody will receive someone else's messeages) . My bot will find in recent changes vandalism using my library (it is finding vulgar words, nonsense char series, itp. King redgards Ty221 (talk) 11:23, 29 March 2013 (UTC)[reply]
Delete discussions? Removal of discussions is not allowed as long as it's not vandalism. It doesn't matter whether discussion is anonymous or not. Did I understand your task correctly?
Reversion of vandalism: we already have a bot that does that, but I could deactivate it temporarily for you to run a bot trial. Let me know when you are ready. Chenzw  Talk  11:52, 29 March 2013 (UTC)[reply]
My bot will delete dynamic IP's disscusions. There is a bot on plwiki with sysop permissions, which removes anynoumus, dynamic IP's disscuisons. Look here : .
This wiki does not remove warnings, discussion etc. on the user talk pages of even dynamic IPs, not to mention that some users will still be able to read messages; dynamic IPs do not change so quickly for most ISPs. First task is Denied. If you wish to discuss this further, you will need to propose this to the community at WP:ST. Chenzw  Talk  14:42, 29 March 2013 (UTC)[reply]
  • Contributions
  • Operator: Addshore
  • Programming language: PHP
  • Function: Removing inter wiki links that can be found on wikidata (Will only run after wikidata is deployed to this language)
  • Description: Code is already running at ENwiki HEwiki ITwiki HUwiki (also approved on en, he, it, hu, no, nl, sl, is, frr, sv, bn, ca, da ,sr)

·Add§hore· Talk To Me! 23:09, 3 March 2013 (UTC)[reply]

Once wikidata is deployed here I will let you go for 50 trial edits just to be official. But I have seen your work on en and I am not overly concerned. -DJSasso (talk) 23:29, 3 March 2013 (UTC)[reply]
I will run a 50 trial edit once WD is live here! ·Add§hore· Talk To Me! 17:13, 6 March 2013 (UTC)[reply]
 Done as bot is now flagged as a global bot. ·Add§hore· Talk To Me! 22:07, 6 March 2013 (UTC)[reply]
Oh didn't notice you were global already...I flagged you with the local flag for now. Once all the pages are updated I will remove the local flag unless it will still be needed. -13:30, 7 March 2013 (UTC)
  • Contributions
  • Operator: Auntof6
  • Programming language: AWB
  • Function: flag uses of the words "current" and "currently" as needing to give a more specific point in time
  • Description:Many articles use the term "currently" to mean something that is happening when the word was added to the article. These quickly become outdated. I'd like to use my existing bot to add template {{when}} inline to articles that do this. This would put affected articles into Category:Vague or ambiguous time. I just did a search on "currently" and there were 7,145 results. My bot work is semi-automatic: I would verify each change. Bot is already approved, just asking approval for this additional task. --Auntof6 (talk) 01:17, 11 February 2013 (UTC)[reply]
  • I'm not a fan of drive by tagging like this. Either fix the situation or don't tag. Not to say this isn't a good faith idea. I just don't think it should be done. That being said I will leave it to other crats to yay or nay it. -DJSasso (talk) 14:23, 11 February 2013 (UTC)[reply]
  • In general it sounds like a good idea. However, I'm not much a fan of such tags at all. I'd rather have a human eye looking at those articles and fixing which can be fixed and then probably only tag those which can't be fixed right away. -Barras talk 14:32, 11 February 2013 (UTC)[reply]
  • RileyBot
  • Operator: Riley Huntley
  • Programming language: Python
  • Function: Clean the sandbox.
  • Description: Bot will clean the sandbox every hour. Current sandbox cleaning bot hasn't edited since 10 days ago. Operator does not seem to be active.

Riley Huntley (talk) 23:24, 20 January 2013 (UTC)[reply]

I contacted the operator to find out why the bots stopped, and they've said they're okay with the task being taken over. It also cleaned Wikipedia:Introduction and Wikipedia:Student tutorial. Can you do those too? Osiris (talk) 07:35, 21 January 2013 (UTC)[reply]
Thank you for contacting him. I would be glad to clean those pages as well. Riley Huntley (talk) 13:20, 21 January 2013 (UTC)[reply]
Excellent! You just have to wait for a bureaucrat to come along now. Osiris (talk) 23:33, 21 January 2013 (UTC)[reply]
It's fine if you run the bot here:
  • Userpage that clearly states it is a bot; it should also be possible to leave comments...
  • When I have seen the thing run fine for a certain time, flagging it is no issue. --Eptalon (talk) 20:46, 22 January 2013 (UTC)[reply]

┌─────────────────────────────────┘

Riley Huntley (talk) 09:09, 27 January 2013 (UTC)[reply]

Looks good to me, but just asking, how did you generate the list of pages? Chenzw  Talk  09:27, 27 January 2013 (UTC)[reply]
A bot trial of 15 pages has been completed; Link to log. I generate the list of pages by running the bot through categories. For this trial; I chose Category:Biology stubs. Riley Huntley (talk) 09:40, 27 January 2013 (UTC)[reply]
There is a handful of articles in that category which do not need {{italictitle}}. How do you go about it? Or rather, is this a semi-automated task? Chenzw  Talk  09:44, 27 January 2013 (UTC)[reply]
Sorry, my mistake for not clarifying. This is a semi-automated task. What the bot does is with regex, checks the page to make sure the template is not on the page already and then I judging by the title and text, save or skip. I sometimes also run the bot by generating a list of pages ahead of time (I would open each page individually) and then just automatically run through the list. :) Riley Huntley (talk) 09:54, 27 January 2013 (UTC)[reply]
 Approved. Chenzw  Talk  10:35, 27 January 2013 (UTC)[reply]

Policy Discussion

The way interwiki links are handled is being changed. Instead of coding them on each individual article, most of them will be maintained in Wikidata. You can read more here. In view of this, what will we need to do about the bots we have that maintain interwikis links? Discontinue them? Convert them to remove links from pages where possible? Discuss. :) --Auntof6 (talk) 14:00, 25 February 2013 (UTC)[reply]

Pywikipedia which most of them run on has been updated to avoid wikis which now are part of the wikidata system and there are some bots that will remove the links once they have all been imported to wikidata. Likely this will make a number of the current bots inactive as most won't bother continuing after this change. Eventually I will be removing bots that are inactive per our normal procedure of removing inactive bots. But just to be clear wikidata does not automatically make interwiki bots depricated. I should also note we are simple have not yet been migrated over so bots will still be operating here as normal. -DJSasso (talk) 14:57, 25 February 2013 (UTC)[reply]

Request a new bot

Bot job request -- identify long stub articles

I often find articles tagged as stubs that are really long enough not to need the tag. Does anyone have a bot that could look at all the pages tagged as stubs, and make a list of the ones that are over a certain length? That way, I (or anyone else interested) could look at the articles to see if the stub tags could be removed. It would be great if the bot could omit tables, infoboxes, and navboxes from the character count, but even including them would be helpful. Thanks. --Auntof6 (talk) 05:32, 31 March 2013 (UTC)[reply]

Stub doesn't just mean short. A stub can be quite long and still be a stub. Stub just means an article that is missing something important. AWB of course used to treat it just as a character count but I do believe that functionality was removed because it was of course incorrect. It really does need a human eye to decide if it should be a stub or not. -DJSasso (talk) 12:43, 1 April 2013 (UTC)[reply]
I agree. This would just be a starting point. I'm asking for a list, not for a bot to make the changes. --Auntof6 (talk) 13:04, 1 April 2013 (UTC)[reply]
Yeah was just clarifying for anyone that might take care of this. -DJSasso (talk) 14:14, 1 April 2013 (UTC)[reply]
The easiest way to do this might be to query the replicated database on Toolserver. Since I'm not nearly good at this (though I know someone who is :P), do you happen to have any idea of the minimum size to include? If it turns out that a db query isn't the easiest way, I could try to write something in Python to do this.  Hazard-SJ  ✈  02:24, 18 April 2013 (UTC)[reply]
Via the db I got User:Hazard-Bot/Long stubs.  Hazard-SJ  ✈  06:21, 21 April 2013 (UTC)[reply]
Thanks! I'll take a look at those. --Auntof6 (talk) 23:51, 25 April 2013 (UTC)[reply]

┌─────────────────────────────────┘
I think there are two different concepts involved here: The first is that an article shorter than a given size (say 2k characters) is unlikely to fully treat a subject, if it is not a redirect, or has been proposed for merging. This category is probably easy to find, and can easily be handled by a bot. The other category is that a human editor, ideally an expert in the subject, identifies that an article does not cover the subjects it should, in the necessary detail. For the first case, it would also be possible to write an edit filter that tags the articles. Edit filters are different from bots in that they only trigger when an article is edited, though. Tags have the benefit that they can be filtered for in the recent changes list. They have the drawback though that they won't show the articles that aren't edited. --Eptalon (talk) 07:41, 30 June 2013 (UTC)[reply]

I started looking at these back when the list was produced. I found that most of them were only long because they had a lot of tables, infobox data, etc., so I didn't find any (so far) that could be de-stubbed. --Auntof6 (talk) 08:21, 30 June 2013 (UTC)[reply]
It's not possible to access the page text from the database, so I'd have to modify the script to load the text separately and try to exclude such things. I'll see what I can do.  Hazard-SJ  ✈  02:11, 3 July 2013 (UTC)[reply]
Well, OK, if it's not much trouble. I didn't want to cause a lot of work here. --Auntof6 (talk) 02:40, 3 July 2013 (UTC)[reply]
Please take a look at what I have so far (you'll have to sort it by "Length" for now). That length is the number of characters, and it excludes tables using the regular syntax or s-start/s-end templates, as well as any template with "box" or "list" in the title.  Hazard-SJ  ✈  01:03, 10 July 2013 (UTC)[reply]
Thanks -- the first few I checked turned up some that I think can be de-stubbed, so I'll work some more with that list in a bit. Thanks for taking the time for this! --Auntof6 (talk) 01:34, 10 July 2013 (UTC)[reply]