Wikipedia:Bots/Fulfilled requests/2019

  • Contributions
  • Operator: Computer Fizz
  • Programming language: Node.js
  • Function: cleanups
  • Description: maintain rfd list, stub links, and correct External links to Other websites , possibly other stuff as the need arises

--Computer Fizz (talk) 01:08, 20 September 2019 (UTC)[reply]

Could you be more specific about what you mean by "maintain"? What exactly would you be maintaining? --Auntof6 (talk) 02:05, 20 September 2019 (UTC)[reply]
@Auntof6: The RfD page has a list of all ongoing rfd's, which is tedious because it requires an extra step to opening and closing them. This bot will manage them isntead making things more simple (no pun intended). Computer Fizz (talk) 02:09, 20 September 2019 (UTC)[reply]
I still don't understand. Plus, what would you be doing with "stub links"? --Auntof6 (talk) 04:38, 20 September 2019 (UTC)[reply]
I mean this. It must be edited every time a new rfd is added or removed. Almost all of the page history is just that. And about the stub links, i plan to add them to short articles and removed from long articles. This feature can be tweaked, removed, or added to based on whether or not the community likes it. Computer Fizz (talk) 07:26, 20 September 2019 (UTC)[reply]
That link is to an edit of a section of the page, but I already knew what page you're talking about. If we don't edit the page, how do you see letting people find the open RFDs?Twinkle automatically adds requests, so it's not a burden to edit the page for that.
As for the "stub links", I think you mean stub templates. How would you determine what pages to add them to? For one thing, it's not just a question of how long the page is; it's also about how complete the article is. For another, even if you did go by length, you'd have to count only the actual text as displayed, not the infoboxes, navboxes, categories, comments, tables, wikimarkup, references, related pages, other websites, and probably more. As for the external links, that might be helpful if we want a bot for it, but I don't usually find too many of them when I go through taking care of them. --Auntof6 (talk) 07:51, 20 September 2019 (UTC)[reply]
@Auntof6: i meant because the botwould edit it. either way it is denied. Computer Fizz (talk)

  Denied. Some of these suggested tasks like stub links and the external links to other websites changes would be content changes and are specifically not allowed to be done by bots. You also can't "do other stuff that arises" you need to have a clear specific task. Plus and I don't mean this as an insult, you very often have a poor understanding of how things work here (an example of which is asking for a bot to do things that can't be done by a bot) and simply cannot be trusted with a bot flag. -DJSasso (talk) 10:43, 20 September 2019 (UTC)[reply]

@Djsasso: i see. thanks for being nice about it while actually helped me feel better. but may i rerequest if it's been some time and/or there are new reasons? Computer Fizz (talk) 19:58, 20 September 2019 (UTC)[reply]

--Mike Peel (talk) 14:44, 14 September 2019 (UTC)[reply]

  • Definitely a reasonable task. Were you intending it just to be a single manual run or actually leaving it unattended and automated? Couple reasons I ask. If you just intended to do a manual run to fix it up once, or every once in a awhile then in the future you might want to just post a note on Wikipedia talk:AutoWikiBrowser/CheckPage and we can give you temporary AWB access to quickly take care of tasks like this, and any admin can do that. The other is that I went to check how big of a problem it was and could only find about 15 instances of it on the wiki and so I just fixed them since there were so few (in most cases portal links are nulled out here so they don't link or showup, only times they aren't are when they are direct linked like your above example). Surprised I missed the one you link to above when I brought the template over, I usually remove them when they are on templates, but I must have been sleeping that day. -DJSasso (talk) 12:27, 5 July 2019 (UTC)[reply]
  • Contributions
  • Operator: Examknow (talk)
  • Programming language: Python
  • Function: Clearing Sandbox, Archiving Discussion Threads, Fixing Common Spelling Errors, Removing Spammed External Site Links
  • Description: ExamBot can use the data from pages in the wiki and wiki data items to perform actions that are very annoying to have to do manually. In the event that the bot goes haywire than there is an emergency stop button and a script that will revert all edit made by the bot in case something goes wrong.

--Examknow (talk) 01:20, 3 May 2019 (UTC)[reply]

We have a bot for clearing the sandboxes and a bot for archiving discussion threads. However, I'm wondering how you were able to make a bot in Python that, while automated, fixes spelling errors and determines what links are spam. Could you elaborate on that? Thanks, 10:24, 3 May 2019 (UTC)
  Denied. We have bots that do most of these already. We also don't allow bots to edit content such as spelling errors. Also attempting to operate without approval doesn't look good either, especially for an operator who hasn't edited here in the past with any regularity. -DJSasso (talk) 10:40, 3 May 2019 (UTC)[reply]
And just to add as I looked into your editing history on other wiki's you have next to no experience, and you are blocked on en.wiki. Your chance of running a bot here is essentially zero. -DJSasso (talk) 10:56, 3 May 2019 (UTC)[reply]