Wikipedia talk:Bots/Requests for approval/Archive 2
This is an archive of past discussions on Wikipedia:Bots. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 |
Can someone please deal with my request for approval?
Its been over a week since the last post to it, I have beta-tested it and it works, and all the complaints have been resolved, but as of the moment nothing seems to be happening with regards to approving it or even giving me information about anything I have to change to get it approved. — Dark Shikari talk/contribs 22:05, 10 September 2006 (UTC)
- See the official policy: WP:IAR. ;-) bogdan 22:09, 10 September 2006 (UTC)
- Only if you want to see the official polixy WP:BLOCK :p Seriously though, there is a lot of backlog, but we jsut got several new approvers, we are working on clearing the backlog, as well as streamlining the process. — xaosflux Talk 22:15, 10 September 2006 (UTC)
- Yeah, I'm balancing trying to get the hang of the approval group responsibilities on a weekend when I'd like to do a few other things. But at least the whole process is now *faster* than it was before. -- RM 22:53, 10 September 2006 (UTC)
- Thanks for the info. I just felt like nothing was happening, but now its nice to know why ;). And WP:IAR is policy now? It seems to have been flipping back and forth lately, but now not only a guideline, but policy?! This is getting insane! — Dark Shikari talk/contribs 23:37, 11 September 2006 (UTC)
Just remember that this page is open to community input, so we like to wait a few days for that and also just to be a bit sure. I'd rather not just approve anything that seems good shortly after the request. On the other hand, we should not get tied up for a week+ for each request.Voice-of-All 15:50, 12 September 2006 (UTC)
Bot Flag Pages
Ironically, after all the attention my recent RfB brought (so that I could set bot flags), the recent approved bots were not quickly set. Perhaps some of this is a problem with the two pages: Wikipedia:Requested bot flags and Wikipedia:Bots/Approval log. These two pages basically do the same thing. I'm going to merge these pages together, and we'll have to sort it out later. It is just too much hassle to update TWO pages AND the archive AND the requests for approval page. I'm just going to slim down the process. -- RM 02:16, 11 September 2006 (UTC)
- There are some bots who are approved but do not require a flag, thus not suitable to be put at Wikipedia:Requested bot flags. --WinHunter (talk) 02:19, 11 September 2006 (UTC)
- I think the pages are a mess, I don't like the Log page at all. All approved bots are already in the archives here, and the operators summarize them on the Bots page. Too much paperwork. If flags aren't involved, we don't need to get crats involved, so the Requests bot flags may be useful. I've updated the approved section of this request page to say approved for oepratiosn, incstead of approved for flag, as not all of them will need to be flagged. Most bots can flag their operations as minor to be filterable on RC; many operators like to run multiple tasks under their bot, sometimes these requrie them to be flagged as normal, sometimes not; unless an editor is making a single purpose bot, or a bot that will only ever do flaggable items, no need to flag it. Generally if their edits/min are going to be <3, they don't really need a flag either. — xaosflux Talk 02:36, 11 September 2006 (UTC)
- Even if it is less than 3 per minute, they should still have a flag. These types of things are additive: One or two bots isn't a problem, but add them all together and it passed the threshold. It is work that doesn't need to show up in RC normally, minor or not. Vandals can use "minor edit" as well, so the bot flag helps in all vandalism patrol. But yes, there are a few cases where not having a bot flag is great, but that is probably the exception to the rule.
- I've held off on merging the pages for now, as we can probably have a fun discussion about it. I'm going to just be bold and try a few things though, see what happens :)
- -- RM 02:48, 11 September 2006 (UTC)
- We could probably do away with the log. The bot flagging log coveres a good deal of them. The rest, and the flagged ones to, should be easily quered by looking at the arthives since they just use the bot's name anyway, its not like its hard to find. Also, we should enourage people to link to them on the bot's userpage to make this even easier.Voice-of-All 03:03, 11 September 2006 (UTC)
- I agree Betacommand (talk • contribs • Bot) 03:16, 11 September 2006 (UTC)
- Certainly agree, perhaps a bot requirement should be a link to it's approval somewhere on its user page? — xaosflux Talk 04:05, 11 September 2006 (UTC)
- That'd be a great idea, especially with the new template/transclusion system on the page. I'd like to see a link to each individual request and approval for each bot. alphaChimp(talk) 04:09, 11 September 2006 (UTC)
- Certainly agree, perhaps a bot requirement should be a link to it's approval somewhere on its user page? — xaosflux Talk 04:05, 11 September 2006 (UTC)
- I agree Betacommand (talk • contribs • Bot) 03:16, 11 September 2006 (UTC)
- We could probably do away with the log. The bot flagging log coveres a good deal of them. The rest, and the flagged ones to, should be easily quered by looking at the arthives since they just use the bot's name anyway, its not like its hard to find. Also, we should enourage people to link to them on the bot's userpage to make this even easier.Voice-of-All 03:03, 11 September 2006 (UTC)
- I think the pages are a mess, I don't like the Log page at all. All approved bots are already in the archives here, and the operators summarize them on the Bots page. Too much paperwork. If flags aren't involved, we don't need to get crats involved, so the Requests bot flags may be useful. I've updated the approved section of this request page to say approved for oepratiosn, incstead of approved for flag, as not all of them will need to be flagged. Most bots can flag their operations as minor to be filterable on RC; many operators like to run multiple tasks under their bot, sometimes these requrie them to be flagged as normal, sometimes not; unless an editor is making a single purpose bot, or a bot that will only ever do flaggable items, no need to flag it. Generally if their edits/min are going to be <3, they don't really need a flag either. — xaosflux Talk 02:36, 11 September 2006 (UTC)
Sounds reasonable RM, we (the BAG) should come up with a good standard threshold for flag/no flag (other then obvious reasons for no flag). I think we could go overboard with flagging though if we're too loose. — xaosflux Talk 02:53, 11 September 2006 (UTC)
- If Wikipedia:Bots/Approval log is too much paperwork how about not using it altogether instead of merging it to some place? (tag it inactive and leave it there for historical purposes) Just like the old archives in the BRFA page. --WinHunter (talk) 03:00, 11 September 2006 (UTC)
- My standard would be "does this bot really need to be watched?" such as an antivandal bot or some other bot that deals with varying, almost human-level, complex tasks.Voice-of-All 03:03, 11 September 2006 (UTC)
- If a bot is performing such complex tasks, then it may make sense to run it without a flag, provided that it isn't performing large number of edits frequently. But my opinion is to avoid running WITHOUT a flag as much as possible. Someone can still look at bot edits in RC if they want to, but the point is that with a flag they have a choice. -- RM 03:31, 11 September 2006 (UTC)
- If the tasks are too complex, regardless of edit amount, it should not be flagged. If its that risky it needs to be slowed down.Voice-of-All 03:43, 11 September 2006 (UTC)
- Yes, that's pretty much what I was trying to say. Faster, larger bots need flags. Slower, complex bots shouldn't have them. -- RM 03:50, 11 September 2006 (UTC)
- If the tasks are too complex, regardless of edit amount, it should not be flagged. If its that risky it needs to be slowed down.Voice-of-All 03:43, 11 September 2006 (UTC)
- If a bot is performing such complex tasks, then it may make sense to run it without a flag, provided that it isn't performing large number of edits frequently. But my opinion is to avoid running WITHOUT a flag as much as possible. Someone can still look at bot edits in RC if they want to, but the point is that with a flag they have a choice. -- RM 03:31, 11 September 2006 (UTC)
I've made historical/inactive the Approval log. It wasn't being used by bureaucrats anyway and it was just extra documentation. We may come up with an alternative at a later point, but for now it's just overhead. -- RM 17:26, 11 September 2006 (UTC)
- Hi all. About the delay in flagging approved bots: for most of the Bureacrat-related duties, there's a little delay. Even in RfA, which is the more "critical" of everything Bureaucrats do, there can be some delay. That's a side effect of the fact that the Bureaucrat group is supposed to be a small one, for security reasons. Currently, with Essjay inactive, we are a little more busy than usual.
For my part, I've got all of the bot-related forums on my watchlist, but because of the high traffic, I'm seldom lucky enough to see an "approved" edit when I access my watchlist. I know it's a little more work, but posting to our Noticeboard is indeed the quickest way to get an approved bot flagged. Since I seem to be sort of the "main" Bureaucrat on the "bot flagging front" for the moment, I'd like to say to the Approvals Group members mainly, but also to all users that may be concerned: please feel absolutely free to ping me directly on my talk page or even e-mail for your bot flagging needs ;) — although if you post only in one of those places, other Bureaucrats will not know about it, and thus won't be able to carry out the request if they come around before I do. But in any case, I'm at your disposal, so that we can get approved bots flagged as soon as possible. Cheers, Redux 00:34, 12 September 2006 (UTC)- Thanks! It's good to see the some 'crats are interested in keeping up with these pages, I also got a note from =Nichalp about them. Perhaps we could have a SHORT page that lists bots that need flags, linking to their page where we already approved their operation AND flagging, once flagged, a crat could just mark it on the page? — xaosflux Talk 02:09, 12 September 2006 (UTC)
- Presuming all group members are admins (which they ought to be), best make it a fully protected page? --kingboyk 10:07, 12 September 2006 (UTC)
- Thanks! It's good to see the some 'crats are interested in keeping up with these pages, I also got a note from =Nichalp about them. Perhaps we could have a SHORT page that lists bots that need flags, linking to their page where we already approved their operation AND flagging, once flagged, a crat could just mark it on the page? — xaosflux Talk 02:09, 12 September 2006 (UTC)
- Please crap the full protected page (Im not an Admin) Betacommand (talk • contribs • Bot) 12:49, 12 September 2006 (UTC)
- Nichalp had mentioned a new forum as well. I like the idea. Something along the lines of WP:CHU, that is, succinct, with a predefined format that would include the name of the bot and links to the relevant discussion and, perhaps just to make it quicker, a diff to the edit in which an Approvals Group member gave the final endorsement (I always copy those into the "comment" section of the "makebot" tool). We don't need to protect it. We'd have instructions at the top (as we do at WP:CHU), which could include a list with the names of the members of the Approvals Group, explaining (if we decide to do it that way) that only members of the Approvals Group may post there requesting a flag for a bot. So, naturally, if someone else posts, we will deny the request immediately, or remove it from the page.
In fact, this page should have two sections: one, and by far the busier one, would be to grant flags to approved bots; the other would be one to request removal of the flags from bots that have either lost endorsement or whose opperator simply decide to "debot" (this one may not be restricted to Approvals Group members, since bot opperators may wish to relinquish their bots' flag themselves). Just a few ideas. Redux 13:19, 12 September 2006 (UTC)
- Nichalp had mentioned a new forum as well. I like the idea. Something along the lines of WP:CHU, that is, succinct, with a predefined format that would include the name of the bot and links to the relevant discussion and, perhaps just to make it quicker, a diff to the edit in which an Approvals Group member gave the final endorsement (I always copy those into the "comment" section of the "makebot" tool). We don't need to protect it. We'd have instructions at the top (as we do at WP:CHU), which could include a list with the names of the members of the Approvals Group, explaining (if we decide to do it that way) that only members of the Approvals Group may post there requesting a flag for a bot. So, naturally, if someone else posts, we will deny the request immediately, or remove it from the page.
- Perhaps you are reffering to Wikipedia:Requested bot flags? -- RM 13:33, 12 September 2006 (UTC)
- Yes! I had completely forgotten about that proposal. So we just need to rekindle it, maybe "revamp it" a bit. In fact, I suggest we move the discussion to its talk page. Redux 14:13, 12 September 2006 (UTC)
- We'll continue posting to WP:BN until we finish up the backlog and have a chance to work more closely on that page. -- RM 14:29, 12 September 2006 (UTC)
- From my perspective I don't see how anything more is needed than this page. Now that there is an active approvals group we all know to watch this page. Since bots are linked in the approvals section when approved (with the needed link to the approval etc), all we have to do it watchlist this one page. I'd much prefer that to another redundant page unless I'm missing something. - Taxman Talk 02:26, 15 September 2006 (UTC)
- Cross-listing this discussion is probably counter-productive. So if you would, please see my answer to your question on BN and post a response there instead. -- RM 12:44, 15 September 2006 (UTC)
De-indent. No, no, that's not what I was referring to. What I mean is the discussion about this page, Wikipedia:Requested bot flags and Wikipedia:Approved Bots Awaiting Flags. I reallize there is a merge proposal for the latter two, but what's the need for more than even one page? Wikipedia:Bots/Requests for approval serves every need I can think of. - Taxman Talk 14:13, 15 September 2006 (UTC)
Tawker's steps to bot flagging
- Approve bot
- Find a B'Crat on IRC (lots of very nice people there)
- Bot flag is assigned quickly and without much fuss. Much smarter way of doing things IMHO :) -- Tawker 20:53, 12 September 2006 (UTC)
EssjayBot II
I noticed today that EssjayBot II (BRFA · contribs · actions log · block log · flag log · user rights) is running. It seems like this bot never made it through the process and does not have a bot flag. Granted it only runs 2 - 10 edits per day, but this is a unique situation with a bot approvals group member apparently approving their own bot. At minimum we should do something about the expired request, but I'd like to get some thought on what next to do. Should we just approve the bot? What about a bot flag, should it have one? I personally say, yes, that's what the bot flag is for, since all bot edits are additive and there is no compelling reason to not hide the bot edits. Essjay is trusted and it is my understanding that the actions being performed are not controversial, problematic, or overly complex, so thus it should be flagged. -- RM 19:33, 13 September 2006 (UTC)
- I am tempted to agree with you on this one, but it must be noted Essjay (talk · contribs · blocks · protections · deletions · page moves · rights · RfA) has been inactive for quite some time now with no explanation... — FireFox (talk) 19:57, 13 September 2006
- Looks like a screw up the tasks page is actually talking about EssjayBot III, II was approved IIRC. --pgk 20:16, 13 September 2006 (UTC)
- Since you approved EsjayBot II (See Approval log), perhaps you could shed some light on why it has no bot flag? -- RM 20:21, 13 September 2006 (UTC)
- (ec)It appears as if the request above was for *additional* functionality, to do user pages in addition to project talk pages (which is apparently what User:EssjayBot III is supposed to do). I don't know where this was previously discussed or how it was approved. Perhaps it wasn't approved. In any case, I don't see a problem with it at this point. The issue at hand is what to do since 1) Essjay is inactive while his bot continues to run, 2) the bot runs without a bot flag, and 3) the request above appears to be what EssjayBot III is designed to do, but not approved for, so maybe we should approve it. -- RM 20:17, 13 September 2006 (UTC)
- On second thought, we shouldn't approve EssjayBot III since it doesn't appear to be tested. -- RM 20:18, 13 September 2006 (UTC)
- I've sorted out the pages. The point in the bot flag is to not clog up recent changes with minor(ish) edits, the nature of this bot doesn't tally with that rationale, low edit count bots tend not to be flagged so they do have the transparency of appearing in RC. --pgk 20:25, 13 September 2006 (UTC)
- Well, since I've just recently been pushing my agenda on the usage of bot flags, that's why I brought it up ;-) I looked at the history when it was approved, and I see that you, Essjay, and Xaosflux all were in agreement about it having no flag. Still, at this point the changes made by the bot can be considered trivial and/or low-risk, as it has been making these changes for months now without incident, AFAIK. IMO, there is no longer any reason for it to NOT have a bot flag. Again, the point of the bot flag is not to make bot changes invisible. The point is to make people who don't care about them to have the ability to turn them off. That isn't to stop RC patrol from looking at bot edits to ensure their accuracy. The point is that *any* edits that do not need to be checked by vandalism patrols is good to have filtered out. I don't see why it needs to be checked by more eyes. THe pages that get archived have enough eyes as it is. If a mistake were made, it would be caught quickly, and almost always because of the users of those pages, not because of RC. -- RM 20:32, 13 September 2006 (UTC)
- I don't intend to argue the point beyond this, because ultimately it makes little or no difference to me, and the owner of the bot was of course more than capable of flaging the bot. I disagree slightly with your view on this in that many people will ignore bot flagged edits regardless of their option to not do so, thus by giving out bot flags we are giving people a facility to sweep under a lot of peoples radar. Bots like this if working properly (and provide the user acts resposibly with the account) I agree there is probably little concern about, the concern is in the opposite circumstance where the bot is malfunctioning or the owner is acting irresponsibly with the bot account. Given the impact of having an additional 2-10 edits show up a day versus the "risk" element, it doesn't appear to me to be a clear cut case either way. --pgk 20:42, 13 September 2006 (UTC)
- Essjay is a bureaucrat, so obviously we have *no* concern about acting irresponsibly. If that were the case, the bot flag would be the least of anyone's concern. If it was a different user I'd perhaps feel differently. Since bot edits are additive and the pages edited are high profile pages with lots of users checking the history, I think a bot flag makes sense here, but because of the low number of edits, I also do not feel like it *has* to be that way. -- RM 20:47, 13 September 2006 (UTC)
- (ec)As an aside, I was somewhat serious about my agenda. But no so much that I don't want to establish consensus. When the bot flag was first created, its purpose was to hide the rambot. My changes were not trivial in a large number of cases, but the flag was granted anyway. Anyway, I have a particular viewpoint of how the bot flag should be used. Other persons have a *totally* different perspective, perhaps from the newer Wikipedia era of not trusting anyone (so much for AGF!). In any case, there is no written policy of when a bot flag should and should not be assigned although some discussion has already occurred, and we will at some point have to come to a consensus on this very issue, which is why I bring it up. I just think it important to keep in mind that the bot flag does not shield the bot from being checked. I myself sometimes do some RC patrol for bot edits. -- RM 20:43, 13 September 2006 (UTC)
- Arrgh, you've drawn me in. AGF is certainly not an issue here, as the owner of the bot is in a fairly trusted position, checkuser, boardvote, oversight etc. That said AGF is also not a call to shut your eyes and hope, we aren't insulting anybody or suggesting bad faith by presenting their bot edits without the bot flag, the same way we aren't with their regular edits. I'm not assuming anyone/everyone will turn rogue otherwise I'd suggest that we never bot flagged anyone and I wouldn't by default exclude bot edits in the IRC bot I run for several wikis.
- One argument to this would be of course that we don't filter flag normal editors as bots, because part of RC is also because people may actually be genuinely interested in the changes from a non-vandalism hunting perspective. The nature of the filtering which is all or nothing, whereas I would have a fair guess that the majority of people aren't interested in seeing (say) interwiki links being added/removed, I'm not so sure I cam say the same of every bot task. Part of the equation also has to be if the edits "logically" fit as bot edits. --pgk 21:03, 13 September 2006 (UTC)
- You're points are well taken, and I apologize for "drawing you in". I would like to get the opinions of others as well, so a statement from everyone is helpful to try to determine consensus on this issue (if it exists). I'd agree with next to everything that you've said, as it is quite reasonable. I apologize for invoking AGF inappropriately. Your points in general are quite good. If I might nitpick some more, in the case of Essjay, archiving pages is to me almost on the order of interwiki links. They are being archived because no one cares about the discussion anymore. If people were still interesting in seeing those changes on RC, they'd have taken a vested interest in the discussion. I can't imagine why anyone would want to know from RC whether talk pages have been archived. A watchlist entry would be a *much* better alternative than trying to find it in RC! -- RM 21:16, 13 September 2006 (UTC)
Elissonbot waiting for response after trials
I am not sure how the approval works, and I can't find any description on what you want me to do after I have done a trial run for my Elissonbot (talk · contribs · count · logs · page moves · block log), but I was approved to do a trial run which I have now completed, and added a note on the Elissonbot approval page three days ago. What to do now? – Elisson • Talk 18:14, 14 September 2006 (UTC)
- Your bot will soon be granted the bot flag and then you will be free to operate your bot as requested. So long as the task doesn't change, you won't need to bother with this approval page again. As soon as the page is approved this page will be updated to indicate the bot flag assignment. -- RM 00:53, 15 September 2006 (UTC)
Archive Templates
I have created {{Bot Top}} and {{Bot Bottom}} for achiving the discussion it has the <noinclude></noinclude> code along with the cat for the discussion I thought that might be better than the {{Debate top}} and {{Debate bottom}} that we are currently using. Betacommand (talk • contribs • Bot) 13:38, 15 September 2006 (UTC)
AWB accounts
Hello all. I've recently created a bot account (User:Halibott) for my WP:AWB changes. While I could do all those tedious, repetitive changes with my own account, I decided it would be easier to keep them separately. That way my watchlist doesn't get bogged and my edit history is easier to track. While AWB is by no means an automatic bot (it does not make changes automatically and all of them have to be approved first), User:Sciurinæ has pointed out to me that perhaps I would have to be approved by this body first, before I use that account. Is it needed? //Halibutt 22:02, 16 September 2006 (UTC)
- If you're approving every change manually, you don't technically need a bot flag (but having an account named "bot" isn't such a good idea in that case). If you're using the auto-save feature you do need a bot flag. Also, if you're doing a lot of edits at speed a bot flag would be a good idea so that you don't flood recent changes. --kingboyk 22:05, 16 September 2006 (UTC)
Helper script
I wrote a script[1] to help automate the approvals process for AG members.Voice-of-All 03:06, 17 September 2006 (UTC)
Template update
Just to inform everyone that I updated the template for bot requests with a new template -
at the top. Thought it might be very convenient for all users.
I was going to use {{SUBPAGENAME}} but if users created pages such as Wikipedia:Bots/Requests for approval/AP.BOT 2nd function it wouldn't work. Not quite sure if you make another bot page for a new function or just add the new request to the approval page. Though using {{SUBPAGENAME}} is very useful so the user only has to type the bot name once.--Andeh 16:53, 20 September 2006 (UTC)
- I've done some more tweaks to the template, and I'll probably make a few more later. At the moment, I added support to the template for alternate functions. The code below...
- *{{botlinks|Zorglbot}} ''In trial period as of 07-Sep-2006'' (''Task 1'')
*{{botlinks|Zorglbot|2}} ''In trial period as of 07-Sep-2006'' (''Task 2'')
- *{{botlinks|Zorglbot}} ''In trial period as of 07-Sep-2006'' (''Task 1'')
- ...produces the following:
- Zorglbot (BRFA · contribs · actions log · block log · flag log · user rights) In trial period as of 07-Sep-2006 (Task 1)
- Zorglbot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) In trial period as of 07-Sep-2006 (Task 2)
- Hopefully this should solve the current problems with the system. -- RM 16:44, 22 September 2006 (UTC)
How large a portion is "substantial"?
I read in the policy that "bots that download substantial portions of the page database are prohibited". Out of curiosity, would roughly 20,000 articles — the physics and mathematics coverage — count as a "substantial portion"? Background to my question can be found here and here, for the morbidly curious. --Anville 21:38, 21 September 2006 (UTC)
- Does 20,000 out of 1,395,974 (~1.4%) seem like a "substantial" portion to you? It doesn't to me. -- Gwern (contribs) 22:56, 21 September 2006 (UTC)
- If it's going to be a one-time thing, there shouldn't be any problem: it's too small a set for downloading a database dump to be worthwhile. As a point of comparison, OrphanBot downloads between 2000 and 4000 pages a day through the edit interface, and 4800 pages a day through Special:Export, which doesn't seem to be having a major impact on Wikipedia's operations. --Carnildo 00:12, 22 September 2006 (UTC)
- This is an interesting question and response. I have thought about downloading a subset of Wikipedia (probably in the few times 10000 page range) for my own private uses with refreshes to my local copy every few months. I have had in mind working from the database dumps, but Special:Export might be more practical if traffic on this scale doesn't bother people. Dragons flight 00:23, 22 September 2006 (UTC)
- Off the cuff, 20000 articles is substantial, but not determetial. How often would this be downloaded, and how quickly? Also once downloaded, what do you plan on doing with it? UPLOADING changed to 20000 pages rapidly would certainly be an issue. — xaosflux Talk 01:09, 22 September 2006 (UTC)
- Yes, I expect this to be a one-time thing, and no, I don't plan to be uploading them again. I had figured by the percentages that it wouldn't be a massive chunk, proportionally speaking, but I didn't know if in absolute terms it would still pose an overload. (I can also distribute the downloads over time to reduce the stress.) Thanks for the information. Anville 01:22, 22 September 2006 (UTC)
- http://en.wiki.x.io/robots.txt suggests a throttle rate of 1 per second. If you're willing to spend 20000 seconds downloading, I wouldn't think it would be a problem. (20000 seconds is 5.5555555555556 hours, according to #expr). --ais523 15:54, 22 September 2006 (UTC)
- I agree. The robots.txt instructions should be followed no matter what. I can't think of any case where it would be acceptable to violate this. So long as the access is reads (as in the case of downloading) and not writes and limited to a one time action, I don't have a problem. The fact that permission was sought suggests good will. I wouldn't consider this to be precedent so that *anyone* could do this, however. -- RM 16:29, 22 September 2006 (UTC)
- http://en.wiki.x.io/robots.txt suggests a throttle rate of 1 per second. If you're willing to spend 20000 seconds downloading, I wouldn't think it would be a problem. (20000 seconds is 5.5555555555556 hours, according to #expr). --ais523 15:54, 22 September 2006 (UTC)
- Yes, I expect this to be a one-time thing, and no, I don't plan to be uploading them again. I had figured by the percentages that it wouldn't be a massive chunk, proportionally speaking, but I didn't know if in absolute terms it would still pose an overload. (I can also distribute the downloads over time to reduce the stress.) Thanks for the information. Anville 01:22, 22 September 2006 (UTC)
(deindent)I wanted to add this note: An acceptable use for downloading 20,000 articles is for some task that is going to be used to improve Wikipedia. It is perhaps not acceptable if the use is for some external use. I suppose it is a judgement call, but if I had to download 30,000 U.S. city articles, say, to fact check them and for other offline processing tasks, then it would be acceptable. But if I was downloading those same articles for another purpose unrelated to improving Wikipedia articles, then I would expect such a request to be denied. -- RM 16:47, 22 September 2006 (UTC)
- Thank you for the note; I am certainly in accord with those sentiments. (Having had to maintain servers myself, I fully sympathise with the desire to avoid overloading them!) My plan stemmed from an idea I proposed to Byrgenwulf and Hillman here and here respectively; I hope, but cannot guarantee, that it will redound to Wikipedia's benefit. Anville 18:11, 22 September 2006 (UTC)
Bot Approval Status
My new bot request has not been acted on for over 2 days, despite the new approvals group members. Any chance of taking a look? (I'll reiterate my offer to help with bot approvals if need be...) Thanks. alphaChimp(talk) 01:24, 26 September 2006 (UTC)
- Well you normally should get comments within the first day, but we like to wait a bit for some bots before approving, either per questions, trials, or community input. Though there is sometimes a delay.Voice-of-All 03:09, 26 September 2006 (UTC)
- We could clearly use more members to the approvals group, although the addition of new members has vastly increased the speed of approvals. The next step is to perhaps establish a more stable process for adding new members. Since you (alphachimp) only just barely missed joining the approvals group, perhaps by the time policy is established there will be sufficient evidence to support adding you to the group. The discussion can be taken up at any time on Wikipedia talk:Bots/Approvals group and not just by approval group members. Feel free to add your thoughts, comments, and ideas. As for me, I don't think I really understood how time consuming approving bots can be. Good thing we don't have to rely on 1 or 2 members anymore. -- RM 03:40, 26 September 2006 (UTC)
- A whole two days doesn't seem that long to me, I haven't looked at the request, but in general I can't imagine anything being that urgent, people wait longer for page moves at times... --pgk 06:20, 26 September 2006 (UTC)
- My general feeling is that waiting is never a great thing. But that said, being a monday with Wikipedia under heavy load, the bot wasn't/shouldn't operated anyway. -- RM 11:53, 26 September 2006 (UTC)
- While the approval of bots does come from the BAG, we really do care about community input. Bot approvals oftentimes run faster if they include links to other project pages where the bot was requested and/or a few supporters to come in on the proposal. Yes some bot requests are speedy, but that is not the rule. — xaosflux Talk 12:17, 26 September 2006 (UTC)
- This is an excellent point. Obviously in the case of alphachimp, he couldn't very well comment on his own bot, but *anyone* can participate in the approval process. The actual approval itself is a normally simple process of determining consensus and appropriateness and then making it official. The bulk of discussion can be performed by *anyone*. We have more need for community input in general than we do for additional approval group members. -- RM 17:51, 26 September 2006 (UTC)
- I don't see an need for more members at this time. Perhaps the procedure could be fined tuned. I often want to wait at least a bit for comments by other BAG memembers or outsiders. Also, many bot owners do not respond to the trials after a while, so I am not sure what I can do there, other than wait a bit more.Voice-of-All 22:56, 26 September 2006 (UTC)
- I've taken to moving pinging the operators talk on them, and moving to expired if no response in a few days. — xaosflux Talk 01:43, 27 September 2006 (UTC)
- While the approval of bots does come from the BAG, we really do care about community input. Bot approvals oftentimes run faster if they include links to other project pages where the bot was requested and/or a few supporters to come in on the proposal. Yes some bot requests are speedy, but that is not the rule. — xaosflux Talk 12:17, 26 September 2006 (UTC)
- My general feeling is that waiting is never a great thing. But that said, being a monday with Wikipedia under heavy load, the bot wasn't/shouldn't operated anyway. -- RM 11:53, 26 September 2006 (UTC)
- A whole two days doesn't seem that long to me, I haven't looked at the request, but in general I can't imagine anything being that urgent, people wait longer for page moves at times... --pgk 06:20, 26 September 2006 (UTC)
- We could clearly use more members to the approvals group, although the addition of new members has vastly increased the speed of approvals. The next step is to perhaps establish a more stable process for adding new members. Since you (alphachimp) only just barely missed joining the approvals group, perhaps by the time policy is established there will be sufficient evidence to support adding you to the group. The discussion can be taken up at any time on Wikipedia talk:Bots/Approvals group and not just by approval group members. Feel free to add your thoughts, comments, and ideas. As for me, I don't think I really understood how time consuming approving bots can be. Good thing we don't have to rely on 1 or 2 members anymore. -- RM 03:40, 26 September 2006 (UTC)
- I think tha existance of the BAG has discouraged comment from non-BAG users. Basically we trust the BAG folk, so don't comment, BAg are waiting for comments to provide an indication of consensus... Rich Farmbrough, 13:34 17 October 2006 (GMT).
Interwiki links
Looks like we've had our share of time managing the interwiki links transcluded on here versions on here, is anyone familiar enough with the interwiki py framework to know if there is an 'ignore this page' tag? — xaosflux Talk 02:31, 2 October 2006 (UTC)
- talk to cyde he has basicly written the pywiki bot. Betacommand (talk • contribs • Bot) 13:17, 2 October 2006 (UTC)
An uncovered situation
It's a little odd no one has asked this before.
I want to get permission to run a bot to add about 1300 articles I'm writing. The situation is fairly straightforward but there's one small hitch: the bot will be run by another Wikipedian on my behalf. (BTW, I believe he's trustworthy, seeing how he's been a positive contributor to the project for at least as long as I've been around.) I'm doing it this way because I'd rather hand this work to someone who has done it before, rather than create a bot more or less from scratch & operate it myself for a one-time run. Is there anything I need to state in my request other than who will be running the bot? Do I need to create a special account for the bot to operate under? TIA, llywrch 20:34, 4 October 2006 (UTC)
- GFDL is the biggest thing to worry about here, if 1300 new articles are created, they will be licensed under the account that creates them, a new bot account may be approriate, with it giving attribution to you, the bot owner. If your helper is simply facilitating the technology of running "your" bot, it should be ok, though you will be responsible for anything it does. — xaosflux Talk 03:24, 5 October 2006 (UTC)
Hi guys: I was curious, does anyone object to me being on the Bot Approvals group? I've been running MetsBot for a while, and have large experience with AWB and pywikipedia. I have been following this page quite a lot and have a pretty good idea about what is and what is not acceptable for a bot. —Mets501 (talk) 02:41, 8 October 2006 (UTC)
- We just went though an election (See Archive2 above), and have not really come up with a policy/proceedure for adding more members (We need it though) Guess it's time to reopen talks on that. — xaosflux Talk 19:03, 8 October 2006 (UTC)
- Yeah, I know about the election: I voted in it. I'll wait if you are in the process of designing a new procedure, otherwise we could just do it informally, or have another election. I'm in no rush, just wanted to help out if possible. —Mets501 (talk) 19:38, 8 October 2006 (UTC)
- Same goes for me. I've shied away from this page a bit since the "election" (I'm not going to lie, it was a bit discouraging), but I'd be willing to help out in any way needed. Alphachimp 03:43, 13 October 2006 (UTC)
- Personally, I'm in favour of an existing BAG member approves you're in.... that makes sense to me -- Tawker 03:44, 13 October 2006 (UTC)
- Same goes for me. I've shied away from this page a bit since the "election" (I'm not going to lie, it was a bit discouraging), but I'd be willing to help out in any way needed. Alphachimp 03:43, 13 October 2006 (UTC)
- I also would be willing to help when you are ready, just message me. I have skills with webbots, and and understanding of wikipedia policy. HighInBC 20:08, 19 October 2006 (UTC)
- Yeah, I know about the election: I voted in it. I'll wait if you are in the process of designing a new procedure, otherwise we could just do it informally, or have another election. I'm in no rush, just wanted to help out if possible. —Mets501 (talk) 19:38, 8 October 2006 (UTC)
AMABot awaiting response
Hi - I've been running a "short trail" of AMABot (which seems to have become a bit of a bigger trail, after some starting problems (caused by lightning)), and after a few days of continuous operation I posted some diffs to the request page. I'm not sure what happens next/what I need to do - any help appreciated. Martinp23 20:19, 8 October 2006 (UTC)
Admin Bots
As a recent request and RFA have involved this topic, it's become pretty clear that the Bot Policy is lacking definition of what actions can or should be allowed to be automatically performed by admin accounts. Please note it is WAY to earlier to have a poll or !vote on this yet, but it's certainly time to have an open discussion with the community. Please voice your opinions! Some topics that should be addressed are:
- The running of an assitance script by existing admin accounts
- The running of an automated process by existing admin accoutns
- The potential for approval of single-purpose robotic admin accounts
- Accountabilty of such accounts
- The running of assitance scrips by such accoutns
- The running of automated processes by such accoutns
- Anything else
Buy-in from a supermajority of WP:BAG members should be a requirement here, as well as the support and approval of the community to ammend the Bot Policy. Thanks! — xaosflux Talk 01:31, 13 October 2006 (UTC)
Discussion
- Well, first off, you need a pressing need for a bot with sysop privileges. The TawkerbotTorA wasn't one, as the vandalism from Tor proxies, as well as the advice to Chinese users, contradicted the stated need. Also, you need to assert whether something can only be done via an admin bot; if coding an extension is feasible, then go for that instead. IMO, sysopbots should be the absolute last resort. Titoxd(?!?) 01:34, 13 October 2006 (UTC)
- Assistance scripts seem absolutely OK for me. For instance, VoA's script includes a rollback all tool, a delete all page creations tool and a block all AOL button. Those are controlled by humans, they just speed up a few very tedious processes. I'd encourage development of this type of admin tool. Alphachimp 03:45, 13 October 2006 (UTC)
- Anything done by an assistance script should be treated as if it had been done the slow way. That means you'd better be sure your script isn't going to screw up and block Jimbo for seventeen years. --Carnildo 05:41, 13 October 2006 (UTC)
- What we need is a framework for approving such tasks that need to run under unique accounts. Currently any sort of proposal to run an automated tool "legally" on a seperate account gets a huge share of "admin bots are scary" type !votes. Bots w/ elevated access can help us out if we provide carefully limited situations - the trick is to provide a framework that allows us to get help in the day to day operation w/o raising more fears. We need to help people see that sysop bots are not a doomsday situation, they can help us keep this place running. Anyways, I've been in discussion w/ a few people re this and there's some progress being made -- Tawker 05:10, 13 October 2006 (UTC)
- A good start would be to define what admin tasks a bot can sensibly do. Admin buttons are for:
- Blocking/Unblocking
- Deleting/Undeleting
- Protecting/Unprotecting
- All of those in generality require some level of intelligence and discretion, there isn't a simple algorithm to determine when those actions should be taken. There are of course instances where a subset can be defined within the scope of a bot. (e.g. blocking page move vandals, blocking dodgy usernames, undoing "expired" page protection etc.) but at the same time, all these things which can be defined as a simple algorithm could be incorporated into Mediawiki and arguably done better than a bot could hope to achieve. (Pagemove rate could be limited, bad usernames could stopped being created, proper expiry could be add to page protection). The admin bot to date being Curps suffered many of the issues people object to in admin bots, it had false positives and was subject to scope creep (it started as a bad username blocking tool for a short term problem when we had a sustained attack of 100's/minute of dodgy usernames being created), and arguably redueced any urgency in dealing with those things being added to mediawiki. Of course the demise of the bot also didn't signal the end of the world, dodgy usernames haven't been a big problem, nor have we had a huge influx of pagemove vandals. For me the way Curps bot came about done was the "right" way, a bot implemented to resolve a short term problem against th users own account (maintaining a strong personal accountability), where it went wrong was that the functionality wasn't absorbed into mediawiki and the bot stopped. --pgk 08:12, 13 October 2006 (UTC)
- You missed a few admin actions:
- History review of deleted pages
- Special:Unwatchedpages
- Editing fully-protected pages and interface messages
- The last in particular doesn't seem wildly inappropriate for a bot; I can certainly imagine a bot updating MediaWiki:Recentchangestext (although I don't think we have a need for such a bot at the moment). On some other Wikimedia wikis, Special:Import is an admin action too (I'm not sure about enwiki), and I can imagine mass transwikis taking place quite easily too. --ais523 13:24, 17 October 2006 (UTC)
- Special:Import doesn't work for en wiki, though again if there was a need for that many that a bot would be required, I guess there would have to be the question about shouldn't MediaWiki do it better in the first place.
Editing protected pages, very few things are that urgent they won't wait until protection is removed (it's almost always temporary) or the few problem pages done manually. (This is actually a problem for some of our substing bots dealing with userpages which have since been protected due to abuse by the blocked user, which I guess will never be deprotected, though again there is a question of is it really that important? many of those pages will never be viewed again, certainly not often enough to concern ourselves about server load)
And as you say, I can't think of a situation requiring automation for editing media wiki messages.--pgk 17:53, 17 October 2006 (UTC)
- Special:Import doesn't work for en wiki, though again if there was a need for that many that a bot would be required, I guess there would have to be the question about shouldn't MediaWiki do it better in the first place.
An idea was discussed a bit in the -admins channel re dealing with vandalism new pages (aka G1, the super super obvious stuff) - essentially if it fit a modified set of criteria for Tawkerbot2 and eventually (after running it in add to a speedy delete list for a while... and approval) an automatic speedy. The rough thought had a pretty warm reception on -admins - It might be something worth exploring (I talk msg'd xaosflux and how I post here :) (I can't see how we could integrate it into MW atm) -- Tawker 00:42, 17 October 2006 (UTC)
- Sounds reasonable. Of course this could be worked into MW, if you can code a set of rules in a bot that same set of rule could be coded into MW to stop the page being created in the first place. The question would be could it be done better there, on first thoughts I'm not sure it could for a couple of reasons (i) by nature this is quite instance specific so probably best outside of the mainline code (ii) People would just mould theirnonsense around those rules until it got accepted. The latter has a risk of happening with a bot of this nature particularly if it deleted them quickly (though TB2s success perhaps demonstrates the intelligence (or lack thereof) of the average vandal). But early days yet a speedy tagging bot (or a parallel bot tagged speedy) might be interesting to see how well it performed. Alternatively it might be interesting to do something like this just listing up potentials on a page and work out later how many actually got tagged by someone, how many got missed by humans within a given timeframe, how many got tagged by humans but not the bot (perhaps a harder problem to solve, having to work through the delete log and try and work out which were speedies...) and how many shouldn't have been tagged, refine it repeat etc. then we'd have some hard number to demonstrate. --pgk 20:07, 17 October 2006 (UTC)
- And a downside. New page patrol is more than just tagging speedies, it's also tagging stubs, cleanup etc. etc. So even if the bot tagged them chances are someone would/shuld still look at them anyway, though longer term that might be an argument to let it go straight for the delete... --pgk 20:13, 17 October 2006 (UTC)
Bot Request
I would like to know if i can get somebody else's bot or request my own bot to give welcome messages to newcomers. I don't know any programming language as well. Is this possible.--Ageo020 (talk • contribs • count) 22:00, 16 October 2006 (UTC)
- This wouldn't be that hard to code, but the real question is it really usefull. Giving every new user the same generic message may take away from the welcoming aspect. Perhaps if only done to new suers after x edits who have not already been welcomed...? — xaosflux Talk 23:44, 16 October 2006 (UTC)
- The idea has been rejected many times over at WP:BOTREQ. Having automatic welcomes eliminates any personality from welcoming users. It defeats any real purpose of the welcome, and would probably create a lot of needless talk pages. That's not to say that welcoming isn't good, but I would much rather have you welcome users on an individual basis (rather than just pulling the new users feed). Alphachimp 23:48, 16 October 2006 (UTC)
- I'll second this. It is an interesting idea, but not really a practical one. -- RM 00:42, 17 October 2006 (UTC)
- As already said really it'd take away the personal touch. (MediaWiki could easily set up and initial talk page with and initial message on account creation). It also suffers as we get with some users who scatter gun welcomes by going through the new user welcome log, things such as users who have never edited, users who were rapidly blocked as vandalism only accounts and users with wildly inappropriate usernames who all have welcome messages... --pgk 06:21, 17 October 2006 (UTC)
- I have nearly all of the user-talk messages stored in my user scripts for rapid use, except {{welcome}} and similar; IMO the personal aspect of welcomes is important. --ais523 13:14, 17 October 2006 (UTC)
- Perhaps, following on for Xaosflux's idea, we could do a bot wo find users who have x edits and who haven't been welcomed, and either give them an information message, or add them to a list of un welcomed users, for humans to welcome. The problem I see with this is finding out whither a user has been welcomed if they have deleted the message from their page - perhaps we'ed need to look for empty talk pages. Just a few ideas - Martinp23 18:22, 17 October 2006 (UTC)
- Or non-existant talk pages. Wikipedia:Lonely newbies? :) — xaosflux Talk 04:04, 18 October 2006 (UTC)
Read only bot
Hello, I was wondering, do I need approval for a bot that reads wikipedia(not excessivly, around 3 requests per minute) but does not write/change? I am thinking of a script that find potentially bad edits, but in such a way that automatic changes would be bad as it is not accurate, and simply gives me(a human with an account in good standing) a list of urls to diffs.
I only ask this as it seems wikipedia refuses clients with unknown, or perl based useragents. I could easily spoof a common set of headers, but I want to know if passive, gentle(slow) automatic reading of wikipedia needs special permsision.
Sorry if a document already explains the answer, I have read so many and I know I have missed a few. Thanks! HighInBC 00:45, 18 October 2006 (UTC)
- How often would this be run, e.g. only when you are browsing, all day everyday, or someone in between? For the most part, this is pretty much acceptable. — xaosflux Talk 02:23, 18 October 2006 (UTC)
- Personally I'd run this on the m:Toolserver - it's much better suited for read only access. If it's vandal detecting patterns feel free to gimmie a shout over them, if they make sense we can throw em in an autoreverto bot :) -- Tawker 02:25, 18 October 2006 (UTC)
- Thanks for the info. In answer to Xaosflux's question it would be ran once, produce a few urls to questionable diffs, then stop. Once I have manually delt with the diffs I may run it again if needed. So I guess what I am saying is that it will not be on wikipedia more than I am. HighInBC 16:11, 18 October 2006 (UTC)
- This is fine, and does not need to go through bot requests, its no more of a load then clicking on Random Article 3x a min. If this will start doing high speed pulls/recursive pulls/or continuous operations then send it in as a bot request, and as Tawker mentioned above, running from the tool server may be an option then. — xaosflux Talk 17:19, 18 October 2006 (UTC) (Bot Approvals Group).
I've got a similar read-only question. I have a couple of things in mind, including an anti-linkspam monitor, that could use a real-time feed of recent changes. I'm aware of the IRC channel, and built a parser for that, but I'd also like to get the content of the change. Does somebody already have a richer feed? If not, I'm like to build and publish one. For that, should I get approval before I start development? In load it would be no worse than some of the vandal-fighting tools, but at one fetch per edit, it's still not small. Thanks, William Pietri 19:38, 19 October 2006 (UTC)
- Have you tried http://en.wiki.x.io/w/index.php?title=Special:Recentchanges&feed=rss or http://en.wiki.x.io/w/index.php?title=Special:Recentchanges&feed=atom already? Tizio, Caio, Sempronio 09:32, 20 October 2006 (UTC)
- Ah, thanks. That may be close enough. I had always assumed the Recentchanges RSS had the same info as the page, so I hadn't looked at it. Do I need any bot-ish blessing to hit that page very frequently? Thanks, William Pietri 15:20, 20 October 2006 (UTC)
- I don't know, but I seem to recall Brion somewhere said that a sequence of very close load requests can cause further requests not to be served. However, robots.txt contains a delay of 1 sec for crawlers, so probably he was referring to a delay of this order. Probably Tawker is one of the best people to ask, since he's running an anti-vandalism bot (which I presume uses these feeds). Tizio, Caio, Sempronio 16:03, 20 October 2006 (UTC)
- Ah, thanks. That may be close enough. I had always assumed the Recentchanges RSS had the same info as the page, so I hadn't looked at it. Do I need any bot-ish blessing to hit that page very frequently? Thanks, William Pietri 15:20, 20 October 2006 (UTC)
Bot change of function
Per consensus on Wikipedia talk:Copyright problems and an explicit request on my talk page, I have changed the behavior of DumbBOT (talk · contribs), so that unlisted copyright violations are listed together with the others rather than on a separate page. The people dealing with this backlog appear to prefer the bot doing this. Tizio, Caio, Sempronio 09:37, 20 October 2006 (UTC)
- Sounds fine. When the bot request was approved initially we wanted to be useful while not getting in the way, but if those users who are most involved in the process want otherwise, that's fine. I'll update your request to ensure that this is remembered historically. -- RM 16:34, 20 October 2006 (UTC)
Bot owners noticeboard?
Don't we need a bot owners noticeboard?--Andeh 17:25, 21 October 2006 (UTC)
- Would this be to centralize editor comments, etc? (WP:BOWN) Many bot operators are off-wiki, and rely on interwiki talk page links to their operators on other projects. — xaosflux Talk 17:35, 21 October 2006 (UTC)
- Just be a general place for bot owners to discuss current bots/jobs/ideas etc.--Andeh 17:45, 21 October 2006 (UTC)
- I don't think we really need that. We have Wikipedia:Bot requests to discuss ideas for bots, and this page to talk about other stuff. —Mets501 (talk) 18:02, 21 October 2006 (UTC)
- Yes, of course, we can have links to there from the top of the noticeboard. But there may be discussions that don't really fit on any talk page.--Andeh 15:24, 22 October 2006 (UTC)
- If you want it, be bold and create it. You don't need anyone's permission :) --kingboyk 15:43, 22 October 2006 (UTC)
- Yes, of course, we can have links to there from the top of the noticeboard. But there may be discussions that don't really fit on any talk page.--Andeh 15:24, 22 October 2006 (UTC)
- I don't think we really need that. We have Wikipedia:Bot requests to discuss ideas for bots, and this page to talk about other stuff. —Mets501 (talk) 18:02, 21 October 2006 (UTC)
- Just be a general place for bot owners to discuss current bots/jobs/ideas etc.--Andeh 17:45, 21 October 2006 (UTC)
- I like the idea. Bot requests is very formal. Discussions are very specific to new proposals. A notice board could be a informal place to discuss all kinds of things. -- Ganeshk (talk) 18:23, 21 October 2006 (UTC)
- You're right, I changed my mind. A noticeboard would be a good thing. —Mets501 (talk) 17:47, 22 October 2006 (UTC)
- I've created one at Wikipedia:Bot owners' noticeboard. Let's get it up and running! —Mets501 (talk) 17:54, 22 October 2006 (UTC)
- Added first discussion, someone beat me to creating {{botnav}}.--Andeh 18:42, 22 October 2006 (UTC)
- I've created one at Wikipedia:Bot owners' noticeboard. Let's get it up and running! —Mets501 (talk) 17:54, 22 October 2006 (UTC)
- You're right, I changed my mind. A noticeboard would be a good thing. —Mets501 (talk) 17:47, 22 October 2006 (UTC)
Bot restiction.
Please note that per (Wikipedia:Requests for arbitration/Marudubshinki) Marudubshinki may no longer operate bot accounts, nor use their user account as a bot. User:Bot-maru is already indef blocked, and does not hold a bot flag, I've removed its entry on Wikipedia:Registered bots. — xaosflux Talk 02:15, 22 October 2006 (UTC)
Inactive Bots
Moved to Wikipedia:Bot owners' noticeboard#Inactive Bots
It was brought to my notice that the said bot was approved a while ago (Wikipedia:Bots/Requests for approval/PlangeBot) but appearently not yet being flagged nor is it in any of the archives. What's going on? --WinHunter (talk) 01:15, 29 October 2006 (UTC)
- Probably a f*ckup. The bot was approved. Would a friendly bureacrat please flag it? Muchos gracias (sic). --kingboyk 12:04, 30 October 2006 (UTC)
Additional BAG member input requested
Please see Wikipedia:Bots/Requests for approval/MartinBotII 3 to help determine approvals for this request. Thanks, — xaosflux Talk 17:43, 2 November 2006 (UTC)
- And again. --kingboyk 13:26, 4 November 2006 (UTC)
Multiple Requests
I have around three four requests for extra tasks on MartinBotII to make, all relating to wikiprojects. To avoid spamming the BRFA page, should I put the the three really simple tasks (which don't edit the mainspace) in one request, and the other (which, at most, involves putting article talk pages in categories, and which runs similarly to MathBot) in another? Or all four together? I'm not sure what the norm is for this - any input appreciated. Martinp23 14:29, 4 November 2006 (UTC)
- What are the tasks? --kingboyk 14:34, 4 November 2006 (UTC)
- I'll outline them below:
- Do the same as task 2 on MartinBotII (approved), but for other Wikiprojects (per requests), using exactly the same system, just different message, recipient and opt-out pages
- Update the Wikipedia:WikiProject Trains/Recent changes list of articles by editting sub-pages of that, based on which article talk pages contain the project banner (diff)
- Produce a list of all wikiproject pages for the WikiProject Directory, displaying them in a table diff and making a seperate list of new projects diff.
- These are the non-mainspace (talk) edtting tasks, which are fairly simple. All have been fully tested, as shown by the diffs. The other task is for WP V1.0 and does the same sort of thing as MathBot, except that all features aren't yet implemented. The main difference is that, in its tables, it only lists articles which have been rated by a wikiprject and achieve a score based on calculations done by the bot based on the rating and importance of the article. For the tests thus far, all those articles which have achieved the minimum score on the wikiprjects which were tested were put into sub-pages of this page. This was only a very limited trial - when approved, the bot will add all the talk pages of all those articles (and images) which meet the requirements to a category - this is the reason that I've isolated this project, along with the potential for a huge number of edits (depending on overall quality of course, but could go into tens of thousands (spread out over weeks). Martinp23 14:49, 4 November 2006 (UTC)
- I'll outline them below:
Request for proposed bot review
I have not used or written a WP bot before (except simple read-only for analysis) and I have a couple of questions:
- Is there a centralized place for someone to propose (not to request) a bot that doesn't yet exist, to get the equivalent of a pre-implementation design review?
- On the assumption that this is the place (for now), my idea is to write a bot (in perl) that can revert in two modes:
- Vandal mode: by looking for specific strings and regular expressions; and
- Linkspam mode: by looking for links to web sites that are on a blacklist
The sites that are 'guarded' would normally be a very small number (like someone's small watch list) and each could have its own custom set of 'typical' vandal strings/regexps, and/or its own set of blacklisted external link sites. All reversions would include Talk page message to the user, would follow 3RR (or less, for safety), would leave clear edit summaries, etc. Of course once a prototype version is available, I plan to bring it here for approval (it would be run in manual mode until bullet proof), but I would appreciate getting preliminary comments prior to investing any serious effort. Thanks, Crum375 17:11, 9 November 2006 (UTC)
- This seems similar to my second bot, which goes after shared IPs and suspicious new users (likely socks) to a watchlist, along with link spam revert ability.Voice-of-All 17:14, 9 November 2006 (UTC)
- Do you have a link to its specs that I can read? Is it Open Source? Crum375 17:49, 9 November 2006 (UTC)
- I read the external 'specs' available at your bot page. If it supports specific tools for EL enforcement (by specifiying blacklisted linked sites per article) then I can't find it. If it supports custom regexps for vandal edit detection per article then I can't find that either. Those are the 2 main missions I see for my proposed bots - customization per article for vandal and linkspam protection. My plan is to implement it in perl. Do you, or does anyone else, see a problem with that concept? I know I myself need it as a user, but I don't want to invest the effort in it if it's already done somewhere, or if it violates some fundamental bot rule and has no chance for approval here. Any comments would be appreciated. Crum375 21:37, 9 November 2006 (UTC)
- My bot only gets 20-80 edits/day usually (now its at a low point). If it only watched single pages it would probably be of very little use.Voice-of-All 23:21, 9 November 2006 (UTC)
- As I noted above, the issue for me is not need, as to me personally it would be very useful to have that functionality - it would save me personally lots of menial work, which is what bots are for. I suspect that even if no one else except for me used that bot, it would still 'pay for itself' in time saved, after several months. If anyone else used it, it would be a bonus. But the real reason for my questions here is whether there is some procedural flaw that I am missing, that would preclude this proposed bot from ever being approved here, or whether someone has already built it and I missed it while going over the existing bot list. I am still waiting for an answer to either question. Crum375 23:49, 9 November 2006 (UTC)
- Per-article customisation is an interesting idea, and could in theory capture persistent vandalism on a particular article that it's not practical or "safe" to revert in general. I think VoA may well be correct, though, that doing it this way could involve a lot of work, for relatively little benefit, but I don't think that's a basis on which the bot would fail to be approved on (wherein the principle is basically, prevention of bots doing harm). Also bear in mind that for linkspam that should be blacklisted in the general caes, there's m:spam blacklist, which solves the problem "at source". If you want a more formal answer from the BAG (as against from here in the peanut gallery), you might consider filing an approval request, but specifying that the trial wouldn't begin at once, assuming it wouldn't be an excessively long period to hold the request 'open' for. Alai 01:53, 11 November 2006 (UTC)
- Thanks for the detailed response. Regarding the effort/benefit ratio, I think I can live with that, and the effort is (mostly) one-time, vs. hopefully a long period of benefits. Regarding the linkspam list, I think the issue there is that some linkspam is only posted (persistently) into one article, and I am not sure if it would instantly qualify for the global blacklist, although it may eventually. Regarding the application in a 'hold mode', I think I'll take a chance on the 'peanut gallery', enough to whip up a basic prototype, which I can then submit for approval. Thanks again, Crum375 02:21, 11 November 2006 (UTC)
Scepbot
Back in June, I requested a flag for my bot (here). A day later, I added a section for possible tasks to do (e.g. template substitution) if there were no redirects to fix, but never got a response either way. I'm a bit confused on whether it should run these tasks or not, however. Will (message ♪) 00:30, 13 November 2006 (UTC)
- go ahead but make sure you update pywikipedia. Betacommand (talk • contribs • Bot) 16:37, 13 November 2006 (UTC)
- Had done anyway, thanks :) Will (message ♪) 17:13, 13 November 2006 (UTC)
- go ahead but make sure you update pywikipedia. Betacommand (talk • contribs • Bot) 16:37, 13 November 2006 (UTC)
InterWikiLink for arabic page
please add someone who knows how to, the [[ar:ويكيبيديا:الميدان/تقنية]] to the list - thanks --Mandavi 16:11, 13 November 2006 (UTC)
- Done; the correct page to add the link to is Wikipedia:Bots/Requests for approval/Header. --ais523 16:24, 13 November 2006 (UTC)
Expiring requests?
TaeBot, TheJoshBot, Huzzlet the bot all didn't have any discussions for weeks, are they going to expire or stay on the list longer? --WinHunter (talk) 16:45, 13 November 2006 (UTC)
- we will move them soon Betacommand (talk • contribs • Bot) 17:10, 13 November 2006 (UTC)
- What is the approval for the TheJoshBot stalled on? At Template:Infobox Australian Place (under WP:AUSTPLACES), we have been waiting patiently for approval so the conversion can be made.SauliH 15:39, 16 November 2006 (UTC)
Trial results
I have posted the trial results of my bot here. May I request somebody in BAG to have a look. Thanks -- Lost(talk) 14:53, 17 November 2006 (UTC)
Betacommand Deletion Bot
I have blocked Betacommand for 1 week for operating an unapproved deletion bot. If I am somehow wildly mistaken and this behavior was properly authorized please correct me.
See: WP:AN#Massive Image Deletion.
Dragons flight 08:49, 28 November 2006 (UTC)
- Just an update for everyone's information: there was no bot involved in this incident. -- RM 14:58, 28 November 2006 (UTC)