Wikipedia:Bots/Requests for approval/Ganeshbot 5
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Denied.
Operator: Ganeshk (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): AutoWikiBrowser and CSVLoader
Source code available: Yes, available at WP:CSV and WP:WRMS
Function overview: To create gastropod species and genera articles based on data downloaded from the WoRMS database. The bot will run under the supervision of the Gastropods project.
Links to relevant discussions (where appropriate):
Edit period(s): Weekly
Estimated number of pages affected: 500 per week
Exclusion compliant (Y/N): N/A
Already has a bot flag (Y/N): Y
Function details: The bot will create species and genera articles under the supervision of WikiProject Gastropods. Here are the steps:
- Bot operator will propose a new family that needs creating on the project talk page.
- Gastropod project members will approve the family and provide a introduction sentence. Here is an example.
- Bot operator will download the species data from WoRMS using AutoWikiBrowser and WoRMS plugin. Only accepted species will be downloaded.
- Bot operator will run AutoWikiBrowser with CSV plugin to create the articles using a generic stub template, project provided introduction sentence and the data downloaded from WoRMS.
- Bot operator will maintain a log on the User:Ganeshbot/Animalia/History page.
There are about 10,000 species articles (approx.) that are yet to be created.
Note: This bot has been approved to create a smaller set of similar stubs in March, 2010. This request is for getting an approval for all new families approved by Gastropods project.
Discussion
[edit]Note: For anyone new to the discussion, in Ganeshbot 4 (later amended at Wikipedia talk:Bot Approvals Group/Archive 7#Wrong way of the close a BRFA) this bot was approved to create about 580 stubs for the genus Conus. Despite stating that "about 580 stubs will be created (nothing more)",[1] Ganeshk was somehow under the impression that further approval was not required to create additional articles. When this was brought to the community's attention at various places, including WT:Bots/Requests for approval#Wikipedia:Bots/Requests for approval/Ganeshbot 4, Ganeshk stopped creating the articles without approval. Anomie⚔ 02:09, 19 August 2010 (UTC)[reply]
- I'm almost positive something like this involving plant articles or insects cratered badly when it was found the source database had errors. Are we sure this database is reliable to use? MBisanz talk 01:56, 19 August 2010 (UTC)[reply]
- From what I hear from members of the Gastropods project, WoRMS has the best experts in the world. Their database is not infallible, but overall beneficial. — Ganeshk (talk) 02:05, 19 August 2010 (UTC)[reply]
- You're probably thinking of anybot, although there were other issues that contributed to that situation becoming a major clusterfuck. Anomie⚔ 02:16, 19 August 2010 (UTC)[reply]
Needs wider discussion. I see you already informed WikiProject Gastropods. Please advertise this request at WP:VPR to solicit input from the wider community. Anomie⚔ 02:09, 19 August 2010 (UTC)[reply]
- I have posted a note at the VPR, Wikipedia:Village pump (proposals)#Species_bot. — Ganeshk (talk) 02:39, 19 August 2010 (UTC)[reply]
- I understand your concern MBisanz, but in my understanding, no database is completely invulnerable to mistakes. Though this may be contested, WoRMS is a somewhat reliable source, considering it is gradually revised by several specialists. Information provided by WoRMS may or may not change with time. It evolves, as does Wikipedia. We from project gastropods aim to closely observe those changes, so that the informations contained in the gastropod articles are true to their source, at least. I recognize that a large number of stub articles created in an instant can make things difficult, mainly because we are just a few active members, but then again I think this bot is very beneficial to the project, if used with caution. --Daniel Cavallari (talk) 02:30, 19 August 2010 (UTC)[reply]
How many articles can the gastgropod project check with just a few active members? The bot created about 100 stubs a day for the past few months for a total of 15000 stubs? Have these 15000 stubs been checked? I checked a few and found concerns. I volunteered to point out problems if I could speak directly to the gastropod family experts, but I was insulted by a gastropod member for my poor spelling, repeatedly insulted. I think the inability to work with other members of the community and the unwillingness to accept criticism and the tendency to focus on personal insults over taxonomic issues spell disaster for this bot. The bot is either continuing to run or is being operated as an account assistant by its operator, this also makes it hard to know what the bot is doing. The operator will have to have all rules of bot operation explicity outlined, as he took his own statement of "580 articles (nothing more)" to mean 15000 articles. What other bot rules will be misinterpreted? —Preceding unsigned comment added by JaRoad (talk • contribs) 03:05, 19 August 2010 (UTC)[reply]
I am also concerned that gastropod members are using what they consider a "somewhat reliable" resource that is evolving through time like Wikipedia. Wikipedia is not considered a reliable source for Wikipedia articles. Writers are expected to use reliable, stable, and non-primary sources, not "somewhat reliable" sources. —Preceding unsigned comment added by JaRoad (talk • contribs) 04:20, 19 August 2010 (UTC)[reply]
- JaRoad, This link will show you that the bot has actually stopped creating articles as of 8/15/10. — Ganeshk (talk) 04:29, 20 August 2010 (UTC)[reply]
- If quality control is being questioned, I suggest that members of the gastropod project agree on an acceptable percentage of defective articles generated. Then, select and examine random articles that were produced by Ganeshbot. Determine the percentage of defectives and take it from there. Anna Frodesiak (talk) 05:29, 19 August 2010 (UTC)[reply]
- Comment Although my general reaction is very much against bot-creation of articles (I think it is crazy), I was impressed with the couple of species articles I looked at. However, I know little to nothing about gastropods (or bots). It is dismaying that the original BAG approval was so badly misunderstood: it seemed quite clear to me. I wonder, from a broader point of view, whether this is a wise thing to be doing at all. What happens when the WoRMS[2] database is enhanced or corrected? How do such changes get here? A content fork on WP is not helpful. What about changes in the WP articles: can relevent changes be fed back to WoRMS? What do the denizens of WoRMS think about all this? Similar thoughts for WikiSpecies[3] (FAQ[4]). I have seen some discussion about why this data should be going into WP rather than WikiSpecies but since the latter is supposed to drive the former I don't understand the rationale for the data coming to WP first. What to the WikiSpecies folks think? Anyway, just my thoughts. Thincat (talk) 10:41, 19 August 2010 (UTC)[reply]
- With regard to your question about how changes on WoRMS can get here, I have plans to write a bot that will compare Wikipedia with WoRMS and list articles that will need updating. I intend to file a BRFA for that in the future. WoRMS was happy to hear that Wikipedia was using of their database as a 'trustworthy' taxonomic data source. We are listed under the user list for their web services functionality. — Ganeshk (talk) 00:00, 20 August 2010 (UTC)[reply]
Support As I have written previously there is unified support of this by Wikiproject gastropods members. The bot is running since March 2010 without any problems. I would like to thank to User:JaRoad, who have found a "mistake" affecting 6 articles (or maximally up to additional 10 articles, and some thinks that it even was not an mistake) in highly specialized theme in this family Category:Velutinidae. The "mistake" was made by one of wikiproject gastropods members. It was made neither by a bot nor by a bot operator. We have remedied it and we have taken precautions. The bot is specialized in creating extant (living species) marine gastropod articles, that is only a small part of the project. The bot works systematically according to its operator instructions. Additionally the bot works in cooperation with WoRMS http://www.marinespecies.org/users.php (see Wikipedia listed there). That guarantee also automatic or semi-automatic update in the future, if necessary. Maybe it seems for other wikipedians, that nobody takes care about those generated articles. That would be incorrect prejudice. See for example the history of "List of Conus species" where is exactly written "all species checked". For example last month have one user uploaded ~1000 encyclopedic images and he have added them mostly into those articles started by this bot. This bot is doing exactly the same thing, that would human members of the wikiproject gastropods do. There are no known real issues with this bot. Feel free to formally approve it. Thank you. --Snek01 (talk) 13:21, 19 August 2010 (UTC)[reply]
Support The core of the gastropod team stands by the accuracy of the articles, and so do I. I watched as the first batch was prepared. It was meticulously fact-checked by JoJan and others before the bot generated the stubs. The bot is an asset to the project, and ought to continue. Furthermore, the introductory statement to this page has an objectionable tone of indictment. Anna Frodesiak (talk) 13:46, 19 August 2010 (UTC)[reply]
Support I find the bot stubs to be very good, certainly as good (or better than) stubs that are created manually by project members or other contributors. We are using the most up to date system of taxonomy. And yes, as Anna says, we reviewed the process very carefully over many weeks before the process was put into effect because we understand the possible dangers of mass bot generation of stubs.This is not our first experience with bot generated stubs, a good number were created back in 2007. Thanks, Invertzoo (talk) 17:10, 19 August 2010 (UTC)[reply]
Oppose Due to the misunderstanding, there are now fifteen thousand stub articles about slugs and snails, largely unchecked, and for which there is frequently no information to be added. The aim of the Wikiproject is to have a similar article for all 100,000 articles in the database. I cannot personally see any reason for this. We should have articles about gastropods that have more information about them, where the article can be fleshed out and more content added. I share the concern about the WorMs database, and do not think that there is any need to reproduce it in Wikipedia. Elen of the Roads (talk) 18:09, 19 August 2010 (UTC)[reply]
- All of them are checked (by me or other project member) prior its creation. By the way, the task of the bot is to create less than 19.000 articles (according to the information Wikipedia_talk:WikiProject_Gastropods/Archive_3#More effective importing) from which the majority of them is already done. There is only need to finish the task in progress. --Snek01 (talk) 19:17, 19 August 2010 (UTC)[reply]
- Yes, but the task of the bot was only to create 600 articles, not nineteen thousand of the things. The bot was allowed to operate on the basis that the Wikiproject would expand the entries - and it was only supposed to create entries on Conus sp, which are rather better documented than most. "I have checked prior to creation" does not really address the requirement to check after creation, and add further information. There is no reason to duplicate the WorMs database in wikipedia. Elen of the Roads (talk) 21:44, 19 August 2010 (UTC)[reply]
- Elen, I think Wikipedia has the potential to realize E. O. Wilson's vision of creating an Encylopedia of life, "an electronic page for each species of organism on Earth", each page containing "the scientific name of the species, a pictorial or genomic presentation of the primary type specimen on which its name is based, and a summary of its diagnostic traits.".[5][6] If the bot is creating accurate articles of species that have been reviewed at WoRMS (please note that the bot only downloads records that are marked as accepted), what is the harm in having a page for that species on Wikipedia? The page will develop over time as people come in and add additional information. The bot gives the page a good starting point. — Ganeshk (talk) 00:20, 20 August 2010 (UTC)[reply]
- Elen, but we are expanding those stubs and checking them when needed (the only thing what is usually need to check are wikilinks only when linking to homonyms). For example Conus ebraeus, Conus miliaris was expanded nicely as well as other ones. Even your presumtion "There is no reason to duplicate the WorMs database in wikipedia." is wrong. If there is encyclopedic content somewhere that is useful for wikipedia, then we will normally duplicate it, for example we duplicate some images from Flickr on Wikipmedia Commons as well as we duplicate encyclopedic text content from any other free source. Look for example at article Velutina velutina how it is "duplicated" from WoRMS and tell me, what are you unsatisfied with. You have written "I cannot personally see any reason for this." Is the reason, that I would not be able to make this Start class article(s) without Ganeshbot enough for you? Even if you still will not see any reason for this, then you do not need to disagree with it, because other people consider it not only reasonable, but also necessary. I have started about ~2000 articles by myself and I am not a bot. Of course I have also expanded much more ones. I must say, that starting them was quite tiresome sometimes. I would like to also enjoy expanding articles as you. Would you be so generous and could you allow me to focus on expanding articles of any gastropod instead of starting them, please? --Snek01 (talk) 00:50, 20 August 2010 (UTC)[reply]
- Elen, I am sorry, but I don't understand what you mean when you say about these new stubs that "there is frequently no information to be added". On the contrary, I think every single one of them "can be fleshed out and more content added". That is the whole purpose of creating them, so that we can easily add images or more info with more references. Invertzoo (talk) 02:58, 21 August 2010 (UTC)[reply]
- If that's the case, you won't object to the bot creating articles only at the pace that you can flesh them out, and you'll be OK with finish fleshing out the 15,000 its already created before its allowed to create any more.Elen of the Roads (talk) 11:07, 22 August 2010 (UTC)[reply]
- You yourself are very much against the idea of a large number of stubs, I can see that, but as far as I know, there does not appear to be a WP guideline against stubs. And, unlike many other kinds of stubby articles, these species stubs have a fact-filled taxobox and intro sentence, as well as a decent reference, so they are actually already quite rich in information, despite being physically short still. It may not seem so to you, but these stubs are already quite useful to a reader who is curious to find out more about a certain species. I also think you will find that throughout most of Wikipedia's biology coverage of individual species, especially those of invertebrates and lower plants, stubs are the norm rather than the exception. At the Gastropods project we have been creating rather similar stubs by hand for a very long time without any objections. Thanks for your interest, Invertzoo (talk) 15:42, 24 August 2010 (UTC)[reply]
- If that's the case, you won't object to the bot creating articles only at the pace that you can flesh them out, and you'll be OK with finish fleshing out the 15,000 its already created before its allowed to create any more.Elen of the Roads (talk) 11:07, 22 August 2010 (UTC)[reply]
- Elen, I am sorry, but I don't understand what you mean when you say about these new stubs that "there is frequently no information to be added". On the contrary, I think every single one of them "can be fleshed out and more content added". That is the whole purpose of creating them, so that we can easily add images or more info with more references. Invertzoo (talk) 02:58, 21 August 2010 (UTC)[reply]
Support – The bot doesn't do anything else that what we, the members of the project, have been doing manually all these years, The Gastropoda is one of the largest taxonomic classes in the animal world. Without a bot, we're facing an impossible task. The data from WoRMS are very reliable, made by the best experts in the world, You won't find a better expert anywhere to check these data, so who do you want to check those data ? As to the so-called mistake in Velutina, I advise the community to read the disccusion at Wikipedia talk:WikiProject Gastropods#Phalium articles, The integrity of the content generated by the bot is not at stake, but the bot permission is the real issue. This bot has saved the members of this project perhaps thousands and thousands hours of work, generating all those new articles. Once an article exists, it is much easier to add information. I'm in the process of uploading to the Commons about 2,500 photos of shells of sea snails from an internet source with a license suitable for the Commons. This is an enormous job that can't be done by a bot because each name has to be checked if it is not a synonym. I cannot insert these photos into wikipedia, unless there is already an article about a genus or the species in question. Otherwise, this would take me years if I have to create all those articles. For most people consulting wikipedia about gastropods, and certainly for shell collectors, the photo is the most important part of the article, The text is more a matter for experts or knowledgeable amateurs, who understand what a nodose sculpture or a stenoglossan radula represents. JoJan (talk) 18:57, 19 August 2010 (UTC)[reply]
Support – As I see it, the bot is not a mere addendum, but a necessity. Taking into account the number of species described, we're dealing with the second most diversified animal phylum, the phylum Mollusca, and it's largest class, the class Gastropoda. There are tens of thousands of extant and fossil gastropod species, and creating each one of those stubs would be an inhuman task... That's why we need a bot. WoRMS is not absolute, but it is one of the most reliable online databases available. I understand that, with proper supervision and due caution, no harm will come out of Ganeshbot. Daniel Cavallari (talk) 00:10, 20 August 2010 (UTC)[reply]
Oppose as currently inplemented. The lack of prior approval and poor communicationskills by bot operator and project will continue to be a problem. The bot operator has now posted a list of 100s of problematic articles, various types of synonyms that should be redirects rather than articles. The project members could have spent time looking for problems and readily found these instead of fighting to protect the bot. It would have established a wikipedia-beneficial future method for dealing with bad bot articles. These articles need fixed now, no bad taxonomic article should sit on Wikipedia while editors know its bad. The bot operator created no plan for fixing these articles. Neither did the wiki project.
In my opinion a bot set up to scour multiple species data bases at the request of a h uman editor could greatly benefit writers of species articles. The hujman editor could verify a dozen species in an hour or two then ask the bot to create just the formatted article with taxonomy box, categories, stub tags. This could save the human editor many hours of tedious work. The bot could get species from algae, molluscs, plants, dinosaurs. It could be multiple bots, even, with a central page for requests. This would be the best of both worlds: more articles, decided by humans, tedium bny bots. JaRoad (talk) 01:41, 22 August 2010 (UTC)[reply]
- Let me just say two things in response to JaRoad's comments. Firstly his assessment of our "communication skills" is based solely on his current personal perspective over the last several days, and as such it is arguably not at all relevant to the bot issue. Secondly and more importantly: if you talk to any invertebrate zoologist who is actually a taxonomist, he or she will tell you that articles or entries that use what may or may not be a synonym name are an extremely common occurrence, not only here on Wikipedia but throughout all writings on biological taxa, especially at the genus and species level. I think you will find this same issue within every large taxon of invertebrates that has not been exhaustively studied, whether the articles or entries are or were created by humans or by a bot. I would not even call these "bad" articles or "bad bot articles". The nomenclatural issues on many species of gastropods are extremely complex. First rate experts within the field very often disagree in quite polarized ways as to what the "correct" name should be for a species of gastropod. I can give you several examples if you like. There really isn't a way to simply "verify" species names as JaRoad suggests. Thank you, Invertzoo (talk) 03:13, 22 August 2010 (UTC)[reply]
The topic is the bot not me. Taxonomy is not the topic either. Editors make decisions about species validity on wikipedia. My suggestion is that only editors make these decisions. Although my suggestion is a counter proposal to this bot, this bot could make a useful tool as part of this counter proposal. I have not suggested any way to simply verify species names. JaRoad (talk) 04:49, 22 August 2010 (UTC)[reply]
- No, I am sorry, but you are quite wrong on this point, which is indeed about taxonomy and nomenclature. Editors on Wikipedia must not and can not make decisions about which species are valid; that counts as Original Research, which is not allowed here. All we can do is to cite a reliable reference to back up the use of a name as it is currently applied to a certain morphotype. The validity of a species and a species name is a weighty scientific opinion, which can only be determined by an expert researcher who knows the relevant historical primary literature well, who has consulted the relevant type descriptions in that family, and who has examined the actual type material for all of the claimed taxa, by visiting the various museums throughout the world that have the types of the relevant species and supposed synonyms and carefully examining that material. Invertzoo (talk) 15:24, 22 August 2010 (UTC)[reply]
Yes, they do. Wikipedia editors decide that WoRMS is a reliable resource eand its listing of species versus synonyms is going to be used, therefore WoRMS listing of accepted names is a source for valid species. Then if WoRMS is in disagreement with another secondary or tertiary source the editor decides which of the two sources is the correct one for the name of the article and how and why the other source earns a mention as to the controversy rather than being the name for the article. Mollusc editors have already decided that the chosen taxonomists on WoRMS will be the deciders of species names on Wikipedia, hence you have chosen to confer validity on the WoRMSZ set of species names, not all of which are accepted 100% by all mollusc taxonomists. This is done for all controversial species of any type of organism on Wikipedia. Maybe you only create articles about noncontroversial species.
Back to the suggestion I raised. This removes the wholesale stamp of validity on one database and returns it to where it belongs: to the editors creating the articles through secondary and tertiary resources. JaRoad (talk) 16:18, 22 August 2010 (UTC)[reply]
- This your surmise is wrong. The decision about articles is always on human editors, who are trying to independently evaluate available information. Then they are making their own human decisions when one source is in disagreement with another. Things are being done exactly as you wish to be done. --Snek01 (talk) 09:43, 23 August 2010 (UTC)[reply]
Arbitrary section break
[edit]To summarize the discussion so far:
- WikiProject Gastropods fully intends to create all these stubs anyway, and in much the same manner as the bot does. The bot just saves the tedium of actually copying and pasting the infobox and such.
- There is some concern over the accuracy of the WoRMS database, but it has been contended that the database is populated and actively maintained by experts in the field and thus should be reliable. Is there any reason to doubt this?
- There is concern that the 15000 already-created stubs have not been reviewed by the project. Is there work on reviewing this backlog, and if so what is the progress? Is there any reason not to accept the suggestion that bot creation of more articles should wait until that backlog is taken care of?
- Note that that does not mean this BRFA should be postponed until that time, just that a condition of approval be "the bot will not start creating more articles until the existing backlog is taken care of".
- There is some concern that, as gastropod classification is changed and species are merged, no one will bother to update the many stubs created here. Is this a legitimate concern? Is this being considered?
- There is some concern that the classification system used by WoRMS is not generally accepted by mainstream scientists in the field. Is this a legitimate concern? Even if so, does the bot creation of these articles actually prevent proper weight being given to other mainstream classification systems?
Did I miss anything? Anomie⚔ 16:17, 24 August 2010 (UTC)[reply]
A few opinions:
- "...thus should be reliable. Is there any reason to doubt this?..." Again, why not do what a factory does, and check random samples, and set a standard for acceptable % of faulty articles. Or, at least figure out the magnitude of this problem within the 15,000 articles. We might be talking about only 30 articles.
- Tag specific groups of articles with an incertae sedis template that states something like "This is a group/clade/family of which the taxonomy may be in flux.."
- Establish a plan for the very valid concern that classifications WILL change.
- Keep producing articles. Incoming content and images will otherwise have nowhere to land.
- So, is this a debate over WoRMS and their people, or the endemic flux of the whole class? If it is the latter, then we should wait 30 years before producing stubs. We know that's not going to happen. So, if it is the latter, then produce the stubs, and work around the problem.
- Anna Frodesiak (talk) 01:01, 25 August 2010 (UTC)[reply]
- I am not at all clear as to what is supposed to constitute a "faulty article". To my mind, not that they are absolutely perfect (is there such a thing?), but the great majority of all of our stubs are currently at a (relatively) good level of correctness, bearing in mind how crazy and how vast the field of gastropod malacology is, and this level of correctness applies to both those that stubs that were made by hand and those that were produced by automation. Such synonym articles as currently exist can not really be considered "faulty" because the information is completely verifiable, even though it may not represent some kind of ultimate biological truth (if there even is such a thing). The supposed error in the few Velutina stubs is arguably not an error at all. The set up of each family is checked before the bot is run. If we are going to demand 100% perfection in accuracy in all stubs, or perhaps in all articles in the whole encyclopedia, then most work on Wikipedia will grind to a halt. We certainly do agree that it turned out the bot was not authorized to create so many stubs, and this is unfortunate, but almost all of us at the Project had no idea that the authorization was lacking. I feel it is important not to impose some kind of punitive demands as "retribution" for what was a genuine misapprehension. Thanks for your patience and understanding, Invertzoo (talk) 04:13, 26 August 2010 (UTC)[reply]
- The WikiProject is being asked to take some ownership of the issue, and to give a plausible assurance of planning and quality checking, in the terms outlined by Anna Frodesiak. "Faulty" means that when a clueful human reads and digests the article, including checking sources, that errors or strong defects are noticed. Not just a quick "that looks good", but a thoughtful appraisal of whether the article is sound and warrants inclusion in Wikipedia (Is it sufficient as it stands? Would it need significant improvement to merit being an article? How likely is it that thousands of such articles would ever be improved? Instead of articles, would the topics be better handled some other way, such as a list? Is it likely that classifications will change? How could that feasibly be handled?). Johnuniq (talk) 07:18, 26 August 2010 (UTC)[reply]
- Thank you Johnuniq for a very clear and cogent message that is also constructive and helpful in tone; that was a very welcome contribution to the discussion. Yes, the project can certainly set something up along the lines that Anna and you have suggested in terms of checking. Just so you know, Daniel and I for the last year have made our way through 6,000 of the older pre-existing stubs (many machine made dating from 2007, and many handmade from 2004 onwards) updating those stubs and fixing them up to reach a better quality and a standardized format. That work has included updating the taxonomy using the most recent overall system and many other improvements. So two of us at least are already used to working for a year with one approach to quality control. If you can give the Project some time to work out what would be the best system to check new stubs and the best system for updating taxonomy and nomenclature, and who will do what, that would be good. Unfortunately I am currently on vacation (until September 6th), so I cannot spare anywhere near as much time on here each day as I would at home. Best wishes to all, Invertzoo (talk) 16:23, 26 August 2010 (UTC)[reply]
- The WikiProject is being asked to take some ownership of the issue, and to give a plausible assurance of planning and quality checking, in the terms outlined by Anna Frodesiak. "Faulty" means that when a clueful human reads and digests the article, including checking sources, that errors or strong defects are noticed. Not just a quick "that looks good", but a thoughtful appraisal of whether the article is sound and warrants inclusion in Wikipedia (Is it sufficient as it stands? Would it need significant improvement to merit being an article? How likely is it that thousands of such articles would ever be improved? Instead of articles, would the topics be better handled some other way, such as a list? Is it likely that classifications will change? How could that feasibly be handled?). Johnuniq (talk) 07:18, 26 August 2010 (UTC)[reply]
- There are not known real issues with this bot. Generated stubs are useful, complete and valuable as they are. Nobody have proved evidence of any problem.
- There is necessary nothing to do with generated stubs. Normal continuous checking for taxonomic updates would be fine for every article of any species either human created or Bot created, but it is not necessary.
--Snek01 (talk) 00:26, 27 August 2010 (UTC)[reply]
- By "faulty article" I mean a small error in the taxobox or such. That's all. After all, these stubs usually contain only a single sentence stating that the subject is a gastropod, and what family it is etc. Simple.
- If 1 out of 1,000 stubs gets something wrong in the taxobox, I do not see that as a reason to stop the bot. It is doing more good than harm. Wikipedia must have an acceptable margin for error. I think, upon examination, that gastropod articles have fewer errors than general articles.
- Johnuniq wonders if such simple articles are worth existing if they consist of so little. Each species needs to be represented, even if only visited once a year. Articles get drive-by improvements from the large body of occassional users. The sooner Wikipedia has all species represented the better. The world needs a comprehensive, centralized dBase. I'm thinking of the state of things in 10 years. Let's get critical mass. This whole problem of conflicting species info is related to lack of centralization.
- I would like to hear what Ganeshk says about bots handling sweeping changes to groups of articles when classifications change.
- Also, it would be nice to see an automated system for checking article, if necessary. Anything to assist or avoid manual checks.
- The bottom line for me, is, if we deem WoRMS a good source within a reasonable margin of error, create all 100,000 articles, and deal with problems en masse with bots.
- Finally, any comment on my suggestion for a incertae sedis template? Anna Frodesiak (talk) 01:35, 27 August 2010 (UTC)[reply]
- Anna Frodesiak, I am slightly concerned by your "the world needs a comprehensive, centralised database". You do realise that Wikipedia cannot fulfil this function (Wikipedia does not consider itself a reliable source). Elen of the Roads (talk) 09:23, 27 August 2010 (UTC)[reply]
- Finally, any comment on my suggestion for a incertae sedis template? Anna Frodesiak (talk) 01:35, 27 August 2010 (UTC)[reply]
- An unreliable source now. But Wikipedia is only a few years old. In a decade or two, who knows? Critical mass might be just what this class of animals needs. Anna Frodesiak (talk) 11:15, 27 August 2010 (UTC)[reply]
- Anna, to your question about the bot handling the changes, it will be difficult for the bot to update an article where the humans have done subsequent edits (the order is lost). The bot can create subpages similar to the unaccepted page to alert the human editors about discrepancies in status, taxonomy etc. — Ganeshk (talk) 11:47, 27 August 2010 (UTC)[reply]
- But when classifications change, doesn't that usually just mean a search and replace? Anna Frodesiak (talk) 13:24, 27 August 2010 (UTC)[reply]
- It is not just a case of search and replace. Here is an example. I had to change the introduction sentence, add a new category and make other edits to accommodate the classification change. The bot cannot make these decisions. It will make a mess. — Ganeshk (talk) 13:58, 28 August 2010 (UTC)[reply]
- But when classifications change, doesn't that usually just mean a search and replace? Anna Frodesiak (talk) 13:24, 27 August 2010 (UTC)[reply]
- Ganesh is right when stating that the bot cannot handle the changes in taxonomy, only report them on a subpage. These changes have to be done manually (as I have been doing in the last few days) because there are sometimes ramifications into other genera. Every change has to be checked thoroughly. Also the new name may not have an article yet, either for itself or for the whole genus. This has complicated my task, keeping me busy for hours on one change from a synonym to the accepted valid name. That's why it's such a shame that the bot has been halted temporarily. It could have created these articles in seconds while it took me hours to do so.
- And as to the disputed need for all these stubs, I can state these aren't really stubs since they contain already a lot of information : the latest taxonomy (most handbooks and websites are running far behind in this), eventually synonyms (again very useful for checking the validity of a name). From the moment they exist, it's easy to add the type species or even a photo. These are important characteristics, wanted by most readers of these articles (such as shell collectors). Text can be added in a later stage and eventually it will be done so. Of course our ultimate goal is to add the finishing touch to each article, but that's a goal for the far future, unless a few hundred new collaborators join our project. JoJan (talk) 14:03, 27 August 2010 (UTC)[reply]
- I'm hearing two things:
- 1. The bot cannot handle changes.
- 2. The bot can create articles in seconds.
- My questions:
- How could a bot help with what you are doing right now?
- (Big picture): If the bot creates 90,000 more articles, and there are classification shifts, what then? Will we have an ocean of inaccurate articles with no automated way of fixing them? Anna Frodesiak (talk) 14:23, 27 August 2010 (UTC)[reply]
- I'm hearing two things:
- The bot cannot help us with the changes, as this involves many things, such as deleting the talk page of the synonym (CSD G6) (I can as I'm an administrator), creating new articles for a genus that was referred to (as I just did for Brocchinia). The new synonyms have to be included into the taxobox of the accepted name (and changing the accession date for WoRMS in the template). While doing so, I have already sometimes remarked that there were additional new synonyms for the accepted name. These other synonyms have to be changed too. Furthermore, one has to make a choice between making a redirect to the already existing article of the accepted name or making a move from the synonym to not yet existing article of the accepted name. As you can see this involves a lot of things that only can be done by us and not by a bot.
- I think Ganesh is best placed to answer this question. But, in my opinion, this shouldn't be too difficult for a bot to accomplish. JoJan (talk) 15:02, 27 August 2010 (UTC)[reply]
- So the answer to Anna's second questions is yes, there might be a surfeit of articles needing changes, at least for a while? Ganesh and yourself both at one point seemed to be saying that it was not possible for a bot to make the changes, although having the bot make articles for all the new synonyms would be possible. —Preceding unsigned comment added by Elen of the Roads (talk • contribs)
- The answer is yes, it will take time for the changes to be fixed. But I won't call it a ocean of inaccurate articles. Out of 15,000 stubs, only 300 articles had a classification change in the last 6 months. If the Gastropod team continues to review the articles as the bot is creating articles, we will not end up with a mountain of articles that need fixing. Already 30 articles out of 300 have been fixed. — Ganeshk (talk) 18:37, 28 August 2010 (UTC)[reply]
- So the answer to Anna's second questions is yes, there might be a surfeit of articles needing changes, at least for a while? Ganesh and yourself both at one point seemed to be saying that it was not possible for a bot to make the changes, although having the bot make articles for all the new synonyms would be possible. —Preceding unsigned comment added by Elen of the Roads (talk • contribs)
- Elen of the Roads:
- Why "...a surfeit of articles needing changes, at least for a while..."? Why just for a while?
- Why do bots make articles for new synonyms? Don't we get rid of those articles?
- Ganeshk:
- If the bot makes another 15,000 articles, won't 2% have problems, just like the first 15,000? Won't the sum total then be 30,000 articles all experiencing a 2% per six-month classification change? It seems that JoJan spent a lot of energy fixing 30 out of 300. I'm still a bit unclear about how to maintain 100,000 articles with such labour-intensive checking.
- Elen of the Roads:
- If I am talking nonsense, please say. Anna Frodesiak (talk) 22:10, 28 August 2010 (UTC)[reply]
- Anna, Wikipedia is mostly text based. That makes it difficult for computer programs (bots) to analyze and update. If MediaWiki (the software that runs Wikipedia) had some kind of database support, the bot could have easily kept the articles in sync with WoRMS. — Ganeshk (talk) 01:37, 29 August 2010 (UTC)[reply]
Another arbitrary section break
[edit]Overview from Wikipedia:Village_pump_(proposals)#Species_bot:
- another five wikipedians have shown support for this task of the Ganeshbot.
- one wikipedian (User:Shimgray) have shown support for generic articles only, while is being "opposed to creating a large number of individual species articles".
- no one have shown disagreement in the village pump.
--Snek01 (talk) 15:45, 29 August 2010 (UTC)[reply]
I would like to thank to all for their comments (including those ones, that have never edited any gastropod-related article and those ones, that have never created any gastropod-related article so they have experience neither with this Bot nor with gastropods). I would summarize the task (to be everybody sure, that it is OK):
|
- 3) additinal checking and improving.
- There are yearly, half-yearly or continuously made checking for NEW changes in the source and those NEW things are being implemented, if they are considered to be OK.
- Articles are normally improved during normal editing process.
This describes the real situation how it have been working and how it works.
Everybody can comment any phase of this process anytime. Usual and often possibilities are like this:
- For not yet created taxa article: "Better/updated source for the taxon EXAMPLE-TAXON-1 is the source EXAMPLE-SOURCE-1. Use the source instead of WoRMS."
- For allready created taxa articles: "Better/updated source for the taxon EXAMPLE-TAXON-2 is the source EXAMPLE-SOURCE-2. Update it (manually or robotically)."
Wikiproject Gastropods members will be happy to do it. Put your notice at Wikipedia talk:WikiProject Gastropods.
Consider, that this (formal) request for approval deals with phase 1) and phase 2) only. If somebody have comments to phase 3), then feel free to share your opinions at the Wikipedia talk:WikiProject Gastropods. Thanks. --Snek01 (talk) 15:45, 29 August 2010 (UTC)[reply]
Snek01 - unless you are a BAG editor, you can't close this. Apologies if you are a BAG editor, but the little table on the requests for approval page says that you are not. The instructions are specific that it has to be closed by a BAG editor (and I would have expected one that has not got an interest in running the bot, but it doesn't say that anywhere, so perhaps not expected) Elen of the Roads (talk) 17:06, 29 August 2010 (UTC)[reply]
- Elen, I don't see where Snek01 mentioned that he is closing the request and approving it. He was posting a summary of the discussion at the Village pump and this page so far. Atleast that is how I read that. — Ganeshk (talk) 17:26, 29 August 2010 (UTC)[reply]
- (edit conflict) The list of BAG members is at WP:BAG, and Snek01 is not one. I'm not sure where exactly Snek01 supposedly closed the discussion (all I see is an attempted summary), but to be 100% clear: this BRFA is not closed yet, and the bot does not (yet) have permission to create any additional articles. Anomie⚔ 17:28, 29 August 2010 (UTC)[reply]
Apologies to Snek01 if I have misread his post. To be clear, I thought it was a genuine error on his part...but accept it seems to have been a genuine error on mine. Elen of the Roads (talk) 22:12, 29 August 2010 (UTC)[reply]
I have comments, nut cannot post easily on this lonmg post. Of course, I risk an incorrectly spelled word attack tangent, among other tangents, by gastropod project members. And I would like my concerns adressed. Wikipedia is an encyclopedia, not home to the latest taxonomy of gastropods, but the most robustly accepted taxonomy. This needs adressed more widely: what gastropod members are doing.JaRoad (talk) 17:40, 29 August 2010 (UTC)[reply]
Strong Oppose Looking at the Conus articles, these are all IDENTICAL! I strongly oppose any attempt to automatically create completely identical stub articles. The fact that a species exists does not mean that there needs to be an article on it that has absolutely zero unique information. That is what Wikispecies is for. Create redirects, but not a word is more useful than the genus article. Even the source database has virtually no information on these species. It is absurd to create thousands of articles with two expansion templates on them that will not (or simply cannot) be solved. And at the very least, please don't use quotation marks where they shouldn't be. The Conus species sting, not "sting." Reywas92Talk 01:25, 5 September 2010 (UTC)[reply]
- The Conus articles are similar where they need to be similar (articles on species within one genus will always be very similar when they are quite short,) but they are not identical. Each species has its own authority and date listed for the species name. This enables a researcher to go to the primary literature and find the original description, so it's very essential. Many have a list of synonyms, an important and useful feature. As you will also see if you look at (say) the first 10 articles, quite a few of the Conus species articles already have things such as images added to them, as well as the common name where applicable, distribution info, and so on. The reason "stinging" was in quote marks is because the tooth and venom apparatus of the cone snail is not what most people think of as a sting: it is not primarily defensive and does not protrude from the hind end of the animal. Instead it originates in the mouth and is applied out of the snout of the creature. It is used to immobilize prey before eating, so in several respects if you are hit by a Cone snail it's a lot more like a rattlesnake bite than a bee sting or scorpion sting. Best, Invertzoo (talk) 23:02, 13 September 2010 (UTC)[reply]
- These creatures are not notable on their own. Few meet the GNG. Existence ≠ notability. I'm sure fantastic lists could be generated for species of each genus that incorporates the authority and date, synonyms, common names, and distribution. When someone is able to add further information, then create the article. Perhaps the bot could create redirects to the genus articles while also inserting the generic template until it can be expanded to be useful. There are over sixty thousand gastropods, and there should not be a nearly empty article for each and every one of them, or even ten thousand. And just because it is not the traditional connotation of stinging, the quotation marks are incorrect. Otherwise, I would suggest to instead clarify what you described above in the article because the quotations are meaningless. Reywas92Talk 23:57, 13 September 2010 (UTC)[reply]
- The Conus articles are similar where they need to be similar (articles on species within one genus will always be very similar when they are quite short,) but they are not identical. Each species has its own authority and date listed for the species name. This enables a researcher to go to the primary literature and find the original description, so it's very essential. Many have a list of synonyms, an important and useful feature. As you will also see if you look at (say) the first 10 articles, quite a few of the Conus species articles already have things such as images added to them, as well as the common name where applicable, distribution info, and so on. The reason "stinging" was in quote marks is because the tooth and venom apparatus of the cone snail is not what most people think of as a sting: it is not primarily defensive and does not protrude from the hind end of the animal. Instead it originates in the mouth and is applied out of the snout of the creature. It is used to immobilize prey before eating, so in several respects if you are hit by a Cone snail it's a lot more like a rattlesnake bite than a bee sting or scorpion sting. Best, Invertzoo (talk) 23:02, 13 September 2010 (UTC)[reply]
Support - It seems that the concerns have been well thought out by the users proposing this bot and that it would serve a beneficial service. I support its creation. Antarctic-adventurer (talk) 18:56, 9 September 2010 (UTC)[reply]
Support &ndash As a Wikiproject Gastropod member I fully support Ganeshbot.
Seascapeza (talk) 18:06, 12 September 2010 (UTC)[reply]
Oppose – The bot would be better off compiling a small number (possibly 1) of list articles than populating the wiki with uninformative stubs that are unlikely to be expanded much in the foreseeable future. See my comments at Wikipedia talk:WikiProject Tree of life#The Great Bot Debate. --Stemonitis (talk) 05:46, 14 September 2010 (UTC)[reply]
- All of them are being expanded and are likely to be expanded surely. In a foreseeable future, probably tomorrow, can be roboticaly expanded over 4000 stubs with description and ecology data. We have enough data for this. But first there have to be stubs for this. Is 4000 expanded articles until the end of 2010 the foreseeable future? --Snek01 (talk) 10:34, 14 September 2010 (UTC)[reply]
- But this is all guesswork. We cannot see into the future. There are currently no plans that I'm aware of to fill these articles with ecology, descriptions, etc. Experience suggests that the vast majority of the articles created en masse like this are not substantively expanded in the short or medium term. Most of Polbot's articles on species on the IUCN Red List (see, for instance, Category:IUCN Red List endangered species) haven't been massively altered, for instance, although they have been repeatedly modified/updated/tweaked, diverting industry that could have been used elsewhere. If you have the data to create 4000 decent articles, with meaningful data, then please submit a proposal to create those. I think we'd all support that. That does not mean, however, that 4000 one-sentence articles are equally desirable. --Stemonitis (talk) 14:41, 14 September 2010 (UTC)[reply]
- If you can not see into the future, then it is your problem. Wikiproject gastropods members can see into the future of creating gastropod related article with the most precise accuracy than any other wikipedian. They know what are plans for the project, they know from their practical experience what are advantages and disadvantages of bot generated articles. If you want to know something more than just one task of this bot, read Wikipedia:WikiProject Gastropods (for example [7]). Experience suggests that all 15000 articles created by Ganeshbot are considered useful and there were found no errors. Imagine that somebody would suggest a bot for creating towns in the USA that would have six different types of informations in it. Would there be any problem? Would be somebody asking about its immediate expansion? If you personally consider those stubs article unused, then I respect your opinion. But some other people consider those article useful, because they provide valuable informations for them. And the purpose of Wikipedia is to provide informations. ADDITIONALLY they are considered very useful by members of one certain Wikiproject. Project is something, what is "carefully planned to achieve a particular aim". I do not think that all members of Wikiproject Gastropods are so stupid to suggest something harmful. --Snek01 (talk) 16:47, 14 September 2010 (UTC)[reply]
- Right, because when there's nothing other than names and synonyms, there aren't going to be errors. If it is correct that the articles can be robotically expanded with description and ecology data, then why isn't that part of the immediate plan?? Start with a bot that will create articles with a paragraph of unique info, not a single identical sentence. As described below, comparing gastropods to American towns is a false analogy: those have millions of potential contributors and have indeed grown substantially. The only likely contributors to these is a niche Wikiproject and experience shows that obscure species stubs barely grow at all. It would be much more beneficial to create fewer quality articles at at time than thousands of substubs at once. Reywas92Talk 00:57, 15 September 2010 (UTC)[reply]
- If you can not see into the future, then it is your problem. Wikiproject gastropods members can see into the future of creating gastropod related article with the most precise accuracy than any other wikipedian. They know what are plans for the project, they know from their practical experience what are advantages and disadvantages of bot generated articles. If you want to know something more than just one task of this bot, read Wikipedia:WikiProject Gastropods (for example [7]). Experience suggests that all 15000 articles created by Ganeshbot are considered useful and there were found no errors. Imagine that somebody would suggest a bot for creating towns in the USA that would have six different types of informations in it. Would there be any problem? Would be somebody asking about its immediate expansion? If you personally consider those stubs article unused, then I respect your opinion. But some other people consider those article useful, because they provide valuable informations for them. And the purpose of Wikipedia is to provide informations. ADDITIONALLY they are considered very useful by members of one certain Wikiproject. Project is something, what is "carefully planned to achieve a particular aim". I do not think that all members of Wikiproject Gastropods are so stupid to suggest something harmful. --Snek01 (talk) 16:47, 14 September 2010 (UTC)[reply]
- But this is all guesswork. We cannot see into the future. There are currently no plans that I'm aware of to fill these articles with ecology, descriptions, etc. Experience suggests that the vast majority of the articles created en masse like this are not substantively expanded in the short or medium term. Most of Polbot's articles on species on the IUCN Red List (see, for instance, Category:IUCN Red List endangered species) haven't been massively altered, for instance, although they have been repeatedly modified/updated/tweaked, diverting industry that could have been used elsewhere. If you have the data to create 4000 decent articles, with meaningful data, then please submit a proposal to create those. I think we'd all support that. That does not mean, however, that 4000 one-sentence articles are equally desirable. --Stemonitis (talk) 14:41, 14 September 2010 (UTC)[reply]
Support - If the source database is up-to-date, it is very useful to have the basic framework of the species pages up and running. It ensures the taxonomy is correct, taxoboxes are present, the authority is present and synonyms are added (and also redirected?). It also ensures there is at least one reference. If the project members are confident they can expand the articles, I don't see a problem. They are the ones actually working on these articles, so why would other people frustrate these efforts? Furthermore, there is a lot of debate about bot creation, but what about AWB? See User:Starzynka and his creations. This bot is doing a far better job than the stuff he or she is creating and nobody seems to be bothered with that. All the messed up species and genus pages I have come across are not made by a bot using a reliable source, but by users taking a list of red linked articles and creating pages en-masse using AWB without adding any additional info or even checking if the list they are working from is actually correct. That, in my opinion, is something everyone should oppose. Not this though, because this bot is actually doing good work. Ruigeroeland (talk) 07:50, 16 September 2010 (UTC)[reply]
Attempting to move this forward
[edit]I see three major objections to this task above:
- The reliability of WoRMS. It seems that this is best determined at Wikipedia talk:WikiProject Gastropods, as the articles are being proposed for creation. I also note the existence of WP:NOTUNANIMOUS and the fact that all except one editor seem to be ok with WoRMS as long as the articles are reviewed by a human after creation.
- The ability of the project to review the articles as they are rapidly created. This can be handled by restricting the bot to creating articles only as long as the number of articles awaiting review is not too large.
- The ability of the project to update articles as the taxonomy changes. This problem already exists anyway and will grow anyway as the articles are manually created, and it seems the project is already working on this problem. Please continue the discussion of this problem elsewhere, as it seems tangential to this BRFA.
Taking into account the concerns expressed above, I propose the following:
- The existing 15000 articles created without approval, minus any already reviewed by the project, are awaiting WikiProject review.
- Ganeshbot will not create any gastropod articles while the number of articles awaiting WikiProject review is more than 500. The review process must ensure each article is a good stub that "is sound and warrants inclusion in Wikipedia" (to quote a proposal above), and should expand the articles to at least "Start" class as much as possible.
- Creation of a set of gastropod articles by Ganeshbot will be proposed at Wikipedia talk:WikiProject Gastropods. The members of the project and any other interested editors will discuss and come to a consensus on whether the proposed set of articles is desirable, and any necessary details of content or formatting to be included.
This is basically what is proposed by GaneshK and WikiProject Gastropods, with the rate of creation automatically limited to match the project's review capacity. The identification of articles awaiting review can be done by listing them on a WikiProject subpage with editors removing articles from the list as they are reviewed, or by applying a template and/or hidden category to the articles that editors will remove as the article is reviewed. If the latter, I would also approve the bot to run through the 15000 articles already created (and only those articles) to append the template/category to any article that has not been edited by one of the WikiProject Gastropods reviewers since creation.
I am inclined to approve the bot under these terms if it seems generally acceptable. Comments? Anomie⚔ 18:55, 29 August 2010 (UTC)[reply]
- Anomie, the accuracy of the articles was never under question. So your requirement that all of the existing articles be reviewed "that they are sound and warrant inclusion in Wikipedia" and expanded to "Start class" is unacceptable. I am requesting for an approval without any limitations on the quantity. If this is not acceptable to BAG, please reject this request. — Ganeshk (talk) 19:24, 29 August 2010 (UTC)[reply]
- Indeed, the accuracy of the articles, at the moment of creation by the bot, is not questioned. WoRMS has the best experts in the world. Checking them by us would be original research and a breach of wikipedia policies. The so-called inaccuracies are actually changes in taxonomy and this happens all the time (especially since a few years, since genetic research has been used to determine the exact position in the taxonomy). This doesn't concern this bot, since it only creates new articles and is not concerned with changes within existing articles. JoJan (talk) 19:55, 29 August 2010 (UTC)[reply]
- If the bot is requesting approval for the creation of stubs with no limits on quantities, it's asking for the ridiculous. With this demand, there is no point in further wasting anyone's time with discussion. I request BAG close this request to allow a bot to create an unrestricted quantity of articles as "denied." JaRoad (talk) 20:00, 29 August 2010 (UTC)[reply]
- Indeed, the accuracy of the articles, at the moment of creation by the bot, is not questioned. WoRMS has the best experts in the world. Checking them by us would be original research and a breach of wikipedia policies. The so-called inaccuracies are actually changes in taxonomy and this happens all the time (especially since a few years, since genetic research has been used to determine the exact position in the taxonomy). This doesn't concern this bot, since it only creates new articles and is not concerned with changes within existing articles. JoJan (talk) 19:55, 29 August 2010 (UTC)[reply]
The question would seem to come back to - do we want 100,000 one line + taxobox articles created automatically all in one go.--Elen of the Roads (talk) 22:18, 29 August 2010 (UTC)[reply]
- Yes, we need about less than 10,000 for marine gastropods to be created by bot to finish the task. Those articles are with classification, authorities, some of them have added where the species was described, synonyms, proper introductory sentence and vernacular names. All of this is the bot doing. Be serious in numbers and do not spread such startlers, please. --Snek01 (talk) 09:57, 30 August 2010 (UTC)[reply]
- A limit to checking backlog before more stubs created is not necessary per JoJan's statement just above: ("...Indeed, the accuracy of the articles...")
- A condition that new stubs be expanded to "start" is unrealistic and unnecessary.
- Consensus on necessary details of content or formatting of proposed set of articles is fine. Consensus on desirability or necessity of proposed set of articles is not.
- An Unreviewed category is fine with me.
- Again 100,000 one line stubs is fine, and necessary. This is what Wikipedia is about: Creating a page to which content can be added. No page, no home for photos or details. Give the masses of occasional users (who otherwise won't create an article, or may create one with a flawed or missing taxobox) a place to add their content.
- JaRoad: If WoRMS as a source is fine, then there is no reason to limit quantity, provided that classification updates can be performed en masse.
- These articles will be created anyway. I made hundreds without a bot. They were far less accurate, and all needed checking. It took many, many hours. No bot, and I will have to get back to making them myself. Nobody wants that.
- If a human and not a bot were creating these, there would be no question or complaints.
- A percentage of random, newly created, accepted, non-gastropod articles, will contain many inaccuracies. That is considered acceptable. The margin of error in the bot stubs is relatively tiny.
- These bot articles, upon creation in the mainspace, are fine. As JoJan has stated, "... the accuracy of the articles, at the moment of creation by the bot, is not questioned." Anna Frodesiak (talk) 01:07, 30 August 2010 (UTC)[reply]
I see my attempt has failed. I'll leave it to another BAGger to figure out consensus here. Anomie⚔ 11:34, 30 August 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} User:Anomie has been "inclined to approve the bot under [his own] terms". Do not create your own terms. Terms are already created and they are called wikipedia policies. Also User:Anomie is welcomed, because he is familiar with the theme. --Snek01 (talk) 13:36, 30 August 2010 (UTC)[reply]
- One Wikipedia policy is that bots obtain approval from the community before operating. Another Wikipedia policy is community input. It concerns me, this unwillingness to discuss the issue with the community and get input from the community, coupled with antagonism toward members of the communty who attempt even to move forward with the bot. This appears to be headed for long-term insurmountable communications issues. There is clearly no authority to have created the 15,000 bot articles, when the bot operator emphasized his intention to create only the articles for a single genus or family. If this bot is authorized, what will project gastropod members take that to mean? If this bot moves forwards the strictest of terms and clear-cut understanding must be gained. --JaRoad (talk) 06:23, 16 September 2010 (UTC)[reply]
One of the issues here is about the need and notability of simple stubs. So, I posted here. If consensus can be reached on that matter, we can focus on the remaining issues. I hope this helps. Anna Frodesiak (talk) 05:56, 14 September 2010 (UTC)[reply]
Hi Elen: As I have just expressed to Invertzoo, I am becoming increasingly concerned with the potential maintenance of bot-created gastropod articles. This seems to be a group in such flux, that it might be unrealistic to expect that we could keep up with the changes in taxonomy.
I am a strong supporter of bot-created stubs that contain only rudimentary information. I think that this is in keeping with the idea of Wikipedia. However, this might be appropriate only for taxonomically stable groups, but not for gastropods. Anna Frodesiak (talk) 12:47, 14 September 2010 (UTC)[reply]
- Agree there is less problems with eg creating stubs for geographical locations as there are fewer changes outside of the most volatile areas of the world. ALSO though, there are more people likely to update eg geographical info (that's my/mums/grans hometown; I stayed there, uncle Bob lives there etc) than there are people with enough knowledge and interest to update gastropod articles. Elen of the Roads (talk) 13:46, 14 September 2010 (UTC)[reply]
- Although the project gastropod members seem intent on up-to-the-minute taxonomy changes, I point out that this is an encyclopedia and many of the resources in WoRMS are primary research with only one publication. There should be no attempt to update gastropods minute-by-minute, this is not a taxonomy journal, it's an encyclopedia, and information should be available in published sources, not a single rewriting of a taxon.
- The community of wikipedia writers that creates the species articles on wikipedia has pretty much come to the consensus that each species is important enough to merit an article, even if it is only a stub. I prefer having the species in a list in a genus article, if there is no distinct information on the species, but the articles do not require instantaneous updating. It takes years, even in the world of gastropods, for new taxonomies to be accepted. It used to take decades. It's not necessary that someone is updating them. --JaRoad (talk) 06:23, 16 September 2010 (UTC)[reply]
It might be wise to separate the matter into three distinct parts:
- 1. bot creating gastropod articles
- 2. worth of resulting stubs (currently under discussion)
- 3. management of a large number of gastropod stubs with changing taxonomy
- 1. There appears to be substantial proof that the bot creates accurate stubs from a good source.
- 2. Some say lists would be better.
- 3. With a (very approximate) 4% annual change in taxonomy, can the project maintain updates? If not, what value would remain in the stubs? Only the authority is static in the stubs. Plus, of course, improvements such as content and images. Ganeshk previously said that bots cannot help update taxonomy. But, JoJan recently stated: "Changes in taxonomy can be reported by another bot...".
Anna Frodesiak (talk) 22:13, 14 September 2010 (UTC)[reply]
- The taxonomic changes should not affect this discussion. Taxonomic changes would also affect manually created articles. One could argue it is better to have all species, so you can implement the changes to all at once, rather than having only a few, with the risk of new ones being created by inexperienced contributors using an old classification which is probably more widely known and found on internet. Ruigeroeland (talk) 14:59, 15 September 2010 (UTC)[reply]
- Furthermore, I would suggest to let the bot at least make genus articles with all species in it. I did this manually for the moth family Tortricidae and it works like a charm. It also prevents people from putting orphan tags on every species article you make. Ruigeroeland (talk) 15:03, 15 September 2010 (UTC)[reply]
Another issue, there is a comment by Invertzoo at Wikiproject Gastropods talk page about less than ideal choices made for the bot articles, "If a few of them have things that some people argue are less than ideal, these are human choices, and were not due to the bot; plus these supposedly less than idea things are a matter of opinion, not of fact."
I would like to know what human choices were made that contributed to "less than ideal" results in the bot articles. If there are "less than ideal" choices being made, will they continue to be made, will every comment by someone outside of Project Gastropod be dismissed as a "matter of opinion," and what, if anything, is Wikiproject Gastropod doing about this to allow all interested members of the community to weigh in? Where are these choices being made? Where were comments made about these "less than ideal" choices?
Again, completely ignored multiple times, up-to-the-minute taxonomy is not encyclopedic, as it arises in the primary literature, not an appropriate source for an encyclopedia. Are the bot and the project members using primary research from WoRMS to create these articles? If this is the case, I request the stubs be deleted. --JaRoad (talk) 17:12, 18 September 2010 (UTC)[reply]
- "If this is the case, I request the stubs be deleted." That is plain nonsense. Would you like to put outdated info on wikipedia because that is encyclopedic? If that is the case, go work on a paper encyclopedia. The great thing about wikipedia is the fact you can include the latest information. Ruigeroeland (talk) 17:45, 18 September 2010 (UTC)[reply]
- Hmmmm. I suggest the bot not be approved and the stubs be deleted. It seems that wikiproject gastropod members, at least one and assuming Ruigeroeland is a member, have confused the encyclopedia for their personal taxonomy journal.
- Operating or authorizing the operation of a bot to create 15,000, much less 85,000 stubs for a small group of editors unconnected to the goals and means of producing an encyclopedia is irresponsible. --JaRoad (talk) 18:06, 18 September 2010 (UTC)[reply]
This was a reply to JaRoad's message before this last one, but it got caught in an edit conflict:
- Hello again JaRoad. We are all working to try to improve the encyclopedia. When you, or anyone else, finds something you feel is "less than ideal" in a group of species articles, (such as the sentence I had created a while ago for the family Velutinidae), please feel free to Be Bold and immediately change it to something better; that is the Wikipedia way. You can leave a friendly note on the Project talk page too if you like, but that is not required. As for WoRMS, it is not primary literature, it is secondary literature. It cites sources in the primary literature. See for example [8]. Best wishes to you and to all, Invertzoo (talk) 18:16, 18 September 2010 (UTC)[reply]
- Then, when you said:
- "5. There is no solid evidence that any of our new bot-generated stubs have genuine errors in them. If a few of them have things that some people argue are less than ideal, these are human choices, and were not due to the bot; plus these supposedly less than idea things are a matter of opinion, not of fact. In our experience over the last 3 years, human-generated stubs have been found to be more likely to contain small errors (such as typos or small omissions) than the Ganeshbot stubs have been."
- ... where and how was your "less than ideal" "human choice" originally suggested, specifically the choice(s) that led to a "less than ideal" article? And how and where did the discussion about its "less than ideal" nature occur? It is not clear from the example you give, as your example does not appear to be a bot article. Knowing how the articles are created and how the choices are made helps in making sure they are without "human choices" that are "less than ideal."
- If the bot is to make these articles, then other wikipedia writers have the right to know how problems are both prevented, by discussion beforehand, and how they are corrected, by discussion after. Both how and where would be cleared up by showing this example of the "less than ideal" article and its resolution.
- I suggested, also, that a single taxonomist's database compilation of the primary literature is not quite secondary literature, unless that taxonomist is including, in the database, a review and discussion of the reasoning. Is WoRMS doing this? I cannot find these types of citations in the links included in the bot articles or the subsequent corrections made in them. If this is the source of the information for wikipedia articles, a review and/or discussion of the taxonomic revision, the wikipedia articles must cite the taxonomic revision, not just link to the taxon page at WoRMS. This would require the bot to add a sufficient line that the reader can tell what the source of the information is. This taxonomic revision, also, considering the dynamic nature of gastropod taxonomy, should have more support than just the database, also. Or is WoRMS a single taxonomic expert compiling the most recent primary literature on a taxon, without a discussion and review? In this case, the information must be supported by an additional source, and this would require the bot to post suggested stubs, and an editor to add the correct additional information. --JaRoad (talk) 18:37, 18 September 2010 (UTC)[reply]
- Then, when you said:
- By the way, Ruigeroeland is not a member of the gastropod project. Ruigeroeland works on Lepidoptera: butterflies and moths. Invertzoo (talk) 18:22, 18 September 2010 (UTC)[reply]
- Hello again JaRoad. I am sorry but I do not find your definition of WoRMS as being a primary source acceptable within professional limits. As for the process we use to set up the bot articles, it is visible on the project page and in the archives of that page. We have tried over time to answer all of your various and numerous objections and questions, but the focus often appears to shift, which makes it difficult. I do understand that you strongly disapprove of these bot stubs, that you want them all deleted and that you want no more to be created, but frankly I feel that your grounds for requesting those actions are not really sufficient to warrant such extreme courses of action. Very few (if any) other editors seem to agree with your extreme stance on these points. On Wikipedia generally I think it is accepted that we all need to be able to tolerate a state of affairs where things are moving towards a high degree of excellence, but are not there yet. A certain amount of imperfection is acceptable as one moves along towards the goal. Thank you for your interest and for your efforts to improve the encyclopedia, Invertzoo (talk) 22:11, 18 September 2010 (UTC)[reply]
arbitrary section break
[edit]Here are the points Project Gastropods would like to make:
- We completely agree that the bot was not authorized to create all these new stubs (other than the Conus stubs.) However, this was a genuine oversight and misunderstanding on our part, not a deliberate flouting of policy. We should not be subjected to punitive measures or special restrictions as a result of this unfortunate misunderstanding. Any new suggestions as to how we should proceed at this point in time should be thought out as a completely separate issue.
- 1. There is consensus at "Tree of Life" that species are intrinsically notable and that species stubs are valuable to have for the reasons suggested: easy expansion, easy and fast adding of images and other info.
- 2. There is no Wikipedia guideline against stubs or against the number of stubs a project should have.
- 3. Thus there is no formal limit to the number of stubs a project should currently have, assuming the stubs are not full of errors. (In any case we are generating stubs by hand every day and have been for several years without a formal checking system in place for content errors; this is commonly the case in the rest of Wikipedia.)
- 4. At the Project we check the content of a planned family of bot-generated species stubs carefully before they are produced. After they are produced, a few in each family are checked by hand. It is not necessary to manually check every single one of them, as the content is standardized, and assuming the bot is running normally, the stubs within one family will all be similar in structure. The bot seems sound, and no problems with it have shown up yet, but were the bot to develop a weird glitch, that should be immediately obvious from checking a few stubs in each family. In the unlikely circumstance that a glitch were to come into effect, an automated fix should be quite easy and fast to implement.
- 5. There is no solid evidence that any of our new bot-generated stubs have genuine errors in them. If a few of them have things that some people argue are less than ideal, these are human choices, and were not due to the bot; plus these supposedly less than idea things are a matter of opinion, not of fact. In our experience over the last 3 years, human-generated stubs have been found to be more likely to contain small errors (such as typos or small omissions) than the Ganeshbot stubs have been.
- 6. If one word is considered seriously misleading, or if quote marks did need removing around one word in a large batch of stubs, that could be changed by automated software in a matter of seconds.
- 7. If taxonomy on the family level or below subsequently becomes somewhat out of date due to revisions by experts, or if the nomenclature has been tweaked subsequent to the creation of the stub, this does not in any way invalidate the species article, and should not be considered an error.
Points explaining why it is so important and valuable to us (and to Wikipedia) to have a full set of stubs to cover the whole class of gastropods:
- 1. We are constantly finding new free images which can be added in to new stubs, that is assuming the stubs are available. JoJan currently has shell images of 2,500 species (!) that are waiting to be added to the project. A leading malacologist in Brazil has also offered to give us a large number of free images. We are constantly finding other new information (with refs) that can rapidly be added, that is, if a stub already exists. When stubs are not pre-existing, having to create a new stub by hand every time you need one is a slow process which is very wasteful of human time and energy. User:Snek01 creates a few new articles on species almost every day of the year (!) If he could use a pre-existing framework of stubs, that would enormously increase his productivity, enabling him to fix up and flesh out far more articles each day.
- 2. Wikipedia works precisely because people enjoy expanding articles and fixing them up. This is a situation where Wikipedia can really benefit from a "Be Bold" approach.
- 3. The Gastropods Project staff has expanded very significantly over the last 3 years, from 5 to 23; nine new people joined in 2009 alone! Even though not all of the 23 editors are extremely active, it does show that in another year or so we might have significantly more manpower and possibly manpower that is a lot more expert. We may have people who are quite willing to take on one whole superfamily or another large taxon and fix up all of the stubs in that taxon. We must think of the future as well as the present and we need to have faith in the overall Wikipedia process, which has proven to work so well over time.
Additional commentary:
- 1. The stubs which were generated recently are based on the website WoRMS and [9], the highest quality secondary source available to us, and one which is managed by some of the best professionals worldwide in malacology. JoJan is now in frequent email contact with WoRMS, so any ambiguities can be cleared up quickly.
- 2. As the WoRMS website corrects or updates their listings, these updates can readily be checked once a month and listed for us by a bot (this has already been done once) and the changes can then be implemented by hand by project members, as JoJan has being doing in the last week or so. When checked once a month, the number of changes will not be more than we can easily manage.
- 3. WoRMS covers only marine species, thus land snails and freshwater snails are not covered. One of our members has created a count of the stubs on extant marine gastropod that have already been created, and those that remain to be created.
- (Note: not counted 9 families of Sacoglossa with 284 species.)
Extant marine gastropods done by Ganeshbot to be done from WoRMS number of families 132 137 number of articles/species 15.000 articles (species + genera) my guess is about 3.000-5.000 species, but it is certainly less than 10.000 articles
- I have tried to count families precisely, but this table serves as an very approximate overview only. But it clearly shows, that the most diversified families are already done. --:Snek01 (talk) 23:00, 14 September 2010 (UTC)[reply]
Thank you all for your patience. Invertzoo (talk) 13:43, 19 September 2010 (UTC)[reply]
- A couple of points, first about how ambiguities will be cleared up:
- "JoJan is now in frequent email contact with WoRMS, so any ambiguities can be cleared up quickly."
- Why do ambiguities have to be cleared up via e-mail with WoRMS? If the literature are not clear, then this ambiguity in the secondary literature should be what is included in the article, not the results of a personal discussion. There's no way that JoJan's e-mails with WoRMS can meet verfiability. Also, one use of the primary literature for an encyclopedia is for an editor to clear up the secondary literature. Again, if it's not clear, then it's the verifiable ambiguity which belongs, not the personal interview via e-mail clarification. A personal e-mail clarification is original research.
- "The threshold for inclusion in Wikipedia is verifiability, not truth—whether readers can check that material in Wikipedia has already been published by a reliable source, not whether editors think it is true."
- Second, I'm disappointed that project gastropod members cannot even see how others can question their ability to understand what was authorized and worry about future instructions. This is what was asked for:
- "Gastropod project would like to use automation to create about 600 new species articles under the genus, Conus. This task has been verified and approved by the project."
- "This task is no way close to what AnyBot did. I had clearly mentioned that about 580 stubs will be created (nothing more)."
- It's clear you were not communicating with the bot operator. I keep trying to find the discussions about the families that led to creating 15,000 articles, but there don't seem to be any. Just a list with sentences. Where is the discussion on project gastropd that shows how you came to the conclusion that this authorization was for 15,000 articles, or the discussion about how you will deal with a future approval that guarantees you don't again misinterpret "500 stubs will be created (nothing more)" as authorization for 15,000 articles.
- Can you quote the numbers more precisely as to how many articles will be created? Will authorizing 5000 lead to 150,000 articles?
- Your misunderstanding is so outrageous and tied directly to a statement designed to reduce fears of the bot producing 1000s of unwanted stubs. The other bot created far fewer articles than this bot ultimately has created.
- The vast chasm between what you initially asked for and what you eventually created concerns me still.
- Third and most important:
- If you're creating groups of identical species stubs, there's no need to create anything but a genus article with a list. When pictures are added the article can be edited and saved under the genus name with the addition of the binomial to the taxobox and to the lead sentence sentence, plus the picture.
We have answered all of these points at least once before, sometimes in great detail. Thank you. Can I ask you to please let everyone know what the IP address was that you edited under, quoting from your current user page (User:JaRoad) "for about 5 years", before you registered as JaRoad only 5 weeks ago? All of our histories are completely open and available for anyone to peruse, as is that of our Project; your history on the other hand is a mystery. For someone who makes such sweeping demands for deletion of articles, it would be good to be able to view your history on Wikipedia. If you have nothing to hide, I cannot imagine any reason why you would want to withhold that key piece of information. Thank you for your cooperation, Invertzoo (talk) 21:45, 19 September 2010 (UTC)[reply]
- Invertzoo, there is an aspect in which your challenge is irrelevant. It remains the case that Ganeshbot was only authorised to create 500 articles, and somehow created 15,000 instead. It also remains the case that the wider community is somewhat uncertain of the value of 15,000 articles that contain no real information - although I do accept that adding pictures is adding value, can I ask how many of Ganeshbot's articles now have pictures added? It is also potentially the case that "I am in contact with WORMS by email" is not a reliable source.....?? While project gastropod's ambition to have an article on every single slug on the planet is admirable, using a bot to create 100,000 articles without information is perhaps...of less value to the project as a whole??? And demanding JaRoad out himself really does go slightly beyond the pale, I think. Elen of the Roads (talk) 22:53, 19 September 2010 (UTC)[reply]
- JaRoad is a newbie and serves on Wikipedia as a Troll, who primary edit not articles, but talk pages. I have to presume good faith so I have no right to think, that JaRoad could be for example on-line seller of shells, sockpuppet, or any other one with bad intentions for Wikipedia. Also questions by Elen of the Roads are being repeated. Over 3200 articles have image(s), but it is not known how many of them were created by Ganeshbot. I guess, that about 1000 articles by Ganeshbot were improved with image(s) during only 6 months. But nobody knows it precisely. Certainly it can be said, that large ratio of images and hundreds of images could not be effectively used on the Wikipedia without Ganeshbot. Anyway, this question is irrelevant for Ganeshbot approval as well as many other ones. Key questions are: 1) Is Ganesbot against guidelines? NO.; 2) Is Ganeshbot adding informations to the Wikipedia. YES. Nobody - even Ellen - can not claim, that Ganeshbot is creating articles without information. One of the most strategically valuable information added by Ganesbot is its synonyms section, that prevents users creating duplicite articles under its synonymous name (because of existence of full text search). Even the only existence of blue link instead of red link gives information valuable in articles like the List of marine gastropods of South Africa and similar ones. For example the first family in that list has 10 articles, all(!) of those 10 articles were created by Ganeshbot, 6 articles of those 10 articles have already images in them, some of those blue links are redirecting to article with different name, so later it will result in updating of names in the list. --Snek01 (talk) 01:20, 20 September 2010 (UTC)[reply]
- I must agree with Snek. Although I don't think he is a troll, I think this has been a huge drain on the energies of the project. This has gone on long enough. We have addressed his concerns throughly, and thoughtfully. Can a decision be rendered now? If the bot is approved, wonderful -- 2,300 images await homes. If not, I will begin generating stubs manually. JaRoad can tag them { { AfD } } individually if he likes. But, I will be making a lot of them, and they will look identical to the bot-created ones. Anna Frodesiak (talk) 01:56, 20 September 2010 (UTC)[reply]
Yet another arbitrary section break
[edit]I must also agree with Invertzoo in that we have answered your questions, and agree with Elen of the Roads that demanding JaRoad out himself is inappropriate.
JaRoad: This is an excerpt from some of the text I added and you removed from your talk page. It is relevant here. (You have every right to remove text from your talk. I only wish we had the right to remove offensive text that attacks contributors from your user page.):
I see that you have a strong point of view. Perhaps a more constructive approach would be to neutrally ask other editors what they think, and achieve consensus, and also to gather information before rendering a POV. You arrived with a POV, and do not have a monopoly on the truth. I, as a member of the gastropod project, neutrally made inquires and asked sensible questions in order to make up my mind. (I was even against the bot during my investigation.)
We have made an all-out effort to respond to your concerns. You, however, (rather single-handedly), have pushed your point of view on such matters as the credibility of WoRMS, the worth of species stubs, the accuracy of the bot's work, and the ability for the project to maintain such stubs. You have yielded or compromised on none of your initial points.
A little research, and a few queries would have saved us a lot of time, considering that we have since shown those points of view to be, not only inaccurate, but also largely in disagreement with the opinions of the community.
So, to expedite this matter, please respond to the answers we have provided, point by point, telling us whether you accept or reject them, and why. We all want to get back to contributing to the project. Thank you. Anna Frodesiak (talk) 09:19, 20 September 2010 (UTC)[reply]
- I would not invite JaRoad to answer every point we made above, as he has already shown himself capable of finding fault with everything we say about every aspect of what we do on Wikipedia. From what he says on his User page it appears that JaRoad is critical of a large part of the Wikipedia community (emphasis mine): "...an all out attack on me by editors whose primary purpose is not editing articles. This is a big battle on Wikipedia and always the social camp has more time to waste and is more invested in their social life making the encyclopedia writers a poor second. Too bad. Wikipedia would already be great if editing were the priority.
- Asking JaRoad to reveal his previous editing identity and contribution history is certainly not "outing" [10], outing is posting personal information such as RL name, date of birth etc. What I am asking for is simple transparency, so we can see whether or not this person has a problematic history on Wikipedia. As it says on that same WP policy page: "tracking a user's contributions for policy violations [is not harassment]; the contribution logs exist for editorial and behavioral oversight."
- And let's also be quite clear about one thing: decisions on Wikipedia are not made by showing who can hang on the longest and argue the most relentlessly; decisions here are made by assessing consensus. We cannot resolve this ourselves: a new person from Bot Approvals will have to attempt to see if some kind of consensus can be found. To quote from the policy page here [11] "Sometimes voluntary agreement of all interested editors proves impossible to achieve, and a majority decision must be taken". It is clear that 2 editors (JaRoad and Elen of the Roads) are in favor of extreme sanctions against the Project's past and future use of Ganeshbot. But over the last few weeks and over several different talk pages, numerous others editors who are not part of our project have supported our position, either partially or completely, once the issues involved were properly explained. And numerous editors within the project support our position. We will see what transpires. Thank you. Invertzoo (talk) 13:15, 20 September 2010 (UTC)[reply]
- Invertzoo, can you please moderate your language a little. I am not in favour of extreme sanctions against anything. Ganeshbot needs a sanction (ie approval) to act. I am not proposing sanctions (ie a restriction) on its previous action, even though these were without approval. The 15,000 slugs and snails are here - as long as they are not all eating my lettuces, let us delight in them and continue to improve the articles which Ganeshbot created. My concern is only whether Ganeshbot should be sanctioned (ie approved) to make any more articles at this time. Opinions do seem to differ as to the relative level of value of these articles in their original form, but with sufficient input I am sure that a consensus witll form. Elen of the Roads (talk) 14:28, 20 September 2010 (UTC)[reply]
- Thanks for a more moderate response that is not aligned with JaRoad's extreme position. I would still like to know if you are in favor of transparency on Wikipedia, and whether you still consider asking JaRoad to link to his previous contribution history and talk page history as an IP contributor to be "outing"? I consider it to be a basic policy necessity. It is not possible to know if one can trust the comments of a user (who claims to have been editing for 5 years) if we have no access whatsoever to his contribution history and talk page history before August 15th of this year. When we have a history to look at, we can see if a particular user has accumulated warnings before, or even blocks, and if so based on what kind of behavior. Thank you for your time and consideration. Invertzoo (talk) 15:14, 20 September 2010 (UTC)[reply]
- I would let that line of questioning drop if I were you, as it could easily backfire on you. Take his statements at face value - if you feel you have refuted them, then simply point that out. Attacking the editor is never the way to go. Elen of the Roads (talk) 15:46, 20 September 2010 (UTC)[reply]
- Thanks for a more moderate response that is not aligned with JaRoad's extreme position. I would still like to know if you are in favor of transparency on Wikipedia, and whether you still consider asking JaRoad to link to his previous contribution history and talk page history as an IP contributor to be "outing"? I consider it to be a basic policy necessity. It is not possible to know if one can trust the comments of a user (who claims to have been editing for 5 years) if we have no access whatsoever to his contribution history and talk page history before August 15th of this year. When we have a history to look at, we can see if a particular user has accumulated warnings before, or even blocks, and if so based on what kind of behavior. Thank you for your time and consideration. Invertzoo (talk) 15:14, 20 September 2010 (UTC)[reply]
- Invertzoo, can you please moderate your language a little. I am not in favour of extreme sanctions against anything. Ganeshbot needs a sanction (ie approval) to act. I am not proposing sanctions (ie a restriction) on its previous action, even though these were without approval. The 15,000 slugs and snails are here - as long as they are not all eating my lettuces, let us delight in them and continue to improve the articles which Ganeshbot created. My concern is only whether Ganeshbot should be sanctioned (ie approved) to make any more articles at this time. Opinions do seem to differ as to the relative level of value of these articles in their original form, but with sufficient input I am sure that a consensus witll form. Elen of the Roads (talk) 14:28, 20 September 2010 (UTC)[reply]
Invertzoo has a good point here. This is not outing. I have been stung by trolls before and this fits the pattern exactly. I'm not saying that this is what's happening here, but it's so hard to tell the difference. Knowing his history on Wikipedia would help a great deal. I am curious as to why JaRoad has been silent on this. Why not be forthcoming? It would favour him and give him credibility.
It's hard to take his statements at face value considering his pre-judgement, his tendentious posts, and the statement on his user page. He didn't arrive asking questions. He came with a POV and has stuck to that regardless of new information.
As for the bot being approved, I suggest:
- allowing this bot to produce stubs without limits on numbers
- adding as much generic information to the lede as possible relevant to the group (genus, family, etc)
- monitoring resulting stubs
- reevaluating the worth of the bot based upon the stubs' condition after several months.
Elen, how does this sound to you? You must see how dedicated we are to improving the gastropod project. It is our aim and interest to improve the project, not just to blindly make stubs and walk away. We will fill them with images and content over the years, as will others. Isn't this exactly what Wikipedia is about? Are there specific conditions you would like to see met in order to lettuce :) make these new stubs? Many thanks. Anna Frodesiak (talk) 18:23, 20 September 2010 (UTC)[reply]
- I see no objections at all to this. If it is not approved though, you could ask someone to AWB all these articles. There are some people who are creating 100s of articles with even less info without anyone even noticing, let alone objecting, using AWB. If you let them do these, they will be doing something useful for a change. :) Ruigeroeland (talk) 19:24, 20 September 2010 (UTC)[reply]
- Thank you Ruigeroeland. Invertzoo (talk) 21:12, 20 September 2010 (UTC)[reply]
I'm not the one with the extreme position, that would probably be the editor who disagrees with the value of species stubs. I've not only changed my position slightly, I indicated that change by posting comments and questions and by clarifying specifically that wikipedia has already established that species stubs are considered valuable articles. I suggested the bot make the articles from proposed lists. This was ignored. I asked questions. They were ignored. I offered help, this was insulted and belittled. I changed my position, I read others' posts, I was called a troll and hounded personally in response. How professional of project gastropod. Can't disagree with or address my points? Attack me. Hound me.
I think that project gastropod's unwillingness to compromise and their attacking those who disagree with plans means this bot will be trouble. Any one who raises issues will be insulted for their spelling, called a troll, hounded by Invertzoo. You want unlimited chaos? That could be obtained by giving this bot unlimited approval to create stubs with no community oversight except by project gastropod members.
People have already expressed disagreement about other bots creating species stubs. The issue is sensitive. It requires editors with diplomacy to be able to deal with sensitive issues. Floundering until you settle on demanding someone out themself, calling them a troll, and insulting their English does not speak of the sort of diplomacy and consideration for working with community consensus that should come with a bot with unlimited approval to create stubs. --JaRoad (talk) 05:11, 21 September 2010 (UTC)[reply]
Closing comment
[edit]This debate has reached no consensus, and as such the bot approval is defaulting to Denied.. There are many reasons for closing this as no consensus, and I’ll explain why I’ve judged that to be the result below.
Firstly some background may be useful to editors not familiar with this, for future reference. I hope this brief summary will be useful for others, and me to help collect my thoughts. This bot was originally approved to create no more than 600 pages, for the Gastropod WikiProject. A number of concerns were raised even then, and the bot was approved only after a lot of discussion, and with a limit on the speed at which it could create articles. This limit on the speed was later removed. However, following this the bot started creating thousands upon thousands of pages (~15,000), many more than the approved number. The bot was shut down, as it was no longer doing approved edits, as I, and a number of other editors, pointed out. It was decided that to continue editing, the bot would need to be approved via BRfA, and I said that to be approved, it would need community consensus first. The BRfA was submitted, and various discussions with the community took place following this.
Now my comments on actually closing this. Firstly, some of the conduct at this BRfA has been exceedingly poor, with personal attacks, and a lack of willingness to work with others. This is true for both some of the supporters and opposition. I believe this has contributed to a battleground mentality, where rather than trying to work together, users feel this is a win or lose situation. Also contributing to this, is the absence of compromise: the supporters do not appear to want any limits on this bot, or to accept anything other than having the bot approved fully, as they proposed it. This is also true for some of the opposition, who will apparently accept nothing less than having the bot shutdown completely. However, because the supporters are the ones proposing this, they need to work in cooperation with the community, to reach a proposal which suits everybody (reasonably). Because this has not been done, there is no consensus reached.
Commenting on the task itself, there seems to be mixed feelings from the community on bot created stubs. Some are opposed to it, partly due to previous bad results from such tasks, as well as the nature of this WikiProject. Since it is a relatively small project on a niche topic, there are understandable concerns that the project will struggle to maintain 25,000 articles. The project’s arguments that they can maintain this number of articles are unconvincing, since they are speaking about having, in the past, maintained very few articles over a very large period of time. Which does not prove that they can “keep on top” of 25,000 articles. Judging by discussions in forums other than this one however, there does seem to be some large community support for bot generated stubs in general. However, few of the comments by users at those forums seem relevant to this particular bot task.
How to move forward
If the project still wishes to move forward with this bot, I would suggest forming a consensus before submitting another BRfA. Using a request for comment or other appropriate venue, as BRfA isn’t particularly suited to building consensus. There has been some suggestion of running this task even without approval, and I would strongly recommend that this is not done. Using automated, or even semi-automated (such as AWB) methods for content creation require approval from BAG, which you do not currently have. Fully manual creation of these pages is, of course, permitted. But I would suggest that rather than doing this, the project works on keeping the pages it currently has up to date, to help convince others that you are capable of maintain these articles, rather than simply creating more pages which there is no evidence that you are maintaining. In future discussions I would remind everybody to stay cool, to listen to each other’s arguments, and to compromise. It is the lack of compromise and cooperation that has led to a lack of consensus, and, subsequently, this being closed as no consensus - Kingpin13 (talk) 08:16, 14 October 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.