Talk:Worst cases

From Meta, a Wikimedia project coordination wiki

I think WikiPedia is only a compromise. There are better solutions. The basic problem I see here is that WikiPedia is centered around one single storage and naming scheme. The end user platform and the data engine do not really factorize, and it does not naturally integrate with homepages, discussion forums etc.

Better solution: make the database engine abstract. Make it a global filesystem where it is only secondary where articles get stored -- publishers have to provide space for their articles themselves until they get copied and authorized by one or the other wiki admin. A wikipedian would then select from a number of articles, derive from them, rewrite them and finally store them on the Wikipedia servers -- including references to the original works.

The underlying system would provide linkage information. Each user stores his files on his own disks and announces them via central indexing servers -- this is some sort of structured search information, which should be superior to Google in some sense.

I could elaborate on that some more. Look at online discussion forums, for example. If you write there, administrators can remove your post. Instead, it could be set up more freely: the globally referenced filesystem could provide the data and the maintainers of a website could include a discussion thread into their forums and administer them by publishing some sort of black lists which they sign and may use to import the articles into their conventional online web server. Then, everyone may freely choose whether he wants the regulation 'enforced' by the website/forum admins or not: the client would select the displayed threads and responses according to the users rules.

That could be transported over to websites. Think of Microsoft products: you could comment on them 'directly on their website' -- although Microsoft would never allow you to do so, but your client could look up references (back tracks?) on some simple indexing servers.

By the implementation of a free and globally referenced filesystem one could also introduce automatic support for copy licenses... and it would allow for easy data replication and thereby also backups and file synchronization among your PCs at work, home etc. since there is no need for a global filesystem consistency and every document points to previous versions so conflicts arising by editing the same document twice at the same time would not be critical since a tool may later detect the 'branching' of the revisions.

"Two worlds", "Wikipedia is effectively dead."[edit]

"ther[e] projects (with or without the GFDL) copy the projects' most effective attributes and ignore its mistakes."

If this happens with the GFDL, there is no worst case. This is compete success. The organizational structure has no significance, other than being a candidate for Darwinian selection (like any other organization). Another project that used the GFDL can preserve the content, completely. 69.107.82.121 04:13, 24 February 2006 (UTC)[reply]

Western imperialism[edit]

It took me a while to think beyond my spell checker, but eventually I supported the British spelling of tyre in http://en.wikipedia.org/wiki/Car_handling. That was on the more or less technical advantage that most of the innovations in that field have been from Britain. A German college once told me "English would make a good world language if we could only get the Americans to speak it. 69.107.82.121 05:03, 24 February 2006 (UTC)[reply]

I heard an American military officer did actually make the obvious and fatal mistake of calling America's envelopment in the middle east a "crusade" and got fired for it. 69.107.82.121 05:03, 24 February 2006 (UTC)[reply]

Wikipedia goes on growing, but non-american editors participation drop. Wikipedia is widely considered as another US propaganda tool, and ends being a biased tool read only by american and pro-america users.

Thank goodness we now have Conservapedia to draw attention.--Orthologist 16:23, 24 May 2007 (UTC)[reply]

Regarding the need for better version comparisons between Wikipedia entries[edit]

Unfortunately, I think that the Wikipedia article on Opium is a good representative of a worst-case scenario that I am seeing more frequently on Wikipedia. The article's current draft is in many ways inferior to a draft from one year ago. For instance, the references, external links, and many of the See Also links have been lost during the last 744 edits. In fact, during over 1600 total edits, the article has never acquired a lead section, and retains unsourced statements and a haphazard organization that were imposed during the first few revisions. The lack of a top-down organization can be remedied by a committed editor, but the loss of information over time is more disturbing.

The obvious fix is to use the "diff" tool to pick out the lost facts, but even though the organization is mostly the same, the "diff" tool does not properly align many of the main section headings. Nor does it seem like it can it be used effectively to compare one section being edited. And of course it won't tell you what was lost unless you look. It seems to me that the development of the article is generally sequential, one change after another, and any loss not rapidly reverted is permanent. In biological terms, Wikipedia articles are asexual organisms, which accumulate deleterious mutations that they cannot lose by recombination. Some asexual organisms do very well, for a time, but for Wikipedia to remain most evolvable and successful, I think it needs a much stronger set of tools to allow the "hybridization" of different versions of the same article, or even between different and related articles. Certainly the effectiveness of DNA sequence alignment tools tells me that there is vast room for improvement with "diff" - it should be possible to pick out even a single sentence with a few related phrases for comparison, let alone entire sections. But Wikipedia also needs tools to allow the actual transfer of those sentences, one by one, from one draft to the other, without the editor needing to hand-check whether they've been moved to some other place in the document first. Automatic homology searches should be made between current and old versions of articles, and old versions scoring high for unique content should be presented in a list for editors to rate and draw from.

Without these changes, I fear that there will be an ever-greater feeling that editing Wikipedia articles is worthless - that any change you make will only be lost, sooner or later, and that the quality of articles never really gets any better. The only exception to this will be articles actually written from beginning to end by single editors, but if they lose information from previous small edits then much of the advantage of a collaborative encyclopedia is lost. Mike Serfas 17:33, 5 April 2007 (UTC)[reply]

Even when it is too much to follow all changes in an article I am interested in, I return from time to time to it, and if I notice the deletion of a valuable part, I search the edit summaries for a reason; if there is no satisfactory reason I restore the deleted part.
Separate histories for sections would be useful. That can be achieved by making sections separate pages and transcluding them.--Patrick 21:54, 5 April 2007 (UTC)[reply]
All these technical suggestions are exciting- as befits this illustrious group. Have the suggestions gone throught the discussion-proposal route since 2007? Or is this topic been beaten dead with a "safe" American Depleted uranium-loaded horse yet? Wikipedia:User:Hilarleo- a/k/a 69.228.233.70 17:05, 23 February 2009 (UTC)[reply]