imported>mutante mNo edit summary |
imported>mutante mNo edit summary |
||
Line 4: | Line 4: | ||
[!] Check [http://wikiindex.org/index.php?title=Category:MediaWiki], [http://www.aboutus.org/Category:Main_Page], [http://www.mediawiki.org/wiki/Sites_using_MediaWiki], [http://www.google.com/search?q=inurl:Special:Statistics] for recent additions that havent been added yet over here. '''will always stay a task for all''' |
[!] Check [http://wikiindex.org/index.php?title=Category:MediaWiki], [http://www.aboutus.org/Category:Main_Page], [http://www.mediawiki.org/wiki/Sites_using_MediaWiki], [http://www.google.com/search?q=inurl:Special:Statistics] for recent additions that havent been added yet over here. '''will always stay a task for all''' |
||
[1] Can http://atwiki.com/ be parsed? Is there a list of all? | Only the ones in the iframes on the homepage. Create a table and visit them often to amend them gradually. |
[1] Can http://atwiki.com/ be parsed? Is there a list of all? | Only the ones in the iframes on the homepage. Create a table and visit them often to amend them gradually. |
||
[1] wiki-site.com: Missing: fully automatic wiki, |
[1] wiki-site.com: Missing: fully automatic wiki, xml Wikitax, Entries to Index | look how many links failed, even though they come from hostlist |
||
[x] One single query to include in largest_html,largest_wiki,largest_csv,largest_ssv, added missing ones to csv,ssv,wiki .. only xml still has a different query |
|||
[x] Apply Uncyclo||Wikia Dupe Elimination at Largest to Wikitax and other output formats |
[x] Apply Uncyclo||Wikia Dupe Elimination at Largest to Wikitax and other output formats |
||
[1] Accelerate Mediawikis-Parsing | in progress, optimizing various update scripts.. |
[1] Accelerate Mediawikis-Parsing | in progress, optimizing various update scripts.. |
||
[1] [[User talk:MattisManzel#List of Largest Non-Mediawikis]] (MattisManzel) |
[1] [[User talk:MattisManzel#List of Largest Non-Mediawikis]] (MattisManzel) |
||
[1] gratis-wiki.com: Missing: fully automatic wiki, |
[1] gratis-wiki.com: Missing: fully automatic wiki, largest_xml |
||
[1] [http://developer.berlios.de/projects/wikixray/ WikiXray] is now available! |
[1] [http://developer.berlios.de/projects/wikixray/ WikiXray] is now available! |
||
[x] Delete [http://s23.org/wikistats/wikia_html.php?sort=ts_asc&th=0&lines=6 These] Wikias |
[x] Delete [http://s23.org/wikistats/wikia_html.php?sort=ts_asc&th=0&lines=6 These] Wikias |
Revision as of 07:50, 19 March 2007
This is a ToDo list for Wikistats
<tasks> [!] Check [1], [2], [3], [4] for recent additions that havent been added yet over here. will always stay a task for all [1] Can http://atwiki.com/ be parsed? Is there a list of all? | Only the ones in the iframes on the homepage. Create a table and visit them often to amend them gradually. [1] wiki-site.com: Missing: fully automatic wiki, xml Wikitax, Entries to Index | look how many links failed, even though they come from hostlist [x] One single query to include in largest_html,largest_wiki,largest_csv,largest_ssv, added missing ones to csv,ssv,wiki .. only xml still has a different query [x] Apply Uncyclo||Wikia Dupe Elimination at Largest to Wikitax and other output formats [1] Accelerate Mediawikis-Parsing | in progress, optimizing various update scripts.. [1] User talk:MattisManzel#List of Largest Non-Mediawikis (MattisManzel) [1] gratis-wiki.com: Missing: fully automatic wiki, largest_xml [1] WikiXray is now available! [x] Delete These Wikias [2] Fully automatic List_of_Wikimedia_projects [2] Fully automatic List_of_largest_Mediawikis [2] Update sources on Wikistats [3] meta:User:mutante/Wikistats#Wikimedia-Portalseiten [3] Keep track of Native Language Links [3] Keep track of Angela </tasks>
Hostlists for wiki-site.com Table
- http://www.wiki.co.il/active-wiki-en.html OK
- http://www.wiki.co.il/active-wiki-es.html OK
- http://www.wiki.co.il/active-wiki-it.html OK
- http://www.wiki.co.il/active-wiki-de.html OK
- http://www.wiki.co.il/active-wiki-he.html OK
Add wikis
http://s23.org/wikistats/addwiki/ also checks for dupes
Failed parsing
Error Codes
See also
About the Task Extension being used: meta:Tasks Extension, Task Extension
Latest Additions
<wikistats type="mw-latest">100</wikistats>