imported>mutante |
imported>mutante m (→Dump Database) |
||
Line 25: | Line 25: | ||
mysqldump -u root -p wikidb > wikidb_20051223.sql |
mysqldump -u root -p wikidb > wikidb_20051223.sql |
||
To save diskspace and bandwidth you can now compress the dump file, f.e. with .tar.gz |
To save diskspace and bandwidth you can now [[HowTo/UncompressFilesInLinux|compress]] the dump file, f.e. with .tar.gz |
||
tar zcvf wikidb_20051223.sql.tar.gz wikidb_20051223.sql |
tar zcvf wikidb_20051223.sql.tar.gz wikidb_20051223.sql |
Revision as of 18:34, 18 December 2005
Please drop a line on the talk page with feedback or comments on this page
What for?
Maybe Wikipedia/Mediawiki developers are taking care of things in the next version of Mediawiki, but in the mean time smaller wikis are getting totally blasted with spam. This is totally wearing the wiki maintainers down and stifling creativity and constructivness with most of the energies spent on making repetitive rollbacking to non-spammed versions and IP blocking which is ineffective since the attacks come from multiple IPs. So this wikipage has been created to help those folks who are looking to control the amount of spam on their wiki.
Requirements
SpamBlacklist
Backup before you Upgrade
Making a backup of a mediawiki installation is basically a three-step process. Copying the regular files, making a database backup and sending them to a remote backup location.
Copy "w" directory
If you follow the standard wikipedia way to hide "index.php" in URLs and your webserver's document root is /var/www/, you will have the wiki physically installed in /var/www/w and an alias to /var/www/wiki in your apache config. Hence, something like:
cp -r /var/www/w /home/backup/w_20051223
would be sufficient.
Dump Database
To make a dump of the Mysql database, use the "mysqldump" command on a console.
mysqldump -u root -p wikidb > wikidb_20051223.sql
To save diskspace and bandwidth you can now compress the dump file, f.e. with .tar.gz
tar zcvf wikidb_20051223.sql.tar.gz wikidb_20051223.sql
Copy to remote location
Finally copy the files to a remote server, f.e. via scp.
scp wikidb_20051223.sql.tar.gz user@backupserver.com:/home/user/backups/