In which I download databases.
I've been having CPU Load errors of late. I don't know why this is - something to do with SQL requests, but it's very frustrating. I can only assume that something is over-zealous in the SQL calls. Therefore, I've removed any plugins that weren't immediately needed, removed things like the calendar (which to my mind is just 'cruft' - who cares if a post went out on the 3rd August? Aren't people more likely to use the search box?)
One thing I found was that the database backup script ran on both murky.org and the backup domain simultaneously - this was killed with extreme prejudice!
Plan for the future:
- Continue to tag old posts.
- Remove/Reduce categories so that I have a few main categories only
- Work a new theme which I've done from scratch - I have my plans, but this isn't a quick thing.
- In the age of decent search engines, do I really need archives which are date based, category based etc? Should I not redirect data based links to some other archive, allowing for fewer cached pages?
It's possible that one cause of CPU load is the tag editor I'm using behind the scenes - it accesses many posts at once - so it's possible that the CPU load thing might sort itself out once this task is done - or when I limit the number of posts edited at once!
It could be that there's something much more fundamental and the host itself is just not up to the task of running a low-traffic site which has WP SuperCache on it. In which case, I may have to change hosts. Up until the last few days, this hasn't been a problem though, so hopefully removing a few database lookups, and tagging fewer old posts at once might make things play nice again.
Either way, Argh! Don't just tell me that there is a high load, tell me which scripts are responsible!
'the NAO wanted only limited child benefit records but was told in an e-mail from a senior business manager in March that to remove more sensitive information was too costly and complex.
Please correct me if I'm wrong, but this should be trivial for any well set up system. In the commercial 'filemaker' system, one can choose which records to export. If, as is more likely, the database was SQL, one could make a copy of the database and then drop unwanted tables, or fields. For anyone managing an IT system, this should have been trivial.
Someone who is responsible for decisions on such a large and costly database really should be able to manipulate that data easily.
For the record, the SQL syntax (after about 15 secs of research) is:
ALTER TABLE <table_name> DROP <field_name>
DROP TABLE table_name
Methinks the 'it was too expensive' excuse is just so much baloney.
Update: qwghlm has a post on this too.
I've been looking for a decent application to manage a DVD collection. Korenwolf did mention something a while back (I forget the name - and I haven't got around to asking) - but if memory serves it ran on a particular computer. This is what I'd like to see:
- Secure login for multiple users
- Enter ISBN, things like DVD cover, actors, title, genre and so on are pulled from online sources and autofilled.
- The info should be user editable.
- There should be options to make each entry 'public', 'friends only', 'private'. This should be over-rideable on a case by case basis
- Each user should have a public facing webpage, generated automatically, showing entries with the appropriate privacy level (e.g. public only to un-logged in users)
- It should be possible to tie two or more user accounts together, who see entries made in the other account as their own - unless that entry is marked as 'do not share'. This'd allow a user to record DVDs that are in the house, as well as DVDs that might be elsewhere (e.g. at work).
- There should be a field for where the item is usually stored. This would be from an editable list and selected with a mouseclick (i.e. prepare a list that says 'on shelf under TV, on bookshelf in front room, whatever' and then just click)
- There should be a field detailing if the item is loaned out, and when it was loaned
- There should be a 'comment-like' section for recording impressions if the DVD is watched.
- Only the people with partner or private webpage should see these fields
- The webpage should be 'clean', uncluttered, and configurable. There should be several modules which could be moved around on a per-user basis, e.g. a module which shows the last DVD watched, a module which shows who has the DVD, a module that keeps track of unwatched DVDs (minus any loaned DVDs), a straightforward list sorted by title, by genre, a module that lists loaned out DVDs (by date of loan), a module that lists DVDs that are in the 'to watch' queue and so on
- There should be two ratings fields, one for user rating, one for public rating (e.g. pulled from imdb)
- There should be text based feeds, i.e. click for a brief list, alphabetically sorted, of all DVD, DVDs sorted by genre, DVDs in a particular genre, DVDs on loan, DVD sorted by storage location and so on.
- Ideally, it should be a pretty standard server app, utilising php/mysql or similar
Essentially, a decent application for managing DVDs (and/or CD, and/or books), not only managing WHAT those resources are, but keeping track of WHERE they are. This'd be really useful for me as quite often I'll see a DVD on offer and thing 'do I have that, or did I just plan on getting that?' - sometimes I see this when away from home, but not away from a computer - this'd allow a quick check.
Does such a thing exist? (There is dvdspot.com - but there I'd be committing data to a 3rd party, and having the ability to make backups is a good thing)
Does anyone with the skills feel like creating the sourceforge project to make this happen? (fx: hopeful grin)
Edit: Okay, I've succumbed to DVDspot - but I'd still prefer an 'own install' for various reasons. So, let's add to the above list 'will accept import and export data in DVDspot format'
With thanks to Andy Budd's Page, I have finally worked out how to do decent backup/restores.
I created a text file called sqlbackup, in the file is this:
# echo start message echo "Backup Script Processing"
# navigate to backup dir if cd ~/backup.sql/latest then echo "...successfully navigated to backup dir" else echo "could not locate backup directory" exit 0 fi
# echo message echo "exporting SQL dump"
if #dump the db into a .sql file mysqldump --user=SQLUSERNAME --password=SQLPASSWORD.... .... SQLBLOGNAME --opt | gzip -c > backup-mt.sql.gz; then #echo success message echo "SQL dump successful" ls -la else #echo error message echo "mysqldump error" exit 0 fi
My crontab has:
# Backup MySQL 5 0 * * * /home/murk1e58/sqlbackup >> ~/error.log 2>&1 56 23 * * * cp ~/backup.sql/latest/backup-mt.sql.gz ~/backup.sql/daily/$(date +\%A).sql.gz 58 23 28 * * cp ~/backup.sql/latest/backup-mt.sql.gz ~/backup.sql/monthly/$(date +\%B).sql.gz
(The lines marked .... should be one one line, I have split them here to make sure the line is narrower than most screens). To restore the backup, one types:
gunzip backup.sql.gz mysql --user=username --password=password sqldatabasename < backup.sql
All good stuff!
Of course, what I really need is a completely seperate server, with similar features to this one (cpanel, command line etc), then I could sent the backup to it automatically and have it mirror this server!
In the short term, I would like to work out how to email the resulting file..... any ideas?