I found a laser printer on eBay for £15 (£25 with P+P) a HP LaserJet 1320tn if your interested.
With my ever increasing workload and paper work I end of printing every day. This is why I got a laser printer, ink jet printers have driven me mad! Ink always running out, printing whole documents with lines though them. Messing everything up they possibly can, not mentions how slow they are at printing.
So if you do a lot of printing get a laser printer. If your thinking about buying a printer spend the extra and get a laser. I got a ink tonner for £9.99 delivered, that will print 6500 pages! (Works out at £0.00153692307 in ink per page).
You may say it can only print black, this has not being a problem and I will invest in a colour one soon (there is a large price difference for this).
I have set up a daily backup of my main linode server. With in my goals
- Completely Automated
- Achieve backups
- As lower traffic as possible
Using a my ubuntu home file server this has being made all possible. (Three 2TB drive in raid – 5).
1. No Password logins
Add a user to perform remote login and rsycn on the remote server.
For now give it as password (we will delete this later)
Back on the local machine we need to generate a ssh-key Make sure you are the user you want to perform the backup with.(press enter though all of it)
ssh-keygen -t rsa
To make life a little easier (especial if your not running on default 22 port) we will make a hosts file
And put this template in
Save and close
Now copy the public key to the remote server
ssh-copy-id -i backupservername
Now we can test if you can login without a password
If that has worked we can now remove there password
passwd -d rsync
And confirm settings in the /etc/ssh/sshd_config
This concludes part one, part two we will run rsycn command and then set up archiving the backups.
For a upcoming project I am working on we have being experimenting with using gearman and mysql. We are using gearman to process lots of data in the background giving us the ability to use multiple servers, this allows for easy scalability. Too many jobs in the que = add more workers.
Whist developing this we decided to test out interacting with a database using gearman. This has its pros and cons.
- No database opening and closing
- Easy to add multiple mysql servers at a later date.
- Multi threading long complex statements.
- Easy backup system
- Background Mysql commands
- Gearman latency time
- Connection limit = workers
Today randomly my mail server stopped allowing outgoing though postfix. Dovecot was working but postfix was not (using the same database for logins)
warning: SASL authentication failure: cannot connect to saslauthd server: Connection refused
warning: SASL authentication failure: Password verification failed
warning: hostname[ip]: SASL PLAIN authentication failed: generic failure
Restarting SASL fixed this problem.
service saslauthd restart
* Stopping SASL Authentication Daemon saslauthd [ OK ]
* Starting SASL Authentication Daemon saslauthd
I have always loved using multiple monitores with my Windows pc. After recently getting a mac Pro (not retina) I was limited to using 2 (the Mac screen and a external display). ScreenRecycler is a program allow you to extend your monitor to any computer (mac/windows/linux).
The software works by using vnc over a network so having a good wired network is advised to avoid lag. I have recently converted my switch to all gigabit. (Over wifi is almost unusable)
You can trial Screenrecycler for 20min a connection with is perfect for testing if you like the setup.
Over gigabit you can’t notise any real lag other than when you move windows about. I have watched iplayer thought it only releasing afterwards that the display was working over network.
(If you a student you can apply and get a discount)
Price: £20 (ish exchange rate from $29.90)
Windows recommenced vnc client: vnc navigator
So nice people at http://apigee.com/ sent me some API stickers. They posted them from the USA! Which can’t really be that cheap.
Its got me thinking how much I take API’s for granted when coding. Almost every service out there has a API in one form or another.
Twitters’ API is unbelievably simple to use and using Abraham’s twitteroauth php library you can get online within minutes.
Facebook’s API is something I have touched in the past. Need to have another play with it. They have also moved on to oauth and thankful got ride of Facebook mark up language!
Paypal API is a joke! It is not a easy task to get your head round. Applying for extended permissions is another effort.
Hosting Company API: Normally this is going to be something you have to code my hand. And try to work out from there documentation. Once implemented its a winner the power it gives you is amazing.
Own API currently myself and a friend are building our own API system and may use it to power all future websites as it adds more flexibility.
A article on how OwnCloud 4 data encryption is unsafe to rely on. Coming out just days after it was released. http://owncloud.org/owncloud-4-release-annoucement/ . I personally feel this feature should have being tested more before being released to the public. The problem is now out on all users who downloaded and installed ownCloud 4, many of whom will not be aware of this issue.
A very good post by copy blogger on how to work your blog more beneficently
A good selection of tips for blogger.
Every week I am going to do a round up of my favourite twitter links (articles pictures and videos).
Do you ever get the error that WordPress can’t upload files?
Unable to create directory /wordpress/wp-content/uploads/2012/03. Is its parent directory writable by the server?
A easy fix
sudo chown -R www-data:www-data /wordpress/wp-content/uploads/