So I’ve had this problem with Windows Live Messenger since it’s release.
In WLM 2011 (version 15.4) they introduced a new feature that integrates with Facebook, which I
initially didn’t like, but now I like it! Only problem was that the feature that shows Facebook galleries within WLM always failed on access with Error code: 0x80042f0f after it comes up with “Loading album via Facebook”
After many months of it bugging me, I sorta gave up. Until now.
Whilst on holiday in South Australia, I noticed that my laptop did not exhibit the same problem (though I had tried it at home).
So obviously it was a problem with my internet connection (I used mobile broadband on holiday).
Today, I ran a Wireshark on my LAN connection when the error comes up and I managed to find the conversation and found some interesting information. Apparently the return page from that error was actually a HTML Error 417; a quick Google got me to this page
Simply put, Windows Live Messenger is asking for a specific header response. Apparently my Squid proxy (on my IPFire box) isn’t playing nice. WLM wants confirmation (Expect 100-continue), but Squid instead sends out an error message, and thus WLM doesn’t get what it expects and fails.
So to fix it, we need to put
In the squid.conf file (‘/var/ipfire/proxy’ in IPFire)
Although you should really put it in ‘/var/ipfire/proxy/advanced/acls/include.acl’ and reset the proxy from the web interface (if you change squid.conf directly you will overwrite it if you change anything in the web configuration page)
This will tell Squid to ignore the request, thus it will not send an error, and BAM its working!
Only updated this a while ago, being a router I didn’t need much in the way of power, just reliability. So its mainly made of old parts.
Generic Desktop ATX (horizontal) ‘modified’ to *erhem* fit the power supply =D.
ASUS M2N SLI Deluxe
AMD Sempron 140 2.8GHz (underclocked to 1GHz) 45w Single Core, unlocked to Dual Core with NVIDIA Unleashed mode. Shows as AMD Athlon(tm) II X2 4400e.
Western Digital Caviar SE 160GB | SATAII
Western Digital Green 2TB | SATAII
2x 1GB Corsair XMS2 DDR2-800
Vantec ION2 CAN460-C 460W
ExpertColor S3 Trio64 86C764X 8MB | PCI
Its running IPFire, which is actually software based on the first Linux router/firewall distro I ran (IPCop) which was really nice, except it only ran Linux kernel 1.4, which is rather old and lacks many features.
I’ve had some interesting projects lately, most notably is an email backup script.
My first thoughts on this was to use RoboCopy and just have it run on log off. Unfortunately RoboCopy does not support incremental backups, thus meaning 10+ computers could be copying potentially 2GB+ files to the server every log-off; not good.
My attention was then steered towards Rsync, a tool which I have used before, that uses the ‘Delta Algorithm’ to scan for changes in a file and copy just those bits that have changed.
I toyed with the idea of having a Linux virtual machine running the server, though this most likely wouldn’t have worked, considering it would need considerable disk space and if something went wrong, well… Linux isn’t exactly the most user friendly of operating systems.
So then I remembered there was a Windows version of Rsync, though I had never actually managed to get it working, it seemed like the best place to start.
DeltaCopy is basically a wrapper for Rsync, running within a Cygwin environment. It’s open source, so no licensing costs, and it runs its own service with a quite small footprint.
My first itteration of DeltaCopy involved simply mounting the remote share and doing a local copy i.e. rsync.exe [source] [destination] which apparently is the wrong way to go about it. Every time a bit was updated, it took nearly 3 times longer than the whole file should’ve taken to copy. I discovered that if you use a share, basically you are downloading a copy of the file, scanning it locally, then sending the whole file back down the line.
The proper way of using it, apparently, is to have an rsync server where the backups are kept, and have the rsync client connect to the server. That way the server can scan for changes without having to send it down to the client. Then the server and client negotiate which bits need copying, then they do it.
After much frustration and many issues (mainly typos :P) I managed to get my test platform to copy from 1 machine to the other. I succeeded and managed to take the 3 minute transfer time down to 6 seconds. Impressive!
To implement the script I will need to have the rsync service installed on the server (which I have now done) and use Group Policy to deploy both the DeltaCopy client (particularly rsync.exe) and my script to run on log off (also done).
I have now come to the point where it should be working but unfortunately it keeps coming up with “Connection Timed Out”. My guess is the firewall on the server isn’t accepting the connections, so I’ll have to look into that and see what can be done.
I’ll need rsync’s port 873 opened up I believe. Whether that needs to be done using Group Policy or through the Firewall settings, I’m not sure.
Anyway, that’s it for the moment.
I think I’ll post my scripts up sometime, stay tuned.
Find them right here: Download backupScripts.zip 7.7KB
You will also need DeltaCopy from Synametrics Technology