It's always cool (and extremely helpful) to be able to access your files from anywhere. There are some partial solutions for this such as cloud storage, and they do the job well with versioning and history. However, they only work if you sync all of your files to their service. And they all have storage caps. Also, what if you wanted to sync your files to a service like Dropbox, but you wanted to preserve your current folder structure? There are ways to accomplish that, such as using symlinks, but that's not a very good solution. My current method is to use a custom dynamic DNS to run a FTP server from home. Let me explain that one.
DNS is responsible for translating web URLs to IP addresses. For example, google.com points to 184.108.40.206. That IP address is static, meaning that it won't change. Therefore, you can be quite certain that typing
google.com will bring you to the right place.
Your home IP may not be static though. It may change often. If you wish to be able to reach your home network, you have to memorize your IP address, and one day (maybe even an hour later!), that IP address may not even point to your home network anymore! Dynamic DNS can fix this. By registering a domain name such as
example.com, you can change the A record to point to your home IP address. That means typing
example.com will be translated to your home IP address. When your IP changes, you would want a way to update that A record to reflect the change to your IP. A while ago, there was a free service (dyndns) that could do just this. You just needed to setup an account with them and change some settings on your router, and then BAM!
youraccount.dyndns.org would be pointing to your home IP address! Sadly, they removed this free service some months ago. So I decided to create my own custom dynamic DNS service.
First, I needed a way to change the A record of my domain. I decided to use a subdomain, so something like
myhome.example.com would point home. I looked into the cPanel AJAX request that was sent when changing the A record manually. It didn't look complicated; all that needed to be changed in the request was the
address=YOUR_IP_ADDRESS. The hard part is actually getting the request token for logging in to cPanel programmatically. For that, I used
cURL to make a login request to cPanel. Then, just use
curl_getinfo() to get the returned URL and use pattern matching to find the session ID. My code to get the session ID:
$pattern = "/.*?(\/cpsess.*?)\/.*?/is";
$preg_res = preg_match($pattern, $inf['url'], $cpsess);
After getting the session ID, sending the DNS change request is trivial. Just use cURL. The big problem is how to run this script and set the correct A record. The script would have to be run at home, and as often as possible. Dyndns used the router to set the IP address. Since your router is always on, that method is extremely smart and power-saving. Having a computer always on just to execute this script every 20 minutes would be overkill. So I decided to use an old Android phone I had lying around: my HTC Tattoo, still on CM Gingerbread. I used Tasker to run an HTTP request to my script every 20 minutes. Done? Not quite.
All I had now was the IP address to my home network. I still needed to be able to turn on my computer to access the FTP server, and then my files. I decided to put my HTC Tattoo to good use again. I installed droid VNC server on the Tattoo as well as Wol Wake on Lan Wan. I configured Wol to send a magic packet to my computer which would turn it on. To send the packet, I would have to connect to the VNC server on the phone and control it. To connect to VNC, I would have to use the DDNS service I created above. To summarize, in order for me to access my files, I would have to:
- Connect to my Tattoo via
- Use VNC to send a magic packet to my computer
- Use FTP software to connect to
myhome.example.com:FTP_PORTand get my files!
It's really not as complicated as it sounds, considering that you'll now have access to all your files anytime, anywhere, even on your phone. One major thing to consider is potential security risks when opening up your computer through FTP. One step would be to only allow read permissions. That way no one can delete your files if they had unauthorized access. I use SFTP server software to serve my files. It uses SSH to create a secure connection before transferring files. This way, I can use key-based authentication instead of passwords, which really makes it much safer - but harder to access "anywhere" because all devices must be pre-approved. Using Linux would probably be safer, because then
sudo ufw limit ssh could potentially prevent malicious activity by rate-limiting connections. Too bad I'm just too used to Windows.