Saturday, July 25, 2015

My experiment with golang and social counters

Starting with my frustration while building the scroll website: the social buttons are freaking slow to load. Especially on pages that has a lot of scripts (like mine) it is way worse. The buttons usually take minutes to show up and when they do, each of them render differently from the rest. Thanks Facebook, Twitter and Google for that.

So I went in and found a few options. It looks like people made a few scripts that will collect the counters via AJAX (on user browser) and render the buttons with images. Some use data URI to avoid triggering more requests, brilliant! Since I wanted to make something super simple and fast to use, I decided to have one script that does all of those. The server will keep track of the counters and put them in the script so everything needed is one and only one request.

I needed 3 social networks and each of them has different ways to retrieve the counters so I figured the server has to make the requests asynchronously, cache the values for sometime and return the bundled js. My goal is to have it at maximum 50KB and delivered within 500ms. According to my measurement, current scripts from the social networks are way more than that:

  • Facebook sdk.js is the worst contender at about 164KB 
  • Twitter's widgets.js is 107KB
  • Google is the winner here, their platform.js is only 37KB
At first I wanted to do this with Node.js because it's fairly simple and it looks like all output is javascript anyway. Turned out making 3 async http requests and merging the results together is quite complicated (callback hell, anyone?). So I switched to use golang since I tried their channel before and it seems to be a good fit. The project went well and I have them running at both Google App Engine and Heroku at the moment:
The code is pushed to GitHub too, with a simple demo site which makes use of the Heroku instance: http://hoangson.vn/go-socialcounters/. The all.js script is only 14KB (8KB gzip) and normally finished loading in less than a second. I also added a jQuery Plugin script for advanced usage (it fetches a heavily cached jsonp file). The project uses data URI as the other scripts that I found but those images do not look good in high resolution screens so I try to detect SVG support and use SVG whenever possible.

I hope this project is useful for someone.

Monday, April 13, 2015

Quick DNS switching script for Mac OS X

I'm too lazy to spend 1 minute every time I have to change DNS so I spent 10 minutes to wrote this script to do it quick. It currently supports Google, Open DNS and uFlix but it can support unlimited number of configurations.

Get the script here: https://gist.github.com/daohoangson/60ce8e0317213bc45c30

Use it like

switch-dns-to.sh google

You will need to enter the user password.


Tuesday, March 17, 2015

PDBS 2015

*PDBS stands for Personal Data Backup Strategy.

I have always been paranoid regarding data loss. That's why I try to follow best practices in backing up data:
  • 2 offline full machine backups to external HDDs, one in Hanoi, one in HCMC. I have done this since a few years ago when I was using the ThinkPad T41 laptop (circa 2007).

    The size of this backup grows linear with time and it is about 300GB these days (with TimeMachine).
  • 1 online backup via Arq to Amazon Glacier for important archival data (how did I flirt with my wife, that kind of serious stuff). Before I started using Arq (early 2013), I made yearly dump of data to DVDs.

    This backup grows slowly until I met Sylvie and it started to explode, going around 150+ GB now. Mostly full size photos.
  • 1 online backup for work related files are put in Dropbox because their apps (Windows and Mac) work really well. I believe they are the only one who uses delta sync, which is super fast. Since work data changes regularly, it makes sense to use something efficient. My Dropbox account currently using 40GB of data. I haven't paid for it though, got the storage via their various promotional programs.
  • At least 1 alternative online backup for codes. I have many public repos on github.com and just as many private ones on bitbucket.org to keep track of code and they act as secondary backups.
  • 1 online backup for media from cameras, smart phones. I use Dropbox on all of my devices. There are about 30GB of photos currently (of the total 40GB Dropbox data). Their mobile apps work well enough across all platform so it just makes sense. For each device, I also use the respective native backup service (Apple = iCloud, Google = Google+, Microsoft = OneDrive).
All in all, there has been no problems so far but I made a change recently and moved archival data (150GB) to OneDrive. The simple reason is Microsoft offers 10TB storage for Office 365 subscribers (which I have been for a few years) and OneDrive photo browser is fairly good. Since this collection of data consists 90% of photos, it's great to be able to quickly find a photo when needed. Also, Arq has proven to be quite complicated and slow (probably because I had never upgraded to version 4). Amazon Glacier is cheap and all but downloading data is time consuming and/or expensive. Most of all, there is no way to browse my data online.

I started uploading to OneDrive 9 days ago (March 8th) and have just finished today. I still have 9.95 TB available, it's beautiful!