If you ever work with JSON on the command line, try out the json_reformat tool included in yajl-tools package (or install from source).
$ curl http://github.com/api/v2/json/user/show/powdahound
"name":"Garret Heaton","created_at":"2009/04/04 08:36:09 -0700",
$ curl -s http://github.com/api/v2/json/user/show/powdahound | json_reformat
"name": "Garret Heaton",
"created_at": "2009/04/04 08:36:09 -0700",
"location": "Sunnyvale, CA",
So much nicer!
It will even tell you if there are syntax syntax errors (as will json_verify).
Amazon added the ability to host static sites on S3 recently so to try it out I made a small site comparing the different types of EC2 instances: www.ec2instances.info. It's not much of a site but it was the only thing in my ideas list that didn't require some sort of database backend.
The setup was very simple:
- Buy the domain (name.com is so much nicer than GoDaddy by the way).
- Point domain's nameservers at my slicehost account.
- Add a new DNS domain in slicehost and add a single CNAME record with a name of 'www' and data of 's3-website-us-east-1.amazonaws.com.'
- Install the latest Cyberduck (Mac). Windows users can use one of the tools here.
- Create a new S3 bucket called 'www.ec2instances.info' and configure it for static site hosting.
- Upload all my files and change their permissions to make them readable by everyone.
Updating the site is easy - just select the file in Cyberduck and click the 'Edit' icon in the toolbar (or hit ⌘K) and it will automatically upload the file whenever you save. If I needed a real deploy system it'd be pretty easy to whip up something with Fabric and Boto.
Overall it seems like a great way to host a static site on the cheap (~$1.50/year for this). The only real downside is that you can't have your root domain hit the bucket because a CNAME must be used. This means that ec2instances.info does not resolve properly. More details here.
Note: I tried to use Amazon's new Route 53 DNS service instead of my slicehost account but the configuration is still a bit more involved than I'd like. Hopefully they'll add it to the AWS web console soon.
(This is a repost from the HipChat Blog)
Open source projects depend on community cooperation. Successful projects have a healthy group of individuals and companies submitting code, writing documentation, and testing new features. Unfortunately it’s not always easy to contribute because different projects will use different bug trackers, version control systems, and approval processes. Package maintainers also have a hard time handling all the incoming patches in a timely manner which frustrates the contributors.
In 2005 Linus Torvalds created the Git version control system in order to solve problems he was having dealing with patches to the Linux kernel. A few years later GitHub came along with a nice web interface on top of Git, making it trivially easy to fork, patch, and contribute to projects hosted there. The standardized wiki and issue tracker features mean that many projects are setup in the same way. Once you learn how to contribute to one project on GitHub you know how to commit to all of them.
Unfortunately GitHub makes it so easy that I’ve found myself becoming lazy. It feels a lot harder to contribute to non-GitHub projects because it often requires signing up for their custom bug tracker, learning the patch process, and waiting longer before the patch is accepted. That extra friction is sometimes enough to prevent me from submitting a fix, and that’s not good for the project.
Ease of contribution is clearly an important factor for open source and other community-driven projects (just look at Wikipedia). As GitHub continues to grow, are more projects going to feel pressure to switch? I think they will, and I’m looking forward to it. Better software is good for everyone.