Archive for the ‘Web Services’ Category

File Under: Web Services

GitHub Brings More Git Superpowers to the Web

Creating a new file on GitHub. Image: GitHub

Code hosting giant GitHub has added a small but significant new feature to the site: the ability to create new files through the web interface. The change makes it easier for non-Git-savvy contributors to quickly and easily add files to a repository.

You’ll find the new file creation tool just to the left of a repository’s breadcrumb menu. Click the new “New File” icon and GitHub will create a new file, ask you to name it and open it in the file editor — all right within your web browser.

Couple the new file creation tool with Git’s existing on-site document editor and you have the plain-text aficionado’s alternative to online editing suites like Google Docs or Microsoft’s Office 365.

At the very least the ability to create new documents through the web interface makes GitHub a more full-featured blogging engine for anyone using Jekyll, Hyde or other static site generators in conjunction with GitHub.

The new file creation tool is smart too. If you try to create a new file in a repository that you don’t have access to, GitHub will automatically fork the project and help you send a pull request to the original repository with your new file (much like it does when you edit a file through the web interface).

You can also do a bit of URL hacking to automatically create new files. Just add ?filename=yournewfile.txt at the end of the URL and GitHub will pre-fill the filename field with yournewfile.txt.

GitHub has also launched a new status site to report the current network health of the site. Should you for some reason not be able to connect to GitHub you can check the new status page to see if GitHub is down or if the problem is on your end. There’s also a new @githubstatus Twitter account you can follow for updates.

File Under: APIs, Backend, Web Services

Google Drive’s New ‘Site Publishing’ Takes on Amazon, Dropbox

Google’s demo site, served entirely by Google Drive. Image: Screenshot/Webmonkey

Google has unveiled a new feature dubbed “site publishing” for the company’s Drive cloud hosting service. Drive’s new site publishing is somewhere between a full-featured static file hosting service like Amazon S3 and Dropbox’s public folders, which can make hosted files available on the web.

Google has set up a simple demo site served entirely from Google Drive to give you an idea of what’s possible with the site publishing feature. Essentially site publishing gives your public folders a URL on the web — anything you drop in that folder can then be referenced relative to the root URL. It’s unclear from the announcement how these new features fit with Google’s existing answer to Amazon S3, Google Cloud Storage.

The API behind site publishing works a lot like what you’ll find in Amazon’s S3 offering. If you use the Drive API’s files.insert method to upload a file to Drive, it will return a webViewLink attribute, something like That ugly, but functional URL becomes the base URL for your content. So, if you uploaded a folder named images, with a file named kittens.jpg, you could access it on the web at

There’s one drawback though, Drive’s site publishing doesn’t appear to support custom domains, which means it works fine for assets like images, CSS or JavaScript, but unless you don’t mind serving your site from some funky URLs, it’s probably not the best choice for hosting an entire site.

There are already numerous static file hosting solutions on the web including Dropbox and Amazon’s S3, as well as whole publishing systems that use Dropbox and S3 to host files, but for those who would prefer a Google-based solution, now you have it.

For more details on the new API see the Google Apps Developer Blog and be sure to read through the Drive SDK docs. If you need help, Google is answering questions over on Stack Overflow.

File Under: Web Services

Google Pairs Gmail, Drive for an Easy Way to E-Mail Huge Files

Attaching Google Drive files in Gmail’s new compose window. Image: Google

Google is making it easier than ever to send large files via e-mail. The company has announced a new feature that puts all your Google Drive documents just a click away from Gmail. Now you can quickly and easily attach files from Google Drive to Gmail messages.

That means you can attach files up to 10GB in size, which is some 400 times larger than what Gmail will allow you to do with typical e-mail attachments. Of course the reason files can be so large is that Google isn’t actually sending the files; it’s merely sending a link to your recipients who can then access them through Google Drive.

Gmail has a clever new feature that checks to make sure that all your recipients actually have permission to view your Drive files. The feature works a bit like Gmail’s forgotten-attachment detector — if you forgot to grant permission to one of your recipients Gmail will prompt you to do so before sending your e-mail.

Like most new features from Google, the new Google Drive integration will be rolling out to Gmail users “over the next few days.” Note that in order to get the new Google Drive attachments feature you’ll need to opt-in to the new compose window option we wrote about earlier (see that post for full details on how to get started with Gmail’s new in-window compose dialog).

WordPress Brings Bitcoin to the Blogging Masses

WordPress earns a Bitcoin merit badge. Photo: Ben Ostrowsky/Flickr.

Upgrading your blog no longer requires a credit card or PayPal account. Starting today you can raid your virtual piggy bank to pay for WordPress upgrades with the digital currency Bitcoin.

The move makes WordPress one of the largest, most reputable online services to accept the fledgling Bitcoin currency.

Bitcoin is an online currency that allows buyers and sellers to exchange money anonymously. According to a post on the WordPress blog, the appeal of Bitcoin for WordPress is that, unlike credit cards and PayPal, “Bitcoin has no central authority and no way to lock entire countries out of the network … merchants who accept Bitcoin payments can do business with anyone.”

The anonymous aspect has made Bitcoin a target for law enforcement agencies, but for WordPress it means that users living in any of the over 60 countries currently blocked by PayPal (and many credit card companies) now have a way to pay for WordPress upgrades and services.

While setting up a basic blog on is free, there are paid upgrades available for custom themes, custom domains or to remove ads from your site.

Bitcoin is in your WordPress. Image: Screenshot/Webmonkey.

Automattic, WordPress’ parent company, accepts Bitcoin payments through, which has now been integrated into the payment interface alongside the PayPal and traditional credit card options. WordPress is foregoing the Bitcoin “confirmations” process, which would help protect the company against fraud. Here’s an explanation from the FAQ:

We could wait for the first confirmation (typically 5-10 minutes) but we prefer to make the customer experience as smooth as possible. Making you wait for confirmations would virtually eliminate our risk but we’re confident that with digital products like ours the risk is already acceptably low.

Note that while WordPress is accepting Bitcoin payments, it may not work for everything just yet. The option to pay with Bitcoin appears to be limited to upgrade bundles at the moment. Purchasing custom themes or domains by themselves is not currently possible due to what WordPress calls “technical complications.”

WordPress adopting Bitcoin is good news for users in countries like Haiti, Ethiopia, or Kenya, which are often blocked by traditional payment systems. It’s also good news for Bitcoin supporters who now have another, very large, every legitimate company on their side.

File Under: servers, Web Services

New Arq 3 Taps Amazon Glacier for Backup Nirvana

Arq 3 makes it easy to navigate Amazon’s Glacier file storage service. Original Image: Christine Zenino/Flickr

Amazon’s Glacier file storage service costs less than a penny per gigabyte per month. It’s hard to think of a cheaper, better way to create and store an offsite backup of your files.

Of course backups are only useful if you actually create them on a regular basis. Unfortunately, getting your files into Glacier’s dirt-cheap storage requires either a manual effort on your part or some scripting-fu to automate your own system.

Back when Glacier first launched we speculated that it would be a perfect fit for a backup utility like the OS X backup app Arq. Now Arq 3 has been released and among its new features is built-in support for Amazon Glacier. Arq 3 is $29 per computer, upgrading from v2 is $15.

Arq creator Stefan Reitshamer sent over a preview of Arq 3 a while back and, having used it for the better part of a week now, I can attest that it, combined with Glacier, does indeed make for a near-perfect low-cost off-site backup solution.

Using Arq 3 with Glacier is simple. Just sign up for an Amazon Web services account and create a set of access keys. Then fire up Arq, enter your keys and select which files you want to back up. Choose Glacier for the storage type and then make any customizations you’d like (for example, excluding folders and files you don’t want backed up).

That’s all there is to it; close Arq and it will back up your files in the background. By default Arq 3 is set to make Glacier backups every day at 12 a.m., but you can change that in the preferences.

Should disaster strike and you need to get your files out of Glacier (or S3), just fire up Arq, select the files you need and click “restore.” Arq will give you an estimate of your costs and you can adjust the download speed — the slower the download the cheaper it is to pull files out of Glacier. There’s also an open source command line client available on GitHub in the event that the Arq app is no longer around when you need to get your files back.

Estimating costs with Arq’s Glacier restore screen. Image: Screenshot/Webmonkey

Existing Arq users should note that Amazon currently doesn’t offer an API for moving from S3 to Glacier (though the company says one is in the works). That means if you want to switch any current S3 backups to Glacier you’ll need to first remove the folder from Arq and then re-add it to trigger the storage type dialog.

In order to get the most out of Arq 3 and Glacier it helps to understand how Glacier works. Unlike Amazon S3, which is designed for cheap but accessible file storage, Glacier is, as the name implies, playing the long, slow game. Glacier is intended for long-term storage that’s not accessed frequently. If you need to grab your files on a regular basis Glacier will likely end up costing you more than S3, but for secondary (or tertiary) backups of large files like images, videos or databases Glacier works wonderfully.

My backup scenario works like this: For local backups I have two external drives. One is nearly always connected and makes a Time Machine backup every night. Once a week I clone my entire drive off to the second external drive. For offsite backups I use rsync and cron to backup key documents to my own server (most are also stored in Dropbox, which is not really a backup service, but can, in a pinch, be used like one).

But my server was running out of space. Photo and video libraries are only getting bigger and most web hosting services tend to get very expensive once you pass the 100GB mark. That’s where Arq and Glacier come in. It took a while, but I now have all 120GB of my photos backed up to Glacier, which will cost me $1.20/month.

The only catch to using Glacier is that getting the data back out can take some time. There are also some additional charges for pulling down your data, but as noted above, Arq will give you an estimate of your costs and you can adjust the download speed to make things cheaper. The slow speeds aren’t ideal when you actually need your data, but these are secondary, worst-case scenario backups anyway. If my laptop drive dies, I can just copy the clone or Time Machine backup drive to get my files back. The Glacier backup is only there if my house burns down or floods or something else destroys my local backups. While it would, according to Arq’s estimate, cost about $60 and take over four days to get my data out of Glacier, that would likely seem like a bargain when I’d have otherwise lost everything.