Archive for the ‘Security’ Category

File Under: Security, Software

Microsoft Gets Ready to Pull the Life Support on Windows XP

That’s the end of the line. Image: Johan Larsson/Flickr

Today marks the first day of the last year of Windows XP’s long and storied life.

On April 8, 2014, Microsoft will officially stop supporting Windows XP, meaning there will be no more security updates or other patches. When April 2014 rolls around Microsoft will have supported Windows XP for nearly 12 years.

Should you chose not to upgrade before next year, you will be, in Microsoft’s words “at your own risk” in dealing with security vulnerability and any potential malware designed to exploit them.

According to NetMarketShare, just over 38 percent of PCs connected to the web are still running Windows XP. Given that current XP users have already ignored three OS upgrades, it seems reasonable to assume a significant number of XP diehards still won’t upgrade even now that Microsoft is no longer issuing security updates — all of which adds up to a potentially huge number of vulnerable PCs connected to the web.

NetMarketShare’s OS statistics for March 2013. Image: Screenshot/Webmonkey.

Starting around this time next year expect black hat hackers to have a botnet fire sale.

With so many suddenly vulnerable PCs on the web, it’s really only a matter of time before unpatched vulnerabilities are identified and exploited, which could mean a serious uptick in the amount of botnet spam or worse — imagine even a small percentage of those 38 percent of PCs being harnessed for distributed denial of service attacks.

For individual users upgrading Windows XP shouldn’t be too difficult, barring a dependency on software that’s never been updated. If Windows 7 or 8 aren’t to your liking there’s always Linux (I suggest starting with Mint Linux if you’re new to Linux).

Upgrading enterprise and government installations is somewhat more difficult. Microsoft puts the matter quite bluntly on the Windows blog: “If your organization has not started the migration to a modern desktop, you are late.”

The Windows blog post contains quite a few links designed to help anyone looking to upgrade, but at the enterprise/government level it may well be too late anyway. “Based on historical customer deployment data,” says Microsoft, “the average enterprise deployment can take 18 to 32 months from business case through full deployment.”

Windows XP isn’t the only Microsoft product that will be getting the heave-ho this time next year. Internet Explorer 6 on XP, Office 2003, Exchange Server 2003 and Exchange Server 2010 Service Pack 2 (newer service packs of Exchange Server 2010 are still supported) will all be cast adrift. It’s also worth noting that this affects virtual machines as well, so if you’ve got a Windows XP virtual machine for testing websites, well, be careful out there.

File Under: Security, Web Basics

Google: Here’s What to Do if Your Website Is Hacked

Chrome’s malware warning page. Image: Google.

Nothing drives away your visitors quite like a message from Google that “this site may harm your computer” or “this site may have been compromised.”

Hopefully you’ll never need it, but if your site does get hacked Google has set up a new site dedicated to helping websites that have been hacked.

The “Help for Hacked Sites” section of Google’s Webmaster Tools offers up articles and videos to help you not only recover from compromising hacks, but take steps to make sure it doesn’t happen again.

Part of what makes hacked sites difficult to deal with is that oftentimes developers don’t even notice that they’ve been compromised. “Hacks are often invisible to users,” says Google in its new help section. “For example, unbeknownst to the site owner, the hacker may have infected their site with harmful code which in turn can record keystrokes on visitors’ computers, stealing login credentials for online banking or financial transactions”

Google has an 8-step program for unhacking your site, which include basics like identifying the vulnerability that was used to compromise your site, as well as how to request a review so Google will remove the dreaded “this site has been compromised” message from its search results.

For more info and all the details on what to do if you’ve been hacked, check out the new Help for Hacked Sites section of Google’s Webmaster Tools.

File Under: Security, Web Services

Users Scramble as GitHub Search Exposes Passwords, Security Details

Inspectocat says “never store private stuff in public places.” Image: Github

GitHub has temporarily shut down some parts of the site-wide search update it launched yesterday. As we mentioned in our earlier post, the new search tools made it much easier to find passwords, private ssh keys and security tokens stored in GitHub repos.

GitHub hasn’t officially addressed the issue, but it appears to be blocking some of the security-related searches that were posted earlier in this Hacker News thread.

GitHub’s status site also says that “search remains unavailable,” though in my testing searching worked just fine so long as you weren’t entering words like “RSA,” “password,” “secret_token” or the like.

Most of the passwords and other security data exposed were personal — typically private ssh keys to someone’s server or a Gmail password — which is bad enough, but at least one appeared to reveal a password for an account on Chromium.org, the repository that holds the source code for Google’s open-source web browser. Another reportedly exposed an ssh password to a production server of a “major, MAJOR website in China.”

Unfortunately for people that have been storing their private security credentials in public GitHub repos what GitHub’s search engine revealed is nothing new. Google long ago indexed that data and a targeted site:github.com search will turn up the same exposed security info, which makes GitHub’s temporarily crippled search a token gesture at best.

If you accidentally stored sensitive data on GitHub the most important thing to do is change your passwords, keys and tokens. After you’ve created new security credentials for any exposed servers and accounts then you can go back and delete your old data from GitHub.

Given that Git, the version control system behind GitHub, is specifically designed to prevent data from disappearing, deleting your sensitive data takes more than just the Git command rm. GitHub has full details on how to get your sensitive data off the site. As GitHub’s instructions say, “if you committed a password, change it! If you committed a key, generate a new one. Once the commit has been pushed you should consider the data to be compromised.”

File Under: Browsers, Security

HTTPS Everywhere 3.0 Secures the Web for Firefox, Chrome Users

The Electronic Frontier Foundation (EFF) has released version 3.0 of its HTTPS Everywhere browser plugin, which will automatically redirect you to secure, HTTPS connections. HTTPS Everywhere 3.0 adds support for 1,500 more websites, twice as many as previous releases.

Firefox users can install HTTPS Everywhere directly from the EFF site. There’s also an alpha release available for Google’s Chrome web browser. Unfortunately, limited add-on APIs mean that HTTPS Everywhere isn’t available for other web browsers.

Once it’s installed, the HTTPS Everywhere extension makes it easy to ensure you’re connecting to secure sites by rewriting all requests to an HTTPS URL whenever you visit one of the thousands of sites HTTPS Everywhere supports.

Why all the fuss about HTTPS? Well, every time you log in to a website through a plain HTTP connection, you expose your data to the world. It’s a bit like writing your username and password on a postcard and dropping it in the mailbox. Think of an HTTPS connection as an envelope to protect your postcard from prying eyes.

The problem gets a bit more complicated than just HTTPS though. Most sites already use HTTPS to handle your login info — that’s a good first step — but once you’re logged in sites often revert back to using an insecure HTTP connection.

So why doesn’t the entire web use HTTPS all the time? The answer is slightly complicated, but the primary reason is speed. HTTPS can’t be cached on CDN networks, which means pages may load slightly slower than they would over standard, insecure connections. For smaller sites the added costs involved with HTTPS certificates make HTTPS more expensive. However neither of those stumbling blocks have stopped Google, Facebook, Twitter, Wikipedia or the thousands of other sites large and small that now offer HTTPS connections.

The EFF is still a long way from its long term goal of encrypting the entire web, but with more sites supporting HTTPS connections every day the web is slowly but surely getting more secure.

File Under: Security, Web Services

Developer Quits OAuth 2.0 Spec, Calls It ‘a Bad Protocol’

After three years as lead author and editor of the OAuth 2.0 specification, Eran Hammer has stepped down from his role, withdrawn his name from the spec and even quit the OAuth working group completely, frustrated with what he now calls “a bad protocol.”

OAuth 2.0 is a rewrite of the original OAuth spec, which offers a secure way to sidestep the dilemma of having to hand over passwords to third party sites and apps to access user data. Google, Facebook, Twitter, and Yahoo are among the high-profile sites that have embraced OAuth in some fashion.

Unfortunately, according to Hammer those same big names are at least partly responsible for making OAuth 2.0 the fiendishly complex and convoluted spec that it has become. Hammer is not the first to question the usefulness of OAuth 2.0. In fact, we’ve previously argued that OAuth 2.0′s complexity is hurting the spirit of API experimentation on the web.

Hammer isn’t just questioning OAuth 2.0, he’s abandoned it entirely and completely erased himself from the project, calling it “a bad protocol… bad enough that I no longer want to be associated with it.”

In Hammer’s view OAuth 2.0 is “more complex, less interoperable, less useful, more incomplete, and most importantly, less secure” than its 1.0 cousin.

The problem according to Hammer are the “enterprise” edge cases which do nothing for the vast majority of developers other than make OAuth 2.0 more complex. As Hammer writes, “that is the enterprise way. The WS-* way. 2.0 provides a whole new frontier to sell consulting services and integration solutions.”

So what should you do? Well, as RSS developer Dave Winer says, “OAuth 1 is fine.” Indeed, OAuth 1.0 works and it’s much more accessible for smaller development teams — you don’t need Google’s engineering team to turn out a secure implementation of OAuth 1.0. Hammer has similar advice, writing, “if you are currently using 1.0 successfully, ignore 2.0. It offers no real value over 1.0.”

Of course the departure of an editor doesn’t mean OAuth 2.0 is going away. It remains, like many other standards, under the auspices of the Internet Engineering Task Force (IETF), which also oversees protocols like SMTP and TCP/IP.