Archive for the ‘servers’ Category

File Under: servers

Apple Fails to Patch DNS ‘Cache Poisoning’ Attack

barbed wireThe previously hypothetical DNS cache poisoning bug you’ve no doubt heard about has made its way into the wild. That isn’t all that surprising given that there are no less than three publicly available exploits, which have been downloaded some ten thousand times.

What’s disturbing isn’t that the code is in the wild and potentially on your DNS server. No the problems is that, despite a concerted effort by vendors, there are still countless unpatched servers out there.

Apple especially has failed to protect its users. Even the normally Apple-supportive Tidbits blog has called the company out for failing to patch its OS X Server software.

The really sad thing in Apple’s case is that Internet Systems Consortium BIND DNS server, which is what OS X Server uses, was one of the first patched systems made available. Apple has simply declined to pass the patch on to its users leaving them vulnerable to DNS cache poisoning and other attacks.

So how do you know if your ISP has patched your DNS Server? Well, the short answer is you probably don’t. You could dig through and see if your ISP has made an announcement. Or maybe call customer service (good luck with that).

Or you could just replace your DNS server with one that you know is secure. It isn’t hard to do at all and we’ve got a new OpenDNS tutorial to walk you through the few steps it takes to setup OpenDNS as your DNS servers. OpenDNS isn’t affected by this latest bug and as an added bonus it’s generally faster than what your ISP uses.

[via Slashdot]

See Also:

Microsoft Dons Sheep Suit, Joins Open Source Foundation

LinuxfudMicrosoft has decided to hide the teeth for bit and joined the Apache Software Foundation (ASF) in an attempt to convey some goodwill to the open source community. The Apache Software Foundation is one of the largest organizations in the open-source world and Microsoft’s new platinum membership nets the ASF $100,000 annually.

Of course there’s some good reasons to question Microsoft’s motives. As Bruce Perens, a well-known open-source advocate, writes on Slashdot, “there’s much reason for caution.”

For instance, Microsoft was once working very hard to mount a legal attack designed to sue the entire open source world out of existence. Obviously that didn’t come the pass on a large scale, and Microsoft now appears to differentiate between open source (potentially benign to its interests) and Linux (a threat). Still, it isn’t hard to see why open source advocates remain wary of the company.

Perens writes:

Historically, Microsoft helped to fund SCO’s attack on Linux — we have court testimony under oath on that. They briefed HP on their plans to sue the the developers of Sendmail, Linux and other programs — we have the HP memo, which HP admitted was real. Their agreement with Novell was calculated to break the spirit of the GPL without violating the letter, so they’ve shown they are happy to cheat the developer community when it’s to their advantage. More recently, they have cheated every way they could in getting Office Open XML through ISO, even having one of their executives pose as officer of a national standards organization.

Perens suspects that Microsoft may be trying buy its way into the open-source world to in an effort to gain open-source credibility so when it comes to deals with government agencies and large corporations, the company can pass itself off as open source friendly.

Their increased involvement in Open Source organizations means that they will be taken as a member of the Open Source community when they speak with national legislators. This is terrible for us, because it means they’ll be able to short-circuit our work to protect Open Source from software patents by speaking to government as an insider in our communities.

What do you think? Has Microsoft genuinely had a change of heart or is this classic case of one hand shaking with open source while the knife-wielding other hand sneaks around back?

[via Slashdot]

See Also:

File Under: Programming, servers

Version Control Smackdown: Git vs Subversion vs Mercurial

Get three programmers into a discussion about which code versioning system they prefer, and you’re guaranteed to end up with some ruffled feathers. I mean, the choice is obvious, right?

The truth is that each of the most popular content versioning systems has its ups and downs. Git handles branches and merges well, and it’s very easy to set up a new repository. Subversion has the best set of client tools, but checkins and checkouts are faster (and smaller) with Mercurial.

Which source control system is the best? Cast your vote in the list below. If you’d like to nominate a new candidate for the title, scroll down to add it to the list. Want to argue your case? Leave a comment.

Show entries that are: hot | new | top-rated or submit your own

Submit an Entry

While you can submit as many entries for your favorite source control system as you want, you can only submit one every 30 minutes. No HTML allowed.

Back to top

See Also:

Gnip Makes Data Portability ‘Suck Less’

Data portability service Gnip (pronounced guh-nip), announced Tuesday, promises to make our Ajax-filled days even better. Developers have been scurrying for ways to efficiently consolidate and streamline data portability, and Gnip promises to do exactly that.

Web applications have typically relied on requesting data in cycles to get its information. Gnip promises to change the model in favor of one that is more push-oriented. Translation: It means getting “live” data, like email or stock quotes, quicker than ever. Gnip would provide data to clients by hosting as a middle man and providing host data asynchronously.

Even better, the service will complement web developers preferences, providing the data by translating the data polling standard of your choice. For example, RSS, ATOM, REST, XMPP or COMET are all promised to be supported.

Data providers and partners signed up for now include Digg, Flickr, Urban Dictionary, Del.icio.us and Six Apart. Currently, while the data is provided, data polling translation and identification will come later. Gnip has provided libraries for Perl, PHP, Java, Python and Ruby for now.

It’s awesome news for those in the business of making web applications. Retrieving information in the past has been a taxing process for both servers and applications.

File Under: servers, Uncategorized

Search Engine Robots Agree Over Standards

How I would like to be a fly on the wall during Microsoft, Yahoo, and Google meetings like the one where they agreed upon Robots Exclusion Protocols (REP). Do they trade barbs, quips and underhanded comments? In my imagination, the gathering is much like a Three Stooges episode.

However these meetings actually go, Web Developers reap the benefits when the three competitors agree as evidenced by the announcement of a standard robots.com protocol.

Microsoft, Yahoo, and Google each announced their involvement in the protocol over the past week along with documentation describing the protocol.

Search engines gather their information by creating tiny programs, or robots, that scan the internet for information. When the programs detect a web server, they copy all files in the server’s directories to their local cache, scan their data, and categorize them for inclusion into search results. Robots.txt is a file that is placed in web server directories that allow permission to the directory to search engines. If a robots.txt file is absent in the directory, the robot automatically assumes you are allowing the contents of that directory to be accessed by search engine.

The REP standardizes how the robots.txt file is interpreted by search engines. It allows web developers more control over privacy and how their data will appear.

All parties benefit by the new agreed-upon protocol because inconsistencies between search engines are erased. Now robots.txt files will be honored equally among the biggest search engines, and presumably by the rest of web-crawling robot community.