Archive for the ‘servers’ Category

File Under: servers, Software & Tools

Opera MAMA Reports Web Standard Usage

Opera has announced the Metadata Analysis and Mining Application (MAMA), a search engine for web developers looking for backend analytics. Basically, the reports are regular search results, but with the focus on things like the number of <font> tags used on the web, or the shocking fact that less than 5% of websites pass the W3C’s validation test.

The wealth of data was culled from 3,509,180 URLs over 3,011,668 domains. All of this data will help you win geek bar fights over internet trivia questions like:

Q: What is the most popular web server on the internet?

A: Apache. Apache serves about 50.76 percent over 2,011,088 domains (67.72 percent). IIS: 35.84 percent over 769,375 domains (25.91 percent).

Q: How many web developers are good enough to write code that passes W3C validation?

A: 145,009 out of 3,509,180 URLs passed validation — only 4.13 percent.

Q: Which country uses Ajax the least?

A: Japan showed the least usage of XMLHttpRequest, while Norway (Opera’s home country) exhibited the highest usage rates at 10.1 percent.

Otherwise, this is a great source of data to help drive standards forward. In many ways, standards bodies were moving on blindly and adding cool features as they are developed. Perhaps with the plethora of data Opera provides on the web, decisions can be made on practical numbers.

The capability to run searches of your own isn’t available to users, but the key findings report is available on Opera’s developer’s site. The reports include many of the most popular questions.

Luckily for us, Opera has offered to run some reports for Webmonkey.

So you tell us: What questions you have for the MAMA oracle? We’ll send them to Opera and post them later. Leave your questions in comments.

See Also:

File Under: servers

Apple Fails to Patch DNS ‘Cache Poisoning’ Attack

barbed wireThe previously hypothetical DNS cache poisoning bug you’ve no doubt heard about has made its way into the wild. That isn’t all that surprising given that there are no less than three publicly available exploits, which have been downloaded some ten thousand times.

What’s disturbing isn’t that the code is in the wild and potentially on your DNS server. No the problems is that, despite a concerted effort by vendors, there are still countless unpatched servers out there.

Apple especially has failed to protect its users. Even the normally Apple-supportive Tidbits blog has called the company out for failing to patch its OS X Server software.

The really sad thing in Apple’s case is that Internet Systems Consortium BIND DNS server, which is what OS X Server uses, was one of the first patched systems made available. Apple has simply declined to pass the patch on to its users leaving them vulnerable to DNS cache poisoning and other attacks.

So how do you know if your ISP has patched your DNS Server? Well, the short answer is you probably don’t. You could dig through and see if your ISP has made an announcement. Or maybe call customer service (good luck with that).

Or you could just replace your DNS server with one that you know is secure. It isn’t hard to do at all and we’ve got a new OpenDNS tutorial to walk you through the few steps it takes to setup OpenDNS as your DNS servers. OpenDNS isn’t affected by this latest bug and as an added bonus it’s generally faster than what your ISP uses.

[via Slashdot]

See Also:

Microsoft Dons Sheep Suit, Joins Open Source Foundation

LinuxfudMicrosoft has decided to hide the teeth for bit and joined the Apache Software Foundation (ASF) in an attempt to convey some goodwill to the open source community. The Apache Software Foundation is one of the largest organizations in the open-source world and Microsoft’s new platinum membership nets the ASF $100,000 annually.

Of course there’s some good reasons to question Microsoft’s motives. As Bruce Perens, a well-known open-source advocate, writes on Slashdot, “there’s much reason for caution.”

For instance, Microsoft was once working very hard to mount a legal attack designed to sue the entire open source world out of existence. Obviously that didn’t come the pass on a large scale, and Microsoft now appears to differentiate between open source (potentially benign to its interests) and Linux (a threat). Still, it isn’t hard to see why open source advocates remain wary of the company.

Perens writes:

Historically, Microsoft helped to fund SCO’s attack on Linux — we have court testimony under oath on that. They briefed HP on their plans to sue the the developers of Sendmail, Linux and other programs — we have the HP memo, which HP admitted was real. Their agreement with Novell was calculated to break the spirit of the GPL without violating the letter, so they’ve shown they are happy to cheat the developer community when it’s to their advantage. More recently, they have cheated every way they could in getting Office Open XML through ISO, even having one of their executives pose as officer of a national standards organization.

Perens suspects that Microsoft may be trying buy its way into the open-source world to in an effort to gain open-source credibility so when it comes to deals with government agencies and large corporations, the company can pass itself off as open source friendly.

Their increased involvement in Open Source organizations means that they will be taken as a member of the Open Source community when they speak with national legislators. This is terrible for us, because it means they’ll be able to short-circuit our work to protect Open Source from software patents by speaking to government as an insider in our communities.

What do you think? Has Microsoft genuinely had a change of heart or is this classic case of one hand shaking with open source while the knife-wielding other hand sneaks around back?

[via Slashdot]

See Also:

File Under: Programming, servers

Version Control Smackdown: Git vs Subversion vs Mercurial

Get three programmers into a discussion about which code versioning system they prefer, and you’re guaranteed to end up with some ruffled feathers. I mean, the choice is obvious, right?

The truth is that each of the most popular content versioning systems has its ups and downs. Git handles branches and merges well, and it’s very easy to set up a new repository. Subversion has the best set of client tools, but checkins and checkouts are faster (and smaller) with Mercurial.

Which source control system is the best? Cast your vote in the list below. If you’d like to nominate a new candidate for the title, scroll down to add it to the list. Want to argue your case? Leave a comment.

Show entries that are: hot | new | top-rated or submit your own

Submit an Entry

While you can submit as many entries for your favorite source control system as you want, you can only submit one every 30 minutes. No HTML allowed.

Back to top

See Also:

Gnip Makes Data Portability ‘Suck Less’

Data portability service Gnip (pronounced guh-nip), announced Tuesday, promises to make our Ajax-filled days even better. Developers have been scurrying for ways to efficiently consolidate and streamline data portability, and Gnip promises to do exactly that.

Web applications have typically relied on requesting data in cycles to get its information. Gnip promises to change the model in favor of one that is more push-oriented. Translation: It means getting “live” data, like email or stock quotes, quicker than ever. Gnip would provide data to clients by hosting as a middle man and providing host data asynchronously.

Even better, the service will complement web developers preferences, providing the data by translating the data polling standard of your choice. For example, RSS, ATOM, REST, XMPP or COMET are all promised to be supported.

Data providers and partners signed up for now include Digg, Flickr, Urban Dictionary, and Six Apart. Currently, while the data is provided, data polling translation and identification will come later. Gnip has provided libraries for Perl, PHP, Java, Python and Ruby for now.

It’s awesome news for those in the business of making web applications. Retrieving information in the past has been a taxing process for both servers and applications.