For developers using Flash, integrating stats has been a chore. Coders had whipped up solutions previously, but now there’s an official package released jointly by Google and Adobe at Monday’s MAX Conference.
Flash and Flex developers can use the components and libraries to track pageviews and events. In a video (embedded below) Sprout’s Matthew McNeely shows how to use Flash analytics with singer Pink’s widget creator as an example.
Several developers helped create the new ActionScript API for Google Analytics. The code is open source under the Apache 2 license, so the less-than-casual coder can dig even deeper.
Web analytics, mining visitor data for trends, has been around for about fifteen years. We web workers have attempted to find good ways to convert the numbers to something that resembles user engagement, with varying success. Now that videos are a major part of the web, new ways of measuring streams are emerging. The nascency of video analytics serves as a reminder that we still don’t have standard web stats completely figured out.
In the late 90s, “hits” became a popular term, despite not being very useful from an analytics standpoint. Hits refers to the number of requests a server receives, including images, CSS files, and more. The term does little to help us compare sites to each other, because the number of requests can vary wildly.
Technical publisher O’Reilly asks, are streams the new hits? Different sites have different definitions of a stream. For some, it means an entire video. For others, it’s a segment of video. That makes comparing a site’s streams about as useful as comparing its hits.
We’ve known that hits are a bad metric for some time, but what about other ways to analyze web traffic?
Pageviews are probably the most popular means to evaluate traffic, as well as dole out ad dollars. Much as video can be split into multiple streams, text articles are often separated into several pages. So, our beloved pageviews aren’t necessarily the best gauge of user engagement.
Visits is another common web metric, but it doesn’t take into account what users do while on the site. Viewing one page, immediately closing the browser and spending an hour clicking through archives are counted equally. Time per visit and Pageviews per visit attempt to rectify that problem, though each could also signs of user confusion.
Bounce rate has recently become a popular sign that user’s lack engagement. If a user views only a single page during a visit, they’ve “bounced.” For some types of sites, such as those selling products, this might be a good metric. Information sites might be unfairly docked if a user quickly checks the page for the latest news and then closes the window. Following a link to an external site from the first page counts as a bounce. That means that if you search Google from a toolbar and find a result on the first page you just bounced.
Any metric probably has its downsides. The same will be true of video analytics, as we search for the right data. Recently Google examined brain waves to find new ways to evaluate user response to overlay ads in YouTube videos. Even though click rates are likely abysmal, the company is searching for proof that the ads are effective.
I don’t know the details behind the methods that Google calls “more technologically sensitive.” My guess is these video metrics, like the examples above, also have their problems. For example, if the brain reacts to an ad, it could just as likely be caused by the user wondering what showed up as it could by the user being engaged by the ad.
Just because metrics are imperfect doesn’t mean we shouldn’t use them at all. Both striving for better measures and being aware of the downsides of current ones are important. We’ll continue using less-than-perfect numbers because they’re better than nothing and, in some cases, all we have.
Opera has announced the Metadata Analysis and Mining Application (MAMA), a search engine for web developers looking for backend analytics. Basically, the reports are regular search results, but with the focus on things like the number of <font> tags used on the web, or the shocking fact that less than 5% of websites pass the W3C’s validation test.
The wealth of data was culled from 3,509,180 URLs over 3,011,668 domains. All of this data will help you win geek bar fights over internet trivia questions like:
Q: What is the most popular web server on the internet?
A: Apache. Apache serves about 50.76 percent over 2,011,088 domains (67.72 percent). IIS: 35.84 percent over 769,375 domains (25.91 percent).
Q: How many web developers are good enough to write code that passes W3C validation?
A: 145,009 out of 3,509,180 URLs passed validation — only 4.13 percent.
Q: Which country uses Ajax the least?
A: Japan showed the least usage of XMLHttpRequest, while Norway (Opera’s home country) exhibited the highest usage rates at 10.1 percent.
Otherwise, this is a great source of data to help drive standards forward. In many ways, standards bodies were moving on blindly and adding cool features as they are developed. Perhaps with the plethora of data Opera provides on the web, decisions can be made on practical numbers.
The capability to run searches of your own isn’t available to users, but the key findings report is available on Opera’s developer’s site. The reports include many of the most popular questions.
Luckily for us, Opera has offered to run some reports for Webmonkey.
So you tell us: What questions you have for the MAMA oracle? We’ll send them to Opera and post them later. Leave your questions in comments.
Yahoo unveiled a free web analytics service for its e-commerce customers and others who work closely with Yahoo. Though it is not yet widely available, the pressure of Google’s similar offering means we’ll all likely have access to it eventually.
Though I haven’t used either IndexTools or its reborn version, it seems like a powerful tool. The real-time reporting is something that sets it apart. Google Analytics is more useful to look at yesterday’s traffic than today’s. Yahoo also claims to store raw traffic data instead of an aggregated version. With raw data you can slice and dice the historical data any way you want.
“Yahoo! Web Analytics will be made available to a wide range of Yahoo!’s customers, partners, developers and advertisers in stages throughout the rest of 2008 and into 2009.”
If you don’t fit into one of those groups, you might try setting up a small search marketing campaign, which is likely to be the cheapest way to get invited. When Google first rolled out Analytics, an Adwords account was required.
There’s also a form to receive updates at the Web Analytics page.
Google Trends added a new website layer to its search analytics tool Friday. The new feature graphs the amount of daily unique visitors to a website and compares it to other sites, demonstrating a window into the highs and lows of web traffic.
The new layer is capable of graphing up to five different websites and displays the results as far back as May 2007. It’s particularly useful for pointing out particular web events over history. For example, Webmonkey’s graph shows a spike when our site relaunched on May 19.
Included with the report is a “Regions” table showing a geographical rundown of where visitors are from. “Also Visited” and “Also Searched For” tables allow a peek into visitors’ surfing habits outside of the targeted sites. The data in the three columns account for the last 30 days.
“Trends for Websites combines information from a variety of sources, such as aggregated Google search data, aggregated opt-in anonymous Google Analytics data, opt-in consumer panel data, and other third-party market research.”
Despite typical Google vagueness, the amount of data promises to represent a very wide stretch of web traffic considering the amount pouring through Google search, e-mail and toolbar products — not to mention the sites that host Google Analytics code and contribute anonymous statistics. Google promises this data is truly anonymous and used only to “to calibrate macro-level insights.”
The feature enters Google Trends into the space currently occupied by Compete, Alexa and Comscore. Given the reach of Google services and products, it’s a good guess that Google data will be more robust than the web-traffic monitoring methods of its competitors in the space.