John Jacobsen works for the IceCube telescope project, the world’s largest neutrino detector, located at the South Pole. The project’s mission is to search for the radioactive sub-atomic particles that have been generated by violent astrophysical events: “exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars,” according to the project website.
Jacobsen is one of the people in charge of handling the massive amounts of data collected by IceCube. In the video, shot this week at the O’Reilly OSCON 2010 conference in Portland, Oregon, John explains how they collect a terabyte of raw data per hour, then send everything to IceCube’s remote research and backup facilities using a finicky satellite hook-up.
Antarctica is one of the least accommodating places on Earth to perform scientific research with computers. It’s the driest spot on the planet — atmospheric humidity hovers around zero — and bursts of static electricity threaten the integrity of IceCube’s data stores. The lack of humidity causes the server clusters’ cooling systems to break down. And if something fails, a spare might take six months to arrive.