How DNA could store all the world’s data

Posted by

[…] The latest experiment signals that interest in using DNA as a storage medium is surging far beyond genomics: the whole world is facing a data crunch. Counting everything from astronomical images and journal articles to YouTube videos, the global digital archive will hit an estimated 44 trillion gigabytes (GB) by 2020, a tenfold increase over 2013. By 2040, if everything were stored for instant access in, say, the flash memory chips used in memory sticks, the archive would consume 10–100 times the expected supply of microchip-grade silicon.

That is one reason why permanent archives of rarely accessed data currently rely on old-fashioned magnetic tapes. This medium packs in information much more densely than silicon can, but is much slower to read. Yet even that approach is becoming unsustainable, says David Markowitz, a computational neuroscientist at the US Intelligence Advanced Research Projects Activity (IARPA) in Washington DC. It is possible to imagine a data centre holding an exabyte (one billion gigabytes) on tape drives, he says. But such a centre would require US$1 billion over 10 years to build and maintain, as well as hundreds of megawatts of power. “Molecular data storage has the potential to reduce all of those requirements by up to three orders of magnitude,” says Markowitz. If information could be packaged as densely as it is in the genes of the bacterium Escherichia coli, the world’s storage needs could be met by about a kilogram of DNA.



FiveWordsForTheFuture - Sep 7, 2016 | Computing, Information Technology
Tagged | , , , ,