Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Hyunjun Park

Preparing for the DNA computation paradigm shift

Representational image of cloud computing.

In “The Structure of Scientific Revolutions,” physicist and philosopher Thomas Kuhn introduced the concept of a paradigm shift, which he used to describe a fundamental change in the basic framework of thinking in natural sciences. Throughout history, however, such paradigm shifts have occurred not just in natural sciences but across the entire spectrum of human endeavor, providing solutions to problems that appeared to be insurmountable under the old paradigm.

The field of data storage and computation is a case in point. As the demand for creation, retention, and data computation only ever increased with time, the current computing paradigm requires enterprises to build data continuously centers the size of football fields and nuclear power plants to power them. Here, the lack of resources and capabilities to build these things quickly enough indefinitely is not as important as the fact that the current computing paradigm is not compatible with a scalable solution.

DNA-based data storage and computation represents a break from the old framework and shows a scalable, sustainable path forward. TechRadar reported on one company, Biomemory, that recently announced an offering for consumers to have messages stored in DNA and shipped to them on a credit card-sized card. The DNA Data Storage Alliance recently announced specifications on recommended approaches to store data in DNA.

The growing cost of AI

AI delivers innovation at a rate and pace the world has never experienced but comes at a substantial cost. AI generates volumes of data, and machine learning models are expensive to train and maintain.

Last summer, it was reported that it costs more than $700 million daily to keep OpenAI’s ChatGPT up and running. Recently, TechRadar reported that Sam Altman is seeking up to a whopping $7 trillion to build a facility to boost the industry’s ability to produce microprocessors to process AI workloads.

These exorbitant costs point to a limitation of the current computation paradigm.

While the advent of the microprocessor and its exponential development over the decades is largely responsible for the world as we know it today, the basic von Neumann architecture surrounding the microprocessor hasn’t changed much since World War II. And it is this architecture, this computing paradigm, that is increasingly becoming incompatible with the ever-increasing demand for data storage and computation.

DNA computation: A paradigm shift

Our cells are DNA-based computers that come together to form our bodies, which collectively process trillions of operations in parallel with very little energy. Scientists have mimicked that and used synthetic DNA to store and compute digital data in laboratory settings.

Compared to existing microprocessor technologies, which process workloads serially, a significant benefit of DNA Computation platforms is the ability to use enzymes or DNA probes to compute in a massively parallel fashion.

Imagine mixing a container of blue liquid with a container of red liquid. The result of this computation –a new color—appears not by serially mixing each color molecule one at a time but by mixing all of them together in parallel. Just as in this thought experiment, computation is performed in a massively parallel manner directly on the data, without having to travel to memory or processor to be processed.

Potential DNA computation application areas

DNA-based computation has the potential to allow the generation of insights from data sets that are not currently possible with existing computers. Early application areas include search, signal processing, and machine learning.

One practical example is satellite imagery of the entire surface of the Earth. We’ll soon have decades’ worth of images taken every second of every day. Given the amount of data, a simple search using conventional technology could become prohibitively expensive, but with DNA, it could be as simple as a COVID test.

Other expected areas of early application are artificial intelligence, machine learning, data analytics, and secure computing. In addition, initial use cases are expected to include fraud detection in financial services, image processing for defect discovery in manufacturing, and digital signal processing in the energy sector.

Borrowing heavily from natural processes and cutting-edge synthetic biology tools, in addition to parallelization, automated and scalable DNA-based computation platforms are divorced from the limitations of traditional electronic systems. They leverage low energy, low physical footprint, and secure computing.

We've listed the best business cloud storage.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.