Why next-generation computers may not be based on silicon

Several years ago, I had the opportunity to visit Peterhof Palace in St. Petersburg, Russia. For those who might not be familiar with the palace, its signature feature is the more than 150 fountains that adorn the property. You can see some of those fountains in my photo below. The truly remarkable thing about these fountains is that they were created long before the invention of electric pumps. The fountains are powered solely by gravity, similarly to the way that the locks work in the older section of the Panama Canal.

next-generation computers
Brien Posey

As amazing as the fountains are, looking at the various pipes and valves took my mind somewhere completely unexpected. I began to realize that it would be theoretically possible to construct a crude computing device that ran not on electricity, but on water. I’m not talking about using water as some sort of battery, but rather creating logic “circuits” from pipes and valves.

Think about it for a moment. The transistors inside of a microprocessor perform switching operations (commonly called flops) that essentially either allow or block the flow of electrons, depending on the input that is being received. Pipes and valves could be used to do something similar with water, although in a much more simplistic way. Rather than controlling the flow of electrons, you could control the flow of water and achieve a similar (albeit much less sophisticated) result. I’m not saying that such a computer would be practical or efficient, only that it would be possible to build such a machine.

Of course, I am not the first person to consider the use of alternative materials in a computing device. For decades there has been talk of replacing silicon-based chips with chips made from gallium arsenide.

This idea has been around forever. My first writing gig involved writing about Windows 95, which was brand new at the time. The first email that I ever got from a reader asked me if I knew how Windows 95 was similar to gallium arsenide. I can’t remember my response, but I received a reply telling me that both Windows and gallium arsenide are probably good for something, but nobody can figure out what it is.

Gallium arsenide holds the potential to make chips that are significantly faster than the ones that we have today. Electrons flow much more quickly through gallium arsenide than they do through silicon. The problem is, however, that it is extraordinarily difficult to manufacture chips out of gallium arsenide. Not the least of the manufacturing challenges is the fact that a key component of gallium arsenide is arsenic — a highly poisonous substance.

Strangely enough, the discussion of building logic circuits out of alternative materials such as gallium arsenide (or water) may be a moot point. The next-generation computers of the future seem as though they are going to have almost no resemblance to the computers that we use today. At least that seems to be the case for supercomputers and other systems that handle extremely heavy computational workloads.

Quantum computing

At last year’s Ignite conference, Microsoft surprised attendees by revealing a prototype quantum computer, shown in the photo below. A quantum computer is a great example of a nontraditional computer that is not based on the use of silicon. It’s so nontraditional in fact that the way that it works tends to be a little bit nebulous (although that’s the whole point).

next-generation computers
Microsoft’s quantum computer prototype. Photo: Brien Posey

Before I explain what quantum computing is, I want to quickly answer the question that I receive most frequently. Just after Microsoft’s announcement last year, I had several people ask me if they should hold off on buying new hardware until the quantum computer becomes available.

The quantum computer is probably still a few years away from being ready for production use. Even then, it will be cost-prohibitive for most organizations. To the best of my knowledge, Microsoft has not discussed cost, but a price tag in the hundreds of millions of dollars is not implausible. More than likely, those who have tasks that are well suited for a quantum computer will lease time on a computing device that is owned by Microsoft or by a large research university. Incidentally, there are other companies, including Google, that are also working to develop quantum computers.

Quantum computers are absolutely nothing like a normal computer and do not rely on transistor-based microprocessors. Instead, a quantum computer works by altering the state of subatomic particles and using the particle state as a reflection of a data value.

To really explain quantum computing, I would have to delve into a full-blown discussion of quantum physics. I am guessing that if I did that, then most people would click away without reading the rest of the article. That being the case, I’m not going to go into a deep granular discussion of how quantum computing works. However, there are two basic concepts that you should be familiar with.

First, quantum computers are not based on the use of binary code. Binary code is used in today’s computers because transistors support two states (reflected as 0 and 1). Quantum computers are not based on transistors, but rather on the use of quantum bits. A quantum bit can have far more than two states.

The other thing that you need to understand is that a quantum bit can store far more data than a binary bit. One hundred quantum bits can theoretically contain 1,267,650,600,228,229,401,496,703,205,375 individual pieces of information. More importantly, however, the way that quantum bits work makes parallel processing a necessity. This means that mind-bogglingly massive calculations could conceivably be performed in a fraction of a second, using only one “CPU cycle.”

DNA-based data storage

Another way in which researchers are moving away from traditional computing environments is in the work that is being done with regard to using DNA molecules for data storage. DNA molecules, when properly engineered, can store vast amounts of data. Currently, a single gram of synthetic DNA can hold 215 petabytes of data!

As you might expect, however, there are a few problems with using DNA for data storage. One problem that I have yet to hear anyone discuss is that DNA has a limited shelf life. Like other organic molecules, DNA molecules begin to break down over time. Cryogenics is currently being used to preserve DNA molecules, but even that has its limits. DNA stored at -20 degrees Celsius can be preserved for several months while reducing the temperature to -80 degrees Celsius can preserve the DNA for a few years.

The other problem with DNA-based data storage is that using it is slow and tedious. After all, data isn’t being written to magnetic or optical media. DNA-based storage requires the arrangement of individual proteins. Reading the data requires the translation of the protein structure into zeros and ones. In other words, the DNA has to be sequenced in order for the data to be read. This is difficult enough for when it comes to reading linear data, but random data access presents a monumental challenge.

Microsoft research has achieved random data access by leveraging the use of primers that exist at the molecular level, but even with that the difficulties of working with DNA have limited storage capacities. As of February 2018, Microsoft had managed to encode and retrieve 400MB of data using DNA.

Pushing the boundaries

I think that it is really exciting to see the technology industry pushing the boundaries of what is known to be possible, and exploring alternative forms of computation and storage. While we might never have a quantum computer and a DNA hard drive sitting on our desk, such technologies will undoubtedly one day solve global problems whose solutions have thus far eluded humankind.

Featured image: Shutterstock

About The Author

2 thoughts on “Why next-generation computers may not be based on silicon”

  1. Nice article though your paragraph about the “DNA has a limited shelf life” is rather incorrect. There are numerous examples that DNA can be extracted from fossils and I am not talking about science fiction such as Jurassic Park.

    Now the oldest DNA ever sequenced come from a 700 000-years old ancestor of the horse (https://www.nature.com/articles/nature12323). This I would call as a pretty durable “hard drive”.

    In addition, the DNA does not have to be stored at -80C or -20C as you have mentioned. Under dark and dry storage conditions e.g. closed uncontaminated vial in can survive thousands of years.

  2. Thanks for the insight Ferdinand. Based on everything that I have read, DNA can be extracted from fossils, but the DNS degrades over time. One of the reasons why nobody has been able to clone a wooly mammoth yet is because each time they have extracted DNA from fossilized or frozen mammoth tissue, some of the proteins have degraded to the point of being unusable.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top