A new age of computing is on the horizon as Moore’s Law comes to an end

Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. However, we are reaching an end of an era as this rule is becoming obsolete.
Tablet and laptop
With computer processors needing to become more complex yet compact, a revolution of their technology may be on the cards. Photo: Pexels

The law could be broken

This law is named after Gordon Moore, the co-founder of Intel. Technology has expanded at an immeasurable rate over the last few decades. Engineers are already squeezing every nanometer of advancement out of silicon chips.

Initially, the doubling was great for the economy of the computer industry. It meant that equipment would become more compact yet more powerful and cheaper.

However, these transistors are now becoming too tiny to produce efficiently. As the demand for smaller chips becomes greater, there is more pressure to come up with viable solutions. Although, at the moment, it could cost billions of dollars to keep up with the ever-changing market.

Moore's Law
The amount of transistors on integrated circuit chips has multiplied year on year. Photo: ourworldindata.org via Wikimedia Commons

Change is coming

Hargreaves Landsdown shared a report by The Telegraph commenting on the industry switch. Intel itself postponed the launch of its 10 nanometer-or a millionth of a millimeter-chip in 2016 to this year. This is because the tech giant struggled to keep the chips from working effectively.

We could see Moore’s Law Be completely overturned by the middle of the next decade. The increase in costs and the limitation of the transistors will force this change. It is estimated that the cost of building machines to develop the minuscule parts needed will be $10 billion.

Professor of Computer Engineering at the University of Manchester, Stephen Furber says that the positive aspects of smaller development will become less prevalent over the years.

“Below 20nm transistors cease to get more efficient or more cost-effective as you continue to shrink them,” he said, as reported by The Telegraph.

“Sure, they get smaller, and you can fit more on a chip, but the other historic benefits of shrinkage no longer apply.”

Microchip
Engineers will be busy trying to find new ways to develop computer chips heading into the 2020s. Photo: Needpix

A new solution is needed

Therefore, new forms of technology have to be considered to match the needs of the market. Furber believes combining artificial intelligence with brain science could be a way to deliver on future demand.

“The explosive developments over the last 15 years in machine learning and AI have been paralleled by developments in more brain-inspired approaches; neuromorphics,” he said.

“Although developed primarily for brain science, there is growing interest in commercial applications of neuromorphics, though nothing compelling yet. If and when there is a breakthrough in our understanding of how the brain works, this should unleash another huge leap forward in machine learning and AI.”

Quantum supremacy may help with the solution to this issue. Quantum processors use qubits rather than the parts that standard processors work with.

Unlike bits which can have a value of either zero or one, qubits can be both at the same time. Google recently claimed that it had achieved Quantum Supremacy thanks to its Sycamore processor. Microsoft had also stated that it is close to achieving the same feat.

However, Quantum supremacy is still in its infancy and its application is still under speculation. Nonetheless, there is a need for new forms of computing if the industry continues to head in the direction that it is.

What do you think would be the best way for computing to adapt now that Moore’s Law is coming to an end? Let us know your thoughts in the comment section.

0 0 vote
Article Rating
1 Shares:
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
You May Also Like