Industry Insights

Blog, Technology

Moore’s Law Has an Estimated End Date

by | Monday, March 14th, 2022

Moore’s law has been prophetic, but nothing lasts forever. When Gordon Moore predicted that transistors inside of a dense integrated circuit would double every 18 months in 1965, it seemed like a very aggressive prediction. 57 years later, it has come to pass, with computing speeds doubling every year and a half. Now, it seems like the tried and true prediction is coming to a close, so we thought we’d look at how technology will continue growing when Moore’s law actually becomes obsolete. 

A Background on Moore’s Law

After Moore made his famous prediction he went on to co-found Intel, which if you know anything about computers, represents probably the most important name in semiconductor technology over the past 50 years. Their first microprocessor, the Intel 4044 had 2,300 transistors. Today’s microprocessor has billions of transistors. It goes to show that his prediction was right on point. 

Why is Moore’s Law Staring at Obsolescence?

The fact is that today’s microprocessors already pack in about as much avenue for computation as possible. This actually all ties into the speed of light. That’s right, the speed of light. You see, the speed of light is a finite number. It is constant and limits the true number of computations a single transistor can handle. Since you can’t supercede the speed of light, and computing is basically electrons moving through matter, the flow of bits (which is how traditional computing is measured) is also finite. So you see, it is impossible to create computation that moves faster than the physical universe allows. Physicist James R. Powell has done the calculations and figures that Moore’s Law will be obsolete by 2036. 

If you couple the hardware limitations with other hurdles such as cooling systems for these microprocessors and the costs associated with creating faster and faster chips with billions upon billions of transistors, it seems as if we’ve already begun to see the end of the consistent growth in computing speeds. 

What Happens Then?

If one thing is for certain, humans are going to press the issue. There are far too many industries that depend on it. As that happens and the end of what is physically possible with microprocessors comes closer, you will see a growth in what is called Quantum Computing.

Quantum computers are computers that compute with what’s called a qubit (quantum bit) and use effects called superposition and entanglement. This process allows the computer to overcome the miniaturization problems of traditional computing. This allows these computers to solve problems in minutes that would take a 5nm microprocessor decades.

With processing-intensive applications such as AI becoming more relevant throughout many industries, the continued innovation of computers is a sure thing, it’s just going to have to come about in ways that don’t look like the computers we use each day. 

What are your thoughts about the long-term innovation of the computer? Do you think that the end of the microprocessor as we know it will come in the next quarter century? Leave your thoughts in the comments section below and stop back to our blog soon for more great technology content.

A Glimpse Into What Compliance Looks Like for Businesses

It’s easy to see all the reasons why you should make data regulations and compliance a priority. After all, you want to ensure you don’t violate the trust and security of your customers, as well as the integrity of your operations. If you make even one mistake, it...

AI Search Isn’t There Yet

People do this all the time: if they don’t know an answer, they just make something up that sounds right. It turns out AI has the same bad habit. A Study Put AI Search to the Test, and It Did Not Go Well Researchers at the Tow Center for Digital Journalism (part of...

Remote Work Is Great, but There Are Some Pitfalls

Do you have employees working remotely? If you do, the real question is, are you doing everything you can to keep them productive and secure? Remote work is awesome, but it comes with its fair share of risks. Today, we get into how to competently confront them. Remote...

Hiring IT is Hard (Here’s How to Make It Easier)

Do you have someone on your staff who can handle most IT-related issues for your business? If not, we’re sure your organization feels it in more ways than one. The issues that come from not having IT help are only made more frustrating when it comes time to find IT...

Let’s Take the Lid Off of CAPTCHA

We've officially reached the point where humans have to prove they're, well, human just to access websites. One of the most common ways to do this? CAPTCHA. CAPTCHA stands for Completely Automated Public Turing test to tell Computers and Humans Apart. It might sound...

Automation Isn’t Always the Best Business Option

Automation makes sense from an operations standpoint, and people see this despite the many who advocate for scaling back to save jobs. For every task that can be completed, however, less than half can be automated. When you consider all the tasks that a human might be...

More Reading from Industry Insights:

AI Search Isn’t There Yet

People do this all the time: if they don’t know an answer, they just make something up that sounds right. It turns out AI has the same bad habit. A Study Put AI Search to the Test, and It Did Not Go Well Researchers at the Tow Center for Digital Journalism (part of...

Hiring IT is Hard (Here’s How to Make It Easier)

Do you have someone on your staff who can handle most IT-related issues for your business? If not, we’re sure your organization feels it in more ways than one. The issues that come from not having IT help are only made more frustrating when it comes time to find IT...