We are very dependent on technology. It’s okay to admit it, this is a safe space. We promise. Even if you fall into the category of doing things the more traditional way, it is hard to escape the allure, convenience, or security of using technology to navigate our lives.
Whether or not you have fully embraced modern technology, it is important to appreciate the steps that got us here. It took a lot of time, money, and intelligent minds to bring us to where we are today. Let’s take a moment to look at a couple technological milestones, both past and present.
A tree fell, but no one heard
The Wall Street Journal stated it best ‘The world changed on Nov. 15, 1971, and hardly anyone noticed’. On that day, a few brilliant minds came together to build what would pave the way to our current success; the microprocessor.
This past week marks the 50th anniversary of the literal building block of technology, the Intel 4004 microprocessor.
Frustrated with designing different chips for each project, engineers Federico Faggin, Stanley Mazor, and Ted Hoff came up with idea of designing a more programmable chip that could be used in a variety of different products. Thus the 4004 was born.
As the years went on, stronger and more capable microprocessors were produced leading to today, where most technology runs on a combination of 64-bit microprocessors.
They are everywhere!
Moore’s law states that roughly every two years, the number of transistors per chip doubles. So far, we have been somewhat following along with that trajectory. However, we can assume that this expectation will start to fade as time goes on. It will not die away due to any inefficiency of the microprocessor itself but due to the development of adjacent technologies and software. Modern advancements have allowed us to make more efficient use of today’s microprocessors thus we can expect to use less of them in the future.
Afterall, microprocessors are everywhere. Our phones, computers, TVs, and even our cars require a decent amount of processors to work correctly.
It’s not rocket science, it’s harder.
Do you ever wonder what insanely impressive kind of background you would need for IBM to let you touch their quantum computers? Heck, even just looking at photos of their newly minted, largest-ever, quantum computer makes us feel like we are going to break it.
Even with all the explanations the internet has to offer, understanding the details behind quantum computing can be a lot. IBM has a great interactive site detailing different aspects of quantum computing and why it is important to the future of technology. Mostly importantly, it doesn’t make your brain hurt while trying to read it!
In a nutshell, quantum computing is the science behind developing computing technology to solve multiple complex problems and concepts current supercomputers are unable to solve.
As we move into this new age of quantum computing, we can expect it to follow a similar path to the microprocessor. IBM’s quantum computer is physically large, not too dissimilar from the first computers developed in the 1960s. Eventually, we will all have quantum computers in our pockets, just like our smart phones that were nearly unimaginable just decades ago.
Keeping our fingers on the pulse
We think you can probably guess that we don’t dabble in quantum computing at Vodigy. However, we are always keeping our finger on the pulse of new and emerging technologies. We strive to provide our customers with the latest and most reliable in business technology.
Is your SMB ready to Unleash the Power of IT with Vodigy? Give us a shout!