Going to high school in the late 1980s, most of my friends and I adopted a personal style that more or less mimicked all of the heavy metal videos that MTV used to play at the time (Mötley Crüe, Judas Priest, Guns N’ Roses, Iron Maiden, etc.). I suppose it was inevitable, but there came a point when we all started getting tattoos. I ended up with a cobra tattooed on my shoulder. The design fit right in with the heavy metal image, but the snake wasn’t my first choice. My original idea was to get a tattoo that made it look as though the skin had been ripped off of my forearm. Rather than exposing bone and tissue however, I wanted the imaginary wound to reveal a circuit board covered in microchips, kind of like what might be found inside of an android.
The reason why I abandoned that particular idea was because a close friend talked me out of it. He wisely explained that while my idea might initially look really cool, the tattoo would not age well. He told me to imagine what the tattoo would look like if I had gotten it 10 years earlier. Our discussion took place somewhere around the year 1988, so just imagine a tattoo like the one that I described, but having it depict technology from 1978. Many of the electronic devices from the late 1970s used vacuum tubes (look it up, kids) rather than transistors or microchips. The point that my friend was making was that a 1980s microchip would eventually look as ridiculously outdated as a vacuum tube.
The reason why I am bringing up something as random as a tattoo selection from 30 years ago is because the reasoning behind my decision not to get the “android tattoo” suddenly seems to be relevant once again. I have been reading quite a bit lately about how digital implants could end up being the next phase of the IoT revolution. I’m not talking about something as simple as an implanted RFID chip, but rather surgically implanted digital technology that augments the human body’s natural capabilities.
I recently read an academic paper that discussed the possibility of replacing human eyes with bionic eyes. The idea of using electronics to create synthetic vision isn’t new. Electronic retinal implants have been used to give sight to the blind since 2013 . Even so, the ideas expressed in this particular paper took the idea of electronic sight to the next level. It talked about introducing capabilities such as a digital zoom, a DVR (so that you could replay events from your life), and even text overlays that could be used for things like language translation, GPS style navigation, and facial recognition.
I have to admit that as I read the paper, I found the idea of having intelligent synthetic vision to be extremely enticing. I immediately began thinking of how similar technologies had been portrayed in sci-fi movies such as RoboCop. But then reality began to set in.
As I started to think about what it would be like to have digitally augmented vision, I began to realize that “RoboCop” had been released in 1987, and that technology (and even ideas pertaining to what might eventually be possible) has changed substantially since then. A modern, state-of-the-art digital vision augmentation would probably be less like what was depicted in RoboCop, and more like having a surgically implanted HoloLens 2 device. As cool as that may sound however, one simply cannot ignore the issue of technological obsolescence. After all, there was a time when Google Glass was considered to be state of the art, but it is laughable by today’s standards. So just imagine what it would be like to spend the rest of your life with Google Glass surgically implanted in your eyes, and you will begin to realize why implanted digital enhancements might be problematic. Although device obsolescence is bound to eventually become a problem, there are ways in which obsolescence might be held off for a period of time. For example, the device might be designed with upgradable firmware and might be equipped with more memory and processing power than it actually needs. This could help to futureproof the device for a period of time. However, digitally implanted devices have another problem that is far greater than mere obsolescence.
If a digitally implanted bioelectronic device that is designed to augment human capabilities is going to be truly useful, it will need to support Internet connectivity. At that point, a person who has such a device implanted essentially becomes a human IoT device. The reason why this is such an important consideration is because connectivity opens the door to a myriad of security problems. It also makes it possible for the device to be abused by its manufacturer.
The biggest problem with having digital technology surgically implanted is the potential for the device to be hacked. Any device that supports wireless connectivity can theoretically be hacked. This holds true even if the device is not connected to the Internet. Recently for example, it was revealed that Medtronic’s heart defibrillator implants can be hacked from about 20 feet away. Although these devices are not Internet connected, they do support wireless communications that allow clinicians to configure and monitor the devices. If something as simple as an internal defibrillator can be hacked, just imagine what a hacker might be able to do to a more sophisticated, Internet connected device. After all, IoT devices are not exactly known for being super secure.
The other big problem with receiving a digital implant is that such devices carry with them a huge potential for abuse. The vast majority of today’s connected devices are known to spy on their users in some way. This spying can range from something as benign as keeping track of websites visited to something as audacious as listening to your private conversations. So with that in mind, just imagine the potential for spying if such a device were implanted into your body.
Bioelectronic device manufacturers could eventually decide to monetize device use as a way of increasing revenues. The monetization of products seems to be a major trend in the tech industry. We have all seen ads pop up in really unexpected and occasionally inappropriate places. I’m not just talking about the ads that are sometimes displayed on the Windows Start menu. I recently overheard a conversation in which someone was expressing frustration at having popup ads appear while they were trying to read an online bible.
So with that in mind, there are potential consequences of having digitally augmented vision. An unscrupulous vendor might decide that it is a good idea to stream ads directly into your eyes. You might not even have a way of skipping the ads.
Worse yet, a manufacturer might use AI to figure out what you have been looking at in the last hour (items on a store shelf, the sticker on a new car, peeling paint on your home, etc.) and serve up ads based on those observations.
There is still a part of me that thinks that it would be really cool to have digitally augmented vision. I would love to have mega-zoom capabilities built into my eyes, or to be able to overlay GPS directions over the real world. Even so, there are some major practicality, security, and ethical issues that will have to be addressed before these types of capabilities can ever see mainstream acceptance.
Featured image: Shutterstock
The DevOps philosophy has greatly transformed how technology organizations are run and software development projects…
With cyberattacks growing in strength and number, it’s good to know there are machine learning…
AWS has rolled out an enhanced version of its Amazon Macie security service. The new…
Administering remote devices can be a headache for IT pros. But with these patch management…
A good understanding of the Red Hat Enterprise Linux network component is key to any…
The new hacking group Shiny Hunters is trying to gain media attention. They’ve certainly accomplished…