As far as technology goes, this was probably bound to happen sooner than later, so I wasn’t too surprised to read earlier this week about a Swedish startup hub known as Epicenter, which offers to implant its workers and startup members with microchips the size of grains of rice that function as swipe cards to open doors, for instance, and operate printers. The microchips are injected into employees’ hands, between the thumb and index finger.


“The biggest benefit, I think, is convenience,” Patrick Mesterton, co-founder and CEO of Epicenter, says in an Associated Press story. As a demonstration, he unlocked a door by waving his hand near it for a reporter to watch. “It basically replaces a lot of things you have, other communication devices, whether it be credit cards or keys.”


Epicenter, which houses more than 300 start-ups and innovation labs for larger companies, began implanting workers in January 2015. Now, about 150 workers have implanted microchips, AP reports. A company based in Belgium also offers its employees such implants, and there are isolated cases around the world where tech enthusiasts have tried the technology as well. The implants have become so popular at Epicenter that workers stage monthly events where attendees have the option of being “chipped” for free, AP reports.


The technology itself isn’t new, and outside of the supply chain, the microchips are perhaps best known as the “chips” injected into pet dogs and cats as virtual “dog tags” used to track missing pets. The implants use Near Field Communication (NFC) technology, the same as in contactless credit cards or mobile payments. When activated by a reader a few inches away, a small amount of data flows between the two devices via electromagnetic waves. The implants are passive so they have no built-in power supply, and although they contain information that other devices can read, they cannot read information themselves.


As would be expected, there are significant privacy concerns when it comes to “chipping” humans. In an interview this week with the Washington Post, however, Mesterton emphasized that the technology doesn’t allow for any kind of worker monitoring by employers.


“It doesn’t even carry that ability. It’s exactly the same as if you would use a single key card,” Mesterton wrote in an email to the Washington Post. “If a person is worried about being traced, their mobile phone or Internet search history poses a bigger threat than the chip we use ever would do.”


Ethical and privacy issues would certainly become a concern if future microchips have more functionality and organizations embrace the technology, nonetheless. That’s one reason management consultants say such chips are unlikely to show up in American workplaces anytime soon.


Michael Chui, a partner with the McKinsey Global Institute who leads its research on the impact of long-term technology trends, told the Washington Post that while there is “a broad awareness for the technical ability for this to happen,” right now there is “zero interest in actually doing it.” For starters, the business case isn’t very high, with “smart badges” and biometric scanners able to do much of the same work. Furthermore, he adds, “there is a general creep factor about it.”


Then again, some people do believe the technology could be tried in the U.S. within a few years, at least as the next iteration of biometric scanners used to log people in and out of work sites, just as fingerprint or hand scans are increasingly more popular as a means to avoid lost or forgotten badges, prevent workers from “buddy punching” the timeclock, and secure the transfer of particularly sensitive information or increase facility security.


What are your thoughts on microchipping humans? Do you see it gaining widespread adoption anytime soon or do you think privacy and ethical concerns will stop its development?