Sept 2019 Forbes
There is much excitement surrounding the field of brain-computer interfaces (BCI). Take, for example, recent headline-grabbing announcements from Neuralink, founded by Elon Musk, which has the long-term goal of helping to “secure humanity’s future as a civilisation relative to AI”. Then, there is Facebook’s development of wearable technology that hopes to achieve “hands-free communication without saying a word”.
While there are no guarantees that telepathy will ever exist, equally, there is no guarantee that it will not. Meanwhile, companies and organisations are making tremendous advancements and we can expect more effective and widespread use of BCIs as they become more sophisticated. Hands-free control of computers and entering data using the brain alone represents a turning point for a number of industries—and is seen as probable, not improbable.
BCIs, also known as neural interfaces, connect the brain or nervous system to equipment such as digital devices or IT systems. Interfaces placed inside the brain, or body, are known as internal, invasive or implanted technologies, as opposed to external, non-invasive or wearable devices.
As immersive technology continues to advance, we will see the interfaces we use to link the physical and digital worlds virtually disappear. Currently, we associate these experiences with cumbersome headsets or interactive touchscreens. In the future, our environments, clothes, or even contact lenses will be the gateway to a new reality. It’s also possible that our very brainwaves could provide the commands needed to let AI-driven immersive systems know what we want.
However, BCIs are not just the future—they are the here and now. In fact, the basic building blocks of neural interfaces have been around for years (we already have brain-controlled artificial limbs) and with the amount of investment that neural technology is receiving, innovation is accelerating.
Neural interfaces are already widely used in medicine. The cochlear implant is the most extensively used form of internal interface today—worn by 400,000 people worldwide—it allows users to experience hearing, despite damage to parts of their cochlea or inner ear. Other sensory implants, such as retinal and vestibular implants, are at a much earlier stage of development.
On the external side of things, one of the most mature external interfaces has been around since the 1960s—Functional Electrical Stimulation (FES)—which helps people to recover motor function. Outside of medicine, external interfaces are increasingly being used to play games, control equipment and enhance memory, concentration, and physical performance. There are no cases of internal devices used for non-medical purposes, other than research.
In areas other than medicine, a number of forward-thinking companies are already looking at BCIs—the gaming world is making significant advancements. For example, Valve is exploring the use of brain-computer interfaces to create adaptive gameplay able to respond to the emotions or ability of the player. This could be achieved through placing Electroencephalography (EEG) sensors in VR headsets. EEG sensors are used to record brain signals and are one of the most widely used external interfaces. Historically, they have been used in medicine and research, but other industries are now showing interest. For example, automotive companies have already used EEG to analyse signals of drowsy driving—eye blink level and yawning—as the electric fields produced by brain activity are a highly effective physiological indicator for assessing vigilance states.
Trimble and Neurable have partnered to explore the use of brain-computer interfaces for the transportation and AEC industries. The two companies share a vision of using neurotechnology to support digital transformation by providing a bi-directional feedback loop, driving increased safety and productivity. Trimble and Neurable will leverage biosignals, such as brain activity combined with eye-tracking technology, to improve training efficiency, driver safety, and high-risk front-line worker safety, as well as provide insights to augment the benefits of a simulation and design evaluation.
Speaking with Aviad Almagor, Senior Director of Mixed Reality and BCI at Trimble, we learn that the company explores the use of biofeedback to identify and capture client experience during the design evaluation workflow. “The multimodal biofeedback approach fuses virtual reality (VR), electroencephalogram (EEG) and eye-tracking to provide insight into human response and enrich designers’ understanding of the potential impact of their work. The suggested solution enables quantification of the experience as part of an evidence-based design workflow.”
For Trimble, the benefits of using BCI are clear, “Using BCI as part of an evidence-based design process can help designers better understand the impact of their work, define target experiences, and design for affordances to support the occupants’ productivity and wellbeing.”
Looking to the future, Almagor said, “The disruption we should be preparing for is fusion-based; integration of ubiquitous computing, XR, BCI, and AI. This disruption will completely merge digital with the physical, transform the nature of our experiences and the way we perceive and interact with the world.”
Even though BCIs are rapidly progressing, the brain is a massive thing to tackle and many implants require open-brain surgery. Then there are the ethical questions—access to people’s thoughts could be an abuse of human rights and there is already evidence of the possibility of hacking neural interfaces.
Despite the issues, the development of interfaces that allow us to combine the intricacies of human thought with the processing might of AI is an amazing advancement for humankind. We just must make sure the technology is used in the right way, controls are in place, and a regulatory framework manages the impact.