In April 2016, Google announced it was starting an incubator called Area 120, its artisanal take on Y Combinator. Berent and Owens applied and got rejected, but they were pointed to X, Alphabet’s “moonshot” division, which takes on riskier, longer-term projects than Area 120. X picked up the project to supercharge sleep, and Owens started running it full-time. Berent stayed in the ads division but devoted some of his time to the project.
One of their first efforts was to launch a study with Phyllis Zee, a well-known neurologist at Northwestern University. They committed $500,000 to the experiment, in which they tried sending audio signals to earphone-wearing subjects to boost the slow waves of deeper sleep. That’s when they hit their first snag: Some participants responded as they’d hoped but others not at all, and they couldn’t figure out why.
Thinking again about the earphones from their sleep study, Berent wondered if he might be better off trying to collect brain data from the ear. That would help him observe not just sleep, but perhaps everything happening inside our heads. He discovered that a Georgia Tech professor—who coincidentally happened to be the technical lead and manager of Google Glass—was working along those lines. The researcher put him in touch with United Sciences, where Konstantin Borodin was doing laser-guided earbud fittings. That company had tried to build a system to perform EEGs through the ear. It had even launched a Kickstarter campaign. But the product never shipped, and the company abandoned the effort.
Berent got in touch and arranged to get fitted for the device himself. Naturally, he tried to test it in his sleep, even though the ear molds were made of an uncomfortable hard plastic. To his delight, he was able to get some measurable brain data. Berent quickly made a deal with the company. Now the ad exec turned brain hacker was on the hook to somehow make it work.
AN EEG IS a finicky thing. In a gold-standard setup, a person’s scalp is covered in many electrodes smeared with a gooey gel to cut down on electrical noise. Once pasted to a person’s head, the electrodes can detect when huge cohorts of neurons fire together, producing signals in different frequency bands. That’s how an EEG can reveal roughly what the brain is up to—various frequencies correlate with stages of sleep, rest, or intense focus. It wasn’t obvious that Berent could do all that with only two electrodes (and no conductive goo). So he flew out to Atlanta to get some expert opinions.
Along with a team from United Sciences, Berent and a small group of renowned neurologists crowded into a tiny examination room at the Brain Health Center at Emory University. The head of the center, Allan Levey, was excited at the prospect of ear EEGs. “We know about our blood pressure, cholesterol, and respiratory system,” Levey says. “But the most important organ is our brain. We don’t assess that systematically.” He figured patients could get better care if they were also tracking the electrical activity inside the skull.Levey had lured some colleagues to get fitted for earbuds with him; one professor had literally written the textbook on EEGs. But some of the scientists were skeptical. They weren’t convinced that the tiny sensors in earbuds could pick up the relatively weak electrical brain signals. If they could, though, the payoff could be huge, allowing for persistent and portable measurements. “The problem was squeezing in all the electronics that would make it work,” says Dan Winkel, an epilepsy researcher who participated in the demo.
The Emory scientists inserted their custom buds, closed their eyes … and thought. Then they turned to a computer monitor to see what data the buds had captured. “All of a sudden, the line begins to travel across the screen,” Winkel recalls—just as it would with a normal EEG setup. “I was pretty shocked, as were most of the people in the room.”
Levey told Berent that if he could eventually match the quality of a true EEG, he’d be on to something—a sort of Apple Watch for the brain. But, he added, the earbuds could be immediately put to use on an important problem: monitoring epilepsy.
There is no easy, noninvasive way to observe a seizure, which is a critical step in treatment, both to assess the efficacy of drugs and to predict when the next seizure might strike. A patient might spend up to a week under observation at a hospital or get electrodes surgically implanted in the brain. The latter approach is expensive and painful. But by studying individuals who have undergone it, scientists have identified patterns of brain activity that seem to predict an impending seizure. With that kind of weather forecast for the brain, patients can better plan their lives, choosing to not get behind the wheel or climb tall ladders.
Berent left Atlanta feeling optimistic. A few months later, he decided to take a three-month transfer—a bungee, in Google parlance—to work full-time for X. But just as he arrived, the sleep project got axed.Owens quickly moved to another team. Berent, however, had to scramble to remain at X. He had to somehow, quickly, pick up the pieces of his project and make a new case for himself. In February 2018, he met with one of X’s top moon-shotters, John “Ivo” Stivoric, to see if he could salvage his dream of ear EEGs. But Stivoric was more interested in a brain device that could control a computer. Such a project would fit into an existing X initiative called Intent OS, which was exploring the future of how humans and computers might interact. Perhaps the earbuds could reveal what a person was focusing on. Or provide other data useful for controlling a computer or augmented reality display. Berent was game, and the new project was dubbed Heimdallr, after the Norse god who used his keen eyesight and hearing to watch for invaders. His teammates started conducting an experiment on how they might use the earbuds to refocus a person’s attention. It involved streaming two audio books simultaneously, one in each ear.
Berent, however, was still obsessing over the idea of replicating medical-quality EEGs. He and his team had to figure out how to amplify more distant signals to make up for the fact they only had two electrodes. The United Sciences prototype wasn’t quite up to snuff; it couldn’t pick up alpha waves, which occur during both sleep and wakeful periods. The X’er also had to miniaturize the electronics of a traditional EEG to fit inside the two buds.
Berent felt that with Google’s knowledge, equipment, and talent, these tasks were possible. He also had on hand 5,000 ear scans from United Sciences, which revealed that it was critical to create a tight seal—to filter out electrical noise that could erode the brain signals. He had to improve on United Science’s hard plastic molds. While casting about, Berent discovered a product called Tecticoat, a super-pliable, conductive coating. When he put it on the buds, suddenly the brain waves they collected became far sharper, and the earbuds far more comfortable. (Berent eventually acquired the intellectual property related to the polymer.)
Impatient with the rate of progress, Berent one day grabbed a lead from a $50,000 portable EEG machine, smeared some gel on it, and jammed it into his ear. To his relief, the electrode registered alpha waves—now he just had to make the same thing happen with his buds. A more definitive clinical test came months later, when a Heimdallr prototype performed roughly on par with an EEG.
Stivoric, who’d been skeptical of Berent’s obsession, was impressed. “One of the worst sensors in the world is an EEG sensor—there’s environmental noise, surface noise, motion of the body, and so forth,” Stivoric says. “I thought, OK, it shouldn’t work. But it does work. These signals are showing up. How is this even possible?”
On October 18, 2019, Berent took a meeting with Google’s chief economist to discuss the privacy implications of reading people’s brain waves. A few minutes in, Berent began feeling poorly. He looked at his Apple Watch, which informed him that he could be in atrial fibrillation. Berent went to the hospital for tests, and a few days later, he underwent a cardiac version of a reboot, where his heart was stopped and restarted. The experience made Berent view his work differently. To hell with Intent OS–he now realized that all he wanted was to build a device that could do for his brain what his watch had done for his heart.
ON NOVEMBER 8, 2019, Jen Dwyer was working at her desk on the third floor of Moonshot Central, inside a converted shopping mall. Dwyer, who is the team’s medical director, holds a doctorate in computational neuroscience and a medical degree, and she joined Berent’s project because of a deep-seated interest in sleep and epilepsy. “I just got really fascinated with the electrophysiological waveform,” she says, calling it “mesmerizing and beautiful.”She opened up a file of patient data from an earbud study she’d set up at Emory, under Winkel’s supervision. As one person’s brain waves marched across her screen, a pattern caught her eye. At first the lines on the chart were neatly spaced out and undulating. “Then, all of a sudden—boom,” she says. The lines started to jump wildly, as if the calm waters of the EEG had surged into an angry sea. It was the signature of a seizure—the first time ear monitoring had detected one. The subject, who had been sleeping, probably never knew anything had happened. But both the earbuds and the implanted electrodes confirmed the event. “We all gave each other high-fives,” says Berent. “This was what we really needed.” As the study progressed, the earbuds would log more of them, picking up 16 of the 17 seizures detected by the electrodes.
But Heimdallr was in trouble. It was still an awkward fit at X. In June 2020, Berent learned that X would stop funding the project. So he spun out an independent company. He worked out a deal where X got a stake in the new firm in exchange for the intellectual property. Five people made the jump from X to the startup, including its medical director. The team hired a new head of product who had worked on the Apple Watch. Now called NextSense and touting itself as a platform for brain-health monitoring, the company got $5.3 million in funding.
In the months since, NextSense has struck up partnerships with universities and drug companies to explore the medical uses of its earbuds. A multinational pharmaceutical firm called Otsuka hopes to use NextSense’s earbuds to assess the efficacy of medication, not only for epilepsy but for depression and other mental health issues. NextSense plans to submit its device for FDA approval this year, and Emory is conducting more studies in hopes of developing an algorithm to predict seizures, ideally hours or days in advance. (The Emory doctors are now consultants for NextSense, and have some equity in the company.)
But while the immediate uses of NextSense’s earbuds are medical, Berent hopes to eventually build a mass-market brain monitor that, if enough people start using it, can generate enormous quantities of day-to-day brain performance data. The catch, of course, is that since no one has ever done that, it’s not yet obvious what most people would get out of the information. That’s also what’s exciting. “We don’t necessarily know what we would learn because we’ve never had access to that type of data,” says Emory’s Winkel.
Berent and his team envision a multipurpose device that can stream music and phone calls like AirPods; boost local sound like a hearing aid; and monitor your brain to provide a window into your moods, attention, sleep patterns, and periods of depression. He also hopes to zero in on a few sizes that would fit a vast majority of people, to dispense with all the ear-scanning.
Far along on the NextSense road map is something unproven, and kind of wild. If artificial intelligence can decode tons of brain data, the next step would be to then change those patterns—perhaps by doing something as simple as playing a well-timed sound. “It’s almost a transformative moment in history,” says Gert Cauwenberghs, a bioengineer at UC San Diego, who licensed some of his own ear-EEG technology to NextSense. Like Berent, he is fascinated by the prospect of using audio to nudge someone into a deeper sleep state. “It’s so convenient, it doesn’t bother you,” he says, “people are wearing stuff in the ear typically anyway, right?” Yeah, but not to screw around with their brain waves.
TEN DAYS AFTER my scanning appointment, Berent introduces me to my custom set of earbuds. We are in NextSense’s Mountain View office, which consists of two cluttered rooms in a shared suite on the building’s first floor. I tuck the buds into my ears and find they fit perfectly—unlike my Airpods—and are much more comfortable than the molded hard-plastic hearing aid I sometimes wear.
Berent pulls out an Android phone and fires up NextSense’s app. It takes data from the buds and displays it on a number of charts and graphs—kind of like the display you see in a hospital room, the one where you hope that none of the lines goes flat. On the screen, I get an instant look at my brain waves, a thick green spiky line on a chart logging the amplitude. He taps to pull up different views and to flip between the two buds. “That looks like a typical EEG,” Berent says, maybe as much to reassure me that I’m normal as to assert that his product is capturing brain waves.
Another exercise had me alternate between a semi-meditative state and alertness. In my alert stage, I sat on a small orange couch—Ikea, maybe?—and looked around the room to note the busy desktops and a low bookshelf jammed with self-help volumes, medical texts, and coding manuals. On top of the unit is a turntable, two small speakers, and a life-size model of an ear; a cover of a vinyl Prince album leans against the wall. Another wall is a giant whiteboard scrawled with equations and data readings. I soon learn that moving my head to take in this scene has messed up my readings. Apparently these prototypes still have some bugs to work out.
But the most interesting test, and certainly the one that excited Berent most, involved napping. He’s still obsessed with sleep, and his company has an ongoing study on it at Emory. “We’re really able to see clear changes between sleep stages,” says Dwyer, the medical director. If the earbuds can prove themselves as snooze detectors, patients who ordinarily get dispatched to a sleep clinic might be spared the trip, says Richa Gujarati, NextSense’s head of product and strategy. With earbuds, she says, “you can send patients home for a diagnosis.”
I, however, was to nap on a small couch in the office. Berent retreated to his Jeep to do the same. I scrunched myself into a semi-fetal position and willed myself into the Land of Nod. It felt like it took half my allotted 20 minutes to doze off, but when my watch alarm started to chirp, I had definitely been out. Berent popped back in the room and congratulated me on my dozing. After uploading the data, we sat in front of his computer and watched as several graphs popped up. I could see the splotchy color fields of my spectrogram darkening around minute five or six, as sleep set in. Berent had taken a similar trajectory. But as he is a polyphasic maestro of the daytime nap, the last few minutes of his slumber produced a waveform signature that was almost a solid block of burnt orange. “It looks like I’m dead here,” he says. For comparison, Berent uploaded data from the Oura device he wears, a sleep tracker worn as a ring. It hadn’t registered the nap.
Of course, gazing at a chart of vibrant blotches wasn’t going to help me fortify my winks. That’s part of what NextSense is promising to one day deliver. But being able to so casually see what my brain was up to felt like a revelation. Just as some of us obsessively monitor our pulses and oxygen levels, we might regularly check our brain waves just to see what they’re up to. If enough of us do it, we may even figure out what they mean.
Lire l’article complet sur : www.wired.com
Leave A Comment