Monday, December 4, 2023

next big things in healthtech

 https://www.fastcompany.com/90978960/next-big-things-tech-health-2023?utm_source=newsletters&utm_medium=email&utm_campaign=FC%20-%20Compass%20Newsletter.Newsletter%20-%20FC%20-%20Compass%2012-4-23&leadId=637128&mkt_tok=NjEwLUxFRS04NzIAAAGP09e2KqNKe4se1ZoxN1lZhumtSg-RJQzJHE0enlCgwS-5SnDz5dKWUUYkqpIB7WlFfRRzCHjeBy2tMedb7bNs3i_4lFqUTy5H5uRrtb2B


The 6 next big things in healthtech for 2023

From bioprinted pancreatic tissue to AI-backed mental health treatment plans, these technologies are changing the game in the world of health.

The 6 next big things in healthtech for 2023

The healthcare sector has by now become completely intertwined with tech innovation. Whether it’s through allowing clinicians to spend more time focusing on treatment or introducing innovative technologies that give power back to patients, tech powerhouses continue to advance the industry.

Aspect Biosystems
For bioprinting a potential cure for type 1 diabetes
Aspect Biosystems’ bioprinted pancreatic tissue is, once implanted, able to cure Type 1 diabetes in rats. As Aspect works with the FDA on bringing its therapeutics to humans, it has garnered attention from diabetes giant Novo Nordisk, with whom it inked a $2.6 billion partnership for use of its bioprinting technology to develop diabetes and obesity treatments.

BrainQ Technologies
For exploring a way to improve stroke recovery in a patient’s home
Israeli company BrainQ has been making neuro rehab at home more readily accessible—and effective. BrainQ’s approach, which has earned the FDA’s Breakthrough Therapy designation, pairs a cloud-based therapy-delivery system with a wearable device that creates a low-energy electromagnetic field meant to improve connections in the brain as it repairs itself over a nine-week therapy window. Over the past two years, BrainQ has steadily added enrollees across 15 U.S. hospitals to its stroke recovery trial that’s assessing the efficacy of its approach. 

NextSense
For being the brains behind brain health’s next-gen treatments
With in-ear technology that can collect EEG data, NextSense develops devices that can deliver personalized therapies for conditions like peripartum depression. In addition, its platform of brain-health biomarkers offers insights that can inform new treatments and drug discovery. In 2022, it began monetizing its biomarker platform, bringing in $1 million in revenue as pharma companies like Takeda and Otsuka employed it to support their development of brain-health drug treatments. It also received FDA Breakthrough designation for its PPD therapy. 

Spring Health
For taking the guesswork out of mental healthcare
Mental health care can be time-consuming and costly, with patients usually having to try out multiple different types of medications, therapies, and support to find what works. Spring Health offers precision mental health care, using AI and data to pinpoint what they say is exactly the right treatment fit for each person. The company says its model leads to recovery that is eight weeks faster on average when compared to traditional diagnostic methods.

Suki AI
For freeing-up clinicians
There’s no getting around it: Healthcare providers are burnt out. In addition to treating patients and saving lives, physicians and other medical professionals are spending hours on documentation and administration work that pulls them away from doing the work they love. Suki AI created an AI voice assistant to alleviate that administrative burden. It recently announced an integration with Epic, which makes it the only solution on the market that works directly with all major electronic health records. 

Theranica
For treating adolescent migraines
Migraines are debilitating, but most treatment options focus on adults. Theranica‘s Nerivio wearable is targeting the 100 million adolescents worldwide who are impacted by migraines. The company’s FDA-approved treatment uses a smartphone app and a product that wraps around a user’s upper arm to trigger a natural process in the brain to control migraine pain. Each treatment lasts 45 minutes, and it can be used as prevention or for acute treatment at the start of an attack.

The companies behind these technologies are among the honorees in Fast Company’s Next Big Things in Tech awards for 2023. See a full list of all the winners across all categories and read more about the methodology behind the selection process.




create more than 700 new materials

 DeepMind AI tool creates 700 new materials

Caixa de entrada

MIT Technology Review Anular subscrição

03/12/2023, 08:29 (há 1 dia)
para mim
MIT Technology Review

Week in Review

Sunday, December 3, 2023

This week’s roundup: What’s next for OpenAI. Google DeepMind’s new AI tool helped create more than 700 new materials. The X Prize is taking aim at aging with a new $101 million award. Unpacking the hype around OpenAI’s rumored new Q* model. And more.

Plus, last chance to subscribe & save 50% on annual subscription to MIT Technology Review to access our in-depth coverage and expert technology insights.
Google DeepMind’s new AI tool helped create more than 700 new materials

Google DeepMind’s new AI tool helped create more than 700 new materials

by June Kim

Newly discovered materials can be used to make better solar cells, batteries, computer chips, and more.

Read more →

The X Prize is taking aim at aging with a new $101 million award

The X Prize is taking aim at aging with a new $101 million award

by Cassandra Willyard

Any team that can restore at least a decade’s worth of muscle, brain, and immune function in older adults will claim the top prize.

Read more →

The University of California has all but dropped carbon offsets—and thinks you should, too

The University of California has all but dropped carbon offsets—and thinks you should, too

by James Temple

It uncovered systemic problems with offset markets and recommended that the public university system focus on cutting its direct emissions instead.

Read more →

50% off sale
Unpacking the hype around OpenAI’s rumored new Q* model

Unpacking the hype around OpenAI’s rumored new Q* model

by Melissa Heikkilä

If OpenAI's new model can solve grade-school math, it could pave the way for more powerful systems.

Read more →

Four ways AI is making the power grid faster and more resilient

Four ways AI is making the power grid faster and more resilient

by June Kim

From predicting EV charge times to pinpointing areas of high wildfire risk, AI is transforming our energy network.

Read more →

Catch up on the stories readers are talking about:

50% off Cyber Week Deal

⚠️LAST CHANCE: SAVE 50%

Don’t miss out on our BIGGEST sale of the year! This is your last chance to subscribe and save 50% on valuable news, analysis and insights.

SUBSCRIBE & SAVE 50%





ARTIFICIAL INTELLIGENCE

Google DeepMind’s new AI tool helped create more than 700 new materials

Newly discovered materials can be used to make better solar cells, batteries, computer chips, and more.

November 29, 2023
a robotic arm in a plexiglass-enclosed workspace filled with sample bottles
MARILYN SARGENT/BERKELEY LAB

From EV batteries to solar cells to microchips, new materials can supercharge technological breakthroughs. But discovering them usually takes months or even years of trial-and-error research. 

Google DeepMind hopes to change that with a new tool that uses deep learning to dramatically speed up the process of discovering new materials. Called graphical networks for material exploration (GNoME), the technology has already been used to predict structures for 2.2 million new materials, of which more than 700 have gone on to be created in the lab and are now being tested. It is described in a paper published in Nature today. 

Alongside GNoME, Lawrence Berkeley National Laboratory also announced a new autonomous lab. The lab takes data from the materials database that includes some of GNoME’s discoveries and uses machine learning and robotic arms to engineer new materials without the help of humans. Google DeepMind says that together, these advancements show the potential of using AI to scale up the discovery and development of new materials.

GNoME can be described as AlphaFold for materials discovery, according to Ju Li, a materials science and engineering professor at the Massachusetts Institute of Technology. AlphaFold, a DeepMind AI system announced in 2020, predicts the structures of proteins with high accuracy and has since advanced biological research and drug discovery. Thanks to GNoME, the number of known stable materials has grown almost tenfold, to 421,000.

“While materials play a very critical role in almost any technology, we as humanity know only a few tens of thousands of stable materials,” said Dogus Cubuk, materials discovery lead at Google DeepMind, at a press briefing. 

To discover new materials, scientists combine elements across the periodic table. But because there are so many combinations, it’s inefficient to do this process blindly. Instead, researchers build upon existing structures, making small tweaks in the hope of discovering new combinations that hold potential. However, this painstaking process is still very time consuming. Also, because it builds on existing structures, it limits the potential for unexpected discoveries. 

To overcome these limitations, DeepMind combines two different deep-learning models. The first generates more than a billion structures by making modifications to elements in existing materials. The second, however, ignores existing structures and predicts the stability of new materials purely on the basis of chemical formulas. The combination of these two models allows for a much broader range of possibilities. 

Once the candidate structures are generated, they are filtered through DeepMind’s GNoME models. The models predict the decomposition energy of a given structure, which is an important indicator of how stable the material can be. “Stable” materials do not easily decompose, which is important for engineering purposes. GNoME selects the most promising candidates, which go through further evaluation based on known theoretical frameworks.

This process is then repeated multiple times, with each discovery incorporated into the next round of training.

In its first round, GNoME predicted different materials' stability with a precision of around 5%, but it increased quickly throughout the iterative learning process. The final results showed GNoME managed to predict the stability of structures over 80% of the time for the first model and 33% for the second. 

Using AI models to come up with new materials is not a novel idea. The Materials Project, a program led by Kristin Persson at Berkeley Lab, has used similar techniques to discover and improve the stability of 48,000 materials. 

However, GNoME’s size and precision set it apart from previous efforts. It was trained on at least an order of magnitude more data than any previous model, says Chris Bartel, an assistant professor of chemical engineering and materials science at the University of Minnesota. 

Doing similar calculations has previously been expensive and limited in scale, says Yifei Mo, an associate professor of materials science and engineering at the University of Maryland. GNoME allows these computations to scale up with higher accuracy and at much less computational cost, Mo says: “The impact can be huge.”

Once new materials have been identified, it is equally important to synthesize them and prove their usefulness. Berkeley Lab’s new autonomous laboratory, named the A-Lab, has been using some of GNoME’s discoveries with the Materials Project information, integrating robotics with machine learning to optimize the development of such materials.

The lab is capable of making its own decisions about how to make a proposed material and creates up to five initial formulations. These formulations are generated by a machine-learning model trained on existing scientific literature. After each experiment, the lab uses the results to adjust the recipes.

Researchers at Berkeley Lab say that A-Lab was able to perform 355 experiments over 17 days and successfully synthesized 41 out of 58 proposed compounds. This works out to two successful syntheses a day.

In a typical, human-led lab, it takes much longer to make materials. “If you’re unlucky, it can take months or even years,” said Persson at a press briefing. Most students give up after a few weeks, she said. “But the A-Lab doesn’t mind failing. It keeps trying and trying.”

Researchers at DeepMind and Berkeley Lab say these new AI tools can help accelerate hardware innovation in energy, computing, and many other sectors.

“Hardware, especially when it comes to clean energy, needs innovation if we are going to solve the climate crisis,” says Persson. “This is one aspect of accelerating that innovation.”

Bartel, who was not involved in the research, says that these materials will be promising candidates for technologies spanning batteries, computer chips, ceramics, and electronics. 

Lithium-ion battery conductors are one of the most promising use cases. Conductors play an important role in batteries by facilitating the flow of electric current between various components. DeepMind says GNoME identified 528 promising lithium-ion conductors among other discoveries, some of which may help make batteries more efficient. 

However, even after new materials are discovered, it usually takes decades for industries to take them to the commercial stage. “If we can reduce this to five years, that will be a big improvement,” says Cubuk.

Correction: This story has been updated to make clear where the lab's data comes from.