A new application of artificial intelligence could help researchers solve medical mysteries ranging from cancer to Alzheimer’s.
It’s a 3D model of a living human cell that lets scientists study the interior structures of a cell even when they can only see the exterior and the nucleus — the largest structure in a cell. The model was unveiled to the public Wednesday by the Allen Institute for Cell Science in Seattle.
The technology is freely available, and Roger Brent, an investigator at the Fred Hutchinson Cancer Research Center in Seattle who was not involved in the tool’s development, has been using it for several months. He’s a big fan.
“This lets you see things with a simple microscope that are going to be helpful to researchers all over the world — including in less affluent places,” Brent says.
“This is incredibly cool,” says Greg Johnson, one of the scientists at the Allen Institute who helped create the cell model. By giving scientists a relatively easy and inexpensive way to compare the internal organization of healthy and unhealthy cells, he says, the model should speed efforts to figure out what goes wrong in diseases like cancer.
The model, known as the Allen Integrated Cell, was developed using artificial intelligence. A computer programmed to learn studied images of tens of thousands of live human stem cells. Some of the cells had been genetically altered to make visible internal structures such as mitochondria. Others were unaltered cells, viewed through a standard laboratory microscope.
Over time, the computer learned to look at an image of a typical cell and figure out its internal organization.
“So, when I give the model an image of a cell and the nucleus, we can predict where all of those structures are,” Johnson says.
The use of artificial intelligence has several advantages over other methods of studying the inside of cells, Johnson says.
Techniques that rely on genetic alteration, special dyes, or intense light, tend to change or damage the very cells a scientist is trying to study, he says. Also, those techniques rely on costly equipment that is not available in every lab.
Until now, there haven’t been good alternatives, says Graham Johnson (no relation to Greg Johnson), a computational biologist and medical illustrator at the Allen Institute.
He recalls the frustration when he was in high school and college of trying to study cells with a conventional microscope.
“You’d see a black-and-white clear object and it had lots of smaller objects inside of it that were moving around and changing shape and doing all sorts of mysterious things,” Graham Johnson says. But there was no good way to identify those shapes or understand what they were doing.
Later, as a medical illustrator, he had access to the technologies that could identify structures inside cells. But even then, it was only possible to look at two or three components at the same time.
“You really need to be able to see all these components and see how they work together,” he says. And the new cell model should help make that possible.
It’s already proving useful to Brent, who is studying yeast cells in order to understand the factors that cause some cells to become diseased while others stay healthy. And his lab has already developed a version of the Allen Institute model that can show the internal structures of yeast cells.
“My hope,” he says, “would be that by the end of the summer we can have some beautifully trained models helping other researchers here.”
The hardware needed to run the models really is affordable, Brent says — his yeast model is powered by three inexpensive graphics processors designed to run video games.
Within a decade or so, Brent says, cell phones will be powerful enough to run the 3D cell models. And that might mean that even in developing nations, a basic lab could look inside a cancer cell to determine which treatment would work best.