By Torie Wells

As clinicians work tirelessly to improve cancer treatment on a more personalized level, they are partnering closely with engineers who are enabling vastly improved medical imaging.

“In order to do precision medicine, you need to see better,” said Pingkun Yan, assistant professor of biomedical engineer at Rensselaer. “If you cannot see, you can’t do anything.”

Yan’s expertise in imaging will support researchers from University of Texas Southwestern, led by Jing Wang, associate professor of radiation oncology, who are currently conducting a clinical trial using an approach known as stereotactic body radiation therapy (SBRT), which delivers high doses of radiation directly to a tumor.

Their critical partnership is being funded by the National Institutes of Health (NIH), as they seek to improve radiation therapy for high-risk prostate cancer patients.

Multiple clinical trials have shown that high doses of radiation to prostate tumors can result in improved cancer outcomes, Yan said, but delivery of that radiation must be localized and precise to protect other healthy tissue nearby.

One of the challenges with SBRT is that the prostate can move and deform during delivery. To make sure the accurate dose is being given in the right location, a reliable and accurate tumor tracking method is needed. But traditional ultrasound technology isn’t sensitive enough to differentiate between the prostate tumor and healthy tissue.

That’s where Yan comes in. He and his team will develop an imaging method to help researchers distinguish between the healthy tissue and the tumor, so they can more accurately administer the radiation doses.

More specifically, Yan will integrate SBRT with a temporal enhanced ultrasound method (TeUS) that he previously developed through a collaboration with the University of British Columbia, Queens University, and NIH. TeUS combines a series of ultrasound images, over time, so that researchers and doctors can visually separate the tumor from the healthy organ.

“The tumor and the healthy tissue move a little differently. By observing that area over time, we extract a difference,” he said.

Deep-learning techniques employed by Yan’s team will make this technique possible.

“We could obtain these images in the past, but didn’t have a good tool to analyze those images. With deep learning, with artificial intelligence, we are now able to decode the information and make it usable,” Yan said.

Yan also recently received a Bench-to-Bedside award grant, also from NIH, to focus on improving cancer detection through ultrasound imaging.

He hopes the results of these collaborative, interdisciplinary projects will improve treatment for all patients.