Paige, a healthcare tech disruptor is working with Microsoft to build the world’s largest image-based AI model for digital pathology and oncology

Paige, a healthcare tech disruptor is working with Microsoft to build the world’s largest image-based AI model for digital pathology and oncology (Image – Shutterstock)

Microsoft to build the world’s largest image-based AI model to fight cancer

  • Paige is working with Microsoft to build the world’s largest image-based AI model for digital pathology and oncology.
  • The model assists in capturing the subtle complexities of cancer.
  • Other tech companies have also been working on AI models to improve cancer research and detection.

An Artificial Intelligence (AI) model analyzes datasets to find patterns and make predictions. Over the years, improvements in technology and the growth in data have enhanced the development and implementation of AI models. Today, AI modeling is capable of replicating human intelligence.

For an AI model to work to the best of its abilities, it needs continuous training on a dataset. Microsoft’s AI Builder, for example, is a platform that provides AI models with the capacity needed to optimize business processes. What makes AI Builder unique is that users can build custom models, tailored to their needs – or even choose a prebuilt AI model for common business scenarios.

But with high amounts of unlabeled data still available, the next wave of AI models will need to train on them. These models are called foundation models. Foundation models are essentially the future of AI models, enabling the technology to work with minimal fine-tuning.

“What makes these new systems foundation models is that they, as the name suggests, can be the foundation for many applications of the AI model. Using self-supervised learning and transfer learning, the model can apply the information it’s learned about one situation to another.

While the amount of data is considerably more than the average person needs to transfer understanding from one task to another, the end result is relatively similar: you learn to drive in one car, for example, and without too much effort, you can drive most other cars—or even a truck or a bus,” IBM explains in a blog post on foundation models.

AI models can help with healthcare.

AI models can help with healthcare. (Image – Shutterstock)

The healthcare AI model

According to a report published by Stanford University, medical AI models often make use of a single input modality, such as medical images, clinical notes, or structured data like ICD codes. However, the report indicates that health records are inherently multimodal with lots of different types of data. This is where foundation models can combine multiple modalities during training.

“The ability to represent multiple modalities from medical data not only leads to better representations of patient state for use in downstream applications, but also opens up more paths for interacting with AI. Clinicians can query databases of medical imaging using natural language descriptions of abnormalities or use descriptions to generate synthetic medical images with counterfactual pathologies,” said the report.

Meanwhile, Google has also reported progress in large language models (LLMs) for the medical field. Unlike some other LLM use cases, applications of AI in the medical field require the utmost focus on safety, equity, and bias to protect patient well-being. Google’s Med-PaLM is a version of PaLM that has been tuned for the medical domain.

Med-PaLM was the first to obtain a passing score (>60%) on U.S. medical licensing-style questions. This model not only answered multiple choice and open-ended questions accurately, but also provided rationale and evaluated its own responses.

Recently, our next iteration, Med-PaLM 2, consistently performed at an expert doctor level on medical exam questions, scoring 85%. This is an 18% improvement from Med-PaLM’s previous performance and far surpasses similar AI models,” Google wrote in a blog post.

AI models have been actively used for cancer research and treatment.

AI models have been actively used for cancer research and treatment. (Image – Shutterstock)

An AI model for cancer treatment

AI models have been actively used for cancer research and treatment. For example, Google has partnered with the Mayo Clinic to explore how AI can support the tedious, time-consuming process of planning for radiotherapy, a common cancer treatment used to treat more than half of cancers in the U.S.

Other examples of AI in cancer treatment include a development by MIT and Mass General Cancer Center in the U.S. The team has developed and tested an AI tool known as Sybil.

According to a report by ABC News, Sybil was trained on low-dose chest computed tomography scans, recommended for those between ages 50 and 80 who either have a significant history of smoking or currently smoke.

For patients undergoing screening for lung cancer, Sybil can look at an image and accurately predict the risk of a patient developing lung cancer within six years.

In another development, Paige, a healthcare tech disruptor, is working with Microsoft to build the world’s largest image-based AI model for digital pathology and oncology.

Paige developed the first large foundation model using over one billion images from half a million pathology slides across multiple cancer types. With Microsoft, the new AI model is orders of magnitude larger than any other image-based AI model existing today, configured with billions of parameters. This model assists in capturing the subtle complexities of cancer and serves as the cornerstone for the next generation of clinical applications and computational biomarkers that push the boundaries of oncology and pathology.

The news of the AI model for finding cancer spread quickly.

The potential of the Paige/Microsoft model is unprecedented.

“Paige has been at the forefront of innovation since its inception, and by combining Microsoft’s expertise and enormous compute power with Paige’s deep expertise in AI, technology, and digital pathology, we strongly believe we will significantly advance the state-of-the-art in cancer imaging. Through the development of this model, we will help improve the lives of the millions of people who are affected by cancer every day,” said Razik Yousfi, SVP of Technology at Paige.

Moving forward to the next phase of development, Paige will be incorporating up to four million digitized microscopy slides across multiple types of cancer from its unmatched petabyte-scale archive of clinical data. Microsoft will provide its advanced supercomputing infrastructure for Paige to train the technology at scale and ultimately deploy it to hospitals and laboratories across the globe using Azure.

Thomas Fuchs, Dr.Sc., Founder and Chief Scientist of Paige, explained that by realizing the potential of generative AI at an unprecedented scale, the Paige model collaboration with Microsoft is a milestone in the history of oncology.

“It opens a window into the microscopic world with extraordinary fidelity, allowing for not only much higher accuracy but completely novel capabilities,” he said.

Speaking to CNBC, Fuchs also highlighted that the workflow for cancer diagnosis has not changed. Pathologists still need to examine a piece of tissue on a glass slide under a microscope. The method is tried and true, but if pathologists miss something, it can have dire consequences for patients.

As a result, Paige has been working to digitize the pathologists’ workflow to improve accuracy and efficiency within the specialty. The company has received approval from the Food and Drug Administration for its viewing tool, FullFocus, which allows pathologists to examine scanned digital slides on a screen instead of relying on a microscope. Paige also built an AI model that can help pathologists identify breast cancer, colon cancer, and prostate cancer when it appears on the screen.