BioCentury
ARTICLE | Product Development

Tokenization: lowering barriers to evidence generation 

Linking data sources via tokenization will change the way clinical data are collected and interpreted

November 3, 2023 2:54 PM UTC

Beyond the promise of digital endpoints to measure clinical phenotypes not possible today, another set of digital tools promises to help shore up gaps in data and extend evidence generation far beyond trial completion.

By stitching together disparate sources of information about an individual’s health into a comprehensive and longitudinal picture, the combination of tokenization and data linkage could inform every stage of clinical development — from filling holes in baseline data to tracking long-term outcomes with far fewer patients lost to follow-up...