From Reverse Phase Chromatography to HILIC: Graph Transformers Power Method-Independent Machine Learning of Retention Times.
Liquid chromatography (LC) is a cornerstone of analytical separations, but comparing the retention times (RTs) across different LC methods is challenging because of variations in experimental parameters such as column type and solvent gradient. Nevertheless, RTs are powerful metrics in tandem mass spectrometry (MS2) that can reduce false positive rates for metabolite annotation, differentiate isobaric species, and improve peptide identification. Here, we present Graphormer-RT, a novel graph transformer that performs the first single-model method-independent prediction of RTs. We use the RepoRT data set, which contains 142,688 reverse phase (RP) RTs (from 191 methods) and 4,373 HILIC RTs (from 49 methods). Our best RP model (trained and tested on 191 methods) achieved a test set mean average error (MAE) of 29.3 ± 0.6 s, comparable performance to the state-of-the-art model which was only trained on a single LC method. Our best-performing HILIC model achieved a test MAE = 42.4 ± 2.9 s. We expect that Graphormer-RT can be used as an LC "foundation model", where transfer learning can reduce the amount of training data needed for highly accurate "specialist" models applied to method-specific RP and HILIC tasks. These frameworks could enable the machine optimization of automated LC workflows, improved filtration of candidate structures using predicted RTs, and the in silico annotation of unknown analytes in LC-MS2 measurements.