Skip to Main Content
Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through Â¿intelligent", online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving Â¿intelligentÂ¿, online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an Â¿intelligentÂ¿ online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.