How to establish a relevant quality processing procedure?
A the end of the course, you will be able to :
- Realize the importance of processing.
- Describe the main steps of MS data processing.
- Design a processing procedure.
Knowledge about LCMS data pre-processing.
Preceding course section enabled you to have at your disposal a list of ions (variableMetadata file) and corresponding intensities (dataMatrix file). What you may want now is to get some relevant information from your tables. However, your data may not be suitable yet for statistical analysis. So, what should you do to ensure the quality of your tables? This course section will show you why you should consider processing your data, and what the usual quality steps are.
- Processing data to make it relevant
- Handling signal drift and batch effect
- Filtering according to quality indicators
- The question of data filtering and correction must be addressed in all projects, even thought in some cases it may lead to the decision of no action on data. In particular, blank filtering, pool variation study and signal drift correction are common aspects to consider when dealing with LC-MS.
- Remember that depending on your context (type of samples, protocol specificities...) specific filters/normalisations may be needed, independently of standards ones.
- Once you applied your customed processing procedure, your tables are ready for the statistical analysis step!
- Dunn, W.B., Broadhust., D., Begley, P., Zelena, E., Francis-McIntyre, S., Anderson, N. (…) Goodacre, R. (2011). Procedures for large-scale metabolic profiling of serum and plasma using gas chromatography and liquid chromatography coupled to mass spectrometry. Nature Protocols, 6 :1060-1083. https://doi.org/10.1038/nprot.2011.335
- Van Der Kloet, F.M., Bobeldijk, I., Verheij, E.R., Jellema, R.H. (2009). Analytical error reduction using single point calibration for phenotyping. Journal of Research, 5132-5141. https://doi.org/10.1021/pr900499r