Let's talk about lab water
Let's talk about lab water
When you are prescribed a new medicine, you instinctively trust that it’s not only effective but also safe. That’s because before any medicine even touches the pharmacy shelf, it has passed stringent quality control tests to comply with strict regulations, such as the FDA’s CGMPs.(1) But what do these quality control tests involve, and how does purified water play such a crucial role in ensuring their effectiveness?
Quality control testing is essential in the pharmaceutical industry so that medicines are safe for use and perform therapeutically as expected. It is defined as “the sum of all procedures undertaken to ensure the identity and purity of a particular pharmaceutical,” according to the World Health Organization.(2) This can range from simple chemical experiments to determine a pharmaceutical’s chemical identity (for example, using thin layer chromatography and infrared spectroscopy), to the more complicated requirements of a given national pharmacopoeia, which is the official reference source of purity and quality specifications for a pharmaceutical product. Raw materials including active pharmaceutical ingredients (APIs) and excipients, such as laboratory water, must be rigorously tested for impurities throughout the manufacturing process, with quality control laboratories also having to test the finished product before it is distributed.
Currently, novel medicinal agents are being discovered at an accelerated rate. This is without doubt fantastic news for our health, but it can be challenging for quality control testing to keep up with this boom in innovation. As such, ever more rigorous and sophisticated analytical methods are being developed to monitor and control pharmaceutical purity, such as tandem liquid chromatography-mass spectrometry (LC-MS).
A range of microbiological tests and toxicity analyses, mainly using liquid chromatography, cell culture, and molecular biology techniques, have been designed to detect a variety of impurities ranging from solids to volatiles and everything in between. These tests include, for example, sterility testing, antimicrobial efficacy testing, microbial limits testing, bioburden determination, endotoxin testing, environmental monitoring and identification, and cleaning/sterilization validation of reusable medical devices.
Water is a crucial component of all quality control testing, such as in sampling, filtration, and culturing, as well as in buffer preparation and for washing/rinsing of equipment. However, as it is a powerful solvent, it can dissolve a wide range of impurities and can be easily contaminated. A lot of laboratory water comes from a mains supply, but this can contain a variety of contaminants, including ions, organic compounds, dissolved gases, and microorganisms. Some types of microorganisms may also proliferate in laboratory water treatment components and in storage and distribution systems.
Using impure water can have a detrimental impact on quality control analysis because water contaminants can not only damage your laboratory equipment but also distort your data. For example, dissolved ions can affect routine molecular biology techniques, such as the polymerase chain reaction, and microorganisms, such as bacteria and their associated nucleases and endotoxins, can also affect sensitive cell culture assays and interfere with biochemical reactions. As such, using contaminated laboratory water could affect the validity and accuracy of your quality control data and impair regulatory approval of your product.
Quality control laboratories must therefore assure they are using purified water by regularly performing water analysis. Ideally, water production, storage, and distribution systems should be designed, installed, validated, and maintained to ensure the reliable production of laboratory water of an appropriate quality.
Laboratories must also make sure to use the appropriate grade of water for their quality control tests to minimize contamination during analysis. For example, ultrapure water (Type I+ water) must be used for highly sensitive endotoxin analysis, liquid chromatography, and for cell cultures, while apyrogenic water (Type II+ water or Type II water) can be used for general chemistry and microbiological analysis.
Why not read our troubleshooting guide to ensure you are using the right grade of water for your quality control experiments?
References
[1] https://www.fda.gov/drugs/developmentapprovalprocess/manufacturing/ucm169105.htm
[2] http://www.who.int/medicines/areas/quality_safety/quality_assurance/control/en/