Ode to a 10-years old mass spectrometer

This year we celebrate the 10th anniversary of one of our QTOFs. It is not among the oldest mass spectrometers that we have in the lab, but it is a special case in many senses. The anniversary evoked both retrospective and prospective reflections, which I wanted to share to hear about the experience of others.

The first ten years

The birthday boy is an Agilent QTOF 6550 that was installed in 2012. The serial number reveals that it was one of the very first units to be produced worldwide and, if I remember well, the first to be installed in Europe a decade ago. Our main challenge was to be able to rapidly analyze the ca. 500 well-known intermediates of primary metabolism in studies with hundreds to thousands of samples.

Immaculate as the first day!
Produced in Singapore in January 2012 (1201), s/n is B001

We needed a setup that would combine scanning speed, sensitivity, and robustness. These three objectives are mutually competing. For example (i) speed comes at the cost of sensitivity; (ii) robust operation calls for the injection of diluted samples (to avoid all kinds of saturation phenomena) which, in turn, increases the sensitivity need; etc. We ended up employing flow injection analysis (FIA) for most of our applications and – only recently – ballistic LC-MS gradients. MS acquisition is dominated by fast MS1 full scans to obtain the best possible representation of sample complexity. MS2 scans are included only sporadically, using targeted MS2 scans for specific precursors that we specify. We extended the same modus operandi to studies that aim at profiling chemicals beyond metabolites, such as the canonical biomarkers hunts in liquid biopsies.

In 2012, the 6550 represented a major leap forward compared to what we had in the lab and all alternatives existing on the market. It had a hexabore transfer capillary and ion funnels that ensured a massive ion current to the detector. The detection of very low abundant species (close to the baseline) was exceptionally reproducible, and allowed quantifying of many more compounds in diluted samples. Speed, resolution, and accuracy were never limiting. It quickly became the workhorse for all discovery studies, regardless of the size or type of samples. This is best conveyed by some numbers.

Over the past decade, we injected >1 million samples on the same 6550. We have been running >100 studies with more than 1000 samples, up to 80’000 for a single study. We were able to analyze virtually all kinds of samples and matrices. Results were published in >200 papers (the vast majority of papers appearing after 2013 and citing our original Analytical Chemistry publication). The 6550 proved to deliver top performance in the routine. It enabled conceiving previously unthinkable projects (some examples: [1][2][3]) and completing studies of virtually any size with a minimum requirement of resources and manpower. It became absolutely normal for all of our PhD students and PostDocs to perform untargeted metabolomics experiments with 1000+ samples. Thanks to the 6550, we were no longer limited by data generation, but by ideas and by our capacity to interpret data.

If you wonder about robustness: maintenance was in the norm if we consider the throughput. We had to replace the detector 3-4 times, about all 2-3 years. About a new transfer capillary once per year. Yes, we had to clean a lot. On average, we vented ten times a year for routine cleaning. This sums up to about 100 vent and pump-down cycles over ten years. There are a few unpublished tricks that we employ to ensure robust operation. One is to add NH4F to the mobile phase. Fluoride has been shown to improve electrospray ionization (Yanes et al 2011). It also suppresses unwanted adducts and has a wonderful cleaning effect on the ESI spray chamber. Aqueous fluoride is known to be corrosive, and we were worried it could “melt” the MS optics or electronics (e.g. the ion funnel) of the 6550. Well, it doesn’t. [Note: a glass-free LC system is needed with F!]. To date, the 6550 still performs like a charm. Just this week, I injected ca. 6000 samples of all kinds: plant extracts, root exudates, faecal samples, liver biopsies, plasma, and even goldbears for a fun project with kids. I didn’t have to tune or clean, just kept on filling the autosampler with 96-well plates for four days.

What makes it essential to us

The fascination of the 6550 is that even after 10 years, it remains THE best-in-class instrument for untargeted FIA-like analyses. This is based on dozens of tests we have done on almost all high-res MS instruments that have been produced over the past decade. In brief, the 6550 eclipses all competitors because it combines sensitivity, full dynamic range, speed, and robustness (of operation and quantification). I will not share detailed results in this post, or explain why others do worse. In most cases, the gap can be attributed to the design of the instrument, the type of detector, or the electronics that are embedded. For example, there are different types of TOF detectors (check Photonis or ETP for some commonly used products) with heterogeneous properties and weaknesses. There are fundamentally different ways and depths for counting ions in TOFs instruments (ADCs, TDCs, hybrids). Also for the instruments, one can simply buy better electronics to obtain better specs (i.e. the FTMS Booster series). Speed is the key, but costs become quickly prohibitive. I will speculate below on additional reasons that might explain the differences. The key points are that in our tests with real samples, we notice striking differences in dynamic range, noise, stability of signals and mass accuracy, jitter, ageing, etc. which are all key for high-throughput, untargeted metabolomics analysis. In our view, the 6550 remains the best instrument for rapid analysis of metabolism. We own several high-end, high-resolution instruments from different MS vendors, but they are used for special applications – not the mainstream.

The conclusion is specific to our use case, and we don’t pretend that it should generalize to other types of analyses. Nevertheless, the technical longevity of the 6550 is unique. I can’t think of a second example where we can state the same about a different MS and application. We have about 10 additional mass specs in the lab, 30 if we count the instruments of our colleagues in proteomics. Some are older than 10 years. Generally, we perceive that the instruments have been quickly overcome by the next generation. After 5 years, top instruments became mid-class. We are used to rapid obsolescence. This was not the case for the 6550, which resists and keeps the lead in face of so many new generations of instruments from multiple MS vendors.

To the next ten years!

Rapid untargeted metabolome analyses are the essential lymph of our research. They account for 90% of the MS analyses we perform in a year (the rest is LC-MS, targeted, lipidomics, and proteomics). The 6550 perfectly fulfils the analytical challenge, and a single 6550 is sufficient to handle our average yearly load of 100’000 injections. Rationally, there is no argument for decommissioning the system and replacing it with something new.
It’s already odd to celebrate the 10th anniversary of an instrument, but I’d love to keep it up and running for the next ten years and celebrate the 20th with hundreds of additional papers.

Ironically, the 6550 is no longer produced even though there is no equivalent alternative. Technical support will naturally end in 2030. We have collaborators in the academy and industry that are replicating our setup. They are literally scavenging the last units from stocks all around the globe. I am keeping an eye on the second-hand market looking for a good bargain before we run out of spare parts. This should hopefully get us to 2032 to celebrate the 20th and perhaps many more papers.

Lessons and thoughts

As I already wrote, 10 years is a long period of time in the MS world. In the meantime, most vendors brought 2-3 generations of new high-resolution instruments to the market. How comes they aren’t any better than the 6550? To some, this question may sound silly after explaining that the 6550 offers the best combo of sensitivity, robustness, dynamic range, etc. In fact, I don’t care about the 6550 in particular. We own one: it’s great and don’t need a replacement. The question is meant to be more generic: how do we explain that in an area of vivid technological development by many companies, a 10-years old instrument is still top?

I have been thinking extensively about possible answers because they directly affect my strategy for the future. In the background, I am in the continuous process of reflecting on which technologies or developments will be needed in the future to drive metabolic research in biomedical or life sciences. On average, all the innovative ideas that we manage to publish have a history of 3-5 years of focused development and validation. We are cooking novel ideas, that will hopefully see the light in max 2-3 years. For these ongoing activities, we currently have all the MS we need, but only because I started much earlier to think about opportunities on the longer time horizon of 5-10 years. Hence, now is the right time to think about the 2nd and 3rd next generation of projects and, hence, hardware. Understanding what makes an instrument a long-term investment or a short-lived hype is absolutely essential to drive sustainable and continuous innovation.

These are some of the generic answers I came up with. None of this is specific to any of the vendors, even though I may mention some examples to illustrate the case.

#1: Some high-res mass specs were not designed for metabolomics

All high-resolution MS instruments have been developed with a clear focus in mind. For many top vendors, the big market is in the space of peptides, proteins, or large biologics. It’s NOT on small molecules or lipidomics. As the developments take several years and are governed by economic factors, we see that some technologies that work beautifully in other areas are repackaged to our field… with mediocre and short-lived results. Some of the newest toys miss the spot in real-life applications.

I sometimes (but increasingly) hear from vendors that “the numbers are changing” and that the big markets are shifting, but I haven’t seen many cases in which the changing wind affected the agenda of the R&D unit. The development teams of MS vendors are surprisingly small and rely on the exceptional expertise of a few. Their agenda tends to be full for the next 4 years. [Put it provocatively: it would be great if the money that customers pay for instruments would go more into expanding the R&D teams and less into marketing.]

We also have to realize that the academic world still has very little weight in these processes, even though the total number of instruments is considerable. The problem is that the academic landscape is massively fragmented: many players, many newcomers, and many wild ideas. It’s a vast jungle where it’s really difficult to find consensus and recognize trends early enough to invest time in ad-hoc developments.

#2: Tendency to combine more things in a single instrument

There is a general trend toward larger instruments that combine different functionalities: tribrids, ion mobility before and after collision cells, etc. They add options and possibilities but all come with a price: time losses to trap and move ions, ion losses, increased space-charge effects and undesired fragmentation, etc. We had some bad surprises with some of the fancier and substantially more expensive instruments performing much worse than simpler and older QTOFs on canonical MS1 or DDA-MS2 workflows. The 6550 turned out to be a match made in heaven for our mainstream applications, and all posterior variants with more gimmicks could not compensate for the loss of sensitivity, speed, or dynamic range.

There is a natural tradeoff between flexibility and performance. If the budget allows one to buy only 1-2 instruments, the real needs must be assessed. In our lab, flexibility matters in 2% of the studies, typically to obtain once a fancy figure for a technical paper. Performance matters in 98% of the applied studies we do. This increase in functionalities is a major factor in driving instrument and maintenance costs. Choosing the right instrument for the purpose saved a lot of money and nerves.

#3: Disconnection from real applications

I observed that vendors are increasingly pushing novel instruments to the market without knowing what they are good for. They somehow wait for users to tell them. Like most application-minded users, when I read about a new instrument I immediately think of fields or cases where the instrument is likely to have a competitive edge. Then I talk to representatives of MS companies and discover that they didn’t think at all about this and that. This strategy has a caveat: if a product wasn’t developed while keeping an eye on major final applications, it is very likely that some elements of the value chain will be missing. It comes over as an unfinished act. Once we buy an instrument and it is installed, we are asked to report all the small things that we miss. The chances that these gaps get filled in the same generation of instruments are near zero because all attention is dedicated to the next product(s) in the pipeline. As customers, we are bound to right-click on figures to export spectra manually, use 3 different pieces of software to obtain high-resolution data, and so on…

I am deeply convinced that it takes little “extra glue” to fully unleash the power of technology when it hits the market. It seems that good opportunities are missed out. I never felt it is because of a lack of goodwill, but rather a consequence of siloed structures and thinking.

#4: New isn’t always better

I have experienced a sizeable number of launches of so-called new instruments that eventually turned out to be merely cosmetic improvements of a previous generation. There is continuous pressure to bring novel products to the public to be associated with dynamism and innovation, but it’s simply not possible to every year make the headlines with genuinely new ideas or products. In some cases, we even saw a regress in performance. For example, the regress was associated with a redesign aiming at lowering production costs (and increasing revenue), or with transitioning to a new software ecosystem that reunites different types of instruments of the same vendor.

It should be obvious to all that there are economic (a.k.a. shareholder value) and marketing factors that determine when or how frequently a new instrument is launched. However, we should also not be surprised that some instruments lose traction quickly after the initial launch hype at ASMS.

My personal takes:

  • Engage vendors on exactly your needs and application. Provide the big picture!
  • Flexibility always comes at a price in performance. Even if you have a generous budget, evaluate the options.
  • If you are planning to buy an instrument, focus on the profile that will enable you to obtain the next one (or the funding).
  • Mistrust new technology: it might take a couple of [non-necessarily-free] software upgrades to fully exploit it.
  • Forget specs and focus on the full solution. Or: forget products and focus on productivity. The fanciest instrument will be useless if you don’t have the software to extract/mine data or need to find and hire three engineers/chemists/informaticians to operate it.

I would love to hear your thoughts or experience!

This Post Has 5 Comments

  1. Galano

    I don’t have experience in high res machine but thanks for the reading because we have to buy one soon.

  2. Pekka

    Excellent perspective. We installed our 6550 in Lyon, France in February 2012 (SG1150B002, so a very early version apparently manufactured in 2011). It has seen a massive amount of human plasma and serum samples over the past ten years, and is still going strong.

    1. Xavier Goujon

      Hey Pekka, nice to read it keeps performing well! Cheers

  3. Galano

    And don’t hesitate to tell what to buy those days…😅

  4. Juli

    We (Metabolomics Unit at the University of Lausanne) would like to sell the same system 6550 iFunnel Q-TOF (coupled to 1290 UHPLC) – don’t hesitate to contact us if interested. It was installed back in 2016 and maintained by Agilent. We would appreciate to free up some lab space for the installation of new HRMS system.

Leave your comment:

This site uses Akismet to reduce spam. Learn how your comment data is processed.