Mass Spectrometric Musings
Last Saturday, I met up with my favorite skepchick Rebecca and we took the train north of Boston to meet up with some friends for a Newtonmas Party hosted by a skeptic friend of ours. On the way out the door, I grabbed some light reading for the way as I wasn’t sure Rebecca and I would end up on the same train. Fortunately, Rebecca and I did meet up and ended up gossiping on the way. Rebecca also made fun of my “light reading for the train.” The book I selected was Geochemistry of Non-traditional Stable Isotopes, a fascinating little volume that I imagine I’ll read cover-to-cover before the year’s end. Really, though, Rebecca, it is light reading… I mean, it’s far better than other books I could have grabbed, such as, Reaction Mechanisms of Inorganic and Organometallic Systems or Thermodynamics of Minerals and Melts, for instance.
As I have a paper due tomorrow on lithium isotopes, my post today is going to have to be about isotopes. For those of you who need a quick review, isotopes of an element are produced because of differences in the numbers of protons and neutrons in the nucleus. Thus, isotopes of an element have slightly different masses that can lead to small, but important, differences in the behavoir of an element. For instance, water sitting in a glass will become isotopically heavy over time as lighter 16-O and 2-H evaporate preferentially over heavier 17-O, 18-O, and 3-H.
Many physical, chemical, and biological processes can lead to isotopic fractionation. Studying ratios of both stable and radioactive isotopes can provide important constraints on these processes. The study of radioactive isotope systems, such as potassium-argon and uranium-thorium, can also be used in the dating of rocks and archaeological samples.
I started reading Geochemistry of Non-Traditional Stable Isotopes for my research paper on lithium isotopes, but I’ve found myself reading bits and pieces of the other sections of the book as well. This book is basically about all the new types of stable isotope systems (lithium, magnesium, zinc, selenium, et cetera) that are now able to be studied because of recent advances in mass spectrometry. The isotopes themselves aren’t really that interesting, but their applications are. Scientists will be able to learn so much more about paleoclimate, volcanology, ocean circulation, biological cycles, et cetera from studying these isotopes.
This little book on stable isotopes is opening my eyes to how recent developments mass spectrometry are revolutionizing isotope geochemistry. I could care less how my car works, but I care a great amount about how mass spectrometers work, for some reason. I’m weird that way. Basically, a mass spectrometer uses electric and magnetic fields to separate isotopes and measure their ratios and concentrations.
These days, the coolest mass spectrometer on the block is a multicollector inductively coupled mass spectrometer (ICP-MS). ICP-MS has many advantages, the greatest of which is that you often don’t have to go through the weeks of painstaking chemistry to separate out an element of interest from a sample. In the old days geochemists had to labor for days in the lab, slaving away over intricate glass columns and probably getting various cancers from the acids and other toxic stuff they used in their chemistry. These days, the modern geochemist can dissolve a rock powder or other sample, dilute the solution with some weak nitric acid, and use a cool flame torch to completely ionize the sample. No chemistry involved, often! Just simple dissolution.
To make life even easier, you can even hook up a laser to your ICP-MS. Besides saying that you “work” with lasers, which is always cool to say, the advantage is that you don’t even need to dissolve your sample! In many cases, you don’t need to prepare your sample at all. You can stick your rock, bone, wood, or even flesh/hair/skin sample in the little box with the laser, and the laser does all the work. Poof! Your sample is ionized. Turned into plasma. Twenty minutes later, you have constrained the chemical and isotopic composition of your sample. The analysis time for your sample has gone from twenty weeks, perhaps, to twenty minutes.
While I oversimplify the ease of laser ICP-MS (you have to worry about matrix effects, for instance), the technology is powerful. As a young geochemist, I marvel in this technology. Yet, I realize that many of the older geochemists, such as my advisor, are somewhat skeptical of this new technology.
Partly, I think, it’s a generation gap. Many geochemists claim that their old techniques of laborious, back-breaking, cancer-inducing chemistry and last generation mass spectrometers still produce better numbers. In a sense, that’s true. Many of the older techniques do still produce the best-precision numbers. Yet, for many problems you don’t need super high precision to answer your questions. Why do all the work, then? I think some of the older geochemists are just bitter they had to work for twenty weeks to get the same numbers their grad students get in twenty minutes. However, for the problems which do require high-precision resolution, the new mass spectrometers are catching up. There are still some tricky isotope systems which require laborious chemistry, but even these systems may become easier to measure with time.
For some researchers, I recognize there may also be financial restrictions. The new machines can easily approach a million dollars in cost, and not all labs can afford that. The old machines and techniques may have to do, in many cases.
While I am in general all for the new technology and less labwork, I do have to agree with my advisor and other older geochemists I’ve spoken with that the ease with which isotopic data can be generated on these new machines is a little dangerous. These days, mass spectrometers are becoming like magic boxes which produce isotope numbers. You load your sample, press a button on the computer, and walk away. After your coffee break, you have your data, more or less. However, today’s graduate students do not necessarily appreciate the nuances and potential pitfalls of the technology, so troubleshooting is challenging.
Also, the rate at which data is being produced is somewhat dangerous. What good are a hundred numbers, after all, if you have no idea what they mean? Maybe they only had ten numbers in the old days, but they really thought about those ten numbers.
I talk about mass spectrometers because these are the toys which I play with, but there are many other examples of scientitific technology that is both world-changing and dangerous. As good scientists, we should embrace modern technology. But, as good skeptics, we should be careful about the magic boxes of the modern world. At the very least, we should appreciate that collaboration and clear communication between professionals in various specialties will become increasingly important as these magic boxes become fancier and fancier.
And speaking of magic boxes, I think it’s time to go and microwave my dinner in the magic box in the kitchen.
This was fun too… just…. uhhhh…. eye opening.
A post on the Skepticality forum recently raised an interesting question about isotopes, namely: why do the individual particles decay when they do?
We know statistically that x% will decay over y time period, but what is the cause of that final tipping point that makes an atom decide to commit purge itself of unwanted neutrons? As far as I am aware, none of the usual four forces in the universe are to blame, so what is?
One of the true mysteries of cause and effect…
Woops. Buffered up the URL reference to Skepticality.
The isotope fairy.
Now I understand why primitive cultures came up with gods pf various crappings. Explaining what you don't know by placeing fairy at the end is very comforting.
So people don't understand Christmas trees? ;)
Thad, I don't think that thinking about radioactive decay in terms of forces is the way to go. Personally, I think about it in terms of movement and gaussian distribution of how fast particles are moving in the nucleus.
There are many different kinds of decay, but take alpha decay as an example. Alpha decay involves the emission of an alpha particle (two protons and two neutrons). In a large atom's nucleus, there are many protons and neutrons moving around. They're packed in there tightly. Even so, most of the time those protons and neutrons are not going to collide with each other and form an alpha particle, but every so often they do. And every so often, one of those alpha particles is moving fast enough that it can escape the nucleus.
So, maybe probability is the best way to think about it… I was learning about probability and Heisenburg uncertainty once, and my teacher mentioned that there was a probability that– very slightly, but still there– that we could walk through a wall. If enough of us tried for long enough, eventually one of us would be just walk through a wall!
So, maybe radioactive decay is like that… although it may not sound intuitive, every so often one of those alpha particles can "walk through" the wall of the nucleus…
Okay, the entire human race has to put everything down and start trying to walk through walls right this instant.
Cause thats to cool not to put the whole world on hold.
Mmmm. Mass Specs.
I've been using MALDI-Tof for the last couple of years for DNA analysis of single nucleotide polymorphism (SNPs). It's much faster than the old methods, especially if you have lots of samples. I've managed over 1/2 Million samples in 5 years, but it could have done almost 8 million in that amount of time (if I had enough samples).
Anyone want to trump me on this?
Well done, Evelyn. Your little essay is very accessible to this layman. You have some skill as a writer as well as a scientist, clearly.
Your point about not taking technology for granted is a good one. For someone, especially a scientist or technician, to accept any data simply because the "magic box" in question spat it out is not good. If on the other hand, they have a grasp of how the thing works and relates to their field of expertise, their understanding of each enhances both.
The proverb about a tool being only as good as its user? That certainly applies all the more to such high tech tools as you describe.
If I had read this earlier in the day, I would have enthusiastically jumped into calculating the rate of isotopic weight increase using Boltzmann statistics and kinetic theory. Sadly, the evening is wearing on and my second round of caffeine doesn't seem to be kicking in. Curse you, fallible body! When the Technological Singularity rolls around, I will so be leaving you behind to upload into my shiny new software form.
Technological Singularity. . . hmm. . . Sounds like a fancy word for living with computers and never having a date.
This may seem a silly question, but how does the laser/computer in the ICP-MS tell the difference between the sample and the container in which the sample resides? Is there reliance on there being enough sample that this hopefully isn't a problem?
If one put a sample of the container in to analyse, what would happen then? No reading?
Anyone else have one of those “whuvhuh?” moments here where you begin to grasp just how much smarter someone is than yourself.
Like one second your really well adjusted and the next you feel you should be in one of those wooden playpens knocking a cup against the bars and playing with toy trucks in a sandbox?
A belated welcome to the world of skepchick blogging, and best of luck on your paper on isotopes!
Another great post, Evelyn! I agree with wright about your skills as a writer. Please include a link the next time one of your papers gets published. It would be fun to read it.
I am well familiar with the "magic box" phenomenon. As a post-doc, I remember a certain spectrophotometer whose data was analyzed using a program written by a former grad student. You just clicked the cursor on a couple of peaks and troughs in the data and – poof – the relevant numbers (index, etc) for your sample came out. Very simple. Very quick. Problem was, none of the students could tell me how the data was calculated. They had no idea how the program worked! I finally managed to track down the source code (BASIC, with NO comments!!) and found that it contained some serious approximations that made the output highly suspect at least half of the time. Did they care? Nope! It was what they were used to, and doing the calculations the correct way was way more work than they were willing to put in. So while I went ahead and did all of my analysis the "hard" way, they just kept using the program and crossing their fingers that the errors wouldn't be a problem! Arrggh! So much for my mentoring skills!
Sadly, I see the same problem crop up in the industrial R&D lab where I work now. Just submit your sample to "Analytical", and wait for that wonderful (potentially meaningless) data to pour in!
Sorry, I became a little carried away with today’s article. Tomorrow’s will be shorter and more fun. Promise. :-)
Mass spectrometers *are* cool, though… okay, ’nuff said.
Reminds me of quantum tunnelling… ah, the wacky life of atoms…
You said that "isotopes of an element are produced because of differences in the numbers of protons and neutrons in the nucleus." But from what I understand, it is only the number of neutrons that changes. If you change the number of protons, then you get a different element. Is this correct, or are these exceptions that I'm not aware of?
Psamathos – you're correct – the number of protons determine the element…it's the variation in the number of neutrons that determines which isotope…
This reminds me of my talk with my advisor in grad school. I worked in General Relativity, and I used the MathTensor package for Mathematica to solve tensor equations and generate code. My supervisor was the really old-school, pen and paper type. He was extremely suspicious of the fact that he could give me a problem and I could solve it in s couple of hours (including time to understand the problem and put it is a form that MathTensor could solve) where it would normally take weeks with careful pen and paper calculations. He didn't trust the computer to come up with the answer, and in truth MathTensor is very temperamental and you do have to be careful. But my point is that a) I trust the computer a lot more than I trust myself not to make any mistakes and b) Once I had the solution in an electronic form I could buld a model and generate data and visualize them and see if thing behaved in the way they were supposed to. I could look at the data and ask: Is this the way we expect a black hole to behave? If the behavior was not what I expected I could recheck my computations. Maybe this could lead to finding that stupid sign error (note, it was always me that made a mistake/typo, not a computer error) or maybe it could lead to new insight.
My point is that more and faster information is a good thing, but you should always do a reality check on your results.
Regarding walking through walls, wouldn't such an experiement lead to a lot of people getting halfway through the wall and then disintegrating, along with the wall, in a violent unclear reaction? Wouldn't this in fact be the outcome for millions and billions and people before even _one_ single person actually got through the wall?
And on a different but connected note, in the Norwegian movie "The Sky Is Falling Down (Himmelfall)" the main character keeps bouncing a tennis ball against the wall and explains that he's waiting for just such an even where every particle lines up just right so it'll go through the wall.
_nuclear_ reaction. :D
I liked the original, "unclear reaction".
Yes, my apologies. Isotopes of an element are created because of differences in neutrons, not in protons. I guess that I've been thinking about nucleosynthesis too much these days… you can form different isotopes by sticking more protons in a nucleus (p-process nucleosynthesis), but of course that changes the element.
Fortunately, it's just a blog entry and not my paper due today. :-)
See "The Men Who Stare at Goats" by Jon Ronson for a description of an American military officer who tried to walk through walls. He may manage it one of these days…
how very very cool When I took my stable isotopes class I had to run one of those vacum lines to do my project… so in the words of the typical old fart who I swore I'd never become: If ~I~ had to… then ~YOU~ should have to as well…:)
I do have a serious question though, are these new machines as tempermental and prone to break as the old ones? Last time I checked the main qualification for working in an isotope lab was the ability and willingness to disasssemble the machine and clean/fix it.
Great post, keep 'em coming (no pressure or anything)
I know little of isotopes (apart from what I saw in highschool chemistry), but I do know about the problem of reliance on technology.
Today, there are so many things that make life easier, but only a limited amount of people understanding them well enough to improve on them, or fix them if they are broken.
Working with computers, I know enough about the various parts to replace a bad part, or even fix a mechanical problem with, say, a DVD drive. I also know enough to program a small application. But I know very little about electronics or the machine code that's used to operate the hardware, so my knowledge is stuck somewhere in the middle. Without people who know how a DVD drive works, and who know how to write drivers to make it run, my little application isn't going to do much.
But to get back to my original thought, It's always been my opinion that you should at least learn to do things manually before switching over to the computer software package that does exactly the same thing only faster and more accurately. This way, you at least know what the software is doing – including what any potential error messages actually refer to – and in case the power goes down or the software crashes or something, you can at least continue working manually until the problem is fixed.
Take photography for example. Today, more than ever, people are taking pictures every day. Yet how many people actually know how to use an old mechanical camera, with a roll of film in it? And if they do, how many could actually develop their film and make prints?
People who have dust particles on their lens and think the "orbs" they just photographed are spirits. Long exposures resulting in light streaking are "ghost flying by". The most silly camera tricks are no longer recognized for exactly what they are. It's appaling. People are making bad pictures and they don't even realize it's them, not "the paranormal", that's responsible.
You must log in to post a comment.