It’s great when experts disagree.
There is a brilliant New Yorker piece "The Obama Memos" that describes the conflicting advice that Obama constantly gets from his cabinet. What is it like when the world's top economists are giving you diametrically opposite advice on handling the recession? How do you decide on the Bin Laden raid or action against Gaddafi, when Hillary Clinton and Joe Biden are persuasively arguing in opposite directions? Thanks to the year that I devoted to this paper, I have a great feel for such conflicting expert advice.
Before we get to that, a little background is necessary. In February of 2012, I was delirious because I thought that I made an incredible scientific discovery. Three years into my postdoc, I was still wrapping up my graduate work, and I stumbled accidentally into a surprising result: I had strong evidence that a heterochromatin-forming silencing complex in yeast was actively present in expressed euchromatic loci. I was thrilled and excited, and so was my Ph.D. advisor Jasper who e-mailed me that I am opening exciting new doors of discovery. Debbie, a graduate student in Jasper's lab was going to start experiments to follow up on my computational surprises. Oh, the high that this gave. This is the type of state that makes being in the science world worth it. Discovering something truly new. Fantasizing about the possible meaning and the potential for a hugely important paper in a year or two.
The problem is that I was worried a little bit. Not that I did not trust the localization signals; quite the opposite. The signals I detected were so beautiful and strong - you only dream about crystal clear results like this. Most of the time, all experiments and analyses give weak and disappointing signals that you have to interpret and then build weak arguments on top of. It's messy. Not this result. This was screaming at you. It was almost too good to be true. And that's why I worried. I am scared of results in science that are too clear.
As the three of us speculated about the likely new biology, I kept anxiously thinking of how to test for an artifact. I figured out a way. I spent a few nights analyzing published datasets from other labs. This is depressing work. You are not looking for new biology, not answering any interesting question. You are doing grunt coding and analysis, database manipulations, and graphing of other people's data, just to make sure that what you are already thrilled about is real.
I finished processing the controls. I graphed the results. And the results were clear once again. Except, they simply and clearly screamed, "You are looking at an artifact." The high I described before? Now invert it. And subtract some from it. There is nothing new. Nothing to follow up. No new biology. I just killed my own discovery. I had to e-mail the Berkeley lab to instantly cease all work on this. It's not the thousands of dollars wasted on reagents, it's not that I let Jasper down, it's not the wasted time on my part or the part of the student. It's the sharp and painful knife wound. This is no slow cut. It's a high-quality, Japanese Santoku knife, slicing right through and wounding you. There's no hope. No alternative explanation. You know instantly that everything was fake.
Then, I asked myself why no one ever thinks of the artifact that I found. I have a lot of experience in this kind of analysis, but I trusted the signal at first. The Berkeley lab uses this technique all the time. Suddenly, I realize that I have uncovered a problem in a molecular biology technique that has been in widespread use for over a decade. I am convinced that this is no yeast problem - this is an inherent problem of the technique, used in yeast, flies, worms, mice, human cancer and stem cell lines - used in everything. I start re-evaluating one science paper after another. Surprising big result, after result, that I am convinced is an artifact.
And here comes the advice part. I knew that I needed to pause my main postdoc project and quickly publish a paper on the artifact, to alert the community. I presented this to a number of labs and professors to make sure that I am not missing anything. And the advice that I got from brilliant Harvard and MIT researchers, including famous professors, made me dizzy.
- PersonA: This is going to be a downer of a paper. If you can figure out the exact cause and offer a solution to the artifact, terrific. If not, I wouldn't publish it because it will upset people.
- PersonB: This will distract you from your main project; forget about this.
- PersonC: Forget your main project as it's very risky; do this instead, spend however long it takes, and you'll be able to get a faculty job from this paper.
- PersonD: All techniques have biases and errors. Why waste your time on this? You risk becoming a "technique person" and that is bad for your career. If you do decide to pursue this, publish this in an obscure journal where no one sees it.
- Person E: Do this quickly, no more than a year, get it out as a public service announcement, and get back to your main research as fast as you can.
- Person F: Very admirable. Indeed looks like a serious problem. But people who published wrong interpretations because of this artifact will just ignore your paper.
- Person G: Don't bother going through a traditional journal. Put this on a blog.
- Person H: You should try to publish it in Cell.
The incredible thing is that the advice was given to me by really smart people that I respect and admire. Top-notch scientists, each and every one of them. The beauty of this is that once the dizziness subsides, the contradictory expert advice becomes liberating. You suddenly realize that you are the true expert on this paper, and you have to trust your instincts. Very liberating.