Sunday, June 24, 2007

Nice.

This was written as a response to this post over at Science Professor. Then I noticed that she's apparently taken me off her blogroll.

Interesting. I wonder what that's about.


***

Briefly, in this post she writes about what to do when someone she's pretty sure is a competitor sends an email asking for help on a method her own lab has struggled with but eventually solved (and hasn't published yet).

This is one I can see from both sides... sort of.

In my field there is a lot of perceived competition when, in reality, personal tastes strongly influence what actually gets done.

That means two different groups can be working on "the same thing" and do totally different experiments. There's plenty to do and everyone has different skills.

So it's really rare that two groups are doing exactly the same experiments exactly the same way.

Then you might as well help each other, advance science, and all that? Right?

Except that I have been on the other end, trying to ask people for tips, usually not on unpublished work (I'm usually too out of the loop to know to ask until the paper comes out). More often I'm asking about things they've published... with necessary details lacking.

Hard to know if it's deliberate or accidental when people publish sloppy & incomplete methods sections??

I've witnessed both- the PI who is distracted and doesn't read the manuscript carefully, and the PI who tells the student/postdoc to omit certain particulars on purpose.

But sometimes I write to the authors asking for clarification, and I can't get an answer out of these people.

I've gotten all kinds of excuses, my favorite of which is "I can't remember how we did that, it was so long ago."

Yeah, there's this amazing invention called writing it down???

I can only guess that they

a) think I'm trying to scoop them?
b) think I'll find out they faked all their claims?

Why else act that way?

So invariably they're paranoid (for whatever reason) or lazy about sharing, and I end up moving slower than I'd like, reinventing the wheel and wondering how we can enforce adherence to the scientific method (write your paper so that ANYONE can reproduce your results COMPLETELY).

Labels: , , ,

9 Comments:

At 5:02 PM, Blogger Ψ*Ψ said...

I agree with the parenthetical statement at the end. It would also be helpful (though discouraging and it probably wouldn't look very good for the authors) to see statements like "This reaction is somewhat capricious."

 
At 2:23 AM, Anonymous Anonymous said...

You're still on FDP's blogroll. I know, because it's the way I just came to your blog. You are Ms. PhD, sixth from the top.

 
At 5:24 AM, Anonymous Anonymous said...

What the freak -- doesn't anyone else keep a LAB NOTEBOOK?!

Of course no one can remember details from some time ago. That's why we take and keep notes on the process and results of each experiment, trial, or run. For those who aren't familiar with it, Kanare's Writing the Laboratory Notebook is a fine example of the how and why. (He's a chemist, but it was equally useful to my behavioural studies, or anything else.)

 
At 9:46 AM, Anonymous Anonymous said...

I hate manuscripts that have incomplete methods sections. As a grad student, I wasted a year trying to develop a protocol that was supposedly reported in full in a journal. We tried contacting the author and all he wwas willing to do was of perform the method in his lab for a very large fee. He was unwilling to fill us in on the gaps in the method.

 
At 10:48 AM, Anonymous Anonymous said...

Hmm, I still see this blog listed under the links on Science Professor.

 
At 7:25 PM, Blogger Sherrianne said...

I've been reading your blog and I'm enjoying your comments.

I'm a female graduate student trying to get done.

I'm trying to meet other bloggers so I'm tagging you for a meme.

Hope you don't mind!

http://nonscientificobservations.blogspot.com/

 
At 10:52 AM, Blogger Average Professor said...

Months ago, I got an email from another researcher at a federal lab, asking me for more details on a method I'd presented at a meeting last year but haven't yet published.

I was happy to oblige, for several reasons. One, I really care about advancing the state of the art, and I am not dumb enough to think I'm such a genius I can do it all myself. Two, my notes on why I made some of the choices I did were a little bit sketchy. Sloppy, yes, but I was in a hurry to finish the project before the meeting so that I actually had some results to present. So I looked at this request as a good opportunity to revisit the thing, explain myself in plain language, and then I'd have it for my own records and for the still-languishing journal article I need to write on it. It also provided me with another prod to move that article closer to the front burner so that I can publish on it and convince them to cite me when they publish their work.

I ran into that guy's supervisor at a conference not too long ago, and he thanked me very enthusiastically for my clarity and helpfulness, and suggested a tantalizing opportunity for collaboration (involving them providing me with some great high-cost data that I am not even close to having the infrastructure to collect myself).

I'm sure there are times when sharing backfires, but there are also times when it turns out like a fairy tale.

I prefer to be an idealist with the expectation that getting burned is always a possibility, as opposed to a pessimist who always turns away relationships that could turn out to be really valuable.

 
At 7:28 PM, Blogger Julep said...

I think incomplete reporting can result from many things including: laziness re: bad methods, laziness re: record keeping, laziness re: accurate and complete reporting, bad methods that a PI purposefully wants to keep under wraps, sloppiness of grad students or whoever who screwed up, general suspicion and wariness re: other people replicating research, trying to play one-up-man-ship, or just general bitchiness.

 
At 1:41 PM, Blogger Joolya said...

I know I was kind of appalled early in my graduate career when I was first attempting to do an experiment described in a published paper that no one else in my lab I had done: the methods section said, "We prepared the hydrogels with or without x mg/ml aprotinin." And then never mentioned again whether the reported data were gels that contained or did not contain aprotinin - potentially important information in a gel invasion assay! Not to mention, aprotinin is not cheap (especially at the concentration they reported) so if it was not essential to the experiment I didn't want to have to use it. Was this a clerical error? Or were they fucking with me? I'll never know. But since then I take ALL published methods sections with a shaker of salt!

 

Post a Comment

<< Home