Statistics Continued

Interestingly enough after last week’s post there is a brilliant article in the BBC magazine about doctors and their understanding of statistics.

Gerd Gigerenzer is one of those names in statistics I trust. His discussion of risk is fascinating. Take the example mentioned in the article:

As a doctor, you know the following facts to be true:

 

  1. The probability that a woman has breast cancer is 1% (“prevalence”)
  2. If a woman has breast cancer, the probability that she tests positive is 90% (“sensitivity”)
  3. If a woman does not have breast cancer, the probability that she nevertheless tests positive is 9% (“false alarm rate”)

 

 

When a 50 year old female patient, who has no other symptoms of breast cancer, has a routine mammogram, she tests positive. Alarmed, she asks you what her risk is? Which of the following is the best answer?

  • nine in 10
  • eight in 10
  • one in 10
  • one in 100

 

If, like me, you read this at lunch with a box of strawberries with one eye on your MOOC numbers, you probably said ‘nine in ten’. In fact the answer is ‘one in ten’. Why is this the case?

Well first remember that if there are a hundred random women in a room, the prevalence of the disease in the population suggests that one of them will have breast cancer. Second, remember that if we test the same hundred women, we will have nine women testing positive who don’t have the disease, and the woman who does have the disease has a 90% chance of testing positive (meaning that it’s possible she won’t test positive).

So with no other symptoms to go on, and remembering that it’s likely that 10 of our hundred random women would test positive (one because she does have cancer and the other nine because they get false positives), the best estimate of whether this patient has cancer is actually one in ten. She might be the true positive. But nine times out of ten she’s the false positive.

 

It’s an excellent teaching opportunity and the maths make sense when you think about it, but it’s keeping the populations separate in your head that makes it difficult.

In other news, I picked up Andy Field’s ‘Discovering Statistics Through R’ and I’m really enjoying it so far.

How To Teach Me Statistics

A few weeks ago I was swearing at my computer and had to go buy a Twix bar from the canteen to calm myself. There was some frantic chocolate scoffing that afternoon.

The source of my irritation? Statistics. I am not a great wielder of statistical power, but I am very interested in their dark arts. This leads to the common situation where I know I’m doing something wrong, such as using stepwise regressions to build a model, the fact I use frequentist over Bayesian probabilities, and even my over reliance on P Values to communicate scientific results, but I just don’t know how to do it better.

I’m expecting there are three reactions to that sentence. The first is “I don’t have a clue what any of that means”. Don’t worry, my grasp of it is very shaky, and it’s not something I’ve ever been taught. It’s something I’ve discovered through hanging out with statisticians.

The second is “Man, I have that exact same problem, but every time I try and learn how to do it, I can’t figure it out.” My friends we are in the same boat. I do not feel I have enough statistical training to tackle these problems.

And lastly the third kind of person is reading that and thinking “Well obviously the answer is *string of gibberish*”

I have had good stats teachers, but they are sadly few and far between, and there are a lot of poor stats teachers who get in there in the mean time and deeply confuse me. I have a lot of good friends who try to teach me and I end up glazing over. What I mean to say is that the following is not personal – and it’s as much a criticism of myself as those who have tried to teach me . . .

Loads of statistically savvy people are willing to teach, they just don’t seem to get it through to me. So seeing as I’m supposed to be quite good at this education malarky, here’s my guide to teaching me statistics.

 

Make Sure We’re Speaking a Common Language

Yes, we really have to start with the basics here. Statistical language is incomprehensible to me. And that’s because we’re all taught differently.

As an example, I refer to response variables as ‘y’ and explanatory variables as ‘x’. A good friend of mine refers to explanatory variables as ‘y’ and response variables as ‘a’ or ‘b’. This causes huge confusion whenever we ask one another stats questions off the cuff.

And the common language refers to more than just making sure I understand what your big formulas are saying. This is what the homepage of R looks like. R is a sophisticated and free statistical tool that we should all be using. I’ve seen more intuitive GeoCities layouts. This is written by and for coders and I have to explain how to extract a zip file to some of my colleagues.

Why are you writing your R manual or your page about your fancy new statistical technique? Are you trying to share it with others who think like you? Fine, carry on. Are you trying to improve the statistical techniques used by frustrated, busy scientists who haven’t had more than a few week stats CPD a year?

Use your words.

Now the R Book is a good start for people wanting to learn R but I still wish it was written by Andy Field, who’s Discovering Statistics book is still my favourite bible, even though I don’t use SPSS anymore. If you’ve read both, you’ll see the difference in style is extreme, and I think it’s because, as a social scientist, Field has a better grasp of how people think. (Although speaking of GeoCities sites . . . I still love the book!)

Edited to Add: I lie! Andy Field has written an R textbook, which I have just bought! Thanks to Comparatively Psyched for the heads up! 

 

Teach Me Something I Can Use

This may seem counterintuitive to what I said further up, but if you’re trying to teach me, say, an alternative method to a stepwise regression, don’t just give me a dataset and tell me the code to run.

Tell me how to arrange my dataset in the way its needed. Ask me questions about my data – get me thinking about the complexities of the experiment I designed. And then tell me the code to run. Don’t forget to walk me through the output. For example, the documentation for the lars package in R explains how I can run a least angle regression on a sample dataset. Great. I can copy and paste that code ad libitum. Can I get it to work on my data? Even though to the best of my knowledge I’ve arranged it in the same way? Nope.

Get me to work through the whole process and you show me where your new method fits into my life.

 

What’s the Application?

I recently sat through a stats seminar where someone was showing off a new method. In the same presentation they briefly glossed over ternary plots as a way of showing off new data.

Applied scientists work in a world that judges us on the number of papers we produce and the impacts our papers have. That is literally how we get our baseline funding.

I don’t disagree that there are lots of problems with publishing but you’re asking me to relearn how I think about statistics, and then to communicate all this in a real-world paper with real-world data (that doesn’t always play nicely). If you’re asking somebody to use an amazing new technique, you’re asking them to get that past reviewers (who more often than not will not know your new stats).

If you have a great technique but it won’t actually give me a conclusion that I can use to improve animal welfare, then it’s not going to help me. And related to this . . .

 

What does it Mean?

The truth of the matter is that the statistical tests we commonly use are ‘plug and play’. We get into the habit of checking the things we want to look at noting the laundry list of caveats in a footnote.

Walk me through an example of what my results mean. If you’ve got me using my own data, tell me if this result confirms or denies my hypothesis, show me why, give me some indication of the next step.

I’m amazed at how many people don’t do this when trying to explain stats to me. You’re interested in the method, I get that. I’m fascinated by recording aggression in groups, but there’s a time and a place to discuss this, or just to tell you what aggression means.

 

Don’t Assume I’m Stupid

I see this all the time when statisticians are trying to teach something to scientists. They spend a very long time on the basics because our fundamentals are so scattered. This is not the most helpful approach. The other method I often see, when I say I don’t understand or even hesitate, the statistician repeats what they’ve said, more slowly and slightly louder.

We’re not stupid. Try teaching us a complex problem in an environment we’re familiar with (i.e. with our own data) and you’ll be surprised how many fundamental skills we’ll pick up because of it. To use a simple analogy, if you wanted to teach me how to maintain a car, wouldn’t you be be better off showing me how to take an engine apart rather than build one from scratch?

Don’t spend half our time explaining the problem to me – I get that there is a problem with the statistics I already use, it’s why I’ve sought you out. Is a finer understanding of the theory really going to help me use this test in future?

 

Finally – Why Are You Teaching Me?

This blog post sounds very whiny. Trust me, I know.

I know I should have learned all this earlier in my career. I know I should use R every day until I’m fluent. I know I shouldn’t using all these out of date stats. But the sad truth is that I haven’t, I don’t and I can’t.

I want to change, and I need the great community of statisticians to help me. So if you’re a statistician who wants to help me and people like me, this is how I’d suggest doing it.

Good luck!

Fifty Thoughts You’ll Have Writing a Grant Proposal

As an early stage researcher, there are a number of thoughts that will cross your mind as you try to write a grant. Allow me to demonstrate.

  1. That grant is totally relevant to me!
  2. Oh, but the deadline is next month.
  3. Still, that’s so relevant. Damn did they write this call for me?
  4. OH MY GOD LOOK HOW MUCH MONEY IS IN THAT POT
  5. I could get a mortgage if I got this grant.
  6. That’s Han Solo levels of wealth right there.
  7. That’s it. I’m applying. What’s there to lose, right?
  8. Well I might lose my early stage researcher status.
  9. Is there a thing about that?
  10. Why are all these guidelines so long? Jesus.
  11. Ok, I’m probably still eligible for early stage researcher status if I apply.
  12. Is this idea going to work? I will have to bend it a bit.
  13. What are the buzzwords of the month?
  14. Can I link this to climate change?
  15. I can probably link this to climate change.
  16. I’m really not sure if this idea is going to work.
  17. Hell, I’m going to email Betsy about it, Betsy’s pretty cool, she’ll tell me more.
  18. Unless she decides she wants to apply for it.
  19. Maybe I could email Colin, Colin’s always really helpful.
  20. I should probably bring this up at a team meeting.
  21. But I’ve only got a month till the deadline.
  22. Fuck it. Write a grant and take it to the boss. What can go wrong?
  23. Hmm. Will they accept that the aim of this project is to keep me in booze for the foreseeable future?
  24. Damn, only 700 words to write my outline, that’ll be tough.
  25. Um.
  26. What is my aim again?
  27. Why is this so hard? I wrote a bloody thesis. I write for a living. God. Maybe a cup of tea will help.
  28. 600 words to go.
  29. I’ll do a literature search! I’m amazing at literature searches.
  30. Wow, quite a few people have done this.
  31. Okay, that is in fact my idea.
  32. They did it in the eighties.
  33. Well I’ll update the research.
  34. I should have emailed Betsy.
  35. Okay I can write this.
  36. I survived the PhD, I can survive being a real academic.
  37. Mendeley, why are you so awesome?
  38. Okay it’s done! It only took me . . . oh my god the deadline is tomorrow.
  39. Quick! What does the boss think?
  40. What do you mean, the grant doesn’t fund overheads?
  41. DAMN IT.
  42. Maybe we can repurpose it for a Fellowship?
  43. Yeah, that experiment is way too expensive, I should take it out.
  44. And Colin says this other experiment won’t work because he tried it twenty five years ago and never published the non-results.
  45. I can see that mortgage slipping through my fingers.
  46. Oh yeahhhhhh, technician time.
  47. No I did not budget technician time.
  48. Yes the project would pretty much demand a whole three technicians.
  49. What do you mean the uni already put a proposal in to this grant?!?!?!
  50. Fuck it. Write it up as a M.Sc project for next year and hope you get a good student.

Retrospectively

I’ve been doing a lot of navel gazing lately, professionally speaking of course, because June is a month of anniversaries for me. Most recently, June marks one year since walking out of my PhD viva and being called Doctor. It marks five years since finishing my undergraduate program. It marks ten years since my last day in high school. And it marks my twenty eighth birthday. Navel gazing has been rife. I have a mounting concern that I will never be a real adult.

With that being said, I feel like it’s a good time to take stock of my career, particularly as I was recently reminded of how hard it is for final year PhD students to see anything other than the doom and gloom that surrounds you in that period of your life. So this is my attempt to show you that one day you’ll feel good again.

Earlier this month I was supposed to be converting  some slides for our MOOC when I was sucked into the ThesisWhisperer blog, taken there by a link and then unable to keep clicking through the stories. It reminded me just how awful I felt when I was finishing up. I felt defeated, utterly, and handing over the thesis was nothing like the victory I thought it would be when I started.

I was sick. I handed in my PhD thesis covered in chicken pox blisters (unbelievably, the third time I’d had the infection). In the six months that ran up to my submission date I had been constantly ill with sore throats, migraines and repeated colds. My insomnia had never been so bad, I cried in our work’s canteen, and I was so ready to walk away from the office and never return.

Except I was back the next week because I’d scored a three week contract. Despite my conviction that I was out, I couldn’t turn down the money. That led to a month’s contract. Then a three month contract, then a six month contract, and now I have guaranteed paycheques up until the end of March.

 

 

The Valley of Shit

The ThesisWhisperer blog talks about the Valley of Shit, and I can remember my valley vividly. It lasted from roughly December 2012 – May 2013 when I handed in.

I’m a competitive person. I like to be the best, and I’d work for nothing if people told me I was wonderful (please don’t tell my HR department). My PhD was the first time I’d ever had to confront the fact I wasn’t the best. My PhD made me confront the fact that not only was I not the best, I wasn’t even in the top percentiles. That was a hard, hard lesson to learn.

Approximately a month before I submitted, my PhD’s key paper was rejected from a journal because of one reviewer’s comments (the worst paper they’d ever read, they couldn’t believe my coauthors had deigned to put their name on it). I cried in the cafeteria in front of my bemused supervisor. She told me I’d need to develop thicker skin, which seemed absolutely impossible.

This month another paper of mine was rejected from a journal (although the comments I admit were much more positive and it was rejected from a very well respected journal that was a bit of a long shot). It barely registered on my radar.

I think this is a big part of the Valley of Shit. Everything feels like the end of the world. I remember being on the phone to my mother and asking her if she would still love me when I failed. Which is ludicrous, of course, but still something I felt I needed to ask. So, yes, the Valley of Shit exists. I clearly lost all perspective in this period of my life.

 

 

The Plateaus of Okay

My viva was a long and arduous one that resulted in remarkably few corrections, at least from my point of view. A few months after I’d submitted my corrections and the University’s Senate agreed I could be awarded the degree of PhD, I got my six month contract extension.

One morning I was in the shower, washing my hair, and I felt a distinct sense of unease. It took a moment but I realised what was unnerving me: I had nothing to worry about. For so long I’d been thinking of the PhD and finally there was nothing to be fretting over. What could I think about instead?

I think I ended up reading the shampoo bottle. It took a while to relearn the art of the shower daydream.

It takes a long time to adjust to being on the Plateau of Okay. There are little things, like not wanting to take all your holiday days because you want to be invaluable. There are big things, like fretting over the fact I still don’t have a postdoc and I’m moving further away from research and into education instead. The thing about the Plateau is that you have the space to remember how to cope with these challenges.

Just before Christmas I was offered an interview for a job that I didn’t really want. The interview was at an inconvenient time and in an inconvenient place. But it was a full time, permanent position and with a higher salary than I’m on right now. After some deliberation I declined the interview, and felt sick for the rest of the day.

In the Plateau you start to make your choices based on what you want, rather than what you’re frightened of. And that in itself is terrifying. I’ve turned down a few jobs and interviews because they’re not quite what I want, and I still wonder if that was the right thing to do. I’ve also been turned down for jobs I thought were perfect for me, and that is what the pub and your friends are for. In the Plateau, it’s not about losing the fear, but recognising you have choices again. You’re no longer trudging endlessly, you can go in any direction.

It’s pretty intoxicating.

 

 

The Peaks of Happiness

This month I won some project money (a small amount, certainly no postdoc, but still). I have enjoyed what I’ve been doing thoroughly. I’ve booked a holiday with all those holiday days I didn’t use last year. And I got my longest contract extension yet.

When I was reading the ThesisWhisperer I realised I was at the Peak of Happiness. All the things that upset me about academia are obstacles to deal with in a few months time (like the next contract extension, my lack of paper output this year, how I’m supposed to do grown up things like buy a house or a pet when I don’t know where I’ll be next year . . .) I was feeling truly elated.

This time last year I could not have believed that I would be this happy.

A peak means there must be another valley further on. The very fact that I know I’ll need another contract extension, that there are still grants that need to be won, and that if I want to leave those parts of my life behind I’ll have to sacrifice the parts of academia I love. You can’t just stay on the peaks of life, but you can hope the plateaus keep climbing, which is what I have decided to do. I’m not afraid of the deep dark valleys right now, because they inevitably end. As the poet said, this too shall pass.

 

 

But most importantly of all, in a few months time I’ll be going to my high school class’s ten year reunion. I guess I could introduce myself as a pet psychiatrist.

Badger Fortnight – The Solution?

But Jill, a fortnight is two weeks not three.

Shut up, that’s what.

 

This week I want to discuss two main studies – the first by Torgerson and Torgerson (2010) and the second a 2008 in the Veterinary Record. We have talked about why the disease is a problem, why the cull hasn’t worked, so the question becomes: What now?

The Torgersons start off with the claim that Defra’s continual fight against bovine tuberculosis is a misplaced use of public resources and we should just chill on the whole thing.

What’s their reasoning?

They start by going into the details of the few cases Britain has had of humans developing bovine TB. Between 1993 and 2003 they note that there were only 315 human cases of Bovine TB and only 14 of those were in people born after 1960 and were British Nationals.

Molecular investigation found that only 10 of the 25 spoligotypes of the bovine TB present in infected humans were actually present in contemporary UK cattle. They describe two cases from Gloucestershire where on-farm transmission from cattle to humans was likely. A third case in Cornwall where a veterinary nurse was infected was considered to have more likely come from her dog. (Interestingly, cats and ferrets are also known vectors of bovine TB and I know I’ve had more cats sneeze into my mouth than badgers, and I’ve probably worked with more badgers than most . . . Ragg et al, 1995). The more infamous six cases which sprouted up in Birmingham featured a UK national with a ‘history’ of drinking unpasteurised milk at home and abroad. And four of these six patients were likely immunocompromised.

Historically, Bovine TB did not come from cattle-to-human airborne transmission, but through milk. And as we pasteurise all our milk nowadays, the Torgersons conclude this risk is now negligible. I want to take a  moment to say that I have anecdotally observed a strange counter culture of people who love unpasteurised milk (in fact it is a topic of conversation that seems to leap up whenever I tell people I work with dairy cows). Unpasteurised milk drinkers are a little like foodies who insist you’re using the wrong kind of spice and I’m often asked if I drink unpasteurised milk – once I was fairly certain my optician wouldn’t sell me glasses until I converted to unpasteurised, but I digress – seeing as milk makes me ill at the best of times, the thought of drinking milk unpasteurised ‘gies me the boak’ as we say in Glasgow. Unpasteurised cheese is another matter . . .

Where was I?

Oh yes. The end conclusion of the Torgersons (are they brothers, or did they just think they would be epic scientific partners?) paper is that they believe our hypervigilant position on bovine tuberculosis in the UK is a waste of public resources. They don’t see a reason for spending so much money on a disease which so rarely affects humans.

The Veterinary Record article doesn’t quite agree, but following the randomised badger culling trial in 2007, they too realised that badger culling was not the way forward (yes we were having this discussion seven years ago) . In the article they propose:

  • More frequent testing of cattle using combined tests to detect active disease.
  • Research on post-movement cattle testing.
  • Research into a vaccine for cattle and badgers and immediate usage as soon as its developed.
  • Research into the disease.
  • Get farmers to understand the need for greater on-farm biosecurity.

 

Really, this article back in 2008, was proposing the oldest solution: identify, research, prevent.

So why aren’t we there yet? Well seven years in research and pharmaceuticals is not a lot of time. Defra’s old website has a page on cattle vaccinations and it points out that the EU prohibits vaccinating cattle against TB (because being vaccinated makes some cattle test positive for TB, ergo herds cannot be declared TB free because the vaccine may be masking infection. The EU prohibits trade of TB infected cattle).

The BCG vaccination is not brilliantly effective in cattle, so we either need a better vaccine, or to use that vaccine to protect some of the herd and reduce the number of cattle we need to cull. But it’s expensive and hampers trade with the EU.

This post will be published a week after I voted in the European elections. I can tell you I didn’t vote for UKIP or anything like that, I’m a good left winger who lives in Scotland, you get three guesses on my vote and the first two don’t count, but the role of the EU legislation in our Bovine TB problem can’t be ignored. The Farmers Guardian reports that the European Commission doesn’t expect a vaccine to be around until 2023.

There is definitely something to be said for better biosecurity measures on farms. There are some brilliant farmers, and there are some poorer farmers, and coughs and sneezes spread diseases. We have known this since Koch came up with his postulates. The good farmers resent being told what they already know and the poor farmers resent being told to do better. We come back to my old hobby horse – how do you communicate that science to a varied audience?

And finally the TB test – if we can find a test that can discriminate between infected, active infections, vaccinated and TB free, and do so reliably, we can still trade with the EU. These things all take time, money, and a little bit of luck.

We won’t find the solution to the bovine TB problem on a welfare scientists hobby blog. The answer is not badger culling. It’s not, as the Torgersons suggest, just letting the disease roam free. If we want to trade with the EU we need to deal with it.

 

Just wait till Defra finds out cats transmit TB . . .

 

Badger Fortnight: TB

For the next couple of weeks I am dedicating Fluffy Sciences to the noble badger. Why, you ask? Well because the other day I ended up reading Defra’s independent panel report on the UK badger cull and the whole thing made me grumpy.

As someone who works both in animal welfare and in the agricultural industry, with a soft spot in my heart for cattle, I have heard a lot about Bovine TB and badgers in the past few years. I’m going to spend the next few posts telling that story, and where better to start but with Bovine TB itself? After all, without this insidious disease, badgers would be fondly remembered from the Animals of Farthing Wood, or the noble lords in Redwall. Instead they’re synonymous with James May and the word ‘cull’. An interesting turn of events.

So. Bovine TB, the villain in our tale. What are you?

When you’re reading or watching some trashy historical drama and the heroine coughs into her handkerchief, staining it gently with blood, you know she’s not long for this world. Satine, I’m looking at you. That disease is Tuberculosis, or consumption, if you’re still feeling gothic.

It’s a famous disease in science because of Robert Koch, who formulated the well known Koch Postulates, a set of rules to identify the causative pathogen of disease. While still very much remembered, they have been supplanted with other rules better capable of identifying things like parasites and even non-active infections.

  1. The microorganism must be found in abundance in all organisms suffering from the disease, but should not be found in healthy organisms.
  2. The microorganism must be isolated from a diseased organism and grown in pure culture.
  3. The cultured microorganism should cause disease when introduced into a healthy organism.
  4. The microorganism must be reisolated from the inoculated, diseased experimental host and identified as being identical to the original specific causative agent.

Koch identified the agent of TB, Mycobacterium tuberculosis, and he received a Nobel Prize for his troubles. Interestingly enough, he received his prize even though for years he’d been convinced that Bovine TB and human TB were not similar, and it was his results that forced him to reevaluate this position.

Fast forward a couple of hundred years and this disease which, at one point, was causing 25% of the deaths in the world was now on the run. Almost all Brits have a peculiar little scar on their left upper arm, the BCG (Bacillus Calmette–Guérin) scar. I have had some Americans quiz me about it and it turns out Americans never had a mass BCG immunisation. You guys missed out on some quality arm punching in school.

If you’ll think back to your school days, you’ll remember a thirteen year old you suspiciously watching a nurse inject you just under the skin on your arm. A few days later they inspect the mark and then decree whether or not you will receive the BCG  vaccination.

It’s often said that if you react to the skin test it means you are already immune to TB, but this isn’t quite true. It means that your body reacted to the TB (in a healthy person this means they got a red blister bigger than 15mm in diameter) which may be because you have a previous vaccination or because your body already has an active TB infection.

Regardless, those who escaped the skin blister get called back for the vaccine. The injection goes between the skin (large ulcerated BCG marks are often an indication of an accidental subcutaneous BCG). As vaccinations go, its quite painful (I remember it being more uncomfortable than three successive rabies vaccinations) and not helped by the traditional teenage sport of punching people in the vaccination spot.

Large keloid scars can form, although latterdays these scars are not so prominent due to improved techniques. I did try to take a picture of mine for you but mine is tiny and barely shows up. The mass BCG vaccination program has recently been suspended in the UK as the disease is now considered very rare. Only at-risk groups are vaccinated now.

 

 

Despite the fact that antibiotic resistant TB is on the rise (and that this is frightening) in the UK we manage the disease in humans fairly well. Other countries, such as India, haven’t been as successful as we have in using the BCG vaccinations – and it seems to be that the disease is harder to manage in equatorial regions, for reasons I’ll not speculate about here.

However Bovine TB is caused by the very similar pathogen Mycobacterium bovis, which can cause TB in humans if they drink the unpasteurised milk of an infected animal, or if they inhale aerosol droplets (e.g. cough, spits and sneezes) of a cow.

Being someone who has been coughed, spat and sneezed on by various cows, I’m not particularly worried about this myself, even though Bovine TB can cross the species barrier, TB itself rarely becomes an active infection in the person who has it.

So why do we worry about it in cows? Well we have a strange double standard here. When we test cows, we use a skin test very similar to the one we use in humans. And, like in some humans, there is a reaction to this skin test.

And those cows which react to the skin test must be culled. You’ll remember that only a few paragraphs higher up I mentioned that a reaction doesn’t mean immunity, it just means the animal is reactive. It doesn’t mean there is an active infection, it just means the animal is reactive. Yet we cull those animals specifically to prevent the spread of the disease.

You can sell the meat of the culled animal if you want, because cooking meat kills the Mycobacterium, but you cannot tolerate a TB cow on your farm.

And this isn’t considering false negatives and false positives in the test, as no diagnostic test is perfect.

The fear is that Bovine TB will either infect or reactivate a latent TB infection in people. Because of this fear we cull any cows infected with TB. Defra have also produced what they optimistically call a leaflet (at 21 pages) of ‘What happens if TB is identified in your herd’ which remind me of those Simpsons leaflets ‘So You’ve Ruined Your Life’.

 

The moment a cow reacts to the skin test the herd is classed as suspect and moving cattle out of and into the herd is restricted. Any reacting animals must be isolated from non-reacting animals and culled. The milk is to be dumped.

The reactor cows are tested post mortem, but even if your reactors show no clinical signs of TB, your herd status is still Officially TB Free Suspended. You will need a clear test (or two clear tests if TB was found in your reacting cows) for your Official TB Free status to be reinstated.

If you have a TB farm you also have restrictions on where you can spread your slurry and on what you can do if your cows die on farm.

It is, in short, a huge hassle, resulting in cow deaths and loss of farm profits, as well as posing a health risk to humans.

 

It is with these facts that the government turned its attention to badgers, and that, dear readers, we will discuss next week.

Granny and the Blackfish

New title for a band: Granny and the Blackfish.

 

TheDodo.Com reports that Granny, a 103 year old orca has been spotted off of Canada. Just another example of wild, long-living orcas. And another spit in the face to SeaWorld.

How the Elephant Got Her Trunk

When I was very small I had a beautiful picturebook Rudyard Kipling book of just-so stories. One I particularly remember is the tale of how the Elephant Got Her Trunk. She was staring at her beautiful nose in a pool every day until one day a crocodile swam up underneath her reflection, grabbed her nose, and pulled and pulled and pulled. The little elephant struggled for so long her nose was stretched all the way down to the ground before the crocodile finally released her. And that, children, is how the elephant got her trunk.

It’s silliness of course, we all know that evolution acts upon the populations, in random genetic mutations, and whatever happens to the individual, so long as it doesn’t stop them reproducing, doesn’t matter. All you need is to procreate, after that the genetic material is mixed and the next line of mutations start.

For the most part, these are the rules of evolution. But all rules have exceptions, and sometimes evolution works through a different mechanism, that of epigenetics.

You don’t come to FluffySciences to find out about cell mechanisms and inheritance (if you are coming here for that we need to have a conversation about our relationship) but you do come here for the real-world explanations. Epigenetics results in a kind of Just So story where a stressful event can result in a change that can be passed down to the next generation. And the Just So story we use to illustrate this isn’t so pleasant.

In the Netherlands in 1944 there was a Hongerwinter, when the Nazis cut off food and fuel transports in the river areas. Tens of thousands of people starved to death. Can you imagine the endless hunger, the enemy soldiers in your streets, the cold? At one point the daily calorie allowance was less than 600 calories.

The immediate effects of this horrible famine were obvious. Pregnant women gave birth to very small babies, children lost the ability to digest wheat, and Aubrey Hepburn developed anaemia. But what happened next?

In Painter et al’s 2008 study (which is open access and you can read here) they investigated the results of the Mothers who starved (F0), their sons and daughters who were born during the famine (F1) and their grandchildren (F2). Using a combination of historical health records, interviews and health checks, they investigated whether the ill health of the granddaughters could be attributed to what the mothers experienced.

Between the 07/01/45 and 08/12/45, children in this cohort were being born to mothers who had, during one of the 13 week periods of gestation, an average daily calorie allowance of 1000 calories. To put this in context, the typical pregnant woman needs 2200 calories per day to maintain both her body and her baby’s growth. In the Dutch Cohort study, children which were born between 01/11/43 – 06/01/45 and 09/12/45-28/02/47 were either born before the famine truly began or conceived after the words was over. They act as controls. They are from the same population, similar mothers, similar time period, even similar psychological stressed. They just don’t have that crippling 1000 calorie a day limitation during the important parts of the baby’s gestation.

The researchers monitored the F1 generation (remember that’s sons and daughters) weight, BMI, Socio-Economic-Status and blood tests to look at how they cope with sugar, and their good and bad cholesterol levels.  They did all this when the sons and daughters were 58 years old.

Then they asked about the F2 (grandchildren). Were the grandchildren premature, on time or late? What did they weigh at birth? Were they twins? How many kids? What order? How many girls? How many boys? How healthy are the grandchildren?

At this point the researchers know what has caused the ill health that F1 have suffered – it was the famine. The question is, has this ill health, which was entirely due to a short term environmental challenge experienced in utero, been inherited by the grandchildren? The two categories of disease they were most interested in were cardiovascular/metabolic diseases and psychiatric diseases.

They used a variety of mixed models (which allow you to have multiple children of the same parent, which would otherwise be a case of pseudoreplication), and regression models, among some other statistical tests which can cope with non-parametric (i.e. real world) data to investigate these questions.

Their results showed that the F1 generation were smaller at birth and as adults they had higher blood sugar levels two hours after eating than the ones who hadn’t suffered through the famine. So far so expected. The children of F1 women were also born smaller (though weighed the same) if the F1 woman had been in utero during the famine. This is the really interesting thing. The environmental effect was inherited. Also, the F2 children of the F1 generation who had been exposed to famine earlier in their gestation (i.e. when they were forming) were more likely to have poor health due to ‘other’ causes. Finally, and the one I consider to be most interesting, the F2 generation of those who had suffered in the famine were more prone to being fat babies.

The last point interests me the most. I’m working on a project looking at prenatal effects in farm animals and the way we talk about this phenomenon is to say that the offspring ‘samples’ the mother’s environment in early gestation. For example, if the offspring is receiving little food and lots of stress then it should prepare itself to be born into a world where resources are unpredictable and scarce. Whatever cellular changes it switches on to do this make it predisposed to obesity and heart disease in a normal environment, and can be passed on to its own offspring.

If you remember last week I ranted about nature vs nurture, this is why. This significant change in baby fat in the famine group is not genetics – the population is not genetically different. It’s not the environment, because the famine was long over when these kids were in the womb. It’s historical environment and the changes that produced. Remember, never use Nature Vs Nurture kids.

There’s still plenty we don’t know about epigenetics – how many generations can these effects last for? How cumulative can the total effect be? And when it comes to prenatal effects, what is the mechanism by which the offspring samples the mother’s environment? As a relatively new field, it’s sexy and cool and a lot of people are into it. Expect a lot more information about it in the future.

Just So.

The Anthropomorphism High Horse

I rarely read a piece of scientific journalism and think “what absolute tosh”, in part because I tend not to use the word ‘tosh’ and in part because I know that science journalism involves digesting and reconfirming a complex idea. It’s not easy.

But this article had me gnashing my teeth. It’s a summary of a paper by Ganea et al 2014 [in press pdf download – only link I can find]. The essence of the paper is this: children which grow up in urban environments (in this case pre-school age children from Boston and Toronto) are not exposed to animals. When they’re given anthropomorphic stories about unfamiliar animals (cavys, handfish and oxpeckers) they will agree with statements that attribute complex emotions to those animals, but not statements which attribute human physical capabilities, e.g. talking, to the animals. The conclusion is that anthropomorphic animal stories inhibit a child’s ability to learn animal facts.

The science I think is interesting – it is the conclusion and the bandying about of the word ‘anthropomorphism’ that get my goat. Let rant at you.

The article’s author says:

Setting aside the shades of grey as to whether non-human animals have analogues for things like friends, the findings suggest that for young kids, “exposure to anthropomorphized language may encourage them to attribute more human-like characteristics to other animals than exposure to factual language.”

 

 

This anthropomorphism spectre infuriates me at times. Let me put it this way, one of the questions asked of the children was “do oxpeckers have friends?” I’m asked relatively frequently if cows have friends, and if I want to answer that question accurately, I have to dance around terminology and use baffling scientific language to answer it in a way that means ‘yes but I can’t really say that because I’m a scientist’.

Cows have preferential associations within their herd. Being with these other individuals makes them more capable of physiologically coping with stressful events (Boissy & Le Neindre, 1997) such as being reintroduced to the milking herd (Neisen et al, 2009), being milked (Hasegawa et al, 1997), or feed competition (Patison et al, 2010a). They will preferentially engage in social interactions with these preferred associations, and these associations go on for longer than with other animals (Faerevik et al 2005, Patison et al, 2010b).

How do you explain this to a 2-5 year old child from Boston without using the word ‘friend’ or any synonym of it? Is it any wonder a child might reasonably assume that animals can have friends? Is it wrong to say that an animal can have a friend?

My irritation here lies with the writer of the article saying children believed ‘falsehoods’ about animals, based on anthropomorphism. We get one link, to a website I can’t access being based in the UK, to research which might suggest animals are similar to us in some ways. Then we move on to a paper I’ve referenced before talking about how dogs’ guilty looks are based on our behaviour (Hecht et al, 2012). The underlying assumption is still that animals are so different from us that children are wrong to believe that animals have the capacity for friendship and caring.

Now I’m fascinated by dogs for precisely this reason. They are so excellent at communicating with us, and reading us, that they are almost in-animal as much as they are in-human. They’re a possible model for human-child behaviour they’re so adept at this. I wouldn’t necessarily use dogs as an example for how the rest of the animal kingdom thinks if I was very worried about making cross species comparisons.

Anthropomorphism is either the attribution of human characteristics to animals. In which case it cannot be used pejoratively. For example, to say “This cow has eyes” would be anthropomorphic.

Or anthropomorphism is the inappropriate attribution of human characteristics to animals, in which case you must carefully consider why the characteristic is inappropriate when given to animals. It is not anthropomorphic in this case to say “This cow feels fear”, because fear, as we understand it, is an evolutionary mechanism to increase your chances of survival, it has physiological and behavioural components and the cow meets all of these. Ergo, this cow feels fear, and that is not an inappropriate characteristic.

Much as I lament the fact urban children have very little contact with the natural world, and I think this is a major issue for animal welfare, food sustainability, and the mental health of the children, I don’t fully agree with the paper’s conclusions, or the writing up in the Scientific American blog.

Firstly, the study found that all children learned new facts regardless of whether they read the anthropomorphic story or the non-anthropomorphic story. The results appear to indicate to me there was less fact-retention in the anthropromorphic story (and while I’m not a psychologist, I have worked with children and I do now work in education, I wonder if the anthropomorphic story, being similar to entertainment, indicated ‘you do not need to pay attention here’ to the kids. This does not appear to be discussed in the paper.).

Secondly, the study found that the children who had anthropoorphic stories told to them were more likely to describe animals in anthropomorphic terms immediately afterwards. Now again I’m no psychologist, but after I went to see Captain America I was partially convinced I was a superhero. It faded after the walk home. I’d like to know more about the extent of this effect over time before I declared anthropomorphic stories as damaging to children’s learning.

Thirdly, the Scientific American article presents some ‘realistic’ and ‘anthropomorphised’ images of the animals side by side. This is not what happened in the paper. In the first experiment the children were shown ‘realistic images and factual language books’ or ‘realistic images and anthropomoprhic language books’. The second study used ‘anthropomorphic images and factual language’ and ‘anthropomorphic images and anthropomorphic language’. The upshot of this is that the realistic image condition was not directly compared to the anthropormphic image condition, regardless of how it seems when you read the Scientific American article.

The paper says at one point:

This reveals that, like adults, young children seem to have a less clear conception of differences between humans and other animals in regard to mental characteristics, as opposed to behaviors. However, exposure to anthropomorphized language may encourage them to attribute more human-like characteristics to other animals than exposure to factual language.

 

 

Well there’s little wonder about that because even we scientists don’t have a particularly clear conception of the mental differences between humans and other animals. The paper itself is interesting and well worth a read, but it falls into the trap of thinking about anthropomorphism as a wholly negative thing. If I was a reviewer I’d suggest Serpell (2002) as an excellent starting point for a more balanced view of the phenomenon.

And I’d also suggest they watch this video before assuming that kids are daft for thinking animals feel emotions.

 

The Selfie Cancer

Have you taken your make-up free selfie yet? Or are you rolling your eyes at the very thought? The split between the two camps is pretty much 50:50 on my Facebook wall.

Let me start with this. I am terrified of cancer. There are few diseases that frighten me. As a biologist, and with a family that comes in the medical flavour variety, I tend to view disease with more fascination than fear. But this obsession with the mechanics of the body breaks down when I’m confronted with the C word.

It’s not that I have bad experiences with cancer. The worst thing cancer has done to me is present a few non-malign tumours in close family members, which causes a few months of unease until the offending lump is excised. A grandfather died of an unknown primary tumour, a quick decline after a surprise diagnosis. And a grandmother who died of cancer before too many memories of her formed. Family lore says her radiation badge from her days working as a nurse in radiology was too often blackened, and that she ignored the signs for too long. I’ve been told we have the same hair.

But it still frightens me. Is it the chaotic nature of the disease? Cells which divide forever, heedless of the proper order of the body? Yes, I am a bit of a control freak. Is it the way it lurks? The lumps and bumps that might seem normal. Is it that, despite being shown by a nurse and looking at the diagrams, I’m still very unclear on whether I’m doing a breast exam right? Is it the vestigial cultural taboo of the C word?

But as a scientist, cancer holds other problems for me. If you forced me to give you my contribution to the world’s scientific knowledge I’d tell you I enhanced our understanding of how personality affects animal behaviour. Anonymous internet commenters have asked me why I didn’t spend my time curing cancer instead.

Build a Large Hadron Collider – why didn’t you spend that money curing cancer?

Define our theory of physics – why don’t you use that time to cure cancer?

Launch a telescope into space – shouldn’t you be curing cancer?

Work in cancer research – shouldn’t you be curing cancer faster?

I’m sure most of my fellow scientists will have had this accusation levelled at them once or twice. Never mind that markets don’t work like this, that scientific progress requires more than one discipline of study. Never mind that I’d be useless in a lab because my natural talents lie towards the empathy and big-picture-view that make me a good ethologist. Why don’t we all go cure cancer right now?

Here I direct you to another wonderful science communicator: Jorge Cham. As the creator of PhDComics.com he has plenty to say on the experience of being a scientist. When he visited a cancer centre he had to ask: why were they listening to him and not off curing cancer?

Please do visit that link. It’s one of the most informative links I’ll ever point you towards. To call this monster simply ‘Cancer’ provides a smoke screen that disguises the true problem. There are many, many cancers and there are many, many hurdles on the way to curing, or even treating, those many, many cancers.

This brings us to the selfie trend. Take a photo of yourself without makeup to raise awareness of cancer. The Telegraph reports the trend has already raised a million pounds. The Independent editorialises the death of vanity. And Closer magazine thinks we’re all missing the point (they helpfully tell me the point is to donate money).

It’s always easy to criticise. I’ll start my criticisms by saying this no-make up selfie bandwagon sensationalises women who choose not to wear make up. It is somehow ‘brave’ to appear as you do when you wake up in the morning. I have apparently been subjected to ‘horror’ if I’m to believe the self deprecatory captions on each selfie.

This is perhaps what offends me the most about this whole trend. I’m an avid selfie taker and I wear make up perhaps once a month. Last weekend I posted about five make-up free selfies in the course of a football match. Is this horrendous to you? Am I brave? No, I am not.

Because I am afraid of cancer.

And this is why the selfie craze is brave, just not quite in the way people might think. It’s not brave because you contravene some ridiculous preconception of beauty. It’s brave precisely because you frighten me. You remind me there is a terrifying disease out there. Stephanie Boyce is brave for reminding me that the disease is survivable.

The bravery is the same bravery that prompts people to stand on the street collecting money for cancer research. It’s an irritant. They know I don’t want to hear about it, they know I don’t want to confront the fear today, but still they ask for money.

When we talk about a ‘cancer awareness’ campaign it may seem like we’re implying there are people out there who are somehow unaware of our plague. Nobody is unaware of cancer. But there is still a desire to sweep the disease under the rug. It is so big, so complex and so terrifying that it’s easier to think that if all scientists simply put their heads together we’d have it kicked in a week.

It bothers me that your make-up less face is worthy of comment. It bothers me that we picked this method of getting people talking. But it bothers me that cancer is still so prevalent.

What’s the bigger evil? The disease, or being reminded that it exists?