Today we went to try out our Everyday Peace Indicators. Previously we had crowd-sourced the indicators from the locality through focus groups with men, women and youth. We then took the indicators to a number of parishes in a locality to see how much they chimed with the population. The day was a great lesson in doing research for real.
The easy part was designing the research on paper. The very difficult part is trying to operationalise – in the real world – what we thought was a good research design.
Some of the problems with faced today were:
• Turning up to our randomly chosen ‘target villages’ in which all of the adults were absent for the day. In one case they were all attending a reburial service (for people killed in the war) and in another they were all at a ‘land wrangle’ meeting. Of course, we did not know this and so turned up and ‘no one was at home’.
• Turning up to a village, introducing yourself to the ‘head man’, which is the protocol that all of the research guidebooks tell us, and realizing that he was very, very drunk. What do you do?
• Relying on mobile phones for the collection of survey data and then experiencing the batteries running out.
• Working out who actually lives in a certain localities. Often people live in a collection of huts. It is not clear which huts belong to the same family.
• Trying to interview an individual in a communal setting. Most people seem to sit outside of their huts in the day in communal areas where there is shade. Her/his answers then become a focus for debate among the extended family and may impair the ability of the interviewee to be candid.
• Turning up to an interview with the survey questions in the WRONG local language. We thought we had this one covered, but then chose to interview a family who were IDPs/refugees displaced by the war. Our NGO partners could not speak their language well enough to complete the survey.
Of course all of these ‘problems’ have ‘solutions’ or measures we can take to overcome them. But these take time and are unanticipated. They forced us to think on our feet (no bad thing), and make all sorts of compromises that would send methodological purists into fits of anxiety. It really ate into the time we had. We had hoped to conduct about fifteen interviews today. We completed five.
Our research is taking place in the real world. Not in some laboratory, or using a dataset that someone else has collected. These problems are the ones faced by many NGOs, INGOs, international organisations and academics in their research and yet … and yet … we rarely hear about these problems. Why not? Is it because they only conduct research in perfect environments? Or is because they tend to mask many of the practical difficulties that they face in order to give the impression that their research is robust and trustworthy? Maybe other research projects suffer similar projects, but I rarely seem to read about this. It makes me wonder.
Is this some sort of dirty little secret that a lot of researchers keep to themselves?
You can find out more about our project at everydaypeaceindicators.com and sign up for updates. We also have Facebook page.
Amen, brother. This is not just for research but for every activity in the field. Your blog post sounded like business as usual. Methodological purists cannot function in a field setting where nothing goes according to plan. Kudos for your flexibility and ingenuity–and adapting to a reality that normal people face in their daily lives outside of developed countries. I envy your trip talking face to face with these people. Have fun and I can’t wait to hear the results.
Thanks Sharon. And it is good to know that this applies to the practitioner world too. Maybe we just have to stop pretending to be superman/woman. R
That’s not a dirty little secret. That’s a research diary :). All my research went like that when I was in the field, no matter where I was. My favourites were streets with no names, where there 4 houses all numbered 61, all at different points on the street (the address I was looking for). This is the norm in most of the places I’ve worked; but the pressure to bring back ‘valid’ results that conform to standards taken from the natural sciences, or risk not getting published, or getting discredited for being honest, and thereby risking all chance of future funding, made me decide not to reveal the kind of challenges you’re describing. So like many others I have known, in NGOs and INGOs as well as academia, I didn’t report that I hadn’t got repeat interviews because respondents were drunk or hiding; I didn’t report that my (male) interpreter had started answering the questions I wanted females respondents to answer, because he was a man and therefore knew best. I didn’t report that I had had to use informal translation that undoubtedly left doubt about what a respondent had said and meant. I didn’t report that I legged it from an interview when the skinny guy pointed his pistol at my head. I didn’t report that my desire for accurate translation had put my other interpreter in harm’s way. At least I learned from that one.
The real world doesn’t figure in ‘social sciences’ research methods. Few ‘respected’ research methods books acknowledge how different the field is from the farm. If this was understood, accepted, owned, we would be left with two outcomes. Either, accept imperfect knowledge garnered from imperfect places in imperfect conditions; or accept equally imperfect (in different, other ways) knowledge garnered from other, routine imperfect places in different, but equally imperfect conditions. These sanitary deciders of which research is OK are blind to, or ignore, the inevitable conclusion, which is that all social sciences research can only churn out imperfect research (because all research on social subjects can only ever be imperfect). It’s all differently imperfect but we have little better; and when we try to innovate, that innovation gets blocked because it’s new and untried. So we are left with either being honest and succumbing to a hegemonic notion of research methods’ validity, or we can conceal the difficulties that reveal the truer nature of the worlds we research. Our choice? Not as long as academia is run like a
business.
Ha! A street with four houses numbered 61! More seriously, I worry about the messages we send to our PhD students. We give the impression that they must be superman/woman in their research and then set them up for a fall when things don’t go to plan.
I once was chastised for being honest about some of my ‘dirty research secrets’ at a conference panel…. I basically presented my concerns about feeling there was some bravado or exaggeration in my respondents answers to their overall role in a given process (expat peacebuilders) and I dared to suggest that there were perhaps some gender dynamics involved when I, a young female peace researcher, interviewed senior male military officials. I was trying to reflect on how or if this impacted my project, my data, my analysis, how does one know when one is being told the truth in an interview setting, how do power dynamics impact social research?. I thought these were valid methodological questions to be asked. Wrong (apparently). I was told I was ‘being disrespectful to people who had given me their time’. I also once saw a graduate student being told to not waste their time with such reflexivity, just present the data. Nice. Being honest is not generally rewarded in my experience. So, it is nice to have more folks speaking up on this point, I wish it was not treated with such disdain in many academic circles. Thanks for the post!
Very interesting! I think one of the (many) flaws in academic culture is that we are supposed to think that we know everything (this is particularly a male thing as well). That immediately negates the who point of research: that our starting point should be not knowing – an epistemology of not-knowing and wonder.
I answered a question at an event on Friday with ‘I don’t know’ – because I genuinely know nothing about the question topic. There was a palpable sense that I was meant to give a long-winded answer even if I knew nothing about the topic.
How did we get ourselves into this position?
I’ve taught research methods…and also been subjected to very bad research methods courses….so what I attempt to do is provide a core of research practice knowledge which hopefully gives a bit of stability when research plans make contact with messy, frustrating, entertaining reality. Reflexivity is important–particularly when you have to examine how a pottering academic might affect the research environment – -reflexivity can be the hardest to get across in classes, mistakenly seen as navel gazing when actually its learning from practice.
The need to present ‘unproblematic’ research methods in papers/projects seems in part a function of simple human concern about highlighting ones own shortcomings, perhaps understandable in a critical research environment; more worryingly it might be a response to extra sensitive ethics panels in the university and the need for grant making bodies to winnow down to the necessary short list. Deal honestly with potential problems and you end up as a tick in the ‘no’ box.