when it comes to research, my belief system is built on three cornerstones:
- (truly) ‘objective’ research does not exist – we can only aim at getting as close as possible to it by limiting our human fault lines.
- everybody is biased – lucky for me, the swedish academic tradition taught me that this is okay, as long as you wear your biases transparently for the outside world to see and don’t pretend to be objective, because: see point number one.
- the first step to avoid bias is by admitting that you are biased – as with many human subjects, the first step is admitting that you have a problem. the good news: after a while, one realises that ‘bias’ is about the least harmful ‘problem’ one can have – if (and this is central!) one is aware of the bias. only by being aware and up front about it, can researchers strive towards scientifically valuable research.
so, naturally, ‘bias’ is a common word in my vocabulary (one which regularly drives the men of my family up the walls). i can not help but think about it all the time. and obviously, i am doing my best to avoid my own biases. yet, since i am working on a topic so far removed from my own cultural background (i am a woman working in a ‘men’s world’; i am young, working mostly with elderly men; and i come from central europe, while i study the middle east), it is not surprising that i constantly end up running head first into all different sorts of biases and difficulties to understand.
at the moment, i am particularly struggling with the need to remain as ‘neutral’ as possible in the methodological questions and choices for my upcoming fieldwork.
methodoloy – a drain to most undergraduate students and a pain even to more experienced researchers, is a subject that is often conceived to be very dry. for me (call me a nerd, that’s fine), methodology is in fact one of the most exciting factors in research. well, maybe not ‘exciting’ – but it is the part which makes me feel most like a legitimate academic 🙂 methodology, after all, is the art of finding the right tool to dig for the answers you are craving. so, here’s tip one:
tip one: methodology is an art – celebrate it!
the more you think about methodology as a mere ‘box’ you need to tick when writing your paper or as a dry-spill in your writing (let’s be fair, writing the methodology section has, in fact, more to do with being mechanical and precise than with writing anything remotely thrilling), the more you will be tempted to rush through your methodological considerations. this is where your first fault (and bias) lies.
instead, when you confront yourself with methodology, think of yourself as an artist, as a designer, as an inventor! imagine you are one of the first men (or women) to discover something like a nail. how exciting for you to now get creative and develop a tool that could help you put the nail into the wall. sure, you can put the nail into the wall with a brick, with a stone, or even with a screw driver. but the brick might become porous and its dirt will cover the floor, with the stone you end up hurting your fingers and the screw driver, while successfully putting the nail into the wall, has missed its original purpose by banging it against a nail instead of applying it with a screw. eventually – if you just put enough thought and maybe even some trial and error in it – you will end up developing ‘the hammer’.
you will see, by challenging yourself to develop your own methodological ‘hammer’ for your research, you will start going beyond the dry basics that you learn in class and this is where methodology becomes actual fun.
tip two: you need more than one tool
maybe it was my teacher, but maybe it was also my fault in not listening or not understanding – either way, it took me until my master’s level (trust me, i was as shocked as you are) to realise that in research, you might look for ‘one method’ but you look for two tools:
- one tool for data gathering
- and one tool for data analysis
with both of these tools, you have to ask yourself, whether a quantitative or a qualitative approach is more useful to your research. the answer to this question is provided by your research question – what is it that you are looking for? what is it that you want to know? is what you want to know about depth of meaning (qualitative research) or about comparing variables (like the level of education and chances to receive employment) or about cause and effect (both quantitative research)?
if you find that the way you answer the quantitative-qualitative question on your two methodological tools (asking yourself what exactly you want to or you are able to do), is significantly different to what your research question suggests, you might want to rethink your research question.
but be also aware that you can mix methods. if, for example, you are interested in a quantitative analysis, you can still gather data with a qualitative tool, should that be easier or more applicable to the context your are researching. you will just need to be careful to make sure that your qualitative gathering process adheres to the standards required for quantitative analysis, but there is nothing stopping you – let’s stay with the metaphor – to develop ‘your hammer’ just as you need it.
tip three: spend time on developing your indicators
last but not least, i want to share one insight i found today: many people tell you that data analysis and writing up take a lot of time. that is true! what they don’t tell you very often, though, is that the step from writing your first methodological approach (in order to apply for fieldwork, for example) to the actual design of the approach takes a lot of time too!
it is one thing to know, what you want to do and which tools you want to use. it is a completely different ball-game to figure out how exactly you are going to do it. think of this (metaphorical) example: your chosen tool is not a hammer, but a bicycle. and you read tons about how to ride a bike and how it functions. you know about all the quirks of human balance and the possible dangers of sidewalks. but, as anyone knows who has ever learned to ride a bike, practice is very different from theory. and it has much less to do with ‘facts and figures’ than it has to do with ‘feeling’ the right balance.
to translate the metaphor into academic example: let’s say you know that you are working with a qualitative approach in both data gathering and data analysis. you chose semi-structured interviews as your data gathering tool and thematic analysis as your data analysis tool. so, which questions are you going to ask? and in which order? you want to keep it even less structured? well, what are the topics you want to address? you might know your main issue, theme or concept. but how do you ask about it? if you are interested in whether someone is republican or democrat, you can just ask this question straight out, sure. but what does this tell you? not much. so what if you want to know how liberal a person is – what do you do then? you need to identify variables (or ‘indicators’) that you can help you figure out how liberal your interviewee is. but how do you make sure that you have the right indicators? how do you make sure that you did not forget some relevant elements? and (this is the question i currently struggle with the most) how do you avoid your own biases in developing these indicators? because, de facto, everything you define a priori has more to do with you – you as an individual, you as a researcher and you as a social being – than it has to do with the interviewee’s reality. maybe there are ‘generally acknowledged definitions’ of what constitutes ‘being liberal’ and you can start from there. maybe you start with the dictionary. but what if you are confronted with different social context (imagine a social context in which the variables ‘liberal’ and ‘conservative’ are not present? or are not perceived as relevant?) or with different languages (languages in which the direct translation has a different meaning than your original concept of interest)? all of these questions need to be considered before actually gathering data.
so, the tip is simple: plan for time to get into the nitty-gritty of your methodological tools, otherwise you end up with built-in biases that even an extensive reflection afterwards can no longer rectify.