Category Archives: Uncategorized

DTQR special issue available!

The special issue of Qualitative Inquiry based on papers from the 2015 ICQI conference are now available. Here is the abstract of our introduction, Digital Tools for Qualitative Research: Disruptions and Entanglements:

In this introduction to the special issue on digital tools for qualitative research, we focus on the intersection of new technologies and methods of inquiry, particularly as this pertains to educating the next generation of scholars. Selected papers from the 2015 International Congress of Qualitative Inquiry special strand on digital tools for qualitative research are brought together here to explore, among other things, blogging as a tool for meaning-making, social media as a data source, data analysis software for engaging in postmodern pastiche and for supporting complex teams, cell phone application design to optimize data collection, and lessons from interactive digital art that pertain to the use of digital tools in qualitative research. This collection disrupts common conceptions (and persistent misconceptions) about the relationship between digital tools and qualitative research and illustrates the entanglements that occur whenever humans intersect with the nonhuman, the human-made, or other humans.

Thank you to all of the authors for their hard work on these papers! They include Jessica MacLaren, Lorena Georgiadou, Jan Bradford, Caitlin Byrne, Amana Marie LeBlanc, Jaewoo Do, Lisa Yamagata-Lynch, Karla Eisen, Cynthia Robins, Judy Davidson, Shanna Rose Thompson, Andrew Harris and Kristi Jackson.

Report on ICQI 2017

Many thanks to everyone who helped make ICQI 2017 a success! We had another full two days of digital tools sessions, along with several pre-conference workshops. Stay tuned for a full report from Kristi Jackson, SIG chair, and in the meantime here are some photos from the event.

*We are especially happy to welcome the following folks as part of the organizing team for ICQI 2018: Caitlin Byrne, David Woods, Daniel Turner, Liz Cooper, Christian Schmeider, Leslie Porreau. and Maureen O’Neill. Thank you all!

**Proposals are generally due at the end of November for the following May conference – join us next year, and don’t forget to tag your submission as part of the Digital Tools for Qualitative Research special interest group!


Another full slate of digital tools presentations this year!


Christiane Page from the Qualitative Data Repository presents on the use of QDAS for archiving data.


Kristi passes on the SIG leadership to Caitlin!


Anne Kuckartz from MAXQDA speculates on the integration of quant and qual data


Gerben Moerman shares his software for engaging in collaborative research


Several sessions this year focused on negotiating social media as qualitative researchers


One featured event was a Wiki Hack of the qualitative research wikipedia pages.


Building the DTQR community


Kristi proposes guidelines for reporting the use of QDAS in research reports.

Wikipedia hack!

One of our most exciting events coming up at ICQI is the Digital Tools Wikihack. We hope to see you there!


ICQI Digital Tools Wikihack

Join us in updating the Wikipedia entries related to qualitative research: A Hands-on Experience. You are invited to learn more about Wikipedia and how you, as a qualitative researcher, could participate in the generation, editing, and critique of the information available to the larger world about qualitative inquiry.

Event information

  • Date: Friday, May 19, 2017
  • Time: 11am-12:20pm
  • Location: University of Illinois at Urbana–Champaign, Architecture Building, Rooms 205 (2nd floor), 608 E. Lorado Taft Dr., Champaign, IL 61820 / More accessibility information about building can be found here
  • Cost: Free, but please register (below)
  • Host: Ricker Library
  • Please bring a laptop and power cord with you

What to expect

We will provide tutorials for Wikipedia newcomers and an overview of the resource. Bring your laptop, power cord, and ideas for entries that need updating or creation. Stop by for a little bit or stay for the whole afternoon. No Wikipedia editing experience necessary!

If you are unfamiliar with Wikipedia, you might try this training module which will help explain a lot of things, including how to add your signature (by the way, signatures are created by saving four tildes [~] in a row).

See you at ICQI!

The final program is out, and you can find all of our Digital Tools SIG sessions!


Here are some important SIG events – hope to see you there!

Thursday, May 18

  • 7:30pm: Digital Tools for Qualitative Research social!
    • Joe’s Brewery, 706 5th Street (a 10-minute walk from the Illini Union)
    • The first drink is on us (while supplies last!), thanks to our generous sponsors: Atlas.ti, Dedoose, MAXQDA (Verbi), NVivo (QSR), QDAMiner (Provalis), Qualitative Data Repository, Queri, Quirkos, and Transana.

Friday, May 19

  • 11am – 12:20pm: Digital Tools for Qualitative Research Plenary
    • Join us in updating the Wikipedia entries related to qualitative research: A Hands-on Experience.  You are invited to learn more about Wikipedia and how you, as a qualitative researcher, could participate in the generation, editing, and critique of the information available to the larger world about qualitative inquiry.
      • 205 Architecture (a room just across the hall from the Ricker Library of Architecture and Art).
      • We strongly urge all who are coming to the Wikihack to bring their own computers, if possible.

Saturday, May 20

  • 11am -12:20pm: Digital Tools for Qualitative Research SIG meeting: We will elect officers, review our budget, collect feedback and plan for next year.
    • There will be time in the agenda for brief announcements and a table for the distribution of materials for attendees (feel free to bring announcements and/or handouts)

Call for paper proposals: IJSRM on Digital Methods

We were quite excited to see this call for papers:

We are seeking to propose a special issue for the International Journal of Social Research Methodology ( )- an interdisciplinary journal dedicated to exploring methodological developments in international contexts. The proposed special issue Digital qualitative methods in social research seeks to explore how qualitative research is being undertaken in, on and with the digital.

Whilst there has been a growing awareness and literature of the ethical parameters around doing research online (c.f. (Eysenbach & Till, 2001; Rodham & Gavin, 2006) and of the means for conducting particular types of online work, such as ‘netnography’ (Kozinets, 2002; Langer & Beckman, 2005) or online surveys (Sue & Ritter, 2011; Evans & Mathur, 2005) qualitative digital methods as a whole remains an area which is expanding, growing and evolving alongside the technologies and platforms with which it engages. Those engaging in digital work are often navigating new paths, utilising reflexive practice to understand the intersection of qualitative work and digital settings and it is these experiences and practices that this proposed special issue seeks to explore.

We are then seeking submissions for this proposed special issue, which if successful would be guest edited by Dr Chris Till and Dr Esmée Hanna (Leeds Beckett University). We are seeking abstracts of no more than 200 words outlining the premise of the paper and key arguments, including the main qualitative digital approach or perspective the paper will engage with. Papers will be selected based on originality, contribution to understanding of qualitative digital methods and international relevance. Papers from all career stages are welcome, and the process will be as supportive as possible to facilitate the involvement of PhD students and ECR. Abstracts will form part of the proposal to be sent to the journal editors and will be published if the special issue is selected.

Topics of papers may include, but are not limited to:

  • Theoretical engagements with ontological and/epistemological bases of particular digital methods/methodologies
  • Critical and evaluative syntheses of existing qualitative digital methodological approaches
  • A demonstration of, and critical engagement of with, an innovative digital method
  • Reflections on ethical issues in the use of particular digital methods

Please send your abstract and contact details to both and by the 31st May 2017. If you have any questions please contact us at the email addresses above.


To heck with Heidegger…

To Heck with Heidegger? Using QDAS for Phenomenology

Heidegger warned that the use of technology is dehumanizing, and prominent phenomenological methodologists like van Manen (2014) have claimed that QDAS packages “are not the ways of doing phenomenology” (p. 319). Those involved in the development of QDAS, on the other hand, sometimes regard hesitance to use digital tools in qualitative research as due to misconceptions or lack of experience. In this post I stake out a middle ground with advice for phenomenologists wary of technology but eager to take advantage of its strengths. This post is based a recent article in Forum: Social Qualitative Research (Sohn, 2017).


By Brian Kelleher Sohn

Brian is a recent graduate of the Learning Environments and Educational Studies Ph.D. program at The University of Tennessee, Knoxville. He is a phenomenologist in the field of education. He is currently working on a book about phenomenological pedagogy in higher education.

Max van Manen (2014), a primary source for many human sciences phenomenologists, claims that qualitative data analysis software (QDAS) is not an appropriate tool for phenomenological research. He claims that using “special software” may facilitate thematic analysis in such genres as grounded theory or ethnography, “but these are not the ways of doing phenomenology” (p. 319). He goes on to say that coding, abstracting, and generalization cannot produce “phenomenological insights” (p. 319). The appropriate tools, however, are absent from his and other phenomenological methodologists’ instructions. Must we use pencil and paper? Record interviews in wax? Write our findings with a typewriter?

Phenomenologists’ concerns about technology and its effects on the researcher are well articulated (if extreme) in Goble, Austin, Larsen, Kreitzer, and Brintnell (2012). Framing their concern’s with McLuhan’s (2003) “medium is the message” (p. 23) and Heidegger’s (2008) views on technology as dehumanizing, they argue that “through our use of technology we become functions of it” (§1). If I have a hammer, I am likely to become one-that-hammers, in the hyphen-prolific wording of students of Heidegger. The right tool for the job, as we say, makes it easy. But ease, in some sense, is the problem for Goble et al. We all know there’s a difference between writing a letter and writing an email: so what might be lost when we use QDAS instead of (more) manual methods?

“Nothing, you luddites!” QDAS users and developers might respond. Every now and then I notice unbridled enthusiasm in the pro-QDAS camp almost as uncritical as what van Manen and Goble et al. write about technology. Davidson and di Gregorio (2011) argued that the processes of qualitative research, no matter the genre, involve disaggregation and recontextualization of data. Any difference between the genres is attributed to “residue” (p. 639) from battles for legitimacy rather than from genuine concerns or criticisms. Time to get with the times!

I wasn’t sure about getting with the times, but I knew I needed a structure to support my phenomenological dissertation study (Sohn, 2016). Below I’ll share some of the strategies I used to reconcile my concerns with technology and my desire to finish my research and graduate.

As a well-mentored phenomenologist, I had participated tangentially in dozens of research studies (e.g., Bower, Lewis, Wright, & Kavanagh, 2016; Franklin, 2013; Dellard, 2013; Worley & Hall, 2012) as part of an interdisciplinary phenomenology research group. In this group we developed research questions, conducted and analyzed bracketing interviews, read and analyzed interview transcripts, and critiqued each other’s preliminary findings. For the most part, the primary investigator in these studies did not use QDAS. It is through this group that I developed an appreciation for face-to-face interaction to conduct phenomenological analysis. I worked with this group for my dissertation, but unlike many of my research group colleagues, I also used the QDAS program MAXQDA. In the space I have here, I’ll talk about three major areas of concern for phenomenologists and what I did to address them.


Seeing words on a screen is not the same as hearing them in an interview or other form of data, like an audio recording of a class session. For Goble et al. (2012), they felt their study participants were turned into zeros and ones when their transcripts were uploaded to QDAS, a Matrix- type nightmare for those who wish to maintain the embodied and cognitive aspects of their work. My solution to this problem was to go back to the audio recordings. I had to listen to them multiple times in auditing transcripts and to identify the relevant passages to my second-order analysis of the data. This immersion helped me see the participants’ data as living and breathing (sometimes kind of heavily). MAXQDA and other packages allow synchronization of the audio-recordings with the transcript so that you can easily listen while reading and analyzing.

Becoming a Tool

Dehumanization can run two directions—towards the participants and back at the researcher. Goble et al. (2012) worry that research studies, rather than opportunities for wonder and discovery, become problems to be solved with QDAS. Instead of artists, we can become functionaries of a capitalist-driven university system that demands results (and publications). This problem is much bigger than QDAS, but when using QDAS, one may feel the pressures of efficiency interfere with quality analysis and writing. For me, when I went to write my results chapter, I thought, “Oh, boy! All the hard work of coding and conceptualizing and memoing and logbooking in MAXQDA will pay off now!” So I started copying and pasting my chapter into existence. I soon found myself with writer’s block, uninspired despite looming deadlines. After realizing I was on autopilot, I returned to some motivational readings from Merleau-Ponty and was able to get back on track. He describes the goal of phenomenological writing as “establishing [the phenomenon] in the writer or the reader as a new sense organ, opening a new field or a new dimension to our experience” (Merleau-Ponty, 1968 [1964], p.182). I had to go back to MAXQDA supporting my writing, rather than driving it.


Maintaining wonder—challenging what is known through a careful examination of bias and positionality—is a key component of the phenomenological attitude. QDAS programs facilitate the ability to find literal similarities across documents, and the speed and efficiency may lead to superficial connections between study participants. A faint echo can by magnified when the coded segments pile up. I had to be diligent in my bracketing efforts, both inside and outside MAXQDA. Every four to six weeks I took printed transcripts to the research group for discussion, insight, confirmation, and contestation. In these sessions other members had the opportunity to help me examine what I thought I knew about my study at a granular and broad level. These times I spent outside of MAXQDA were also opportunities for distance from my data. We need immersion, we need to dwell in the words of participants, but without stepping back, it is difficult to successfully engage in the abductive thinking required for high-quality phenomenological insights.


As Shuhmann (2011) says, the QDAS user-interface “adds a layer of interpretation to qualitative analysis as one has to know how to ‘read’ a software package” (§2). This additional layer, the interface between user and QDAS platform, is where the following recommendations may best serve researchers (and more recommendations can be found in Sohn, 2017). In my case, I used MAXQDA to code and immerse myself in my study without feeling the participants had been atomized into cyborgs. I used memo and logbook features without turning my thoughts into restrictive categories. When I did feel as if an over-reliance on MAXQDA was hurting my study, I returned to the research group and reread phenomenological writings to reignite my motivation for producing the report. In order to maintain the phenomenological attitude while using QDAS, I recommend the following: keep your feet inside and outside the study and be diligent and exhaustive in bracketing.



Bower, K., Burnette, T., Lewis, D., Wright, C., & Kavanagh, K. (2016). “I Had One Job and That Was To Make Milk” Mothers’ Experiences Expressing Milk for Their Very-Low-Birth-Weight Infants. Journal of Human Lactation, 33(1), 188–194 DOI: 10.1177/0890334416679382

Davidson, J. & di Gregorio, S. (2011). Qualitative research and technology: In the midst of a revolution. In N. Denzin & Y. Lincoln (Eds.), Handbook of qualitative research (4th ed., (pp. 627-643). London: Sage.

Dellard, T. J. (2013). Pre-service teachers’ perceptions and experiences of family engagement: A phenomenological investigation. (Doctoral dissertation, University of Tennessee, Knoxville). Retrieved from

Franklin, K. A. (2013) Conversations with a phenomenologist: A phenomenologically oriented case study of instructional planning (Doctoral dissertation, University of Tennessee, Knoxville). Retrieved from

Goble, E.; Austin, W.; Larsen, D.; Kreitzer, L. & Brintnell, S. (2012). Habits of mind and the split-mind effect: When computer-assisted qualitative data analysis software is used in phenomenological research. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 13(2), Art. 2,

Heidegger, M. (2008 [1977]). The question concerning technology. In D. Farrell (Ed.), Basic writings (pp. 311-341). New York: Harper Perennial.

McLuhan, M. (2003 [1964]). Understanding media: The extensions of man. Toronto, ON: McGraw Hill.

Merleau-Ponty, M. (1968 [1964]). The visible and the invisible (ed. by C. Lefort, transl. by A. Lingis). Evanston, IL: Northwestern University Press.

Schuhmann, C. (2011). Comment: Computer technology and qualitative research: A rendezvous between exactness and ambiguity. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 12(1),

Sohn, B. (2016). The student experience of other students. Doctoral dissertation, University of Tennessee, Knoxville, USA,

Sohn, B. (2017). Phenomenology and Qualitative Data Analysis Software (QDAS): A Careful Reconciliation. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 18(1), Art. 14,

van Manen, Max (2014). Phenomenology of practice: Meaning-giving methods in phenomenological research and writing. Walnut Creek, CA: Left Coast Press.

Worley, J., & Hall, J. M. (2012). Doctor shopping: A concept analysis. Research and Theory for Nursing Practice26(4), 262-278.


Create your publication strategy course

You spent a lot of time researching and writing a thesis, dissertation, or capstone project. How can you convert your academic work into a career asset?

Join Dr. Janet Salmons (author of Doing Qualitative Research Online and Qualitative Online Interviews) and Dr. Helen Kara for their Create Your Publication Strategy course Feb 13-March 31.

Select “Digital Tools user” and get a $25 discount. (We are using the honor system– there is no code to enter.)

Here is a little background information:

  • Create Your Publication Strategy is not a writing course, its a “what do I do with my writing?” course. We now have many options for publishing our work: blogs to peer-reviewed articles, book chapters,books with a publisher, self-published e-books, case studies, or manuals. Which mix of options aligns with academic or professional career goals? The course is designed to help participants reflect on their goals and set priorities, and then to create a plan that maps out types and timing for publications that will help them move forward.

  • This is a course for people who have completed (or almost completed) doctoral degrees. Participants have written dissertations and/or theses as well as other significant academic and/or professional papers, reports, or articles. Through exercises, discussion, and instructor feedback they will carefully review these pieces of writing to assess their publication potential and to determine steps needed to update existing work or to develop new work based on their findings.
  • This is not a MOOC. While the course is flexible and self-paced, participants can share ideas with an intimate learning community of other academic writers, and receive feedback from Dr. Helen Kara and Dr. Janet Salmons. Two live online meetings will be held during the course.
  • The community will continue beyond the 6-week course through an email list and a community of practice. Helen and Janet will share resources and occasional webinars with course alumni.

We are holding a tweetchat next week to discuss publication strategies for academic writers. You can also find more about publication strategy in this SAGE Methodspace post: Research > Publication > Impact (You Might Need a Strategy for That).

Using spreadsheets as a qualitative analysis tool

Don’t have access to qualitative data analysis software? This week’s blog post provides some tips on how to prepare your transcripts to work within Microsoft Excel or Google Sheets.


By Dustin De Felice and Valerie Janesick

Are you in the middle of a research project? Or are you just starting and thinking about the possible tools you may need during analysis? Given the variety of choices out there, we discuss one choice not often thought of in qualitative research: spreadsheets. While there are a number of spreadsheet programs or Apps available, we focus on the steps needed to prepare your files for these two: MS Excel and Google Sheets. Either of these spreadsheets are good options, though each one has its own advantages (view our handout for a visual overview). In De Felice and Janesick (2015), we outline steps to take with Excel. For this discussion, we focus on preparing your transcripts so that they are ready to be imported into a spreadsheet. We recommend starting with the technique described by Meyer and Avery (2009). They provide an effective and simple technique for importing transcription files into Excel (version 2003). They recommend using Excel because it handles large amounts of data (numeric and text). It provides the researcher with ways of adding multiple attributes during the analysis process. Lastly, it allows for a variety of display techniques that include various ways of filtering, organizing and sorting.

In De Felice and Janesick, (2015), we recommend using Excel because the format mirrors many phenomenological methods (Giorgi, 2009; Moustakas, 1994). In addition, spreadsheet programs are powerful, yet not as overwhelming as other data analysis tools. In De Felice and Janesick (forthcoming), we extend our recommendation beyond just the use of Excel because there are a number of features available when utilizing Google Sheets. One benefit of Sheets is that it is widely available and most features are free. It allows for collaborative opportunities (including same time/same sheet editing). This feature is available within Excel, but there are a number of barriers that can limit this functionality (e.g. cost, different versions, etc.). If you are in the process of choosing between Excel and Sheets, we encourage you to read this blog post. In general, we recommend either spreadsheet for most projects, though there are some compelling reasons to use one over the other.

In terms of design, we recommend asking yourself this question before transcribing: “What unit do you have in mind when you think about your analysis?” (and Mayer & Avery, 2009). The answer to this question can help you decide how to best use either program. For example, will you use the turns in a conversation as a unit of measure, or will you use the phrase or sentence level as a unit? Of course, there are other ways to analyze the data, so we recommend starting with Meyer and Avery and their discussion on the “codable unit” (2009, p. 95-96). Before describing some of our steps, we would like to note that these steps are most effective for a researcher conducting and transcribing interviews (from audio).

Once you have identified that codable unit, you need to establish some conventions within your transcriptions to ensure your work will import. In Excel and Sheets, you can use these as separator characters: Tabs, Commas or Custom (essentially any single symbol including a hard tab as shown in figure 1). These separator characters will dictate how your text will fit into the cells and across the rows and down the columns. While there are ways of changing these separator characters, it is best to establish a convention within your transcriptions and stick with it throughout your research project.

Figure 1: Screenshot of the options available for importing a file into Google Sheets.


This import step is necessary because spreadsheets are not the easiest tools to use when transcribing. Instead we recommend using a word processor to create your transcripts in a text (.txt) format. In figure 2, we provide a screenshot from within a word processor. In the background of the screenshot, we have an example transcription that used a turn in conversation as the codable unit and the colon as a separator character.

Figure 2: Screenshot of Google docs file that we downloaded as plain text (.txt).


We have used different word processors and our preference for working with transcriptions is to use MS Word. We provide detailed steps in De Felice and Janesick (2015) on p. 1585. Our main reason for this choice is that there is no way to show formatting marks in Google Docs (see Figure 3 for a list of formatting marks available in MS Word. These features are not visible in Google Docs). These formatting marks are essential in properly preparing the transcription for importation. There is hope for an eventual add-on to correct this oversight in Google Docs, but there is currently no workaround.

Figure 3: Screenshot of formatting marks within MS Word.


Once your text file is ready for import, you can use the spreadsheet as an analysis tool. For phenomenological studies, we outline a few ways of using Excel in this capacity in De Felice and Janesick (2015). However, we recommend throughout your research project that you keep in mind this fantastic advice from Meyer and Avery: “All research projects (and researchers) are not the same. What works for one project may not be best for another.” (2009, p. 92) We offer the same advice for our suggestions here.



De Felice, D., & Janesick, V. J. (2015). Understanding the marriage of technology and phenomenological research: From design to analysis. The Qualitative Report, 20(10), 1576-1593. Retrieved from

Giorgi, A. (2009). The descriptive phenomenological method in psychology. A modified Husserlian approach. Pittsburgh, PA: Duquesne University Press.

Meyer, D. Z., & Avery, L. M. (2009). Excel as a qualitative data analysis tool. Field Methods, 21, 91-112. DOI: 10.1177/1525822X08323985

Moustakas, C. (1994). Phenomenological research methods. Thousand Oaks, CA: Sage.


Five-Level QDA

As we look ahead to ICQI 2017, we are sharing a few more reflections on this year’s conference. Here, Christina Silver and Nicholas Woolf describe their work on Five-Level QDA


Christina Silver and Nicholas H Woolf, Five-Level QDASM

In the Digital Tools Stream at ICQI 2016 we presented two papers discussing our work to develop and implement the Five-Level QDASM Method, a CAQDAS pedagogy that transcends methodologies, software programs and teaching modes. The first, called “Operationalizing our responsibilities: equipping universities to embed CAQDAS into curricular” was presented by Christina in the opening plenary session. The second, called “Five-level QDA: A pedagogy for improving analysis quality when using CAQDAS” was presented jointly by Nicholas and Christina.  Here we briefly summarize these two papers. You can find out more about the Five-Level QDA method and our current work by visiting our website.

Responsibilities for effective CAQDAS teaching in the digital age

There is an expanding range of digital tools to support the entire process of undertaking qualitative and mixed methods research, and current generations of students expect to use them (Paulus et al., 2014), whatever their disciplinary, methodological or analytic context. Although many researchers use general-purpose programs to accomplish some or all of their analysis, we focus on dedicated Computer Assisted Qualitative Data AnalysiS (CAQDAS) packages. CAQDAS packages are now widely used and research illustrates that uptake continues to increase (White et al. 2012; Gibbs, 2014; Woods et al. 2015). However, there’s little evidence that their use is widely embedded into university curricula. There may be several reasons for this (Gibbs, 2014), including the difficulty of attending to diverse learner needs, which are affected by learners’ methodological awareness, analytic adeptness and technological proficiency (Silver & Rivers, 2015).

These issues highlight the importance of developing effective ways of embedding CAQDAS teaching into university curricula. This long-standing issue has been debated for as long as these programs have been available, and it is widely agreed that the appropriate use of digital technologies must be taught in the context of methodology (Davidson & Jacobs, 2008; Johnston, 2006; Kuckartz, 2012; Richards, 2002; Richards & Richards, 1994; Silver & Rivers, 2015; Silver & Woolf, 2015). However, few published journal articles illustrate teaching of CAQDAS packages as part of an integrated methods courses (recent exceptions are Paulus & Bennett, 2015; Bourque & Bourdon, 2016 and Leitch et al., 2016). Several reflective discussions regarding the integration of methodological, analytical and technological teaching are insightful and useful (e.g., Carvajal, 2002; Walsh, 2003; Davidson & Jacobs, 2008; di Gregorio & Davidson, 2008), as are concrete examples of course content, modes of delivery, course assignments and evaluation methods (e.g., Este et al., 1998; Blank, 2004; Kaczynski & Kelly, 2004; Davidson et al., 2008; Onwuegbuzie et al., 2012; Leitch et al., 2015; Paulus & Bennett, 2015; Bourque & Bourdon, 2016; Jackson, 2003). However, these writers provide varying degrees of detail about instructional design and are contextually specific, focusing on the use of a particular CAQDAS program, a disciplinary domain, and/or a particular analytic framework. Their transferability and pedagogical value may therefore be limited where there is an intention to use different methodologies, analytic techniques and software programs.

There are clearly challenges and lack of guidance in the literature for teaching qualitative methodology, analytic technique and technology concurrently. Although the challenges are real they need not be seen as barriers. A pedagogy that transcends methodologies, analytic techniques, software packages and teaching modes could prompt a step-change in the way qualitative research in the digital environment is taught (Silver & Woolf, 2015). The Five-Level QDA method  is designed as such a pedagogy with the explicit goal of addressing these challenges.

The Five-Level QDA Method: a CAQDAS pedagogy that spans methodologies, software packages and teaching modes

The Five-Level QDA method (Woolf, 2014; Silver & Woolf, 2015; Woolf & Silver, in press) is a pedagogy for learning to harness CAQDAS packages powerfully. The phrase “harness CAQDAS packages powerfully” isn’t just a fancy way of saying “use CAQDAS packages well”, but means using the chosen software from the start to the end of a project, while remaining true throughout to the iterative and emergent spirit of qualitative and mixed methods research. It isn’t a new or different way of undertaking analysis, but explicates the unconscious practices of expert CAQDAS users, developed from our experience of using, teaching, observing and researching these software programs over the past two decades. It involves a different way of harnessing computer software from a taken-for-granted or common sense approach of simply observing the features on a computer screen and looking for ways of using them.

The principles behind the Five-Level QDA Method

The core principle is the need to distinguish analytic strategies – what we plan to do – from software tactics – how we plan to do it. As uncontroversial as this sounds, strategies and tactics in everyday language are commonly treated as synonyms or near-synonyms, leading unconsciously to the methods of a QDA and the use of the CAQDAS package’s features being considered together as a single process of what we plan to do and how we plan to do it. A consequence of this is that the features of the software to either a small or a large degree drive the analytic process.

The next principle is recognizing the contradiction between the nature of CAQDAS which is to varying degrees iterative and emergent, and the predetermined steps or cut-and-dried nature of computer software. When this is not consciously recognized, either the strategy is privileged, with the consequence that the software is not used to its full potential throughout a project; or the tactics are privileged, with the consequence that the iterative and emergent aspects of a QDA are suppressed to some degree. However, when the contradiction is consciously recognized it becomes necessary to reconcile it in some way. One approach is through a compromise, or trade-off, in which the analytic tasks of a project are raised to a more general level and expressed as a generic model of data analysis in order to more easily match the observed features of CAQDAS packages (terms in italics have a specific meaning in Five-Level QDA).

The Five-Level QDA method, following Luttwak’s (2001) five level model of military strategy, takes a different approach to reconciling the contradiction between strategies and tactics by placing it in a larger context in order to transcend the contradiction. Regardless of research design and methodology, there are two levels of strategy – the project’s methodology and objectives (Level 1), and the analytic plan (Level 2) that arises from those objectives. There are similarly two levels of tactics – the straightforward use of software tools (Level 4) and the sophisticated use of tools (Level 5). We use the term tools in a particular way. We are not referring to software features, but ways of acting on software components – things in the software that can be acted upon. Whereas CAQDAS packages have hundreds of features, they have far fewer components, typically around 15-20.

The critical middle level between the strategies and tactics is the process of translation (Level 3).Rather than raise the level of analytic tasks to the level of software features, the level of analytic tasks is lowered to the level of its units, which are then matched, or translated, to the components of the CAQDAS package. This method ensures that the direction of action of the process is always initiated in a single direction: from analytic strategies to software tactics, and never the other way around. This ensures that the analytic strategies drive the analytic process, not the available features of the software. Because translation operates at the level of individual analytic tasks the method is relevant across methodologies and software programs. Figure 1 provides an overview of the Five-Level QDA method.

Figure 1. Five-Level QDA Chart

two levels of strategy >>>>> translated to >>>>> two levels of tactics
Level 1 Level 2 Level 3 Level 4 Level 5

The purpose and context of a project, usually expressed as research questions and a methodology

Analytic plan

The conceptual framework and resulting analytic tasks


Translating from analytic tasks to software tools, and translating the results back again

Selected tools

Straightforward choice of individual software operations

Constructed tools

Sophisticated use of software by combining operations or performing them in a custom way

Several people at our presentations fed-back to us that this chart implies the process is linear and hierarchical, which of course is misleading because all forms of QDA are to varying degrees iterative and emergent. Since ICQI, therefore, we have created a diagram that we hope more accurately reflects the iterative, cyclical nature of the process (Figure 2).

Figure 2. Five-Level QDA diagram


Tools for teaching the Five-Level QDA Method

In order to illustrate translation in the context of learners’ own projects we have developed two displays: Analytic Overviews (AOs) and Analytic Planning Worksheets (APWs). AOs provide a framework for the development of a whole project, described at strategies Levels 1 and 2, whereas APWs scaffold the process of translation, facilitating the skill of matching the units of analytic tasks to software components in order to work out whether a software tool can be selected, or needs to be constructed. We don’t have space here to illustrate translation in detail, but further details about teaching the Five-Level QDA method via the use of AOs and APWs are discussed in Silver & Woolf (2015) and Woolf & Silver (in press).

Implementing and researching the Five-Level QDASM Method

The Five-Level QDA method provides an adaptable method for teaching and learning CAQDAS. Since 2013 we’ve been using it in our own research and teaching in many different contexts, and we’ve used feedback from our students and peers to refine the AOs and APWs. We’re also and working with several universities to improve provision of CAQDAS teaching, using the Five-Level QDA method in different learning contexts.

We believe that the Five-Level QDA method is adaptable to a range of instructional designs including different face-to-face workshop designs and remote learning and via textbook and complimentary video-tutorials (Woolf & Silver, in press). This is because it provides a framework through which qualitative and mixed methods research and analysis can be taught at the strategies level within the context of digital environments. Instructional designs that teach qualitative methodology and analytic techniques via the use of CAQDAS packages illustrate that qualitative methodology and technology need not be introduced to students separately (e.g. Davidson et al, 2008, Bourque 2016, Leitch 2016). The Five-Level QDA method pre-supposes that it is not possible to adequately teach technology without methodology, but we would also argue that in our increasingly digital environment it is increasingly less acceptable to adequately teach methodology without technology. The intentionally separate emphasis given to analytic strategies and software tactics within a single framework enables the teaching of methodology and technology concurrently within an instructional design that is adaptable to local contexts, as well as serving as a method to harness CAQDAS for researchers’ own projects.

We are currently evaluating our work and welcome opportunities to work with others to implement and further research the effectiveness of the Five-Level QDA method in different contexts.


Blank, G. (2004). Teaching Qualitative Data Analysis to Graduate Students. Social Science Computer Review22(2), 187–196.

Bourque, C. J., & Bourdon, S. (2016). Multidisciplinary graduate training in social research methodology and computer-assisted qualitative data analysis: a hands-on/hands-off course design. Journal of Further and Higher Education9486(April), 1–17.

Carvajal, D. (2002). The Artisan ’ s Tools . Critical Issues When Teaching and Learning CAQDAS. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research3(2, Art 14).

Davidson, J. (n.d.). Thinking as a Teacher : Fully Integrating NVivo into a Qualitative Research Class.

Davidson, J., & Jacobs, C. (2008). The Implications of Qualitative  Research Software for Doctoral  Work. Qualitative Research Journal8(2), 72–80.

Davidson, J., Jacobs, C., Siccama, C., Donohoe, K., Hardy Gallagher, S., & Robertson, S. (2008). Teaching Qualitative Data Analysis Software ( QDAS ) in a Virtual Environment : Team Curriculum Development of an NVivo Training Workshop. In Fourth International Congress on Qualitative Inquiry (pp. 1–14).

Este, D., Sieppert, J., & Barsky, A. (1998). Teaching and Learning Qualitative Research With and Without Qualitative Data Analysis Software. Journal of Research on Computing in Education31(2), 17.

Gibbs, G. R. (2014). Count: Developing STEM skills in qualitative research methods teaching and learning. Retrieved from

Jackson, K. (2003). Blending technology and methodology. A shift towards creative instruction of qualitative methods with NVivo. Qualitative Research Journal, 3(Special Issue), 15.

Johnston, L. (2006). Software and Method: Reflections on Teaching and Using QSR NVivo in Doctoral Research. International Journal of Social Research Methodology9(5), 379–391.

Kaczynski, D., & Kelly, M. (2004). Curriculum Development for Teaching Qualitative Data Analysis ONline. QualIT 2004: International Conference on Qualitative Research in IT and IT in Qualitative Research, (November), 9.

  1. Leitch, J., Oktay, J., & Meehan, B. (2015). A dual instructional model for computer-assisted qualitative data analysis software integrating faculty member and specialized instructor: Implementation, reflections, and recommendations. Qualitative Social Work

Onwuegbuzie, A. J., Leech, N. L., Slate, J. R., Stark, M., Sharma, B., Frels, R., … Combs, J. P. (2012). An exemplar for teaching and learning qualitative research. The Qualitative Rport17(1), 646–647. Retrieved from

Paulus, T. M., & Bennett, A. M. (2015). “I have a love–hate relationship with ATLAS.ti”TM: integrating qualitative data analysis software into a graduate research methods course. International Journal of Research & Method in Education, (June), 1–17.

Silver, C., & Rivers, C. (2015). The CAQDAS Postgraduate Learning Model: an interplay between methodological awareness, analytic adeptness and technological proficiency. International Journal of Social Research Methodology5579(September), 1–17.

Silver, C., & Woolf, N. H. (2015). From guided-instruction to facilitation of learning : the development of Five-level QDA as a CAQDAS pedagogy that explicates the practices of expert users. International Journal of Social Research Methodology, (July 2015), 1–17.

Walsh, M. (2003). Teaching Qualitative Analysis Using QSR NVivo 1. The Qualitative Report8(2), 251–256. Retrieved from

White, M. J., Judd, M. D., & Poliandri, S. (2012). Illumination with a Dim Bulb? What do social scientists learn by employing qualitative data analysis software (QDAS) in the service of multi-method designs? Sociological Methodology42(1), 43.–76.

Woods, M., Paulus, T., Atkins, D. P., & Macklin, R. (2015). Advancing Qualitative Research Using Qualitative Data Analysis Software (QDAS)? Reviewing Potential Versus Practice in Published Studies using ATLAS.ti and NVivo, 1994–2013. Social Science Computer Review, 0894439315596311.

Woolf, N. H., & Silver, C. (in press). Qualitative analysis using ATLAS.ti: The Five-Level QDA Method. London: Routledge.

Woolf, N. H., & Silver, C. (in press). Qualitative analysis using MAXQDA: The Five-Level QDA Method. London: Routledge.

Woolf, N. H., & Silver, C. (in press). Qualitative analysis using NVivo: The Five-Level QDA Method. London: Routledge.