Transparency, QDAS, and Complex Qualitative Research Teams

by Judith Davidson

Judith Davidson is an Associate Professor in the Research Methods and Program Evaluation Ph.D. Program of the Graduate School of Education, University of Massachusetts Lowell.  She is a founding member of the ICQI Digital Tools Special Interest Group.  She is currently working on a book about complex qualitative research teams, which she is writing in NVivo!

Qualitative researchers increasingly find themselves working as members of a complex research team. Multiple members, multiple disciplines, geographically dispersed—these are just some of the forms of diversity that we face in our research endeavors.  Many of these research teams employ Qualitative Data Analysis Software (QDAS).

While there are many reasons to use QDAS in complex team research, the one I wish to talk about here is support of transparency: making clear how results were reached or showing proof of the process of interpretation that indicates the conclusions are believable. Transparency has long been held up as a virtuous and important notion in qualitative research, but as with many things in qualitative research, many of our descriptions relate to individually conducted research, not team-based research projects. Moreover, our considerations of transparency have not yet made much sense of qualitative research conducted with QDAS.

As part of the 2016 ICQI Digital Tools strand and a panel examining issues related to QDAS use with complex teams, I presented a paper titled “Qualitative Data Analysis Software Practices in Complex Qualitative Research Teams:  Troubling the Assumptions about Transparency (and Portability)” (Davidson, Thompson, and Harris, under review) that sought to get at some of the issues that arise at the nexus of complex teams, qualitative research, QDAS, and transparency.

Our paper applied Jackson’s notion of transparency-in-motion (Jackson, 2014) to the methodological process of a complex team project in which we had been engaged, Building a Prevention Framework to Address Teen “Sexting” Behaviors, or the ‘Sexting Project’  (Harris et al; Davidson, 2014).  Jackson’s ideas were derived from a study of ‘lone ranger’ researchers, doctoral students using QDAS in their own, individual research work.  In contrast, the goal of our paper was to demonstrate how Jackson’s descriptive categories of transparency-in-motion (triage, show, and reflect) are enacted by real teams working with real world restrictions that teams often face in trying to use QDAS.  In the article, we follow the development of one finding from the Sexting Project, the continuum of sexting, to show how QDAS use wove in and out of the stages of triage, show, and reflect as this term evolved for the research team (Davidson, 2011).

The Sexting Study was conducted by a multi-disciplinary team located at three institutions of higher education in three regions of the United States.  Focus group data about views of teen sexting was collected from three separate audiences at these locations; teens, teen caregivers, and those who worked with and for teens.  It was one of the first qualitative research studies conducted on the topic of sexting.  All data collected for the study was organized in an NVivo database maintained by the lead site.

The following table gives a quick and dirty overview of the discussion from the paper.

Sub-categories of Jackson’s notion of Transparency-in-motion (2014) Applications from the Sexting Project (Harris, et al, 2013), illustrating the malleable relationship of transparency and QDAS
Triage: Emphasize, Sort, Classify In NVivo, Davidson and Thompson coded focus group responses to “Why do youth sext?”  Noticed differences in regard to relationship, peer group, and gender.
Show:  Share, Illustrate, Hold-up In full team meetings, Davidson and Thompson used NVivo to examine the responses to this question and to dig down into the differences noted.
Reflect:  Examine Content, Negotiate Meaning Full group reflects and develops the notion of “the continuum of sexting”. Davidson and Thompson return to recode the responses in NVivo as points on this continuum: Mutual Interest; Self Interest; Intent to Harm

Analysis of our process demonstrated that NVivo fulfilled possibilities for triage-in-motion (triage, show, and reflect) through deep individual analysis with the tool and broader episodic analysis with the full research team.  Despite differential access to NVivo by team members (only lead team had a site license), the tool was able to offer all researchers better opportunities for working with the data and visualizing relationships within the data. Despite these restrictive circumstances, the QDAS tool could play this role, because there was senior leadership knowledgeable and experienced in the use of the tool who could support full group opportunities to use and think with it.

These findings indicate Jackson’s notion of transparency-in-motion has relevance to both individual researchers and research teams.  Using the key criteria of triage, show, and reflect, researchers were able to manage the use of QDAS as they continuously worked toward transparency-in-motion under less than ideal conditions.   Discussions of transparency have long been prominent in qualitative research, but we suggest that in today’s post-modern/post-structural world transparency-in-motion may be a more useful perspective for qualitative researchers to adopt.

 

References

Davidson, J. (2011).  Qualitative research network at UMass Lowell:  Brown Bag on “Giving Birth to Theory in Qualitative Research:  Adolescents and the Sexting Continuum”. Wednesday November 9, 2011.  

Davidson, J. (2014).  Sexting: Gender and Teens.  The Netherlands: Sense Publications.

Davidson, J., Thompson, S., Harris, A., (under review).  Qualitative Data Analysis Software Practices in Complex Qualitative Research Teams:  Troubling the Assumptions about Transparency (and Portability).

Harris, A., Davidson, J., Letourneau, E., Paternite, C., Miofshky, K.T. (September 2013.  Building a Prevention Framework to Address Teen “Sexting” Behaviors.  (189 pgs).  Washington DC: U.S. Dept. of Justice Office of Juvenile Justice & Delinquency Prevention.

Jackson, K. (2014).  Qualitative Data Analysis Software, Visualizations, and Transparency: Toward an Understanding of Transparency in Motion.  Paper presented at the Computer Assisted Qualitative Data Analysis conference, May 3, 2014.  Surrey, England.

 

Points of view or opinions in this document are those of the author and do not necessarily represent the official position or policies of the U.S. Department of Justice, which funded the project.

 

 

 

Call for papers: ICQI2017

And community updates!

Thank you to all of the presenters at the 2016 International Congress of Qualitative Inquiry (ICQI)! We welcomed 49 presenters to our Special Interest Group (SIG) on Digital Tools for Qualitative Research (DTQR), and thanks to our 14 sponsors, we gave away 30 books and software licenses during the raffles. We once again had a wonderful conference experience, and look forward to seeing many of you gain in 2017.

ICQI 2017

  • Plan now to submit a paper proposal to our DTQR SIG for 2017. Submissions are due December 1 and guidelines are available on the ICQI website (http://icqi.org/). See our prior mini-programs for ideas:
  • If you are interested in helping us review proposal submissions, serving as a session chair, organizing the program, or planning any special events, please let us know.
  • There will be several DTQR-related workshops in 2017 – offered by Sharlene Hesse-Biber, Kristi Jackson, Anne Kuckartz, Trena Paulus & Jessica Lester. Check them out! http://icqi.org/home/workshops/
  • During our SIG meeting at ICQI 2017 we will propose a leadership structure and hold elections for officers. Positions will include SIG chair (currently Kristi Jackson), program chair (currently Judith Davidson) and outreach/marketing chair (currently Trena Paulus). Position descriptions will be posted on the Digital Tools website prior to the conference for your review and consideration and we will send a comprehensive message regarding this and other conference events a few weeks prior to ICQI 2017.

News and notes

  • Our special issue of Qualitative Inquiry on DTQR, based on the 2015 conference papers, is now in press. We will announce its publication through our social media outlets (see below).
  • Send us your news! What are you publishing? What other conferences are you attending? What workshops are you hosting? We are happy to share this with the community via our social media outlets.
  • We have launched the DTQR blog series and welcome your contributions. Want to share what you’ve been working on, thinking about or theorizing? Contact daniel@quirkos.com with your idea.
  • We enjoyed seeing many of you at the KWALON conference in Rotterdam in late August where we continued conversations about the future of QDA software. (http://www.kwalon.nl/kwalon-conference-2016) and where the final session on interoperability was videotaped: https://www.youtube.com/watch?v=sU2hv4N6d6I
  • We would love updates from anyone who attended the Qualitative Methods and Research Technologies Research Summit in Cracow, September 1-3 (http://www.esa-cracow.pl/register.html). Consider contacting  daniel@quirkos.com with your idea for a blog post.

Upcoming events

Follow us on social media:

Teaching qualitative research online

In this age of digital tools, we are using digital tools not only to do our research, but to teach the next generation of scholars. In this month’s blog post, Kathryn Roulston and Kathleen deMarrais from the University of Georgia describe their foray into the world of teaching qualitative research online.

***

Teaching qualitative research online

Kathryn Roulston and Kathleen deMarrais

“I hate online teaching!” That is a sentiment that we have heard expressed by some teachers in higher education. In addition to the workload involved in teaching online, many faculty members are skeptical about the technical support they will receive for learning how to teach online, as well as the outcomes of online education for their students. Although we too have experienced our own doubts at times, we have voluntarily developed online coursework for the Qualitative Research program at the University of Georgia. We have learned much from this journey into online education, and are happy with the outcomes for our students.

To begin, we started with courses that we had taught for many years in face-to-face contexts. Over a period of years, we developed the online course content as we taught hybrid versions of our courses that involved both face-to-face and online instruction. In 2014, we began to offer the core courses of our on-campus certificate in Interdisciplinary Qualitative Studies in a fully online format. Judith Preissle and Kathleen deMarrais’s notion of “qualitative pedagogy” informs our approach to online teaching. That is, we approach teaching in the same way we approach doing qualitative research: by being responsive, recursive, reflexive, reflective, and contextual.

We have attended to how students respond and engage with online coursework by conducting a research study (with Trena Paulus and Elizabeth Pope) to look at graduate students’ perceptions and experiences of online delivery of coursework. Over the past two years 18 students from four courses have been interviewed. We have also looked at naturally-occurring interaction that occurs during the course. Below we share some of what we have found to date.

Why do students choose online coursework?

Convenience, flexibility and accessibility. Students take online coursework because they perceive it to be convenient and accessible for commuters and distance students. On-campus students who are not commuters also choose to take online coursework because not having to attend face-to-face classes allows more flexibility in individual schedules. We have found that even students who have had poor prior experiences in online courses, or prefer to take coursework face-to-face will choose online coursework if it is thought be more convenient or accessible.

Learning through reading and writing. Some students prefer taking coursework online rather than face-to-face, because they appreciate the intensive reading and writing involved. Further, since students in our courses are required to post weekly, more reticent students are more visible than they might otherwise be in face-to-face contexts. Students are also able to revisit online resources, something that is appreciated by English language learners especially.

What are the challenges of learning online?

Managing the weekly schedule. To be successful in the online learning context, students must learn to manage the weekly schedule in new ways. Students in our coursework are expected to log in to the Learning Management System (LMS) several times a week, and if they do not do this will easily fall behind and feel overwhelmed.

Course design and organization. Understanding what is expected each week and locating relevant materials may be an initial challenge for students. This is aided when courses use a repetitive design in which content materials are organized in a standardized format with expectations for what is required each week clearly identified. Students are expected to access content in modules that are organized similarly, and post and respond to one another’s work at the same time each week. Although several of our students reported that they perceive this kind of routinization as somewhat tedious, it appears to help the majority of students to accomplish tasks successfully.

Mastering content. Students view learning about approaches to qualitative inquiry and how to design and conduct qualitative research on a topic of individual interest as challenging. Since courses we have taught are delivered asynchronously, students need to let instructors know when they are unclear about what is expected of them. Several students have reported that they are reluctant to ask questions of the instructor online – although multiple forums are provided for them to ask questions related to content, assignments, or technical issues. Students also have opportunities to meet with instructors in an online meeting room to discuss questions, although not all students take advantage of these meetings.

Interacting with others online. Students report that they want to engage and connect with the instructor and other students in authentic ways online. Some students perceive that classmates do not always post in authentic ways. In online learning contexts, misunderstandings can easily occur– so both instructors and students must learn how to communicate effectively in asynchronous formats. Most of all, students appreciate constructive feedback from their instructors and classmates, and several have told us that they have formed friendships with others that extended beyond the life of a course.

How do students assess the learning outcomes of qualitative coursework delivered online?

Overall, students assess the coursework that we have taught online positively and the learning outcomes as equivalent to coursework taught face-to-face. Students are able to meet their learning goals with respect to learning about methods used by qualitative researchers (e.g., interviewing; observing; document analysis; data analysis); how to design a qualitative study; the relationship of theory in doing research; assessment of quality; and writing up findings from qualitative studies.

What helps students to be successful in online learning contexts?

Students report that effective learning is facilitated by:

  • Clear organization of course content;
  • Use of a variety of multi-media resources, including screencasts and videos, audio-enhanced presentations, quizzes, course readings and so forth;
  • Timely feedback from the instructor; and
  • Course projects that contribute to individual research interests.

What have we learned about teaching qualitative research online?

Learning about our students’ perspectives has helped us to understand the value of:

  • Modeling and encouraging students to be authentic in how they represent themselves online (e.g., posting personal photos and/or videos, organization of synchronous online meetings, and sharing of personal details about one’s off-line life).
  • Providing specific guidance to students in being reflexive with respect to their learning preferences and management of their course schedule in order to develop self-directed learning skills.
  • Structuring activities and assignments in ways that build upon one another across the course sequence.
  • Providing clear instructions and content organization to assist students as they navigate the Learning Management System to locate materials and complete assigned tasks.
  • Delivering reminders in multiple modes (e.g., email, videos, news items etc.) to assist students in managing the weekly schedule.
  • Encouraging students to be reflective about how they learn and interact with others in the online environment.
  • Attending to students as individuals, even as we also consider the wider context in which any particular course is being taught.

As teachers of qualitative inquiry, we have been challenged to adapt to changes in student populations with whom we work and innovate in how we teach. Learning to teach online has prompted us to think carefully about how we teach qualitative inquiry and what we expect students to know and be able to do as a result of engaging in our courses. We continue to be excited about learning to teach online and how we might engage students in digital spaces. We know that we still have much to learn about effective online teaching.

For more information on qualitative pedagogy, see:

Preissle, J., & deMarrais, K. (2011). Teaching qualitative resesearch responsively. In N. Denzin & M. D. Giardina (Eds.), Qualitative inquiry and global crisis (pp. 31-39). Walnut Creek, CA: Left Coast Press.

Preissle, J., & DeMarrais, K. (2015). Teaching reflexivity in qualitative research: Fostering a life style. In N. Denzin & M. D. Giardina (Eds.), Qualitative inquiry and the the politics of research (pp. 189-196). Walnut Creek, CA: Left Coast Press

For more information on the University of Georgia’s Online Graduate Certificate in Interdisciplinary Qualitative Studies, visit our website.

Speculating on the future of digital tools

We are happy to announce the publication of an article in Qualitative Inquiry:

Speculating on the Future of Digital Tools for Qualitative Research

Judith Davidson, Trena Paulus, and Kristi Jackson

Development in digital tools in qualitative research over the past 20 years has been driven by the development of qualitative data analysis software (QDAS) and the Internet. This article highlights three critical issues for the future digital tools: (a) ethics and the challenges, (b) archiving of qualitative data, and (c) the preparation of qualitative researchers for an era of digital tools. Excited about the future and the possibilities of new mash-ups, we highlight the need for vibrant communities of practice where developers and researchers are supported in the creation and use of digital tools. We also emphasize the need to be able to mix and match across various digital barriers as we engage in research projects with diverse partners.

Launching the DTQR blog series

One outcome of the community conversations at ICQI was the decision to launch a series of blog posts related to this year’s presentations. If you are interested in contributing a post, let us know at DigitalTools@queri.com. This month’s post is by Kristi Jackson of Queri, Inc. Let us know what you think!

DETERMINISM VS. CONSTRUCTIVISM: THE POLARIZING DISCOURSE REGARDING DIGITAL TOOLS FOR QUALITATIVE RESEARCH AND HOW IT THREATENS OUR SCHOLARSHIP

The Power Broker, written in 1974 by Robert Caro, won a Pulitzer Prize as a biography of one of the most prolific and polarizing urban planners in American history, Robert Moses. Known as the “master builder,” his racism and elitism were evident in much of his work. He publically resisted the move of black veterans into Stuyvesant Town, a Manhattan residential area created specifically to house World War II veterans.  He also set his sights on the destruction of a playground in Central Park to replace it with a parking lot for the expensive Tavern-on-the-Green restaurant. Although he was never elected to any public office, he once held twelve titles simultaneously, including NYC parks Commissioner and Chairman of the Long Island State Park Commission (Caro, 1974).

In his piece, “Do artifacts have politics?” Landon Winner (1980) describes the 204 bridges over the parkways of Long Island that were built under the supervision of Moses. Some of these bridges had only nine feet of clearance. Based on a claim by a co-worker, Moses purposefully constructed the low bridges to prevent busses from traveling through the area’s parkways. This privileged white, upper-class owners of automobiles who could fit their vehicles under the bridges. It simultaneously restricted access for poorer people (often blacks) who tended to ride buses. This is one of Winner’s examples of the social determination of technology: That there are explicit and implicit political purposes in the histories of architecture and city planning.

Because I am here to talk about QDA Software and Digital Tools for Qualitative Research – industries with builders, owners, developers, consultants and trainers – you may be starting to see where I am going with this. While we QDA Software experts like to think we are building bridges – that is, building access – our critics sometimes insist that we are being obstructionist. Before coming back to my story of Robert Moses, let me define my two key terms. As a caveat, I am setting aside some of the diverse understandings and controversies around these terms.

  1. Determinism presumes that all events, including human actions and choices, are the results of particular causes. This approach to knowledge isolates elements and tends to define them as causes or results.
  1. Constructivism presumes that views of ourselves and our world are continually evolving according to internal processes of self-organization. These views are not the result of exposure to information or dedicated adherence to principles like Occam’s Razor. Rather, our views of ourselves and our world expand, contract, plateau and adjust from within.

From my insider perspective, experts in QDA Software tend to emphasize constructivist perspectives. Researchers carve out their own paths in the use of the software for a particular study. As Miles and Huberman (1994) argued, the flexible, recursive and iterative capabilities of software provide unprecedented opportunities to challenge researcher conceptualizations. Lyn and Tom Richards (1994) stated that as they began developing NUD*IST, their analysis entailed a constant interrogation of themes. They also stated that one of their primary goals was to allow for a diverse range of theories and methodologies to be applied, and for these to be adjustable over time.

From my vantage point, many critics who argue that QDA Software compromises the qualitative research process are invoking the language of determinism. The particular flavor is one of technological determinism. The most common critiques include the tendency for this genre of software to

  • Distance researchers from the data (Agar, 1991).
  • Quantify the data (Welsh, 2002; Hinchliffe, Crang, Reimer & Hudson, 1997).
  • Homogenize the research process (Barry, 1998; Coffey, Holbrook & Atkinson, 1996).
  • Take precedence over researcher choices (Garcia-Horta & Guerra-Ramos, 2009; Shonfelder, 2011).
  • Lull researchers into a false sense confidence about the quality of their work (MacMillan & Koenig; Schwandt, 2007).

The point is that when it comes to the debates between critics and supporters of QDA Software, a markedly different perception of freedom is in play. While critics often frame QDA Software with the language of determinism, the advocates often frame it with the language of constructivism. To the former, the software limits personal agency by standardizing processes; to the latter, the software expands options and promotes diversity.   

For their scholarship on Robert Moses, Caro received a Pulitzer and Winner has been cited widely. But, suspicious of some of the characterizations of Moses, Bernward Joerges (1999) carefully examined the historical details and he presents us with another picture. The regional planner who claimed that Moses purposely built the low bridges to limit access did so almost 50 years after the event, and through his own deduction after measuring the bridge height. Next, Kenneth Jackson, a historian and editor of the New York encyclopedia, repeatedly had his students research some of the themes and episodes in Caro’s thirteen hundred page biography and found that many were “doubtful or tendentious.” Most tellingly, correspondence with US civil engineers noted that commercial traffic such as busses and trucks was prohibited from such roads anyway, and so building more than 200 unnecessarily high bridges would have been fiscally irresponsible. Finally, Moses constructed the Long Island Expressway alongside the parkways which didn’t restrict commercial traffic and could provide access to the beaches in any vehicle.

So, what are we to make of these different characterizations of Robert Moses and what do they have to do with determinism, constructivism and the scholarship about QDA Software?

Thomas Gieryn (1983; 1999) initially introduced boundary work as the activity among scholars who purposefully attempted to demarcate science from non-science. Gieryn argued that boundary work among scientists was part of an ideological style that functioned to promote a public image of some ways of knowing over others and was a key factor in the ascendance of the scientific method. He argued that this boundary work could occur in any discipline, including history (such as my description of the bridges of Robert Moses), and research (such as my descriptions of QDA Software). What I am pointing to, as part of my conclusion, is that most scholars view the bridges of Robert Moses as the contested entity; the bridges are the artifacts that either liberate human behavior on one side by facilitating agency or controlling human behavior on the other by limiting it.

However, as Joerges (1999) notes in his assessment of the many different ways the bridges have been described for different purposes, the artifact is the telling of the story, not the bridges, themselves. The bridge story has been used over and over because it’s handy. Because it tells well. Because, in my rendering of it, the telling of the bridge story is an allegory of the many deterministically-laden critiques of another artifact: QDA Software. Together these critiques of QDA Software amount to a one-sided, inaccurate view of how the software limits agency and limits we who use it, and it is a story told over and over as part of professional boundary work by those who critique it; just as we use constructivist language to promote it.

But as a QDA Software expert, I am aware of our own boundary work in the way we sometimes talk of qualitative researchers who do not use QDA Software. Unflattering descriptions like:

  • Rigid
  • Luddites
  • Lazy
  • Afraid
  • Behind the times
  • Wildly uninformed about the capabilities of the software

In contrast, most of them would probably use the same constructivist language to define their own work as we do: Flexible, diverse, etc.

As a larger community of qualitative researchers, we often use the language of determinism to critique the “other,” by invoking fairly simplistic cause-and-effect characterizations such as the unflattering ones I just unleashed. This is a form of boundary work that both camps use to allegedly protect the values of freedom, diversity and individual agency against the oppressive, homogenizing other. After all, that is, collectively, what we qualitative researchers tend to do when we perform boundary work with quantitative researchers. It is a strategy we’ve honed over the years to help demarcate and protect our approach to knowledge.

So, you see, the lesson regarding these determinist versus constructivist discourses may be less about epistemologies and more about the language we use to engage in boundary work: To other-ize each other. This has never lead to any positive, innovative changes in practice in any field, and it won’t in ours either. Our challenge is to find new avenues for scholarship that minimize the boundary work and maximize the collaborative work.

References
Agar, M. (1991). The right brain strikes back. In Using computers in qualitative research. Eds N. G. Fielding & R. M. Lee. 181-194 (Ch 11)
Barry, C. A. (1998). Choosing qualitative data analysis software: Atlas/ti and Nud*ist compared. Sociological Research Online, 3(3). Retrieved from http://www.socresonline.org.uk/socresonline/3/3/4.html
Bourdieu, P. (1977). Outline of a theory of practice. New York: Cambridge University Press. Translated by Richard Nice.
Caro, R. A. (1974). The Power Broker. New York: Knopf
Coffey, A., Holbrook, B., & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online, 1(1), http://www.socresonline.org.uk/1/1/4.html
Garcia-Horta, J. B., & Guerra-Ramos, M. T. (2009). The use of CAQDAS in educational research: Some advantages, limitations and potential risks. International Journal of Research & Method in Education, 32(2), 151-165.
Gieryn, T. F. (1999). Cultural boundaries of science: Credibility on the line. Chicago: The University of Chicago Press.
Gieryn, T. F. (1983). Boundary-work and the demarcation of science from non-science: Strains and interests in professional interests of scientists. American Sociological Review, 48. 781-795.
Henderson, K. (1999). On line and on paper: Visual representations, visual culture, and computer graphics in design engineering. Cambridge: MIT Press.
Hinchliffe, S. J., Crang, M. A., Reimer, S. M.,  & Hudson, A. C (1997). Software for qualitative research: 2. Some thoughts on ‘aiding’ analysis. Environment and Planning A, 29(6), 1109 – 1124.
Joerges, B. (1999). Do politics have artifacts? Social Studies of Science, 29(3), 411-431.
MacMillan, K.  & Koenig, T. (2004). The wow factor: Preconceptions and expectations for data analysis software in qualitative research. Social Science Computer Review, 22(2), 179186.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook 2nd ed.). Thousand Oaks: Sage Publications, Inc.
Richards, L., & Richards, T. (1994). From filing cabinet to computer. In A. Bryman & R. G.
Burgess (Eds.), Analyzing Qualitative Data (pp. 146-172). London: Routledge.
Schwandt, T. A. (2007). The Sage dictionary of qualitative inquiry (3rd ed.). Thousand Oaks: Sage Publications.
Shonfelder, W. (2011). CAQDAS and Qualitative Syllogism Logic – NVivo 8 and MAXQDA 10 Compared. Forum: Qualitative Social Research, 12(1). Article 1. January.
Tesch, R. (1990). Qualitative research: analysis types and software tools. London: Routledge: Falmer.
Welsh, E. (2002). ‘Dealing with Data: Using NVivo in the Qualitative Data Analysis Process’ Forum Qualitative Sozialforschung / Forum (FQS) Volume 3, No. 2 – May 2002 Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121-136. Winter.

 

Report from the International Congress of Qualitative Inquiry 2016

The digital tools special interest group (DTQR) at ICQI, chaired by Kristi Jackson (QUERI) and co-chaired by Trena Paulus (University of Georgia) and Judy Davidson (University of Massachusetts-Lowell), coordinated eight panels with a total of 35 papers this year on a variety of topics such as the use of QDA software for facilitating complex teamwork, teaching qualitative inquiry online, and exploring gender and identity in digital spaces.

Approximately 100 conference participants attended the digital tools sessions over the two days. The opening plenary included presentations by Kakali Bhattarachya on virtual ethnography (Kansas State university), Harriett Green on the digital humanities (University of Illinois), Christina Silver on Five-Level QDA (QDA Services), and Kristi Jackson on new discourses for digital tools (QUERI).

We had an engaging final mash-up working meeting for the SIG on Saturday, so stay tuned here for further developments. For now, here are some photos from the event.

2-Critical plenary speakers.JPG

Opening plenary speakers Kristi Jackson, Trena Paulus (discussant), Christina Silver, Kakali Bhattarachya and Harriet Greene.

3-Five-level QDA.JPG

Nick Woolf and Christina Silver present their Five-Level QDA Training Model

4-In between sessions.JPG

Participants mingle in between Digital Tools SIG sessions

 

5-University of Georgia presenters.JPG

Faculty and students from the University of Georgia presented on teaching qualitative research online, with Judy Davidson serving as discussant

13233138_1171738496191219_8800897617455033997_n.jpg

Johnny Saldana serenades the opening session with his digital tools song

13241243_1171738736191195_2887502414065850842_n.jpg

David Woods talks about complex teamwork and QDAS

 

Community Events at ICQI

Community Events
International Congress of Qualitative Inquiry (ICQI)
University of Illinois, Champaign-Urbana
May 18-21, 2016
http://icqi.org/registration/

 dtqr

  • If you are attending the International Congress of Qualitative Inquiry, join us at our three community events.
  • Each event focuses on our Special Interest Group: Digital Tools for Qualitative Research.
  • We have over 25 items to raffle at these community events, including QDA Software, books, and more! (Locations will be provided when the final program is made available by ICQI).

 Thursday, May 19

7:00 – 9:00 p.m.

Meet & Greet at the Opening Midwest BBQ, with a raffle at 8:00pm
(look for the blue balloons)

 Friday, May 20

8:00 – 9:20 a.m.

The Construction and Use of Digital Tools for Qualitative Research: Challenges on the Horizon for Qualitative Research, our opening Panel.

 Saturday, May 21

11:00 a.m. – 12:20 p.m.

SIG Mashup: A Working Meeting, where we will review our activities for the last year and discuss future plans.

 We will send a full schedule of our SIG sessions, including abstracts,
as soon as the final ICQI program is available.