Research goals and questions exercise

See this attached list of research goals and research questions. 

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

1) Match the research goal to the research question(s) and 

2) identify them as either qualitative or quantitative (no mixed methods yet), and 

3) explain WHY it is so. 

Use the (attached) table to cut/paste the goals and questions into and provide your answers. 

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

Look for specific keywords to help you differentiate between qualitative and quantitative, and remember that the “why” answer is vital.

Look at Pages 10, and 14-15 of the McGregor document attached for some of the “why’s”, but use short sentences to explain them (don’t just copy and paste the why’s).

SAGE Research

Methods

Understanding and Evaluating Research: A Critical

Guide

Author: Sue L. T. McGregor

Pub. Date: 2019

Product:

  • SAGE Research Methods
  • DOI:

    https://dx.doi.org/10.4135/9781071802656

    Methods: Theory, Research questions, Mixed methods

    Disciplines: Anthropology, Education, Geography, Health, Political Science and International Relations,

    Psychology, Social Policy and Public Policy, Social Work, Sociology

    Access Date: January 11, 2023

    Publishing Company: SAGE Publications, Inc

    City: Thousand Oaks

    Online ISBN: 9781071802656

    © 2019 SAGE Publications, Inc All Rights Reserved.

    https://dx.doi.org/10.4135/9781071802656

    Overview of Research Design and Methods

    Learning Objectives

    • Distinguish between methodology, method, and research

    de

    sign

    • Compare and contrast research design as logical and as logistical

    • Appreciate the difference between reconstructed logic (quantitative) and logic-in-use (qualitative), es-

    pecially an emergent research design

    • Explain the five logics of mixed methods research designs

    • Appreciate the link between research inquiry and research design

    • Describe the purpose and importance of the Methods section of a research report

    • Compare and contrast qualitative and quantitative intellectual inquiries

    • Identify the major reporting components (subheadings) of qualitative and quantitative research re-

    ports

    • Compare and contrast the most agreed-to approaches and terms about research integrity, rigor, and

    quality that are used in each of the three methodologies, and learn the attendant strategies to meet

    the standard for the specific research

    methodology

    Introduction

    This chapter focuses on the constructs of research design and the Methods section in a research paper. Re-

    search design is a larger construct than methods, to be explained shortly. But within a research paper, once

    the authors have stated the research question, developed an introduction to the study, and presented a re-

    view of the literature (and maybe a theoretical framework), their next step is to provide a description of the

    strategies used to collect and analyze data pursuant to the research question—that is, their methods. This

    chapter provides a generic discussion of methods, followed with much more detail in Chapter 9 (qualitative

    methods) and in Chapter

    10 (quantitative and mixed methods).

    As a caveat, a detailed discussion of how to use specific methods is beyond the scope of this overview chap-

    ter, or even this book. There is no attempt to explain how to do a survey, conduct a scientific experiment, pre-

    pare a case study, or engage in ethnographic research where researchers immerse themselves in the lives of

    the participants. That being said, the general discussions in Chapters 9 (qualitative) and 10 (quantitative and

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 2 of 27

  • Understanding and Evaluating Research: A Critical Guide
  • https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    mixed methods) will address the basic conventions pursuant to preparing, conducting, and reporting these

    types of research, which entails identifying common methods.

    This generic chapter will begin with a discussion of the larger construct of research design, including the link

    between research design and research inquiry, research design as logic and logistical, and the most common

    research designs organized by the three methodologies: qualitative, quantitative, and mixed methods. The

    conversation then shifts to a general overview of methods (distinguished from methodology). The purposes of

    the Methods section are identified followed with general introductions to the major differences between qual-

    itative and quantitative inquiries, the major reporting components (subheadings) of each of these re

    search

    reports, and the topic of rigor and quality in each of the three methodologies.

    Etymology and Definition of Methods and

    Research Design

    Method is Greek methodus, “for mode of inquiry or investigation.” It stems from meta, “after,” and hodos, “a

    travelling, a pursuit, a way of teaching or going” (Harper, 2016). In effect, method refers to investigating or

    inquiring into something by going after or pursuing it, especially in accordance with a plan. It involves tech-

    niques, procedures, and tasks used in a systematic, logical, and orderly way (Anderson, 2014). Within the

    context of conducting and reporting research, it is the stage wherein researchers design instruments, appa-

    ratus, or procedures or gain site access (if relevant), obtain a sample, and then collect and analyze data from

    that sample (or entire population) (Johnson & Christensen, 2012). As was discussed in Chapter 2, this book

    distinguishes between method and methodology, with the latter connoting the philosophical underpinnings of

    the

    study.

    The other term used in this chapter is research design. Research is French recercher, “to search.” In the con-

    text of this book, it refers to the accumulation of data that are interpreted, leading to new knowledge. Design

    is Latin designare, “to mark out, devise, choose, designate.” A design can be defined as a plan used to show

    the workings of something before it is made or created. It can also mean the underlying purpose of some-

    thing, in this case, the search for knowledge (Anderson, 2014; Harper, 2016). From a technical stance, the

    research design refers to the overall strategy that researchers choose to integrate the different components of

    their study in a coherent and logical way, thereby ensuring they can effectively address the research question

    using the new knowledge created from the study (Labaree, 2016). Research design also entails logic (Yin,

    1984), to be discussed shortly.

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 3 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml

    Research Design

    Many disciplines mistake research design for methods (de Vaus, 2001). This section explains how this book

    distinguishes between these terms, respecting the lack of a consensus in the scholarly arena for their usage.

    Research design is a larger construct than method. Per above, methods refer to technical procedures, tech-

    niques, or steps taken to obtain information and analyze data for a study. A design is a plan made before

    something is done (Anderson, 2014). Designing research is a process that entails both logic (thinking and

    reasoned judgments) and logistics (doing), with logic coming first, inherently shaping logistics (methods) (Yin,

    1984).

    Research Inquiry and Research Design

    The logic and thinking that researchers use to design their research is affected both by the (a) methodology

    (which shapes the research questions and all assumptions underlying the effort), and (b) type of research

    inquiry they are conducting. In short, (a) exploratory research strives to reach a greater understanding of a

    problem, usually laying the groundwork for future studies; (b) descriptive research seeks more information

    so as to accurately describe something in more detail, creating a fuller picture by mapping the terrain; and

    (c) explanatory research seeks to connect ideas to understand causal inferences (explain relationships) (de

    Vaus, 2001; Suter, 2012; Yin, 1984). These approaches apply to both quantitative and qualitative research

    methodologies (except explanatory), with qualitative also seeking to (d) illuminate meaning and subjective

    experiences and (e) understand processes and structures (Blaxter, 2013; Shank & Brown, 2007).

    Articulating Research Purpose in Research Design

    Each of these five types of research inquiry represents the deeper purpose of the study (the problem), or the

    reasons for doing it, which is why Yin (1984) said research design is logical (i.e., it entails reasoned judg-

    ments). Each type of inquiry offers a different reason for why the study is needed (e.g., to describe, explore,

    find meaning, or theorize). Authors must not confuse research purpose (reason for inquiry) with methodolo-

    gy, research design, research question, or methods (see example 8.1). When identifying the nature of their

    research inquiry, they can use headings in their paper such as Justification for the Study, Importance of the

    Study, or Objectives of the Study (Newman, Ridenour, Newman, & DeMarco, 2003). A clearly stated research

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 4 of 27 Understanding and Evaluating Research: A Critical Guide

    purpose will help readers formulate a realistic set of expectations about a study and better ensure they eval-

    uate the quality of the study’s design within the context of the author’s purpose (Knafl & Howard, 1984) (see

    Chapter 6).

    Example 8.1 Research purpose versus question The problem is the deeper, more complex rea-

    son why the researcher is conducting the study (e.g., to explore, describe, explain, or find meaning).

    Newman et al. (2003) recounted a quantitative study in which the research question was incorrectly

    presented as the research problem: “What is the effect of making a substantial reduction in class

    size on student achievement?” The researchers erroneously characterized class size as the problem

    when in fact students’ lack of achievement was the problem and the reason why this explanatory

    study was needed (i.e., to explain). In this study, reducing class size was but one solution to in-

    creasing student achievement. By losing focus on what the real problem was (lack of achievement),

    the researchers designed an inappropriate study if they wanted to explain it. An unfortunate conse-

    quence of authors neglecting to clearly state their purpose and problem is that some readers may

    uncritically accept their results and change their practice when they should not.

    Research Design as Logical and Logistical

    Research designs guide the methods decisions that researchers must make during their studies, and they

    set the logic by which interpretations are made at the end of studies (Creswell, 2008). To further appreciate

    the link between research design logic and method, authors can consider this metaphor. Before builders or

    architects can develop a work plan or order building materials, they must first establish the type of building

    required, its uses, and the needs of the occupants; that is, they must think about their entire build and justify

    any design decisions they make. Their work plans (methods) to construct the building then flow from this logic

    (i.e., their reasoned judgments about the build). The same idea holds for a study’s research design (de Vaus,

    2001).

    Research design as logic concerns researchers thinking about what sorts of data are needed to answer the

    research question, including what methods might be most appropriate to generate those data. The type of

    research inquiry (i.e., the purpose behind the research) shapes the overall structure of the study, especial-

    ly the methods (Kotler, 2000; Newman et al., 2003). Research design as logical equates to a blueprint with

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 5 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2021.xml

    specific, sequenced (sometimes iterative) steps that are be completed to bring the plan to closure (i.e., the

    methods, which are the focus of this chapter). Research design as logistical refers to the work plan developed

    by the researcher to collect evidence and analyze data to answer the research question and respect the type

    of research inquiry (the logic). Logistical means planning and organizing to make sure things are where they

    need to be so an activity or process can happen effectively (Anderson, 2014). The logic affects the logistics

    (methods), and the logistics reflect the logic (Yin, 1984) (see Figure 8.1).

    Quantitative Research Design Logic

    Quantitative research uses a predetermined, fixed research plan based mostly on reconstructed logic. This

    logic of research is based on organizing, standardizing, and codifying research into explicit rules, formal pro-

    cedures, and techniques so others can follow the same linear plan and reconstruct the study. This is the logic

    of “how to do research” and is highly organized and systematic (Jarrahi & Sawyer, 2009; Neuman, 2000).

    The type of research inquiry determines the research design created using this logic (see Table 8.1). Should

    they create a cross-sectional design (collect data once from one sample), a repeated cross-sectional design

    (collect data once from different samples), a longitudinal design (collect data from one sample over time), a

    one-subject design, an experimental design, a case study, or some other design (Kotler, 2000)?

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 6 of 27 Understanding and Evaluating Research: A Critical Guide

    Figure 8.1 Research Design as Logic and Logistical

    Table 8.1 Three Types of Research Inquiries, With Examples of Quantitative Research Designs

    Exploratory Research Inquiry Descriptive Research Inquiry Explanatory Research Inquiry

    Cross-sectional design

    Case study design

    Cross-sectional design

    Longitudinal design

    Case study design

    Cross-sectional design

    Experimental design

    Case study design

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 7 of 27 Understanding and Evaluating Research: A Critical Guide

    Qualitative Research Design Logic

    The research designs included in Table 8.1 (based on only reconstructed logic) do not adequately represent

    “the logic and processes of qualitative research [which] lacks such an elaborate typology into which studies

    can be pigeonholed” (Maxwell, 2008, p. 214). Maxwell (2008) said “this does not mean that qualitative re-

    search lacks design” (p. 215). Instead, qualitative research requires a broader and less restrictive concept of

    research design, in which researchers use “‘logic-in-use’ [as well as] ‘reconstructed logic’ [to accommodate

    the] ‘design in use’ [principle]” (p. 216). This is called an emergent research design wherein the original plan

    changes as the research unfolds, meaning it is nonlinear (Creswell, 2009) (discussed in Chapter 9). Regard-

    less, the end result is data that are then analyzed, interpreted, and discussed, leading to conclusions, impli-

    cations, and recommendations (de Vaus, 2001; Suter, 2012; Yin, 1984).

    As a final caveat, de Vaus (2001) explained that researchers should not equate a particular logistical method

    with a particular research design logic. It is also erroneous to equate a particular research design with either

    quantitative, qualitative, or mixed methods approaches. Instead, authors need to bear in mind the link be-

    tween (a) the purpose of the research (logical inquiry) and (b) their research design (both logic and logistics)

    (Yin, 1984) and then introduce their Methods section accordingly (see examples 8.2 and 8.3).

    Example 8.2 Quantitative research design and method This exploratory, quantitative research

    inquiry employed a cross-sectional research design. Data were collected from a purposive sample

    using the survey method, specifically a piloted questionnaire designed for this study. Descriptive sta-

    tistics were used to analyze the data using Minitab software, and the results were reported using

    frequencies, percentages, and means (averages).

    Example 8.3 Qualitative research design and method This qualitative research inquiry employed

    an emergent research design, using the phenomenological method. Data were collected from a

    snowball sample of individual participants by way of interviews. The data were thematically ana-

    lyzed, and findings were reported using quotes and a supportive narrative.

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 8 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    Mixed Methods Research Design Logic

    Authors of mixed methods studies should avoid rhetorical logic, meaning they should not assume that one

    strand of data is only there to embellish their analysis of the other strand and is not really considered to be

    a necessary part of their analytical interpretation or argument. Mixed methods explanations and interpreta-

    tions require more challenging logics. Mason (2006) identified five logics, one being rhetorical. Parallel logic

    assumes each strand has its own logic (see above), and authors would run these in parallel and report two

    different sections, one for each strand. A third approach is corroborative logic, which concerns itself with data

    triangulation. Researchers would strive to use data from each strand to corroborate each other (confirm or

    give support). If researchers use an integrative logic, they likely choose this at the beginning of the research

    design process so they can intentionally link insights from both data streams to get a better picture of the

    whole phenomenon (see Chapter 10).

    Mason (2006) identified multidimensional logic as the most challenging type of mixed methods logic. “The

    argument is that different methods and approaches have distinctive strengths and potential which, if allowed

    to flourish, can help [researchers] understand multi-dimensionality and social complexity. . . . The logic imag-

    ines ‘multi-nodal’ and ‘dialogic’ explanations which are based on the dynamic relation of more than one way

    of seeing and researching. This logic requires that researchers factor into their accounts the different ways of

    asking questions and of answering them” (pp. 9–10). It differs from the other logics, which assume data inte-

    gration rather than a data intersection. The latter “involves a creative tension between the different methods

    and approaches, which depends on a dialogue between them” (p. 10). This dialogue cannot occur without

    everyone involved embracing a logic that respects multiple dimensions and points of view (researchers them-

    selves and research methodologies, with attendant assumptions, as discussed in Chapter 2).

    Review and Engagement

    When critically reading a research report, you would

    □ Ascertain whether the authors used the term research design when introducing their methods, with-

    out confusing the two concepts

    □ Make sure they shared their thinking and reasoning about how best to answer their research ques-

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 9 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml

    tions (i.e., explained the logic used when creating their research design, especially what type of data

    were needed to answer their research questions)

    □ Per the above, determine if they included a section titled Justification for or Importance of the Study

    □ Determine if they properly referred to reconstructed (deductive) logic (quantitative) or logic-in-use

    (qualitative emergent research design) or if they referenced mixed methods logics

    □ Determine if they clarified their research design (see Table 8.2)

    □ Determine if they explicitly stated the type of research inquiry they employed (exploratory, descrip-

    tive, explanatory, meaning seeking, or understanding processes and structures)

    Most Common Research Designs

    Table 8.2 summarizes the most common research designs for each of qualitative, quantitative, and mixed

    methods studies, discussed in much more detail in Chapters 9 and 10. These approaches to designing re-

    search differ because of methodological distinctions, discussed in more detail in the second part of this

    overview chapter.

    Table 8.2 Main Types of Qualitative, Quantitative, and Mixed Methods Research Designs

    Qualitative Research Designs (in-

    volve changing tactics over the

    course of the study)

    Quantitative Research Designs (involve adhering to a formal

    plan with no deviation)

    Mixed Methods Research Designs (in-

    volve some prioritized combination of

    strategy and tactics)

    • Interpretive—insights from inter-

    preting data change the research

    design

    • Investigative—traces out a phe-

    nomenon in its natural field set-

    ting

    • Participatory—research design

    is codeveloped with participants

    • Descriptive—describes what actually exists, as well as its fre-

    quency, and then categorizes the information

    • Correlational—examines whether a change in a variable (no

    manipulation) is related to change in another

    • Comparative—measures variables that occur naturally in ex-

    isting groups, then compares them to determine their influence

    on the dependent variable

    • Experimental—manipulates independent variables, measures

    • Use qualitative methods to explain

    quantitative data (words to explain

    numbers)

    • Use quantitative methods to further

    explain qualitative data (numbers to

    explain words)

    • Use both methods to achieve trian-

    gulation

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 10 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    Methods

    The discussion now turns from the construct of research design to that of methods, which are understood

    to include instrument development and apparatus, sampling, data collection, and data analysis, differing for

    each of the three methodologies used to shape this book: qualitative, quantitative, and mixed methods. This

    chapter provides a generic discussion of methods, followed with Chapter 9 (qualitative methods) and Chapter

    10 (quantitative and mixed methods).

    Methodology Versus Methods

    Many disciplines use the word methodology to refer to methods (Schneider, 2014). This section explains how

    this book uses these terms, respecting the lack of a consensus in the scholarly arena for their usage. This

    book clearly distinguishes between methodology and methods (see Chapter 2). Methodology (ology) is fo-

    cused on what is involved in creating new knowledge and refers to the branch of philosophy that analyzes the

    principles and axioms of research. The word method refers to a system of strategies used to obtain informa-

    tion for a study.

    Many disciplines’ use of the word methodology to refer to methods (Schneider, 2014) most likely occurs be-

    cause the empirical (quantitative) research paradigm is so prevalent. Given its dominance, authors tend to

    • Illuminative—strategically focus-

    es on one aspect of research de-

    sign

    • Instrumentation—study creates

    a new data collection instrument

    • Sensitization (descriptive)—sen-

    sitizes readers to participants’ sit-

    uation

    • Conceptualization (theory build-

    ing)

    changes in dependent variable (experiment and control), and

    infers causal links

    • Quasi-experimental—employs an experimental and control

    design using existing groups, then cautiously infers causation

    • Predictive exploratory—determines how variables may be

    used to develop data-based models of a phenomenon

    • Survey (nonexperimental)—examines an already-occurred

    event in naturally occurring groups

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 11 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml

    deem it unnecessary to identify it as a methodology per se, leaving that term for the data collection and analy-

    sis procedures. While respecting this convention, this book assumes that when authors are reporting on in-

    strument development, apparatus, sampling, data collection, and data analysis, they are reporting methods,

    not methodology (Bryman, 2008). Consequently, this chapter employs the term methods for the strategies to

    obtain information for a study, an aspect of research that is deeply informed by methodology (the creation of

    knowledge) (see Chapter 2, Table 2.1 and Figure 2.3).

    Review and Engagement

    When critically reading a research report, you would

    □ Determine whether the authors used the terms methodology and methods but did not confuse them

    □ Ascertain if they clarified their methodology before presenting their methods

    □ Check to see if they provided enough information for you to judge the appropriateness of the se-

    lected method(s) against the implicitly or explicitly stated methodology

    Purpose and Importance of the Methods Section

    Some scholars feel that the Methods section is the most important part of a research paper (Azevedo et al.,

    2011; Kallet, 2004). It fulfills several key roles. In this section of their paper, authors have an opportunity to

    convince readers they have fully documented all of the steps undertaken to collect and analyze data for their

    study. With sufficient information, readers can rest assured that authors have carefully and systematically

    thought about their methods, indicating they are clear-thinking and competent researchers who do high-qual-

    ity work. In particular, in quantitative research, readers need sufficient information to enable them to repro-

    duce the procedures and get similar results (called replicability and reliability). In qualitative work, readers

    need sufficient information to enable them to determine if the methods and findings are relevant to, and can

    be adopted in, their context (called dependability) (Dillinger, 2011; Labaree, 2016; Shon, 2015; VandenBos,

    2010).

    In their Methods section, authors should review previous literature pursuant to the design they will be imple-

    menting and openly discuss and debate measurement issues and strategies so they can improve on previ-

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 12 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml#i1507

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml#i1519

    ous work. Their own Methods section should clearly set out a well-articulated set of procedures that can be

    consistently reapplied (quantitative) or appropriately adopted in another context (qualitative). By making their

    measurement choices explicit, authors help readers decide if the study was done well or needs improvement

    (Harris, 2014).

    Following this convention, the Methods section serves the purpose of fostering ongoing debate about how

    to improve measurement instruments and research procedures, whether qualitative or quantitative. Authors

    should try to avoid using or perpetuating inconsistent measures and procedures because this creates dis-

    continuity in the literature about the particular phenomenon being measured (Choudhuri, Glauser, & Peregoy,

    2004; Harris, 2014). As examples, Harris (2014) noted that scholars have developed 200 ways to measure

    self-esteem, 16 ways to measure aspiration, and hundreds of instruments to measure quality of life, and they

    have not settled on how to measure gender identity or prejudice. These are examples of discontinuities per-

    petuated in the literature.

    Authors may choose to select from and adapt previous attempts to measure a phenomenon, and if so, they

    must provide a solid rationale for their method choices. Using this rationale, readers can critically evaluate

    the study’s overall quality (Dillinger, 2011; Labaree, 2016; VandenBos, 2010) (see Chapter 1). As explained

    in Chapter 2, quantitative and qualitative research embrace different notions of what counts as knowledge,

    reality, logic, and the role of values. These philosophical differences determine what data are collected and

    how, and how these data are analyzed and reported.

    Major Differences Between Qualitative and Quantitative Intellectual Inquiry

    Table 8.3 portrays the main differences between the qualitative and quantitative approaches to scholarship

    and to academic inquiry (Ary, Jacobs, & Sorensen, 2010; Choudhuri et al., 2004; Creswell, 2009; Driessnack,

    Sousa, & Mendes, 2007a; Johnson & Christensen, 2012; Patton, 2002; Rolfe, 2006; Suter, 2012). Authors

    should write their Methods section using language and vocabulary reflective of the approach that informed

    the inquiry in their study. This narrative would reflect each methodology’s respective assumptions about real-

    ity, truth, the role of values, the importance of context, the role and voice of the researcher, the applicability

    of variable manipulation, logics, and so on. Critical readers can use this narrative (its presence or absence)

    to draw conclusions about the quality of the scholarship. In a mixed methods study, authors would use this

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 13 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1401.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1494.xml

    information as appropriate when addressing each strand of their research design: qualitative and quantitative.

    Table 8.3 Main Differences Between Qualitative and Quantitative Intellectual Inquiry

    Qualitative Inquiry Quantitative Inquiry

    • Assumes subjective reality is socially constructed and subjective • Assumes there is an objective reality ready to be discovered

    • Appreciates complexity and multiple truths • Favors parsimony and assumes a single truth

    • Research is value bound, and the researcher’s values are accounted

    for
    • Research is value neutral, and the researcher’s values are muted

    • The researcher is the primary instrument (observations, interviews)
    • Uses inanimate instruments (scales, questionnaires, checklists,

    tests)

    • Contextualizes findings and applies ideas across contexts • Generalizes results from a sample to a population

    • Portrays natural settings and contexts • Manipulates and controls variables

    • Few participants, many variables • Few variables, many subjects

    • Understands the insider’s view • Presents the objective outsiders’ view

    • Human behavior is situational • Human behavior is regular

    • Interprets human behavior in context • Predicts human behavior

    • Understands perspectives (empathetic) and exploration • Provides causal explanations and predictions

    • Widely, deeply examines phenomena • Narrowly tests specific hypotheses

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 14 of 27 Understanding and Evaluating Research: A Critical Guide

    • Focuses on quality, essence, and nature • Focuses on quantity (how much)

    • Presents the world as seen by participants • Presents social facts devoid of context

    • Uses inductive then deductive logic • Uses deductive then inductive logic

    • Searches for patterns and looks for complexity • Analyzes discrete components looking for the norm

    • Uses purposive sampling • Uses random sampling

    • Single cases or small samples • Large samples with statistical power

    • The research design is emergent and evolving • The research design is predetermined

    • Data are words, images, and categories • Data are numbers (minor use of words)

    • Nonlinear, iterative, and creative analysis • Linear, standardized, and prescribed analysis

    • Thematic, patterned analysis of data • Statistical analysis of data

    • Power in rich descriptions and detail • Statistical power

    • Reports are written in expressive, holistic language (thick descriptions) • Reports are written in precise, conventional, abstract language

    • Some studies create theory from the findings • Use theory to ground the study and interpret results

    • Generates understandings from patterns • Test hypotheses that are born from theory

    • Faces conceptual complexity • Faces statistical complexity

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 15 of 27 Understanding and Evaluating Research: A Critical Guide

    Major Components (Report Subheadings) of Qualitative and Quantitative Re-

    search

    Table 8.4 compares the basic stages or major components of both quantitative and qualitative research meth-

    ods and provides the typical subheadings authors would use to report their respective methods for a study.

    Purposefully using these headings greatly facilitates others’ ability to critically read the Methods section of a

    research report. If authors fail to explicitly indicate which methodology informed their study, readers can take

    cues from their subheadings. Absence of these subheadings—or, worse yet, the content relevant to each

    stage—raises unnecessary flags about the study’s integrity and quality. These headings are used in Chapters

    9 and 10 to organize the discussion of how to report both qualitative and quantitative research or their strands

    within a mixed methods study.

    • Strives for trustworthy, credible data • Strives for reliable and valid data

    Table 8.4 Basic Steps (Report Subheadings) of Qualitative and

    Quantitative Methods

    Qualitative Methods

    NOTE: These steps are not always linear and sequential

    Quantitative Methods

    NOTE: These steps are linear and sequential

    • Site selection and access (gaining access to the site from which the sample

    will be drawn)

    • Sampling (people, artifacts from the site[s])

    • Ethical considerations

    • Role of the researcher (responsible self-accounting, privileges sample’s

    voice)

    • Data collection (from site participants, with the researcher as the key data col-

    lection instrument, yielding piles of raw data—words, pictures, graphics)

    • Thick and deep (re)presentation of the data (detailed accounts of the re-

    search context and participants’ experiences)

    • Instruments, apparatus, and/or procedures (tools to collect

    data)

    • Sampling (people, animals, artifacts from which data are

    collected)

    • Ethical considerations

    • Data collection (from the sample using the aforementioned

    tools, yielding a pile of raw data—numbers)

    • Data analysis (statistically examine the pile of raw data to

    determine its essential features, done after data collection)

    • Account for validity, reliability, and generalizability

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 16 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    Review and Engagement

    When critically reading a research report, you would

    □ Ascertain whether the authors used language and vocabulary reflective of the research inquiry ap-

    proach that informed their study (see Table 8.3)

    □ Determine if they used methodology-specific headings to organize their Methods section (see

    Table 8.4) and fully accounted for and shared their research design logic and logistics

    □ If subheadings are missing, determine if the authors at least included pertinent details for each

    stage of their respective methodology’s research design

    • Data analysis (thematic, patterned examination of the thick data, often done

    in concert with data collection)

    • Account for trustworthiness (along several criteria)

    • Data security and management

    • Limitations of emergent research design

    • Data security and management

    • Limitations of predetermined research design (normally fol-

    lows the Discussion section)

    Table 8.5 Comparison of Criteria to Ensure High-Quality Quantitative and Qualitative Research

    Quantitative (Positivistic, Empirical, Deterministic)
    Qualitative (Postpositivistic, Naturalistic, In-

    terpretive, Critical)

    Striving for unbiased data (results are true if no bias was introduced, made possible if the re-

    searcher’s personal preferences, prejudices, and opinions are held at bay during the entire re-

    search process).

    Strategies: judiciously address issues of internal validity to ensure that the study design, imple-

    mentation, and data analysis are bias free, yielding high levels of evidence of cause and effect

    (or association); employ representative and random sampling techniques; account for missing

    and incomplete data; acknowledge funding sources.

    Striving for trustworthy data (data must be

    truly transparent and open to critical thinking

    by reader; trust means acceptance of the

    truth of a statement).

    Strategies: triangulation (multiple sources of

    data); member checks; saturation during da-

    ta collection; peer review or expert consulta-

    tions; audit trail (detailed record of re-

    searcher’s decisions, with reasons); thick de-

    scriptions; plausible alternatives; account for

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 17 of 27 Understanding and Evaluating Research: A Critical Guide

    negative cases; prolonged engagement in

    the field.

    Objectivity:

    Empirical research is said to be value free, meaning the research process should not be influ-

    enced by the researcher’s emotions, preferences, or personal prejudices. Researchers are sup-

    posed to dispassionately engage in research from a stance of value neutrality, thereby ensuring

    the truth is found. Judgments about the evidence should not coincide with the researcher’s ori-

    entation (despite that science is not really neutral; relative value neutrality is more likely than

    absolute neutrality).

    Confirmability (subjectivity):

    Refers to the researcher’s neutrality when in-

    terpreting data (i.e., self-awareness and con-

    trol of one’s bias); appreciating that values

    are central to the research process, re-

    searchers still have to be sure their findings

    can be confirmed or corroborated by others

    (i.e., their values did not take over). It is the

    extent to which findings are shaped by the

    respondents themselves, rather than the re-

    searcher’s bias.

    Strategies: embrace the tenets of the scientific method and empirical inquiry; do not distort re-

    search or let one’s values intrude by drawing on personal worldviews, motives, self-interest, or

    customs or by capitulating to external pressures (researchers are especially vulnerable to value

    intrusion during the interpretation and discussion stage).

    Strategies: reflexivity (involves self-critique

    and disclosure of what one brings to the re-

    search, especially one’s predispositions); au-

    dit trails; method triangulation; peer review

    and debriefing.

    Internal validity:

    This refers to the integrity of the research design. The word internal pertains to the inner work-

    ings of the research process, designed and conducted to ensure that the researcher measured

    what was intended to be measured (producing strong, valid data instead of weak, invalid data).

    Also, the research design should follow the principle of cause and effect. There are seven major

    threats to internal validity (i.e., measuring something other than what was intended): (a) conta-

    mination by an extraneous event (history effect); (b) participants aging or tiring (maturation ef-

    fect); (c) loss of subjects or attrition between testing (mortality effect); (d) sensitizing subjects

    with pretest (testing effect); (e) extremely high or low pretest scores (statistical regression ef-

    fect); (f) subjects are not carefully assigned to test groups (selection bias effect); and (g) unreli-

    ability of an assessment instrument (instrumentation effect).

    Strategies: take steps necessary to mitigate threats to internal validity (e.g., account for contam-

    ination, maturation, attrition, sampling size, group formation and assignment, instrumentation

    alignment, and testing sensitization).

    Credibility (credible to the participants):

    Did the researchers create a faithful account-

    ing of people’s lived experiences (i.e., an ac-

    curate representation of their reality, from

    their perspective)? Did the researchers get a

    full answer to their research question? Also,

    can others have confidence in the truth

    shared by the researchers (i.e., in their ob-

    servations, interpretations, and conclu-

    sions)? The latter require strong evidence,

    clear logic, valid data, and ruling out alterna-

    tive explanations.

    Strategies: member checks; detailed, thick

    descriptions (and lots of quotes); triangula-

    tion (methods and data); peer review and de-

    briefing; extended and prolonged fieldwork;

    researcher reflexivity to mitigate invalidity;

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 18 of 27 Understanding and Evaluating Research: A Critical Guide

    cross-case comparisons.

    External validity (asserted by the researcher):

    Does the truth (conclusions) from the study hold in situations outside the study? Researchers

    have to ask, “How similar is my study to the situation I want to generalize to?” (meaning make a

    broad statement from a specific case). If too dissimilar, their results and conclusions are not ex-

    ternally valid; that is, they do not hold true for other situations (based on statistical assump-

    tions).

    Transferability (determined by the user):

    Refers to the degree to which findings can

    be applied or transferred to other contexts or

    settings—that is, used more widely by oth-

    ers. It is the researcher’s responsibility to

    provide accurate, detailed, and complete de-

    scriptions of the context and the participants

    so that users of the study can determine if

    the findings and conclusions apply (are

    transferable) in their context (based on simi-

    larity of deep descriptors).

    Strategies: judiciously choose appropriate research design protocol (especially sample size and

    bias). Then, before asserting that the results are valid in other populations, situations, and con-

    ditions, researchers must recognize, consider, and report on factors that mitigate these asser-

    tions, notably any interactions (a) among treatment and subjects, settings, and history as well

    as (b) between subjects and settings. Researchers often temper their assertions by setting out

    study limitations.

    Strategies: cross-case comparisons; litera-

    ture comparisons; detailed, thick descrip-

    tions; researcher reflexivity to mitigate inva-

    lidity; state study limitations (account for se-

    lection, setting, and history effects that might

    make the study unique to only a single group

    [i.e., not transferable]).

    Reliability (of the instrument and methods):

    Refers to the extent to which someone else can follow the research design with the same sam-

    ple and get the same results. Are the methods reproducible and consistent, and is sufficient in-

    formation provided so others can repeat the approach and procedures? To what extent are vari-

    ations controlled?

    The reliability of the instrument depends on six types of validity: (a) face validity (subjects think

    the test is measuring what it is supposed to measure); (b) expert judges think the test is valid;

    (c) test items actually contain content being measured; (d) compare a new test with a previously

    validated test (concurrent validity); (e) taking a test is good prediction of a score when the test is

    taken again in the future (predictive validity); and (f) construct validity (mix of all of the oth-

    ers—did the test measure the intended higher-order construct and nothing else related to it, de-

    termined by how the variables are operationalized?).

    Strategies: standardized administration of instrument or procedure; internal consistency (i.e.,

    ensure instrument items are actually measuring the underlying construct, reflected in Cron-

    bach’s alpha); increase number of test items; use objective scoring; test-retest; ensure that two

    Dependability:

    Related to reliability, researchers have to re-

    sponsibly provide sufficient information so

    others can repeat the research design proto-

    col in their context but not necessarily get the

    same results. It refers to the stability of find-

    ings over time and in changing research con-

    texts (i.e., others can rely [depend] on the

    study). The latter means the findings, conclu-

    sions, and interpretations must be supported

    by the data. Note that credibility ensures de-

    pendability.

    Strategies: audit trail; triangulation; rich doc-

    umentation; intra- and intercoder or observer

    agreement; approach and procedures are

    appropriate for the context and can be docu-

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 19 of 27 Understanding and Evaluating Research: A Critical Guide

    Integrity of Research Designs

    Both quantitative and qualitative researchers have to ensure, respectively, that their data are reliable (can be

    replicated or adopted) and valid (they measured what was intended to be measured), with results generaliz-

    able to those outside the study or adoptable in another setting, respectively. Differences in the philosophical

    assumptions between quantitative, qualitative, and mixed methods approaches, however, mean researchers

    tend to employ different terminology for these key aspects of a study’s rigor or quality (Ary et al., 2010). And,

    although very consistent in quantitative research, nomenclature for issues of rigor is not consistent in qualita-

    tive research (Suter, 2012) and has a unique twist in mixed methods (Teddlie & Tashakkori, 2006).

    Integrity of Qualitative and Quantitative Research Designs

    Table 8.5 provides an overview of the most agreed-to approaches and terms used by both types of re-

    searchers and of attendant strategies to meet the standard for the specific research methodology (Ary et al.,

    different forms of one test measure the same thing. mented.

    Generalizability (breadth of applicability):

    Researchers want to make broad statements from their specific case (they used a small ran-

    dom sample from a whole population). They want their conclusions to hold for others not in their

    study. Based on statistical assumptions, generalizability refers to the extent to which results and

    conclusions can be applied to people, settings, or conditions beyond those represented in the

    study.

    Strategies: account for external validity.

    Authenticity (realness for participants):

    Researchers want to make specific state-

    ments about only the people they studied

    (how the latter see their world). So, authen-

    ticity refers to the extent to which partici-

    pants’ voices and agency are ensured, and it

    strives for assurances that the researcher

    has represented all views of all participants

    (authentic means “original, genuine, undis-

    puted”).

    Strategies: collaboration with participants;

    member checking; researcher reflexivity (in-

    volves self-critique and disclosure of what

    one brings to the research).

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 20 of 27 Understanding and Evaluating Research: A Critical Guide

    2010; Creswell & Miller, 2000; Guba, 1981; Johnson & Christensen, 2012; Lincoln, 1995; Nahrin, 2015; New-

    man, Newman, & Newman, 2011; Shenton, 2004). Table 8.5 addresses issues of unbiased and trustworthy

    data, objectivity and subjectivity (confirmability), internal validity and credibility, external validity and transfer-

    ability, reliability and dependability, and generalization and authenticity (representing quantitative and qualita-

    tive, respectively). These are discussed in more detail in Chapters 9 and 10.

    Integrity of Mixed Methods Research Designs

    With more detail in Chapter 10, mixed methods authors have to be concerned with reporting both design rigor

    and interpretative rigor because mixed methods research depends on integration of data from both qualita-

    tive and quantitative strands. Interpretative rigor evaluates the validity of the conclusions and comprises three

    standards: (a) Interpretative consistency occurs when inferences follow from the findings or results, rather

    than coming out of the blue. (b) Theoretical consistency means inferences are consistent with known the-

    ories. (c) Integrative efficacy occurs when meta-inferences (integrating initial strand-specific inferences into

    inferences that apply across the entire data set) adequately incorporate inferences that stem from both the

    qualitative and quantitative phases of the study; that is, neither is privileged when discussing the outcomes of

    the study (Teddlie & Tashakkori, 2006).

    Technical Aspects of Reporting Methods

    In addition to length and organizational logic and approaches, several grammatical conventions inform the

    preparation of the Methods section of a research paper (e.g., person, tense, and voice) (Lynch, 2014). Each

    is now discussed.

    Length

    Suter (2012) suggested that the Methods section is one of the longest sections of a research proposal (in the

    range of five or so pages), but for a research article, it is one of the shortest sections. Fox (2013) explained

    that while the findings of a qualitative paper constitute 30%–40% of the paper, the Methods section is shorter

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 21 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    (10%) and requires that authors employ a concise, tight, logical writing style. In a quantitative report, authors

    should devote about 10%–15% of their paper to their Methods section (Thomas, 2011). The length of this

    section in a mixed methods study depends on which strand was prioritized, qualitative or quantitative.

    Lynch (2014) clarified that the length of a qualitative Methods section is dictated by how much detail is re-

    quired to describe site selection, access, sampling, data collection, and analytical procedures. Authors also

    have to make available an audit trail (detail) that readers can follow to access researchers’ thinking while they

    implemented and adjusted their emergent research design. The same principle of detail holds for a quantita-

    tive paper. The quantitative Methods section should be detailed enough that (a) it can be repeated by others

    because its essential characteristics have been recounted (reliability) and (b) readers can judge whether the

    results and conclusions are valid (i.e., did the study measure what it intended to measure?) (Kallet, 2004).

    More specifically, detail means those things that could logically be expected to influence the results. “Insuffi-

    cient detail leaves the reader with questions; too much detail burdens the reader with irrelevant information”

    (American Psychological Association, 2001, p. 18).

    In all three methodologies, authors have to ensure that readers can follow what was done and judge its rigor

    and quality. Labaree (2016) opined that authors should assume readers possess a basic understanding of the

    method. This assumption means authors do not have to go into great detail about specific procedures; rather,

    they should focus on how they “applied a method, not on the mechanics of doing the method.” The accepted

    convention is to provide adequate citations to support the choice and application of the methods employed

    in their study. They need to know their audience and decide how much detail is appropriate (Goodson, 2017;

    Harris, 2014). If they are reporting a new procedure they developed for their study, more detail is justified, but

    they should avoid the recipe approach (step-by-step) (The Writing Center, 2014b).

    In summary, when deciding on the length of their Methods section, authors have to take “into account the

    difficult balance between completeness (sufficient details to allow . . . verification [of rigor]) and brevity (the

    impossibility of describing every technical detail and the need to strictly follow the guidelines/instructions for

    authors provided by journals and recommendations regarding word count limits)” (Azevedo et al., 2011, p.

    232).

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 22 of 27 Understanding and Evaluating Research: A Critical Guide

    Organizational Logic and Approaches

    Suter (2012) observed that readers “can easily get lost in a disorganized maze that purports to describe

    or manage data” (p. 461). Journal editors commonly reject manuscripts due to errors or omissions in the

    Methods section (Boylorn, 2008; Hesson & Fraias-Hesson, 2010b). To offset this possibility, authors need

    to choose an organizational framework for their Methods section that “effectively make[s] sense of data and

    convince[s] the reader that the plan for data management [collection, and analysis] is meaningful, structured,

    and coherent” (Hesson & Fraias-Hesson, 2010b, p. 461).

    “The organization of the method section depends on the author’s presentation logic” (Rocco & Plakhotnik,

    2011, p. 167). (a) The most common approach is chronological, meaning authors would arrange the discus-

    sion of their method in the order that things occurred. (b) Sometimes, in order to describe a complex aspect

    of their research design, authors may have to shift to a most-to-least-important structure within the chrono-

    logical approach. (c) Another common organizational pattern is general-to-specific (Boylorn, 2008; Hesson &

    Fraias-Hesson, 2010b; Labaree, 2016). (d) Authors can also organize their Methods section using the major

    components of their research design, identified with subheadings, taking direction from Table 8.4 for each

    of qualitative and quantitative reports (Boylorn, 2008; Hesson & Fraias-Hesson, 2010b; Rocco & Plakhotnik,

    2011).

    Objective Versus Subjective Writing

    When preparing quantitative papers, authors are encouraged to use descriptive writing so they can ensure

    concise, adequate, logical, and detailed descriptions of their methods (Goodson, 2017; Labaree, 2016; Rocco

    & Plakhotnik, 2011). Goodson (2017) explained that, as ironic as it sounds, when using descriptive writing,

    authors strive to be objective and avoid subjective judgments of what happened during the sampling or data

    collection stages. She provided these examples (p. 177):

    Example 8.4 Descriptive (objective) writing “After examining the pictures, the researcher asked

    each child to select the picture they [sic] wanted to discuss.”

    Example 8.5 Nondescriptive (subjective) writing: “The very young second-graders examined

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 23 of 27 Understanding and Evaluating Research: A Critical Guide

    three interesting pictures the researcher presented to them. After the children spent a lot more

    time than planned examining the pictures, the researcher asked each child to select the picture

    they [sic] wanted to discuss.” (Note that the words in bold represent the writer’s subjective judg-

    ments of what happened during data collection.)

    The use of subjective writing is more allowable in qualitative papers as long as researchers have addressed

    their inherent biases by (a) engaging in reflexivity (i.e., continuous examination and explanation of how they

    influenced the research process) and (b) creating an audit trail by which readers can trace the author’s cog-

    nitive decisions pursuant to putting their research plan into action (Blaxter, 2013). Per example 8.5, if it was

    a qualitative study, children spending longer than anticipated to examine the pictures may have been a key

    moment that shaped the researcher’s understanding of the phenomenon as experienced by those living it.

    Person, Tense, and

    Voice

    The different use of language (i.e., person, tense, and voice) in the Methods section of quantitative and quali-

    tative papers reflects the different epistemological and axiological assumptions of these two broad approach-

    es to research (Lynch, 2014).

    Person

    Authors of quantitative papers conventionally write in third person so as to reflect the objective nature of the

    scholarship. The Methods section of qualitative papers is often written using a much more subjective tone,

    employing second, and even first, person because the author (researcher) is the main data collection instru-

    ment and is intimately involved in the implementation of the research design plan (Boylorn, 2008; Hesson &

    Fraias-Hesson, 2010a; Johnson & Christensen, 2012). First person is now acceptable in social sciences but

    less so in the natural sciences (The Writing Center, 2014b). “This rhetorical choice brings two scientific values

    into conflict: objectivity versus clarity” (The Writing Center, 2014b, p. 7). The scientific community has yet to

    reach a consensus about which style should be used, meaning authors should at least consult the journal’s

    preferred style manual or its Guidelines for Authors (see Chapter 5).

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 24 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i1954.xml

    Tense

    Regardless of the research methodology, the Methods section is written in the past tense because the work

    has already happened (Boylorn, 2008; Hesson & Fraias-Hesson, 2010a; Kallet, 2004; Labaree, 2016). There

    are a few exceptions. Sentences describing standard procedures commonly used by others are written in

    present tense (e.g., “This assessment instrument is often used in studies focused on student intelligence”)

    (Lynch, 2014). Also, authors should try to avoid using the imperative (e.g., “Add 5 grams of the solid to the

    solution”) because it sounds like a recipe approach, which is to be avoided. A narrative structure using past

    tense is preferred to a step-by-step, recipe model (The Writing Center, 2014b).

    Voice

    Authors of quantitative papers are encouraged to use passive voice because it places the focus on what was

    done, not who did it. Occasionally, the passive voice is used with a by phrase, naming the agent as well as

    the action (e.g., “The survey was administered by the high school principal”) (Boylorn, 2008; Hesson & Fra-

    ias-Hesson, 2010a). While passive voice should always be used in quantitative papers, authors of qualitative

    papers can consciously choose what voice they will use (Boylorn, 2008). Normally, authors of qualitative pa-

    pers employ active voice, which focuses on who did the action. This writing strategy makes sense because

    “qualitative research recognises, and even foregrounds, the role played by individuals—the researcher, the

    informants and other participants” (Lynch, 2014, p. 33).

    Example 8.6 Passive voice Stress was applied to the rubber segments in gradually increasing in-

    crements. [focus on what was done]

    Example 8.7 Active voice Both the researcher and the participants (we) examined the graffiti on

    the walls of the community hall. [focus on who did something]

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 25 of 27 Understanding and Evaluating Research: A Critical Guide

    Review and Engagement

    When critically reading a research report, you would

    □ Judge if the Methods section was long enough to recount what the authors did to sample, collect,

    and analyze data to answer their research question

    □ Ascertain if, overall, they organized their Methods section is such a way that it is meaningfully struc-

    tured, providing a coherent overview that is understandable (i.e., nothing is missing or inadequately

    explained)

    □ Confirm that quantitative authors used objective writing, avoiding subjective judgments of what

    happened during the sampling or data collection and analysis stages

    □ Determine if they followed the recommended conventions for tense, voice, and person for the

    methodology

    Final Judgment on Research Design and Methods Section

    Taking all of the Review and Engagement criteria into account, what is your final, overall judgment of the

    research design and Methods section of the paper you are critically reading?

    Chapter Summary

    This chapter addressed the very complicated issues of research design and what is involved in reporting the

    methods employed to sample, collect, and analyze data to answer the research question for a particular study.

    It began with a discussion of the larger construct of research design, including (a) the link between research

    design and research inquiry, (b) research design as logic and logistical, and (c) the most common research

    designs organized by the three methodologies: qualitative, quantitative, and mixed methods (see Table 8.2).

    The conversation then shifted to a general overview of methods (distinguished from methodology), acknowl-

    edging more detailed coverage to follow in Chapters 9 and 10. The purposes of the Methods section were

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 26 of 27 Understanding and Evaluating Research: A Critical Guide

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2431.xml

    https://methods.sagepub.com/book/understanding-and-evaluating-research/i2560.xml

    1.

    2.

    3.

    4.

    5.

    6.

    7.

    8.

    9.

    identified, followed with general introductions to (a) the major differences between qualitative and quantitative

    inquiry, (b) the major reporting components (subheadings) of each type of research report, and (c) the topic

    of rigor and quality in each of the three methodologies. The chapter wrapped up with an overview of the basic

    grammatical and organizational conventions of reporting and writing up the Methods section of a research

    paper.

    Review and Discussion Questions

    Based on the approach used in this book, how do methods differ from methodologies? How do

    methods differ from the research design?

    Distinguish between research design as logical and as logistical (Figure 8.1).

    How is the research design tied with the type of research inquiry?

    What are the main differences between qualitative and quantitative inquiry and their approach to

    scholarship (see Table 8.3)? Which of these aspects of scholarly inquiry did you struggle with the

    most, and why?

    Compare qualitative research design logic with quantitative research design logic.

    Identify the basic steps for conducting and reporting both qualitative and quantitative studies, com-

    menting on the issue of linearity and sequentiality. Which method do you feel most comfortable with,

    and why? How did these differ from approaches to designing mixed methods studies?

    Explain to someone else in plain language the basic differences in reporting the methods used for

    qualitative, quantitative, and mixed methods studies.

    How do qualitative, quantitative, and mixed methods studies differ in how they deal with quality and

    rigor in their research design (see Table 8.5)?

    Summarize the conventions for length, tense, person, and voice when preparing the Methods sec-

    tion of a research paper, depending on the methodology.

    https://dx.doi.org/10.4135/9781071802656

    SAGE

    © 2018 by SAGE Publications, Inc.

    SAGE Research Methods

    Page 27 of 27 Understanding and Evaluating Research: A Critical Guide

    https://dx.doi.org/10.4135/9781071802656

      SAGE Research Methods

      Understanding and Evaluating Research: A Critical Guide

      Overview of Research Design and Methods

      Learning Objectives

      Introduction

      Etymology and Definition of Methods and Research Design

      Research Design

      Research Inquiry and Research Design

      Articulating Research Purpose in Research Design

      Research Design as Logical and Logistical

      Quantitative Research Design Logic

      Figure 8.1 Research Design as Logic and Logistical

      Qualitative Research Design Logic

      Mixed Methods Research Design Logic

      Review and Engagement

      Most Common Research Designs

      Methods

      Methodology Versus Methods

      Review and Engagement

      Purpose and Importance of the Methods Section

      Major Differences Between Qualitative and Quantitative Intellectual Inquiry

      Major Components (Report Subheadings) of Qualitative and Quantitative Research

      Review and Engagement

      Integrity of Research Designs

      Integrity of Qualitative and Quantitative Research Designs

      Integrity of Mixed Methods Research Designs

      Technical Aspects of Reporting Methods

      Length

      Organizational Logic and Approaches

      Objective Versus Subjective Writing

      Person, Tense, and Voice

      Person

      Tense

      Voice

      Review and Engagement

      Final Judgment on Research Design and Methods Section

      Chapter Summary

      Review and Discussion Questions

    Research Goal

    s and

    Research Question

    s

    :

    Qualitative

    or

    Quantitative

    ?

    Given that you now know the philosophical differences in qualitative and qualitative research, you should now be able to distinguish between those types of research goals. See this list attached of research goals and research questions. 1) Match the research goal to the research question(s) and 2) identify them as either qualitative or quantitative (no mixed methods yet), and 3) explain WHY it is so.  Use the table below to cut/paste the goals and questions into and provide your answers. Look for specific key words to help you differentiate between qualitative and quantitative, and remember that the “why” answer is vital.

    Research Goal

    Research Question

    Research Type

    Reasons for Research Type

    The goal of this study was to examine the relationships of transformational leadership and organization climate with working alliance in a children’s mental health service system.

    The current study investigates whether organizational factors such as climate and leadership can also be related to the quality of interactions between providers and clients.

    Quantitative

    The research question asks about how multiple variables are related to one another. There are keywords like “examine relationships”, “be related to”, etc.

    The purpose of the current study is to explore how teachers may shape girls’ aspirations towards STEM careers.

    What is the role of teacher talk in girls’ pursuit of and attitudes towards STEM careers?

    Qualitative

    Key words include “to explore”, which is more commonly used with a non-hypothesis qualitative exploration. In addition, a study looking at “teacher talk” is harder to quantify in numbers, as it’s about their language/words, which indicates the method may be interviews.

    Research Goal

    1. The goal of this study is to investigate whether leaders’ well-being, in the form of positive affect and job stress, can be explained by leader-member exchange (LMX) quality at the group level of analysis.

    2. What is the process of negotiating and reaching consensus within a particular social structure?

    3. The purpose of this study is to explore how spousal carers of people with MS interpreted their lived experience with their partner, the way in they assigned meaning to their being in such a situation, and the skills and knowledge they have developed to live with their situation.

    4. The purpose of this study was to investigate decision-making experiences and the social psychological processes family member surrogates use for health care decisions as they related to decision making with and for a terminally ill family member.

    5. The purpose of this study is to examine the extent to which leaders’ and teams’ goals work together to affect a range of outcomes when their teams fail to regulate (i.e., when they focus exclusively on one particular type of goal). We explicitly focused on learning and performance goals because this distinction is perhaps the most obvious and salient type of goal tension in work organizations. 

    6. What role does friendship play in girls’ developing sense of self? Specifically, does girls’ friendship provide a form of resilience as they transition from childhood to adolescence?

    7. This study will examine the roles of experiential opportunities, organization-initiated cross-cultural experiences (i.e., those found in leadership development programs) and non-work cross-cultural experiences.

    8. The goal of this study is to analyze the conditions under which women are promoted to top leadership positions and exploring the challenges they face post-promotion.

    Research Questions

    1. What do caregivers define as successful day-to-day experience?

    2. How do girls describe the development of their sense of self during transition from childhood to adolescence?

    3. Does group-level analysis of leader-member exchange explain leaders’ psychological states of leader well-being, in the form of positive affect and job stress?

    4. After promotion, do female leaders experience a lack of support and/or challenges to their leadership?

    5. What barriers do caregivers have in their everyday life?

    6. Do leaders’ relatively immutable personality characteristics (i.e., the Big Five) affect global leadership competencies?

    7. How did legal family member-surrogates honor self-determination with imperatives of caring for ill family member?

    8. What are the potential benefits of having leaders and work teams pursue compensatory goals?

    9. How do stakeholders enter into negotiations and what motivates and constrains them during this process?

    Are you stuck with your online class?
    Get help from our team of writers!

    Order your essay today and save 20% with the discount code RAPID