Question one:How did the authors develop the list of 20 primary studies summarized in Table 1? How did they end up reviewing, significantly, only 18 of these?
Notably, the authors developed the list of 20 primary studies through systematic reviews that were supplemented with descriptions of primary studies. More precisely, the list was developed based on the systemic literature review conducted by Just (2012).The previously mentioned evaluation reviewed articles that were retrieved from PubMed and that evaluated the effectiveness of literature search training among medical residents and students. As such, Goodman et al. (2014) points out that the study narrowed on 15 articles. Among the 15 articles, 14 were published in the period between 1998 and 2011, while one was an unpublished dissertation by Judith. According to Goodman et al. (2014), the inclusion criteria used by Just in selecting the 15 articles was whether a study employed objective assessments of search skills and subsequently reported on the statistical significance of the research findings. Further, studies that only relied solely on or that only incorporated knowledge tests or self-assessed skills among trainees, were excluded.
Additionally, five more articles were found through different approaches. First, Goodman et al. (2014) indicates that one more article was found after replicating Just’s research strategy that sought to identify studies that were published between January 2012 and September 2013.Second, a reference search of the previously concluded reviews led to the identification of one more article (Goodman et al. (2014).Thirdly, three more articles were identified through the reference search for relevant articles for the current review. Cumulatively, the articles totaled to 20 articles that were summarized in table one.
However, two studies were excluded since they conducted test-retest assessments after training without incorporating pre-tests. Precisely, one test indicated poor test-retest reliability in the period between t the end of training and three months after training on assessment of search skills (Goodman et al., 2014). The other article only indicated that there were high consistency levels in the quality of searchers three years after training; however, there was a considerable decrease in the number of relevant articles that were retrieved from the searches (Goodman et al., 2014).
Why do the authors move to review (Debowski et al., 2001) and (Wood, Katkebeeke, Debowski, & Frese, 2000) after their primary review?
Markedly, the studies Debowski et al. (2001) and Wood, et al. (2000) were reviewed after the primary review since they were the only studies that manipulated teaching characteristics for bibliographic search. Given that studies that manipulated training characteristics for bibliographic search are vital in the development of actionable and specific recommendations for instructional methods in teaching search skills, it was vital to review them separately to identify the importance of their content to the study (Goodman et al., 2014).
One of the additional studies, (Wood et al., 2002), used a “2 by 2 mixed experimental design with enactive exploration as the between-subjects factor and search task as the within-subjects factor” (Goodman, Gary, & Wood, 2014, p. 337). What were the significant results of the analysis of variance (ANOVA)?
According to Goodman et al. (2014), the ANOVA test indicated that there were significant within-subject effects for activities and for feedback interventions by activity on search strategy quality. Notably, the findings indicated that the quality of search strategies over time and across search results differed considerably among feedback interventions. Further, ANOVA test results indicated that there were considerably significant within-subject effects on the performances. To elaborate, Goodman et al. (2014), points out that performance levels changed significantly as participants progressed considerably through the four activities. Moreover, the rate of change was influenced by feedback interventions; since, the findings indicate the existence of significant interaction of feedback by task.
Take a moment to reflect on your process in completing this exercise. What was more difficult or less difficult than you expected, given how you construed your abilities in this area prior to the directed analysis? How did the challenges you faced, or lack thereof, affect how you feel about continuing with your doctoral studies? Finally, list three concrete steps you will take to raise your skills in research methodology to an expert level.
Following the exercise above, it is notable that the process of determining how primary reviews were selected, and the reason for excluding some studies was relatively easier than what was expected ,prior to the directional analysis. Given that the minor challenges experienced were easily overcome, the researcher was considerably motivated to continue with the doctoral studies. Further, the researcher will follow three steps to improve skills in research methodology. These steps include; reviewing various case-studies on bibliographic search training to learn new approaches: undertaking varied practice to promote discriminant learning and advancement of adaptive skills; and reviewing literature on interacting elements to enhance skills in bibliographic search and Evidence-based management (EBMgt) steps (Goodman and O’Brien, 2012).
- Debowski, S., Wood, R. E., & Bandura, A. 2001. Impact of guided exploration and enactive exploration on self-regulatory mechanisms and information acquisition through electronic search. Journal of Applied Psychology, 86: 1129 –1141.
- Goodman, J. S., & O’Brien, J. 2012. Teaching and learning using evidence-based principals. In D. M. Rousseau (Ed.), The handbook of evidence-based management: Companies, classrooms, and research: 309 –336. New York: Oxford University Press.
- Goodman, J. S., Gary, M. S., & Wood, R. E. (2014). Bibliographic search training for evidence-based management education: A review of relevant literatures. Academy of Management Learning & Education, 13(3), 322-353.