4. Critical Thinking

December 1st, 2016

Definition

Critical thinking is the ability to identify an issue, dilemma, or problem; frame it as a specific question; explore and evaluate information relevant to the question; and integrate the information into development of a resolution. An advanced manifestation of critical thinking is evidence-based practice – the conscientious, explicit, and judicious use of current best evidence about practice, the creation of policy, and the conduct of research.

Back to Top

Knowledge Areas

Through participation in this program, a participant will know:

  • The cognitive hierarchy of critical thinking: knowledge, comprehension, application, analysis, syntheses, and evaluation.
  • Basic statistics and epidemiology, qualitative and quantitative research, systematic reviews, and meta-analyses.
  • The levels of evidence used in the guidelines of the U.S. Preventive Health Services Task Force.

Back to Top

Skills

Basic. Through participation in this program, a participant will:

  1. Use population data to assist in determining the needs of a population for the purposes of designing programs, formulating policy, and conducting research or training.
  2. Formulate a focused and important practice, research or policy question.

Advanced. With more experience and building on the basic skills, MCH leaders will:

  1. Apply important evidence-based practice guidelines and policies in their field.
  2. Identify practices and policies that are not evidence-based but are of sufficient promise that they can be used in situations where actions are needed.
  3. Translate research findings to meet the needs of different audiences.
  4. Discuss various strategies, including supportive evidence, for the implementation of a policy.

Back to Top

Educational Experiences

  1. Critique of the current literature
  2. Review and synthesis of the current literature
  3. Participate in a journal club
  4. Grant writing or analysis of existing grant
  5. Program planning and program analysis
  6. Review of manuscript submission
  7. Policy analysis
  8. Thesis or capstone research
  9. Review how a clinical guideline was developed
  10. Identify an area of uncertainty and why published reviews/data do not apply
  11. Develop evidenced-based rationale for a policy or guideline
  12. Take a statistics course

Back to Top

Resources/Assessment Tools – 4. Critical Thinking

Key Documents

Bloom, B., Englehart, M., Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational objectives: The classification system of educational goals. Handbook I: Cognitive Domain. New York: Longmans Green.

Huitt, W (1998). Critical thinking: An overview. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. http://chiron.valdosta.edu/whuitt/col/cogsys/critthnk.html
Accessed August 13, 2004.

Scriven, M., & Paul, R. (1992, November). Critical thinking is the “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.” Critical thinking defined. Handout given at Critical Thinking Conference, Atlanta, GA.

http://chiron.valdosta.edu/whuitt/col/cogsys/critthnk.html Accessed August 13, 2004

Back to Top

Assessment Tools

Inclusion Criteria –

To be considered for initial inclusion in this web site, the materials had to meet several criteria:

  • the material needed to focus on one or more of the skills listed for a particular competency
  • the material needed to describe either a measurement instrument or theory that could support the creation of such an instrument
  • the material had to be publicly available, that is, where the item is not a commercial entity available for purchase
  • the material needed either psychometric information about its properties as a measure or, particularly in the case of material found only on the Web, a high degree of face validity

Copyright and Use Issues –

The materials initially described were identified for consideration by MCH interdisciplinary training programs. Many of these materials are copyrighted and thus, may not be copied, distributed, transmitted, or published without the express written permission of the copyright owner. It is the responsibility of each user to ascertain whether materials may be freely used or whether such permission is needed.

Portfolios

Portfolios are collections of information that can be used to evaluate MCH knowledge in action.

Portfolios include materials prepared by a learner to demonstrate learning in response to a plan. There is increasing evidence of the utility of portfolios for assessment of learning and for competency assurance in health care.

For a portfolio to be effective, it should include:

  • a learning plan that contains specific goals and objectives
  • materials that demonstrate achievement relative to the learning plan
  • learner reflections
  • learner and faculty evaluations of the material

The ACGME, in its draft Toolbox of Assessment Methods, provides some information about the properties and uses of portfolios for assessment.

Information at:
http://www.acgme.org/outcome/assess/toolbox.asp

There is no shortage of information about evidence-based practice, whether of medicine, education, or policy. Many journal articles, books, and online courses exist that teach principles associated with locating and evaluating evidence. However, there appear to be few measures designed to assess the evidence-related skills of learners. In their review article on the effectiveness of teaching EBM skills, Norman and Shannon (1998) noted that most of the studies reviewed used written pre-post tests consisting of multiple-choice, short answer, or true-false questions related to knowledge and/or skills as measures of effectiveness.

Information at:
Norman, G. R., & Shannon, S. I. (1998). Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: A critical appraisal. Canadian Medical Association, 58 (2): 177-181.

Back to Top

Netting the Evidence: ScHAAR

Netting the Evidence is an extensive set of resources for teaching and learning effective ways to search for and appraise evidence that was created by the School of Health and Related Research at the University of Sheffield, UK. Although the content of this material is primarily about evidence-based medicine, the principles are applicable to the notion of evidence-based policy/practice.

Information at:
https://www.google.com/cse/home?cx=004326897958477606950:djcbsrxkatm

Back to Top

Journal clubs

Journal Clubs can provide a venue not only to share new information but also to serve as a way to teach and assess evidence-based techniques and ways of understanding the uses of data, statistical techniques, research methodologies, and the limits of generalization. Pearce-Smith (2006) has shown that participation in a journal club can result in increased appraisal skills.

Information at:
Pearce-Smith, N. (2006). A journal club is an effective tool for assisting librarians in the practice of evidence-based librarianship: A case study. Health Information and Libraries Journal. 23: 32-40.

Back to Top

Ennis, Critical Thinking Assessment

In his review of critical thinking assessments, Ennis (1993) concluded that general testing of higher-order thinking may be important even though it does not address subject-specific concerns. A number of proprietary tests of critical thinking have been developed over the years. These include the California Critical Thinking Skills Test, the California Critical Thinking Dispositional Index, and the Watson-Glaser Critical Thinking Appraisal. Several of these measures have been used extensively with health care learners, particularly nurses. While educators expect that learners will increase in their critical thinking abilities over the course of their education, results using these instruments have been inconsistent.

Information at:
Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice. 32(3): 179-86.

Back to Top

Newer Assessment Methods

A recent review of the literature shows several newer methods of assessment are being developed and tested as more holistic ways of assessing the development of critical thinking. These include:

Simulation:

Simulation techniques are designed to train learners by placing them in complex situations where there are elements of uncertainty. Working through the situation, usually in a team, learners apply knowledge, deal with issues of incomplete information, problem solve, and interact with other team members. Debriefing after the simulation, often by viewing a videotaped performance of the simulation, and reflection are key components of the technique.

The cost of creating and providing simulated experiences is likely to be considerable. Thus, the practical implementation of this technique for training and assessment may depend on working collaboratively with centers where such programs currently exist.

Concept maps:

Concept mapping is a way of providing a graphical representation of one’s thinking. It can be a useful technique for deconstructing complex situations by giving the learner the opportunity to parse her or his thinking by synthesizing relevant information about a case or case study. Recognizing discrete entities and creating linkages between them provides ways of analyzing the complexity often found in patient care or organizational situations. The learner presents the resulting concept map to a faculty member or mentor who may probe the situation further with questions about relationships between entities, gaps, needs, etc. A variety of mechanisms for scoring concept maps have been created. Hsu and Hsieh (2005) have used concept mapping done before and after training to demonstrate an increase in learners’ sophistication in critical thinking over time.

Information at:
Hsu, L. & Hsieh, S. (2005). Concept maps as an assessment tool in a nursing course. Journal of Professional Nursing, 21(3): 141-9.

Online interactive cases:

The modified essay question (MEQ) format has been a popular method for testing in medical education for decades. This format presents the learner with a scenario that unfolds over time. As information from each segment of the scenario is disclosed, the learner is asked questions about what he or she is thinking and would do next. Each successive segment provides the learner with more information about what is actually happening and asks the learner to incorporate that information into her or his thinking. The MEQ has frequently been used as an oral examination.

The availability of computer technology has made the development of online MEQ-type cases possible. One example of such cases is CLIPP (Computer-assisted Learning in Pediatrics Program). While the content of these cases is not relevant to MCH leadership, the modules can serve as an example of online interactive material that has demonstrated efficacy.

Information at: http://www.clippcases.org/

Back to Top

« Previous: 3. Ethics & Professionalism | Next: II. Others »

Back to Top

Comments are closed.