• - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • How to present patient...

How to present patient cases

  • Related content
  • Peer review
  • Mary Ni Lochlainn , foundation year 2 doctor 1 ,
  • Ibrahim Balogun , healthcare of older people/stroke medicine consultant 1
  • 1 East Kent Foundation Trust, UK

A guide on how to structure a case presentation

This article contains...

-History of presenting problem

-Medical and surgical history

-Drugs, including allergies to drugs

-Family history

-Social history

-Review of systems

-Findings on examination, including vital signs and observations

-Differential diagnosis/impression

-Investigations

-Management

Presenting patient cases is a key part of everyday clinical practice. A well delivered presentation has the potential to facilitate patient care and improve efficiency on ward rounds, as well as a means of teaching and assessing clinical competence. 1

The purpose of a case presentation is to communicate your diagnostic reasoning to the listener, so that he or she has a clear picture of the patient’s condition and further management can be planned accordingly. 2 To give a high quality presentation you need to take a thorough history. Consultants make decisions about patient care based on information presented to them by junior members of the team, so the importance of accurately presenting your patient cannot be overemphasised.

As a medical student, you are likely to be asked to present in numerous settings. A formal case presentation may take place at a teaching session or even at a conference or scientific meeting. These presentations are usually thorough and have an accompanying PowerPoint presentation or poster. More often, case presentations take place on the wards or over the phone and tend to be brief, using only memory or short, handwritten notes as an aid.

Everyone has their own presenting style, and the context of the presentation will determine how much detail you need to put in. You should anticipate what information your senior colleagues will need to know about the patient’s history and the care he or she has received since admission, to enable them to make further management decisions. In this article, I use a fictitious case to …

Log in using your username and password

BMA Member Log In

If you have a subscription to The BMJ, log in:

  • Need to activate
  • Log in via institution
  • Log in via OpenAthens

Log in through your institution

Subscribe from £184 *.

Subscribe and get access to all BMJ articles, and much more.

* For online subscription

Access this article for 1 day for: £50 / $60/ €56 ( excludes VAT )

You can download a PDF version for your personal record.

Buy this article

clinical presentation evaluation form

  • See us on facebook

What are you looking for?

Forms, templates & examples, aamc, acgme, gra & medhub conference presentations.

  • Using the self-study to further GME institutional integration: From notification to implementation
  • Polish In Action : Form The Best APE and Make It Shine
  • Stanford GME Revised APE Summary and Metrics
  • APE Guidebook v2024
  • Red-Yellow-Green: A Visual Scorecard Reporting on Program Quality
  • Report Card Database & Template 
  • Scorecard Handout
  • Putting it All Together: Standardizing the CCC Process via a Strategic Resident Evaluation toolkit (zip file)
  • Putting it All Together: Standardizing the CCC Process via a Strategic Resident Evaluation (slide deck)
  • Beyond Collaboration: Real-time Integration with the C-Suite
  • APES to Self-Studies - Seamlessly Putting the Accreditation Puzzle Pieces Together
  • Trend Analysis Workbook
  • GuideBook 2.1
  • Deadlines & Timeline Sample
  • Trend Analysis Template
  • Milestone Evaluations: Discovery of Some Thought Provoking Rating Trends
  • Using Organizational Change Tools to Meet the Challenges of the Changing GME World
  • Using Data Strategically to Streamline Coordinator Work Products and Maximize Program Outcomes
  • Modified Using Data Strategically to Streamline Coordinator Work Products and Maximize Program Outcomes
  • Deadlines 2016-2017
  • GME Housestaff Survey
  • Program Evaluation by Residents
  • Program Evaluation by Faculty
  • Summative Evaluations
  • Semi-Annual Trainee Evaluation by Program Director
  • How to Create a Successful Game Plan for APES and Self-Studies
  • How to Use Milestones to Provide an Evidence-based Early Warning System for Identifying the Academically Vulnerable Resident
  • Continuing the Conversations…Techniques for Difficult Encounters
  • Coordinators: Mission NOT Impossible – Using Data to make You and Your Program Shine!
  • Lessons Learned from NAS: The Need for an Institutional Curriculum for GME Professionals
  • Mission NOT Impossible – Using Data to make You and Your Program Shine!
  • Using C-Suite Business Models and Tools to Address Issues faced by DIOs/GME in the New Era of Healthcare Reform
  • Voice of the Coordinator: Your experience on what works and doesn’t work with CCC meetings
  • Voice of the Director: Your experience on what works and doesn’t work with CCC meetings
  • Coordinators and Clinical Competency Committees: How to Streamline and Support the Work of your Program’s CCC
  • Meaning beyond numbers: The power of qualitative inquiry for program assessment 
  • The Basics of Effective Evaluations in the Era of NAS 
  • GME Lean Streamlining
  • Duty Hour Requirements

Institutional Report Card

  • Site Visit 102 and the DIOs role
  • Designing GME Evaluations
  • Resident Perceptions and Program Quality
  • Eliminating Bias from Evaluation Instruments
  • Institutional Report Card and Decision-making
  • Teaching the Competencies
  • Patient Physician Communication C-I-CARE
  • Streamlining the Evaluation Process
  • Using Diverse Data Sources - Some Surprising Findings on the Drivers of Resident Wellness
  • GME Housestaff Survey Template
  • 2017-2018 GME Housestaff Survey
  • CLER Template
  • Analyzing the Dynamics of GME Change with Competitive Analysis Tools & Applying Change Management Methodology to Build Resilient Programs
  • Building Resident Resiliency: The Challenge of Feedback and the Millennials
  • Data Driven Tools to Facilitate Evidence-based Decision Making Supporting Program Resiliency
  • Using MedHub and Process Streamlining to Make Organizing & Documenting GME / Accreditation Data Easy
  • A3 Template
  • Alumni Survey Template

Annual Program Evaluation (APE) & Program Evaluation Committee (PEC)

  • Annual Program Evaluation Data Checklist
  • APE Documentation Templates (Sign-in Sheet & Agenda, Meeting Minutes, and Approval of Action Plan)
  • APE Prep Instructions for Program Directors (Step-By-Step Instructions & APE Checklist)
  • A Quick Method to Analyze Program Evaluations
  • APE Powerpoint Presentation Example
  • Program Evaluation Committee Policy Template
  • Program Improvement Action Plan Template
  • Program Improvement Action Plan (courtesy Yuen So, MD, Neurology)
  • Program Improvement Meeting Agenda/Minutes (courtesy Harchi Gill, MD, Urology)
  • Away Rotation Application

Clinical Competency Committee (CCC) & Milestones

  • Clinical Competency Committee Policy Template

Conditional Formatting

  • Video Explanation: Resident Performance Profile Tool (ACGME 2014)
  • The Practice Excel File that Accompanies the Video (Refer to tab PGY3 for practice): Resident Performance Profile Tool

Eligibility and Selection of Residents and Interview Policy

Final evaluation.

  • Final Evaluation Template

GME House Staff Survey

  • GME House Staff Survey Form

Institutional Report Card Template

Leave of Absence (LOA)

  • Program-specific LOA Policy
  • Institution - LOA Policy
  • Leave of Absence Form

On Call Coverage

On-Call Coverage/Moonlighting Request Form

Milestone 2.0 Update

GME's Milestone 2.0 Video Guide -  Click Here to Watch

GME's Milestone 2.0 Workshop Recording -  Click Here to Watch

Program Curriculum

  • Curriculum and Objectives Template (courtesy David Cornfield, MD, Pediatrics Critical Care)
  • Curriculum, Goals and Objectives Example (courtesy Alice Edler, MD, MPH, MA (Education), Former Pediatric Anesthesia Program Director and GME Faculty Fellow)
  • Writing Curriculum: Goals, Objectives, Assessment and ACGME Competencies

Program Expansion/Funding

Program Expansion/Funding Application

Program Letter of Agreement (PLA)

Program Letter of Agreement (PLA) Template - Standard Away [Kaiser, LPCH, SCVMC, Etc.]

Program Letter of Agreement (PLA) Template - Non-Standard Away

Program Letter of Agreement (PLA) Template - Visiting

Program Letter of Agreement (PLA) Template - Veterans Affairs

Moonlighting

Program-specific Moonlighting Policy

Recruitment

  • Program-specific Recruitment Policy
  • Institution- Recruitment Policy

Residency Training Agreement Guideline

Resident/Fellow Probation

Notice of Placement on Probation

Sharps Training

Sharps Checklist

  • Sleep Pods for Strategic Napping

Supervision

  • Program-specific Supervision Policy  (courtesy Iris Gibbs, MD, Radiation Oncology)
  • Program-specific Supervision Policy  (courtesy Lois L. Bready, MD @ UTSW)
  • Protocol defining common circumstances requiring faculty involvement Template
  • Institution - Supervision Policy

Transitions of Care

  • Program-specific Handover/Transfer Policy
  • Institution - Handover/Transfer Policy
  • Transitions of Care - Timed Training
  • Evaluation: Transitions in Patient Care – Sign-Out Evaluation
  • Program Specific Wellness Policy Template  
  • Program-specific Work Hours Policy
  • Protocol for Remaining Beyond Scheduled Work Period
  • Institution - Work Hours Policy

Template: Clinical Evaluation Report

Template download.

If you are a user of  Formwork, our eQMS software , you can save a lot of time by choosing “QMS” on the top menu and “OpenRegulatory Templates” on the left menu, and then opening the relevant folder to find this template ready to load into Formwork.

If, for some mysterious reason, you’re using a different QMS Software, you can also simply download this template – specifically, as Word (.docx), PDF, Google Docs or Markdown file. Scroll down for a preview!

The  template license  applies (don’t remove the copyright at the bottom, don’t re-use this for commercial purposes).

Unsure how to get started and how to get your EU MDR medical device certified? We’ve already helped hundreds of companies with their MDR compliance. Book a free 30-minute consulting call and let’s discuss how you can get your compliance done efficienty.

1. Purpose and Scope

According to the Regulation (EU) 2017/745, Article 61 and ANNEX XIV, the evaluation of the clinical performance and safety as well as the clinical benefit must be based on clinical data and is required for all medical device classes. The clinical evaluation report and the clinical data on which it is based, verifies the clinical safety and performance of the [device name].

A clinical evaluation plan [Reference] is in place and this clinical evaluation report is carried out in accordance with the plan.

2. Definitions

Definition / AbbreviationDescription
MDRRegulation (EU) 2017/745
[…][…]

3. Product Information

Manufacturer:
Product name:
Product models:
CE marking:
Classification:

3.1 Intended Use

Add intended use.

3.2 Patient Population

Add patient population

3.3 Intended Medical Indication

Add intended medical indication

3.4 Contraindications

If none, state as follows: There are no known specific situations that contraindicate the use of this device.

3.5 Operating Principle

Offer a detailed overview of the device, encompassing its name, models, sizes, and components across hardware, software, and accessories. Clearly categorize the device, such as a biological artificial aortic valve, and outline its physical and chemical attributes, technical specifications, and mechanical traits. Specify sterilization methods, radioactivity considerations, and operational principles. Detail materials used, particularly those in contact with the patient, and any inclusion of medicinal substances, animal tissues, or blood components. Incorporate a visual representation, and note the device’s class, global market entry, and specific product configurations. Highlight innovative features relevant to ongoing assessments and address unmet medical needs. Provide concise step-by-step application procedures, elucidate performance in different modes, and describe the device’s workflow.

3.6 User Profile

Describe the typical user of the software. Some ideas could be: Qualifications, prior training (for your software), technical proficiency, time spent using the software.

3.7 User Environment Including Hardware / Software

Describe the typical use environment. What sort of devices is this running on? Does the software only run on one device or multiple devices? Is it loud and chaotic like in an emergency ward? How’s the lighting? Also, add other software or hardware which is required by your device. Most commonly, apps require users to have a smartphone with a compatible operating system (iOS / Android).

4. Clinical Benefits

Describe the intended clinical benefit(s) of the device.

5. Clinical Claims

All claims can be found in the table below. These claims will be thoroughly examined as part of the literature search in the clinical evaluation.

No.ClaimSourceReference
1Our device reduces procedure time by 20%Website / promotional materialUsability study / Literature analysis (addressed in clinical evaluation report) / verification and validation / PMS data; PMCF data
If there are no claims: No claims require validation through the clinical evaluation

6. Context of the Medical Device

6.1 developmental context.

Provide an overview of the device’s developmental context, including its current market presence in Europe or other countries, the duration of its presence, and the quantity of devices placed on the market. Consider incorporating information from relevant publications to enrich this chapter.

6.2 State of the Art

Outline the state of the art and the medical alternatives of the device. Summarise guidance documents, common specifications or health technology assessment report, which could help describing the state of the art. Usually, review articles provide a broad overview on the state of the art and medical alternatives.

7. Clinical Evidence

Clinical evaluation is an on-going process, conducted throughout the life cycle of a MDSW. Both favorable and unfavorable data considered in the clinical evaluation shall be included in the technical documentation.

Three key components will be taken into account when compiling clinical evidence:

Valid clinical association

  • Demonstrate that it corresponds to the clinical situation, condition, indication or parameter defined in the intended purpose of the MDSW

Technical performance

  • Demonstration of the MDSW’s ability to accurately, reliably and precisely generate the intended output, from the input data.

Clinical performance

  • Demonstration of a MDSW’s ability to yield clinically relevant output in accordance with the intended purpose

7.1 Valid Clinical Association

Valid clinical association is understood as the extent to which, the MDSW’s output (e.g. concept, conclusion, calculations) based on the inputs and algorithms selected, is associated with the targeted physiological state or clinical condition. This association should be well founded or clinically accepted. The valid clinical association of a MDSW should demonstrate that it corresponds to the clinical situation, condition, indication or parameter defined in the intended purpose of the MDSW.

Example: MDSW that detects heart arrhythmia by analysing auscultation sound obtained by a digital stethoscope requires demonstrating valid clinical association of the association between abnormal cardiac sounds and heart arrhythmia. Evidence supporting valid clinical association can be generated e.g. through literature research, professional guidelines, proof of concept studies, or manufacturer’s own clinical investigations/clinical performance studies. The is intended to […]. This is a well-established clinical procedure and clinically accepted. The valid clinical association will be demonstrated with: Technical standards Professional medical society guidelines Systematic scientific literature review Clinical investigations Published clinical data (e.g. Summary of Safety and Clinical Performance (SSCP) / Summary of Safety and Performance (SSP), Registries and databases from authorities)

7.1.1 Systematic scientific literature review

Chosen source for the literature search is PubMed. The table lists the search terms used and the number of results.

No.Search categorySearch termno. of results
1Clinical association / state of the art
Describe the total number of results and the number of duplicate publications.

Different filters, exclusion & selection criteria have been used.

Justify the set filters, especially the timeframe and the limitation of certain evidence level.
  • Publication Dates
  • Article types
  • Text availability

Exclusion criteria

  • Publications about animal trials
  • Publications in a language other than English
  • Publications published before [Date]
  • Publications with no abstract
  • Duplicates identified in more than one search category
  • Publications with the following content are generally not relevant:

Selection criteria

  • Publications describing or focusing on the use of the medical device under evaluation
  • Publications describing the use of an equivalent device
  • Publications describing or focusing on comparative literature of medical alternatives and state of the art of the medical device under evaluation

7.2 Technical Performance

According to the MDCG 2020-1 technical performance is the demonstration of the MDSW’s ability to accurately, reliably and precisely generate the intended output, from the input data. Evidence supporting technical performance can be generated through verification and validation activities, e.g. unit-level, integration, and system testing or by generating new evidence through use of curated databases, curated registries, reference databases or use of previously collected patient data. Technical performance is confirmed by the examination and provision of objective evidence that the MDSW specifications conform to user needs and intended uses, and that the requirements implemented can be consistently fulfilled. For example, performance verification and validation in the intended computing and use environments can be characterized by the demonstration of: availability, confidentiality, integrity, reliability, accuracy (resulting from trueness and precision), analytical sensitivity, limit of detection, limit of quantitation, analytical specificity, linearity, cut-off value(s), measuring interval (range), GENERALISABILITY, expected data rate or quality, absence of inacceptable cybersecurity vulnerabilities, HUMAN FACTORS ENGINEERING. Summarize the relevant tests, validations and verifications to demonstrate that the medical deviceaccurately and consistently meets the intended purpose in real-world usage. Add subchapter if necessary.

7.3 Clinical performance

Validation of the CLINICAL PERFORMANCE is the demonstration of a MDSW’s ability to yield clinically relevant output in accordance with the intended purpose. The clinical relevance of a MDSW’s output is a positive impact on the health of an individual expressed in terms of measurable, patient-relevant clinical outcome(s), including outcome(s) related to diagnosis, prediction of risk, prediction of treatment response(s), or related to its function, such as that of screening, monitoring, diagnosis or aid to diagnosis of patients, oron patient management or public health. Example for clinical performance is a retrospective study on previously obtained data. Generate evidence that shows your:SaMD has been tested in your target population and for your intended use; and that users can achieve clinically meaningful outcomes through predictable and reliable use. Note that line with the provisions of MDR Article 61 (1), the level of clinical evidence required should be appropriate in view of the device claims and characteristics. For medical devices, where the demonstration of conformity with GSPRs based on clinical data is not deemed appropriate (MDR Article 61 (10)), the manufacturer shall duly substantiate in the technical documentation why it is adequate to demonstrate conformity based on the results of non-clinical testing methods alone, bench testing and preclinical evaluation, and usability assessment. This means in case where your device does not produce clinical data, you can use bench testing and usability to demonstrate the clinical performance. Summarize the clinical performance data.

7.3.1 Equivalent device

If no equivalence is claimed: No equivalent device could be identified. General guidance including a detailed comparison table is provided in the MDCG 2020-05. Read this guidance and use the table to demonstrate equivalence if applicable.

8 Risk Management

A risk analysis, conducted in compliance with EN ISO 14971 is currently documented in the:

  • SOP Risk management
  • Risk Management Plan
  • Risk Analysis
  • Risk Management Report

8.1 Known Hazards and Risks

List hazards/ risks associated with the medical device.

8.2 Known Side-Effects

If applicable, please list/ describe side-effects.

8.3 Precautions and Warnings

List precautions and warnings.

8.4 Usability Engineering

Please provide a summary of the usability engineering either deriving from separate documents or the risk management. Example of text of the conclusion might be: The evaluation of the usability in accordance with IEC 62366-1 confirms that the design adequately reduces the risk of use error as far as possible, that the design is adequate for the intended users and that the information materials supplied by the manufacturer for the intended users are suitable.

8.5 Additional risks identified in the literature

PubMed has been searched for risks that might be associated with the use of the medical device.

Describe here the search for risks and usability related risks with the use of the device. State the filters used in your search.
RisksSearch termNo. of results
Risks associated with the deviceSearch term
Usability related risksSearch termNo. of results
Risks associated with the deviceSearch term
List the publications in Annex Literature accordingly.
Summary of identified risks A general patient benefit has been identified and proven within the literature. However, some possible complications have been reported in the literature. Limit your focus to risks that are directly or indirectly linked to the medical device. Risks related solely to the procedure, without any interaction with the medical device under evaluation, are not pertinent to this chapter or the risk-benefit assessment.
Literature ref.Risks / ComplicationsConsidered in risk management?
  • Not one of the outlined risks pertained to an overarching product issue or design flaw. The examination of pertinent publications did not unveil any apprehensions concerning the safety. Additionally, the literature review did not uncover any risks that haven’t already been addressed in the existing risk management protocols.
  • Summarise the literature regarding usability.
  • In the absence of usability information: A review of the literature did not uncover any additional insights regarding the usability aspects associated with the use of the . Furthermore, there is no indication in the literature of any overarching product issues or design flaws related to usability.

8.6 Conclusion of Risk Management

Example of text might be: Risk control measures were established and executed in accordance with the Risk Management Plan. These implemented measures are predominantly aligned with the adherence to relevant standards. Furthermore, technical control and monitoring measures were introduced and successfully validated for efficacy. The risk management process validates the adequacy of information materials provided by the manufacturer, ensuring that risk mitigation measures are accurately addressed in the Instructions for Use (IFU). Following the successful implementation of these risk control measures, both the remaining individual risks and the overall residual risks were evaluated as acceptable [Reference the Risk Management Report].

9 Post-Market Surveillance Data

Present available post-market surveillance data and delineate its significance in assessing the clinical performance and safety of the relevant medical device. If applicable, reference post-market surveillance reports or periodic safety update reports, focusing on conclusions that are relevant to the device’s clinical performance and safety. [Manufacturer] has implemented a post-market surveillance (PMS) system to promptly identify new risks not previously recognized during the extended market experience. This commitment ensures the immediate execution of corrective and preventive actions, as detailed in <reference to the post-market surveillance system. This section further consolidates insights gained from the medical device under evaluation and/or its equivalent devices, utilizing internal and external databases. The strategy for identifying pertinent reports is tailored to each database. Add or remove subchapters as needed. If possible, align the timeframe for database searches with that of the literature search. If an excessive number of potentially relevant results arise, opt for a restricted timeframe with justification.

9.1 Internal Vigilance System

Summarise the data regarding sales numbers and complaints.

9.2 Additional Post-Market Clinical Follow-Up Data

PMCF is planned and conducted to proactively collect and evaluate clinical data with the aim of confirming the clinical safety and performance throughout the expected lifetime of the device, ensuring the continued acceptability of identified risks and detecting emerging risks on the basis of factual evidence. Summarise the data regarding PMCF of the device under evaluation & reference the post-market clinical follow-up plan.

9.3 Relevant Device Registers

Please summarise internal and external register data. If no internal device register is available, example of text might be: The manufacturer has not implemented an internal device register.

9.4 BfArM Database

Sources:  https://www.bfarm.de/SiteGlobals/Forms/Suche/EN/Expertensuche\_Formular.html?nn=708434&cl2Categories\_Format=kundeninfo

The following search terms have been used:

Please state a timeframe of the search if restricted. The search led to <xxx> results of which only <xx> refer to the or its equivalent devices.

Ref.Issue DateDeviceDescription / ActionRelevant?

9.5 Swissmedic Database if applicable

Summarize the search in this database and use the structure provided in the BfArm example.

9.6 MHRA Database if applicable

9.7 fda maude database if applicable, 9.8 fda recall database if applicable, 9.9 summary and conclusion of pms data.

Offer a condensed overview of post-market surveillance data, incorporating considerations on risk management and usability. Enumerate identified risks aligned with the evaluation, ensuring comprehensive coverage of all risk management aspects. Specifically, focus on assessing use errors and the design of the user interface. Include details about the user profile and usage environment, if applicable. Noteworthy complications from MHRA, Swissmedic, BfArM, and FDA databases include: • List all relevant general complications Crucially, the scrutiny of post-market surveillance data revealed no risks unaddressed in the risk management discussion. The assessment of clinical data provides further reinforcement of the safety and performance of the device under evaluation.

10. Benefit Risk Assessment

Provide an overview about risks and benefits for the medical device and come to a final conclusion, why the probable benefits outweigh potential risks. The following list summarises an example of the evaluation of acceptability of the benefit-risk ratio. Based on the findings in the clinical data review as well as in the risk analysis, it can be inferred that the probability of a patient experiencing a substantial benefit when using the [device name] outweighs the probability of suffering harm due to a residual risk of the device significantly.

11. Summary & Conclusion

Executive Summary: This clinical evaluation represents a methodologically rigorous ongoing process encompassing the collection, assessment, and analysis of clinical data for the . The report synthesizes preclinical, non-clinical, and clinical data from diverse sources, presenting crucial information about the device’s intended purpose. A comprehensive literature search, yielding a sufficient number of relevant publications (n=xx), underscores the safety and performance of the , with identified publications meeting satisfactory quality standards. The evidence supports the intended purpose, clinical performance, and benefits as outlined in informational materials. No safety-related complaints, unaddressed risks, or usability concerns were identified beyond those addressed in risk management. Market experience, involving more than xxx units sold worldwide since xxxx, provides valuable insights. Safety-related complaints (xx) were reported, and a thorough search within clinical experience databases (MHRA, BfArM, Swissmedic, and FDA) revealed no unevaluated risks or usability aspects. Residual risks were deemed acceptable in the final risk management report, with the benefits outweighing these residual risks. The clinical evaluation affirms compliance with relevant safety and performance requirements (Regulation (EU) 2017/745, ANNEX I, clauses 1 and 8). Overall, the clinical safety, performance, and benefits demonstrate that the <medical device> aligns with current knowledge and technological standards. Conclusions: The clinical evaluation confirms that the <medical device> complies with current knowledge and technological standards, is suitable for its intended purpose and users, and offers substantial clinical benefits, outweighing potential adverse effects. Evaluated clinical data, aligned with Regulation (EU) 2017/745, are scientifically sound and comprehensive, supporting the device’s conformity. The analysis of literature, clinical data, and risk factors indicates that patient benefits significantly surpass the risk of residual harm, rendering further clinical investigations unnecessary. A planned PMCF strategy, considering the clinical evaluation report’s results, defines the process and frequency of activities. In summary, the clinical safety, performance, and benefits showcased in this evaluation confirm that the <medical device> adheres to relevant general safety and performance requirements (Regulation (EU) 2017/745, ANNEX I, clauses 1 and 8).

A1 References

The following table lists all relevant publications, provides a summary of the content and lists the appraisal.

Ref.No.TitleSummaryIndication / ApplicationRisksNamed DeviceBenefitsUsability

A2 Selection of Literature Search Results

The following table lists all identified publications, the decision for potential relevance and final relevance.

Use the literature assessment Excel sheet and copy paste the first columns in here.

No.ReferencesPotentially relevant?Relevant after reading the full text?

A3 Document References

Reference documents that you used in the CER (Risk management, Usability, bench testing summaries) here.

A4 Qualification of authors

Provide qualification and experience of the evaluators (e.g. author, reviewer and/or approver) to demonstrate that the responsible person fulfils the requirements for the accomplishment of clinical evaluations.

Template Copyright  openregulatory.com . See  template license .

Please don’t remove this notice even if you’ve modified contents of this template.

Leave the first comment (Cancel Reply)

clinical presentation evaluation form

VCOM Students

  • Clinical Forms

Evaluation Forms

  • Mid-Rotation Evaluation Form (PDF)
  • Third Year Preceptor Evaluation Form (PDF)
  • Fourth Year Preceptor Evaluation Form (PDF)

Case Report Grading Rubric (PDF)

Research Project Report Grading Rubric (PDF)

Excused Absence Forms

  • OMS 3 and 4 Request for an Unplanned Excused Absence (PDF)
  • OMS 3 and 4 Request for a Planned Excused Absence (PDF)

Report Forms

  • Occupational Exposure to Bloodborne Pathogens Report Form (PDF)

Request Forms

  • Request for Rotation Change (PDF)
  • Request to Attend Non-VCOM Clinical Rotation (PDF)  (Virginia Campus)
  • Request to Attend Non-VCOM Clinical Rotation (PDF)  (Carolinas Campus)
  • Request to Attend Non-VCOM Clinical Rotation (PDF)  (Auburn Campus)
  • Request to Attend Non-VCOM Clinical Rotation (PDF)  (Louisiana Campus)
  • Third Year Elective Request (DOC)  (Virginia Campus)

Section Navigation: Academics

  • Preclinical Education
  • Clinical Education: Third Year
  • Emergency Medicine
  • Medical Selectives
  • Surgical Selectives
  • Scheduling Fourth Year Rotations
  • Sports Medicine
  • Clinical Housing
  • Hospital Clinical Sites
  • Career Advising and Planning
  • Medical Outreach Programs
  • Simulation and Technology Center

clinical presentation evaluation form

  • Subscribe to journal Subscribe
  • Get new issue alerts Get alerts

Secondary Logo

Journal logo.

Colleague's E-mail is Invalid

Your message has been successfully sent to your colleague.

Save my selection

Evaluating Oral Case Presentations Using a Checklist

How do senior student-evaluators compare with faculty.

Kakar, Seema P. MD; Catalanotti, Jillian S. MD, MPH; Flory, Andrea L. MD; Simmens, Samuel J. PhD; Lewis, Karen L. PhD; Mintz, Matthew L. MD; Haywood, Yolanda C. MD; Blatt, Benjamin C. MD

Dr. Kakar is assistant professor, Department of Medicine, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Catalanotti is assistant professor, Department of Medicine, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Flory is assistant professor, Department of Medicine, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Simmens is research professor, Department of Epidemiology and Biostatistics, The George Washington University School of Public Health and Health Services, Washington, DC.

Dr. Lewis is director of administration, Clinical Learning and Simulation Skills (CLASS) Center, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Mintz is associate professor, Department of Medicine, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Haywood is assistant dean for student and curricular affairs, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Dr. Blatt is professor, Department of Medicine, The George Washington University School of Medicine and Health Sciences, Washington, DC.

Funding/Support: None.

Other disclosures: None.

Ethical approval: This study was approved by the institutional review board of The George Washington University.

Previous presentations: The abstract of an earlier version of this article was presented at the May 2011 Northeastern Group on Educational Affairs meeting, Washington, DC, and at the November 2011 Research in Medical Education Conference, Denver, Colorado.

Correspondence should be addressed to Dr. Kakar, Department of Medicine, The George Washington University School of Medicine and Health Sciences, 2150 Pennsylvania Ave., NW, Washington, DC 20037; e-mail: [email protected] .

Purpose 

Previous studies have shown student-evaluators to be reliable assessors of some clinical skills, but this model has not been studied for oral case presentations (OCPs). The purpose of this study was to examine the validity of student-evaluators in assessing OCP by comparing them with faculty.

Method 

In 2010, the authors developed a dichotomous checklist. They trained 30 fourth-year medical students (student-evaluators) to use it to assess 170 second-year medical students’ OCPs in real time during a year-end objective structured clinical examination. Ten faculty physicians then scored videos of a random sample of these OCPs. After discarding items with poor faculty reliability, the authors assessed agreement between faculty and student-evaluators on 18 individual items, total scores, and pass/fail decisions.

Results 

The total score correlation between student-evaluators and faculty was 0.84 ( P < .001) and was somewhat better than the faculty–faculty intraclass correlation ( r = 0.71). Using a 70% pass/fail cutoff, faculty and student-evaluator agreement was 74% (Kappa = 0.46; 95% CI, 0.20–0.72). Overall, student-evaluator scores were more lenient than faculty scores (72% versus 56% pass rates; P = .03).

Conclusions 

Senior student-evaluators were able to reliably assess second-year medical students’ OCP skills. The results support the use of student-evaluators for peer assessment of OCPs in low-stakes settings, but evidence of leniency compared with faculty assessment suggests caution in using student-evaluators in high-stakes settings. Extending peer assessment to OCPs provides a practical approach for low-resource evaluation of this essential skill.

Full Text Access for Subscribers:

Individual subscribers.

clinical presentation evaluation form

Institutional Users

Not a subscriber.

You can read the full text of this article if you:

  • + Favorites
  • View in Gallery

Readers Of this Article Also Read

Health systems science in medical education: unifying the components to..., new educator roles for health systems science: implications of new physician..., advancing competency-based medical education: a charter for clinician–educators, entrustment decision making in clinical training, amending miller’s pyramid to include professional identity formation.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.11(12); 2019 Dec

Logo of cureus

Developing a Beginner’s Guide to Writing a Clinical Case Report: A Pilot Evaluation by Junior Doctors

Samson o oyibo.

1 Diabetes and Endocrinology, Peterborough City Hospital, Peterborough, GBR

Introduction

Writing a case report increases one’s knowledge about a particular disease condition, demonstrates intellectual curiosity and commitment to scientific inquiry and the ability to follow through on scholarly projects. Despite several articles and journal-specific instructions published concerning case report writing, none have been evaluated by their intended audience. The aim of this study was to get junior doctors to evaluate an online presentation as part of the process of developing a beginner’s guide to writing a clinical case report.

Materials and methods

In response to our previous studies an online presentation concerning how to write a clinical case report was provided for junior doctors. Junior doctors were invited by email to look at the online presentation and complete an online evaluation form thereafter. The questions were adapted from the Evaluation Form for Teaching and Presentations provided by the Joint Royal Colleges of Physicians Training Board. Data was analysed both quantitatively and qualitatively.

Sixty-five doctors looked at the presentation and completed the online evaluation form. All agreed that the objectives of the presentation were identified and met. Sixty-four (98.5%) agreed that it was effective and clear. Sixty percent indicated that they found the information and instructions useful. An additional 13.85% found the whole presentation useful without specifying any aspect. Eight percent found the summary slide useful, 4.62% found the case selection criteria slide to be useful and 4.62% found the permission and patient consenting slide useful. Twenty percent would like the inclusion of examples of good abstracts and case reports, 13.85% would like more teaching sessions, and 13.85% would like improvements to the slide-presentation format. Overall, 64 junior doctors (98.46%) remarked that the presentation was good, very good or excellent.

Conclusions

This study has demonstrated the importance of evaluation of teaching material by junior doctors while developing a beginner’s guide to writing a clinical case report. Once the above action points and limitations have been taken into account, further repeat evaluations by junior doctors need to be undertaken while developing a robust beginner’s guide to writing a clinical case report.

Having an article published in a peer-reviewed medical journal is important for career progression in several medical specialties. Although enhancement of their curriculum vitae has been cited as a motivation to getting published, a keen interest in the subject is a more important reason stated by doctors [ 1 ]. Writing up a case report increases one’s knowledge about a particular disease condition, demonstrates intellectual curiosity and commitment to scientific inquiry and the ability to follow through on scholarly projects [ 2 ].

In a previous study, we demonstrated that junior doctors feel that medical article publishing is an effective teaching method but little was done to help them bridge the gap between getting an interesting case and getting published [ 3 ]. In a follow-up study, we highlighted the importance of establishing a medical article publishing club for junior doctors based on action points from the previous study. Junior doctors said that the medical article publishing club contributed to learning, education and publishing skills [ 4 ].

In response to action points from the above-mentioned studies an online PowerPoint presentation was provided for junior doctors on “a guide to writing a clinical case report”. The main objective of this study was to obtain junior doctors’ evaluation of the online presentation, with the ultimate aim of making improvements and developing a robust and user-friendly guide to writing clinical case reports.

The online presentation

As an action point to a previous study an online PowerPoint presentation of “a guide to writing a clinical case report” was made for junior doctors to aid them in writing clinical case reports. This consisted of 18 PowerPoint slides starting from the title slide to the bibliography slide. This presentation was made available on our institution’s educational website for all junior doctors to use. The PowerPoint presentation is shown in Figure ​ Figure1 1 .

An external file that holds a picture, illustration, etc.
Object name is cureus-0011-00000006370-i01.jpg

Study participants

Junior doctors in our healthcare institution were invited by email to look at the online PowerPoint presentation and complete an online evaluation form thereafter. There was also the facility to download the presentation. Invited doctors were given four weeks to respond while a reminder invitation email was sent every week for the same four-week period.

Study design

As part of the email, a web-based evaluation form was administered to junior doctors so that they could evaluate the online PowerPoint presentation after going through it. The evaluation form distribution and data collection were carried out over a four-week period. Ethics approval was sought through the Research & Development department of our institute. This study did not require ethical approval on the account of it being registered with our Quality, Governance and Compliance Department as a Quality of Education Improvement Project. Participants were assured of strict anonymity and confidentiality during this study.

Evaluation questionnaire

The evaluation questionnaire was prepared online using SurveyMonkey [ 5 ]. The questions were adapted from the Evaluation Form for Teaching and Presentations provided by the Joint Royal Colleges of Physicians Training Board [ 6 ]. The questionnaire contained six questions: (1) were the objectives of the online presentation identified, (2) were the objectives met, (3) was the delivery of the presentation effective and clear, (4) what aspects of the presentation was useful, (5) any suggestions for improvement, and (6) overall, what is your evaluation of the online presentation. Questions 1-3 required a “yes” or “no” answer. Questions 4-5 were open-ended questions requiring input into a comment box. Question 6 required an answer from “very bad”, “poor”, “fair”, “good”, “very good” or “excellent”. A web-link to the questionnaire was sent via email to participants.

Data analysis

The responses to questions 1, 2, 3, and 6 were analyzed and presented as whole numbers (and percentages). The answers to questions 4 and 5 were transcribed verbatim and analyzed qualitatively by the process of thematic analysis [ 7 , 8 ]. The data was reviewed for initial codes, subthemes and subsequently developed themes related to what was found useful and suggestions for improvement. The raw data, subthemes and themes were continuously reflected upon to ensure credibility and trustworthiness of this survey [ 9 ].

There were 65 respondents to the invitation emails. Therefore, 65 junior doctors looked at the presentation and completed the online evaluation form.

Objectives, clearness and effectiveness

All 65 respondents (100%) agreed that the objectives of the presentation were identified. All 65 respondents (100%) agreed that the objectives of the presentation were met. Sixty-four respondents (98.5%) agreed that the presentation was effective and clear. This is shown in Table ​ Table1 1 .

Questions concerning the presentationNumber of junior doctors (N = 65)
 YESNO
Question 1: Were the objectives of the presentation identified?650
Question 2: Were the objectives met?650
Question 3: Was the delivery of the presentation effective and clear?641

Useful aspects and suggestions for improvement

The answers to questions 4 and 5 were analysed thematically. The raw data (answers to both questions along with the thematic analysis) used to support the findings of this study has been deposited in the Harvard Dataverse and is freely accessible [ 10 ]. The main themes derived from the analysis are presented here.

Question 4 - What Aspect of the Presentation Was Useful?

All respondents answered question 4, and several major themes emerged from the thematic analysis. Thirty-nine respondents (60%) indicated that they found the information and instructions provided in the presentation useful (e.g., they highlighted the stepwise approach, breakdown, clear, concise and systematic structure of the information provided). Nine respondents (13.85%) indicated they found the whole presentation useful without specifying any aspect. Five respondents (7.69%) indicated that they found the summary slide useful. Three respondents (4.62%) indicated that they found the case selection criteria slide to be useful. A similar number of respondents (4.62%) indicated that they found the permission and patient consenting slide useful. One respondent particularly found the abstract slide useful. Two respondents indicated that the subject/topic was useful. Two respondents made an abbreviated text comments which could not be deciphered while one respondent indicated that the presentation was “a bit vague”.

Question 5 - Any Suggestions for Improvement?

Sixty-two respondents answered question 5, and several major themes emerged from the thematic analysis. Thirteen respondents (20%) indicated that they would like the inclusion of examples of good abstracts and case reports. Nine respondents (13.85%) indicated that they would like more presentations and teaching sessions (e.g., workshop sessions, online sessions and circulation of the presentation to more junior doctors and medical students). Nine respondents (13.85%) indicated that the slide-presentation format could be improved (e.g., add more colour, make the slides more interactive, less crowded, less rushed, shorter presentation). Thirty respondents (46.15%) indicated “nil” or “none” in response to the question “any suggestions for improvement”. Two respondents just gave praises (e.g., good job, well done), one respondent made an abbreviated text comment which could not be deciphered, and another left that question blank.

Overall evaluation of the presentation

Sixty-four respondents (98.46%) remarked that the presentation was good, very good or excellent. One respondent remarked that the presentation was poor. This is shown in Table ​ Table2 2 .

Overall evaluation of presentationNumber of junior doctors (%)
Excellent36 (55.38%)
Very good20 (30.77%)
Good8 (12.31%)
Fair0
Poor1 (1.54%)

Formal training and adequate mentorship are key ingredients required to help junior doctors with writing and presenting case reports. The importance of lack of these factors has been highlighted in a previous study looking at the perceptions of fourth-year medical students on writing case reports [ 11 ]. In this study, medical students indicated that lack of formal training and lack of mentorship were significant barriers to writing and presenting cases. There are several journal-specific guides and instructions on how to write clinical case reports but despite this, junior doctors still find it difficult to write up a case report. This fact emphasizes the importance of mentorship and training, which could be provided by a curriculum-based medical article publishing club or forum, which should include an easy-to-follow guide to writing case reports for junior doctors. While developing such a guide it is important that there is continuous evaluation by the junior doctors. Evaluation should be a continuous and periodic process, as it helps teachers and learners to improve the teacher-learner process.

There are several articles and journal-specific instructions published concerning writing clinical case reports but there is scarcity of reports of evaluation of these published guides and instructions by their intended audience. A guide to writing case reports directed at junior doctors in a user-friendly format and evaluated by junior doctors may go a long way in helping junior doctors write up clinical case reports. Such a guide can be included in the junior doctors’ teaching curriculum alongside an adequate mentorship program.

Action points from this pilot study

This study has demonstrated the importance of evaluation of teaching material by the intended learners, the junior doctors in this case. Junior doctors found the PowerPoint presentation about a “guide to writing a clinical case report” useful. In particular: the layout of the instructions, the information about permission and patient consenting, the information about case selection criteria, and the summary slide at the end of the presentation. The junior doctors also suggested ways of improving the presentation, namely, inclusion of examples and illustrations of good abstracts and case reports, adding colour to the presentation and making it more interactive and providing more teaching sessions and presentations on the topic of writing clinical case reports. These factors will be taken into account while making the improvements to this guide.

Limitations

This study has some limitations that should be acknowledged. First, this study assumes that everyone who looked at the presentation went on to complete the evaluation form. We have no way of knowing how many junior doctors looked at the presentation without going on to complete the online evaluation form. There are various forms of page-view/download counters that can be used to access this data when arranging future studies. Second, the results of this pilot study may not be generalizable as the sample size (respondents) makes up 25% of the total junior doctor population in just one healthcare institution. However, this was a pilot study. Third, the invited population of doctors are employees within the same healthcare establishment as the organiser of the study. Therefore, any non-responder or responder bias based on this cannot be ruled out. A sample size including junior doctors from different healthcare institutions would limit this bias.

This study has demonstrated the importance of evaluation of teaching material by junior doctors while developing a beginner’s guide to writing a clinical case report. Once the above action points and limitations have been taken into account and improvements made, further repeat evaluations by junior doctors will need to be undertaken while developing a robust beginner’s guide to writing a clinical case report.

Acknowledgments

The author would like to thank all the junior doctors who participated in this evaluation study.

The content published in Cureus is the result of clinical experience and/or research by independent individuals or organizations. Cureus is not responsible for the scientific accuracy or reliability of data or conclusions published herein. All content published within Cureus is intended only for educational, research and reference purposes. Additionally, articles published within Cureus should not be deemed a suitable substitute for the advice of a qualified health care professional. Do not disregard or avoid professional medical advice due to content published within Cureus.

The authors have declared that no competing interests exist.

Human Ethics

Consent was obtained by all participants in this study. Not applicable issued approval Not applicable. This study did not require ethical approval on the account of it being registered with our Quality, Governance and Compliance Department as a Quality of Education Improvement Project. Participants were assured of strict anonymity and confidentiality during this study.

Animal Ethics

Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue.

Published in

Evaluation Form Templates

12 Free Presentation Evaluation Forms (What to Include)

A presentation evaluation form is a document used by an evaluator to analyze and review a particular presentation.

The form allows you to give structured feedback to the presenter about their presentation. Additionally, it can be used whenever you want to rate an individual’s presentation skills. Assessments are an important means for individuals to improve themselves, and you must therefore provide the presenter with accurate feedback regarding their presentation. This will enable them to make the necessary adjustments and enhance their presentation skills.

Furthermore, the feedback form allows you to judge whether the presenter comprehensively covered all the important topics and answered questions appropriately. An elaborate presentation should be able to give clear insights into the chosen topics. For example, if the presentation is about the advantages and values of using the company’s products and services, everyone present during the presentation should be able to clearly understand the products and their market valuation.

The form used to evaluate presentations, its purpose, the evaluation criteria, and some helpful assessment advice will all be covered in this article.

Download Free Form Templates

A presentation evaluation form should be comprehensive as it is meant to provide the presenter with honest reviews of their performance. To ensure you have a form that is thorough, you should use a template to prepare it. That will make it easier for you to create a proper form.

Also, it will ensure that you have all the required sections and details. You can access and download these templates for free from below:

presentation evaluation form word

Purpose of Presentation Evaluation Form

An evaluation form allows you to give a critical review and evaluation of a presentation. Different aspects of the presentation are judged as part of the evaluation; this includes the presenter’s effectiveness and efficiency in imparting information, body language, enthusiasm, volume, modulation, ease of flow, clarity of speaking, and the presenter’s overall preparedness.

Therefore, after you have reviewed the presentation, you should share your comments with the presenter. They can use it to understand what they need to do to improve their overall performance. Furthermore, your feedback form should be easy to understand and should convince the presenter to take action towards improving their confidence and appearance.

Also, you may give suggestions to help the presenter improve their emotional control during presentations; this is an effective way of convincing and persuading the audience.

A well-drafted review will allow you to give your opinion without sabotaging the presenter’s confidence. Therefore, feedback has to be constructed positively but must also provide clear instructions about those areas that need improvement.

3 Different Kinds of Presentation Evaluations

An effective way of helping individuals give powerful and informative presentations is by informing them on how their presentations will be evaluated.

Here are three techniques you can use to evaluate a presentation:

Self-evaluation

One of the most effective ways of improving someone’s presentation skills is by allowing them to judge their own performance. This can be achieved by making them rate their presentations. Occasionally, there are those who will be able to give accurate and insightful reviews on what they did well and where they need to improve. Also, there are some who will find it difficult to evaluate themselves.

Asking a presenter questions about their performance will enable you as an evaluator to assist them in self-evaluation. You can ask them how they think they performed, what they think they have accomplished, what they gained before, during, and after the performance, and what they think they could have done differently during the process of presenting.

Peer evaluation

Assessment by peers encourages the presenters to provide feedback on each other’s performances. For instance, if you are a teacher, you can ask your students to give their opinions about their classmates’ performances. Peer evaluation is an effective way of helping the students to differentiate between a perfect and an average presentation.

Also, this will allow them to be more attentive as they observe and learn how to present their projects effectively. You can distribute forms to each student to give their feedback. Then, you can request that they give the forms to the presenter at the end of the presentation.

Professional evaluation

Professional evaluations of presentations are usually conducted by someone like a teacher. Therefore, as an evaluator, you are required to verbally give your comments  instead of recording them on some evaluation forms. In most cases, you are required to discuss the presentation immediately when it ends; this allows the presenter to get immediate feedback.

To professionally evaluate a presentation, you can ask for its copy prior to the presentation. This will allow you enough time to review the contents and be prepared to give provide a comprehensive assessment. As a result, you will be able to help the presenter get better at their future presentations.

Evaluation Criteria for Presentation

A presentation is judged on six criteria. The individual or group presenting their work must have the required skills to present their content effectively.

Below are the six abilities that you must assess as part of the evaluation:

Ability to analyze the audience

You need to assess if the presenter understands their audience based on the following:

  • Whether their content was tailored and relevant or just generic
  • If the pitching was done correctly
  • If they used proper language
  • If they used terminology that the audience understood
  • If they engaged their audience
  • If their audience seemed focused or distracted.

If the presenter understands the audience, they will most likely have a great presentation. As an evaluator, you must determine if the presenter researched their audience and was able to handle any challenges they encountered during their presentation.

Ability to develop a structured presentation

You need to determine if the presenter has a structured presentation that makes the content persuasive. The message alone cannot be impactful if it lacks a logical flow and structure of ideas. You should judge if the presentation was clear, easy to follow, and had a narrative or story-like flow with a clear beginning and conclusion. 

Also, you need to check if the transitions used between sections were smooth, if the presenter used relevant visual aids such as PowerPoint slides or handouts, and finally, if it had a clear call to action section at the end. 

A proper and clear structure is important if the presenter wants their message to impact the audience. It should have a clear start, flow smoothly, build momentum, and have a powerful ending without losing the audience’s attention at any point.

Ability to engage the audience

The presenter must also have the ability to engage the audience. If the presenter properly analyzes the audience, they will most likely be able to connect with them. This is a significant factor that distinguishes a great presentation from a poor one. Ascertain if the presenter had content that the audience would find interesting. 

Also, you need to check if the presenter’s method of delivery was effective. The presenter should be able to build a rapport with the members of the audience, use proper gestures and body language, and speak clearly and confidently with proper intonation in a conversational tone.

Ability to prepare effective slides

The ability to prepare slides that effectively convey the intended message is an important aspect of a successful presentation. Slides are visual aids meant for the speaker to elaborate on their information and enable their audience to understand the message thoroughly. You need to determine if these slides are easy to read, have detailed information, and have a proper layout and format for easier understanding.

The slides should have a good balance between text, graphics, and images. The slides can be considered effective if they contain text in bullet points as well as impactful graphics that reinforce the presenter’s message.

Ability to be confident and other strengths

It is also important to evaluate if the speaker does not lack confidence when presenting. The presenter should exude confidence, be natural, and be in control while presenting. You need to assess if they were at ease while speaking to their audience, whether they appeared confrontational, whether they seemed anxious or distracted, and whether they were awkward or shy. 

Ability to summarize and achieve intended outcomes

The conclusion should also have a clear and achievable call to action and be inspirational. Therefore, you need to ascertain the presenter’s ability to summarize and conclude their presentation in a manner that ensures they have achieved their intended outcomes. You must assess whether their closing statement was well-rounded and  included all the main points.  A proper closing should leave the audience with a sense of having achieved something.

Best Tips for You

There are tips that you should keep in mind when evaluating a presentation if you wish to have impactful feedback that will benefit the presenter.

Below are the three main tips that you should consider:

Emphasize the process

You need to focus on the process of preparation rather than the product itself. That means that you should evaluate and comment on the process taken, such as gathering information, analyzing the audience, etc. This is more impactful, and it will help the person identify the areas that need improvement so they can make it better next time.

Be specific

Your feedback should include specific directions to help the presenter  improve themselves, rather than just giving opinions.

For example:

Instead of writing, “You were not audible or confident enough during your presentation,” you should write,  “At some point during the presentation, you were not audible and did not seem confident. This made it hard to hear or understand you. Pay close attention to your pace and audibility the next time. If you are feeling underconfident, use gestures and take your time to pause instead of using filler words such as “um,” “ah,” and “like.” 

End on a positive note

Always conclude your assessment on a positive note. The assessment is meant to motivate a person to develop their presentation abilities. Therefore, it is important that besides  highlighting the flaws, also include positive feedback to encourage the presenter 

Your job as an evaluator is to assist the presenter in improving their skills. An effective way of doing this is by giving them constructive feedback. Your assessment should not only highlight the shortcomings but also be thoughtful and positive. When you use an evaluation form, you can make precise notes about the areas where a presenter needs to improve and the ones where they did well. The oral presentation can be challenging and time-consuming. However, with a form, you can comprehensively explain what is expected of a presenter during and after their presentation. Notably, it is important to focus on its different aspects, which include the style of presenting and the contents. As an evaluator, you are responsible for objectively assessing the skills and content of the presenter. Therefore, your feedback should be detailed and effective. Ensure that you have an evaluation criteria that will make it easy for you to provide your comments regarding all relevant aspects. You can use templates to create forms that meet all your evaluation requirements effortlessly.

Keep reading

24 free employee performance evaluation forms (word | excel), 6 free teacher evaluation form templates, free peer evaluation forms & samples (word | pdf), free workshop evaluation forms – word – pdf.

Evaluation Form Templates , Forms

12 Free Employee Performance Evaluation Form Templates

15 likert scale templates (free examples).

Sheet Templates , Evaluation Form Templates

  • Open access
  • Published: 26 April 2022

Development and validation of the oral presentation evaluation scale (OPES) for nursing students

  • Yi-Chien Chiang 1 ,
  • Hsiang-Chun Lee 2 ,
  • Tsung-Lan Chu 3 ,
  • Chia-Ling Wu 2 &
  • Ya-Chu Hsiao 4  

BMC Medical Education volume  22 , Article number:  318 ( 2022 ) Cite this article

5216 Accesses

3 Citations

16 Altmetric

Metrics details

Oral presentations are an important educational component for nursing students and nursing educators need to provide students with an assessment of presentations as feedback for improving this skill. However, there are no reliable validated tools available for objective evaluations of presentations. We aimed to develop and validate an oral presentation evaluation scale (OPES) for nursing students when learning effective oral presentations skills and could be used by students to self-rate their own performance, and potentially in the future for educators to assess student presentations.

The self-report OPES was developed using 28 items generated from a review of the literature about oral presentations and with qualitative face-to-face interviews with university oral presentation tutors and nursing students. Evidence for the internal structure of the 28-item scale was conducted with exploratory and confirmatory factor analysis (EFA and CFA, respectively), and internal consistency. Relationships with Personal Report of Communication Apprehension and Self-Perceived Communication Competence to conduct the relationships with other variables evidence.

Nursing students’ ( n  = 325) responses to the scale provided the data for the EFA, which resulted in three factors: accuracy of content, effective communication, and clarity of speech. These factors explained 64.75% of the total variance. Eight items were dropped from the original item pool. The Cronbach’s α value was .94 for the total scale and ranged from .84 to .93 for the three factors. The internal structure evidence was examined with CFA using data from a second group of 325 students, and an additional five items were deleted. Except for the adjusted goodness of fit, fit indices of the model were acceptable, which was below the minimum criteria. The final 15-item OPES was significantly correlated with the students’ scores for the Personal Report of Communication Apprehension scale ( r  = −.51, p  < .001) and Self-Perceived Communication Competence Scale ( r  = .45, p  < .001), indicating excellent evidence of the relationships to other variables with other self-report assessments of communication.

Conclusions

The OPES could be adopted as a self-assessment instrument for nursing students when learning oral presentation skills. Further studies are needed to determine if the OPES is a valid instrument for nursing educators’ objective evaluations of student presentations across nursing programs.

Peer Review reports

Competence in oral presentations is important for medical professionals to communicate an idea to others, including those in the nursing professions. Delivering concise oral presentations is a useful and necessary skill for nurses [ 1 , 2 ]. Strong oral presentation skills not only impact the quality of nurse-client communications and the effectiveness of teamwork among groups of healthcare professionals, but also promotion, leadership, and professional development [ 2 ]. Nurses are also responsible for delivering health-related knowledge to patients and the community. Therefore, one important part of the curriculum for nursing students is the delivery of oral presentations related to healthcare issues. A self-assessment instrument for oral presentations could provide students with insight into what skills need improvement.

Three components have been identified as important for improving communication. First, a presenter’s self-esteem can influence the physio-psychological reaction towards the presentation; presenters with low self-esteem experience greater levels of anxiety during presentations [ 3 ]. Therefore, increasing a student’s self-efficacy can increase confidence in their ability to effectively communicate, which can reduce anxiety [ 3 , 4 ]. Second, Liao (2014) reported improving speaking efficacy can also improve oral communications and collaborative learning among students could improve speech efficacy and decrease speech anxiety [ 5 ]. A study by De Grez et al. provided students with a list of skills to practice, which allowed them to feel more comfortable when a formal presentation was required, increased presentation skills, and improved communication by improving self-regulation [ 6 ]. Third, Carlson and Smith-Howell (1995) determined quality and accuracy of the information presented was also an important aspect of public speaking performances [ 7 ]. Therefore, all three above mentioned components are important skills for effective communication during an oral presentation.

Instruments that provide an assessment of a public speaking performance are critical for helping students’ improve oral presentation skills [ 7 ]. One study found peer evaluations were higher than those of university tutors for student presentations, using a student-developed assessment form [ 8 ]. The assessment criteria included content (40%), presentation (40%), and structure (20%); the maximum percent in each domain was given for “excellence”, which was relative to a minimum “threshold”. Multiple “excellence” and “threshold” benchmarks were described for each domain. For example, benchmarks included the use of clear and appropriate language, enthusiasm, and keeping the audience interested. However, the percentage score did not provide any information about what specific benchmarks were met. Thus, these quantitative scores did not include feedback on specific criteria that could enhance future presentations.

At the other extreme is an assessment that is limited to one aspect of the presentation and is too detailed to evaluate the performance efficiently. An example of this is the 40-item tool developed by Tsang (2018) [ 6 ] to evaluate oral presentation skills, which measured several domains: voice (volume and speed), facial expressions, passion, and control of time. An assessment tool developed by De Grez et al. (2009) includes several domains: three subcategories for content (quality of introduction, structure, and conclusion), five subcategories of expression (eye-contact, vocal delivery, enthusiasm, interaction with audience, and body-language), and a general quality [ 9 ]. Many items overlap, making it hard to distinguish specific qualities. Other evaluation tools include criteria that are difficult to objectively measure, such as body language, eye-contact, and interactions with the audience [ 10 ]. Finally, most of the previous tools were developed without testing the reliability and validity of the instrument.

Nurses have the responsibility of not only providing medical care, but also medical information to other healthcare professionals, patients, and members of the community. Therefore, improving nursing students’ speaking skills is an important part of the curriculum. A self-report instrument for measuring nursing students’ subjective assessment of their presentation skills could help increase competence in oral communication. However, to date, there is a no reliable and valid instrument of evaluating oral presentation performance in nursing education. Therefore, the aim of this study was to develop a self-assessment instrument for nursing students that could guide them in understanding their strengths and development areas in aspects of oral presentations. Development of a scale that is a valid and reliable instrument for nursing students could then be examined for use as a scale for objective evaluations of oral presentations by peers and nurse educators.

Study design

This study developed and validated an oral presentation evaluation scale (OPES) that could be employed as a self-assessment instrument for students when learning skills for effective oral presentations. The instrument was developed in two phases: Phase I (item generation and revision) and Phase II (scale development) [ 11 ]. The phase I was aimed to generate items by a qualitative method and to collect content evidence for the OPES. The phase II focused on scale development which was established internal structure evidence including CFA, EFA, and internal consistency of the scale for the OPES. In addition, the phase II collected the evidence of OPES on relationships with other variables. Because we hope to also use the instrument as an aid for nurse educators in objective evaluations of nursing students’ oral presentations, both students and educators were involved in item generation and revision. Only nursing students participated in Phase II.

Approval was obtained from Chang Gung Medical Foundation institutional review board (ID: 201702148B0) prior to initiation of the study. Informed consent was obtained from all participants prior to data collection. All participants being interviewed for item generation in phase I provided signed informed consent indicating willingness to be audiotaped during the interview. All the study methods were carried out in accordance with relevant guidelines and regulations.

Phase I: item generation and item revision

Participants.

A sample of nurse educators ( n  = 8) and nursing students (n  = 11) participated in the interviews for item generation. Nursing students give oral presentations to meet the curriculum requirement, therefore the educators were university tutors experienced in coaching nursing students preparing to give an oral presentation. Nurse educators specializing in various areas of nursing, such as acute care, psychology, and community care were recruited if they had at least 10 years’ experience coaching university students. The mean age of the educators was 52.1 years ( SD  = 4.26), 75% were female, and the mean amount of teaching experience was 22.6 years ( SD  = 4.07). Students were included if they had given at least one oral presentation and were willing to share their experiences of oral presentation. The mean age of the students was 20.7 ( SD  = 1.90), 81.8% were female; 36.3%, four were second year students, three were third students, and four were in their fourth year.

An additional eight educators participated in the evaluation of content evidence of the ORES. All had over 10 years’ experience in coaching students in giving an oral presentation that would be evaluated for a grade.

Item generation

Development of item domains involved deductive evaluations of the about oral presentations [ 2 , 3 , 6 , 7 , 8 , 12 , 13 , 14 ]. Three domains were determined to be important components of an oral presentation: accuracy of content, effective communication, and clarity of speech. Inductive qualitative data from face-to-face semi-structured interviews with nurse educators and nursing student participants were used to identify domain items [ 11 ]. Details of interview participants are described in the section above. The interviews with nurse educators and students followed an interview guide (Table  1 ) and lasted approximately 30–50 min for educators and 20–30 min for students. Deduction of the literature and induction of the interview data was used to determine categories considered important for the objective evaluation of oral presentations.

Analysis of interview data. Audio recordings of the interviews were transcribed verbatim at the conclusion of each interview. Interview data were analyzed by the first, second, and corresponding author, all experts in qualitative studies. The first and second authors coded the interview data to identify items educators and student described as being important to the experience of an oral presentation [ 11 ]. The corresponding author grouped the coded items into constructs important for oral presentations. Meetings with the three researchers were conducted to discuss the findings; if there were differences in interpretation, an outside expert in qualitative studies was included in the discussions until consensus was reached among the three researchers.

Analysis of the interview data indicated items involved in preparation, presentation, and post-presentation were important to the three domains of accuracy of content, effective communication, and clarity of speech. Items for accuracy of content involved preparation (being well-prepared before the presentation; preparing materials suitable for the target audience; practicing the presentation in advance) and post-presentation reflection; and discussing the content of the presentation with classmates and teachers. Items for effective communication involved the presentation itself: obtain the attention of the audience; provide materials that are reliable and valuable; express confidence and enthusiasm; interact with the audience; and respond to questions from the audience. The third domain, clarity of speech, involved of items could be, post-presentation, involved a student’s ability to reflect on the content and performance of their presentation and willingness to obtain feedback from peers and teachers.

Item revision: content evidence

Based on themes that emerged during, 28 items were generated. Content evidence of the 28 items of the OPES was established with a panel of eight experts who were educators that had not participated in the face-to-face interviews. The experts were provided with a description of the research purpose, a list of the proposed items, and were asked to rate each item on a 4-point Likert scale (1 = not representative, 2 = item needs major revision, 3 = representative but needs minor revision, 4 = representative). For item-level content validity index (I-CVI) was determined by the total items rated 3 or 4 divided by the total number of experts; scale-level content validity index (S-CVI) was determined by the total items rated 3 or 4 divided by the total number of items.

Based on the suggestions of the experts, six items of the OPES were reworded for clarity: item 12 was revised from “The presentation is riveting” to “The presenter’s performance is brilliant; it resonates with the audience and arouses their interests”. Two items were deleted because they duplicated other items: “demonstrates confidence” and “presents enthusiasm” were combined and item 22 became, “demonstrates confidence and enthusiasm properly”. The item “the presentation allows for proper timing and sequencing” and “the length of time of the presentation is well controlled” were also combined into item 9, “The content of presentation follows the rules, allowing for the proper timing and sequence”. Thus, a total of 26 items were included in the OPES at this phase. The I-CVI value was .88 ~ 1 and the scale-level CVI/universal agreement was .75, indicating that the OPES was an acceptable instrument for measuring an oral presentation [ 11 ].

Phase II: scale development

Phase II, scale development, aimed to establish the internal structure evidence for OPES. The evidence of relation to other variables was also evaluated as well in this phase. More specifically, the internal structure evidence for OPES was evaluated by exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). The evidence of relationships to other variables was determined by examining the relationships between the OPES and the PRCA and SPCC [ 15 ].

A sample of nursing students was recruited purposively from a university in Taiwan. Students were included if they were: (a) full-time students; (b) had declared nursing as their major; and (c) were in their sophomore, junior, or senior year. First-year university students (freshman) were excluded. A bulletin about the survey study was posted outside of classrooms; 707 students attend these classes. The bulletin included a description of the inclusion criteria and instructions to appear at the classroom on a given day and time if students were interested in participating in the study. Students who appeared at the classroom on the scheduled day ( N  = 650) were given a packet containing a demographic questionnaire (age, gender, year in school), a consent form, the OPES instrument, and two scales for measuring aspects of communication, the Personal Report of Communication Apprehension (PRCA) and the Self-Perceived Communication Competence (SPCC); the documents were labeled with an identification number to anonymize the data. The 650 students were divided into two groups, based on the demographic data using the SPSS random case selection procedure, (Version 23.0; SPSS Inc., Chicago, IL, USA). The selection procedure was performed repeatedly until the homogeneity of the baseline characteristics was established between the two groups ( p  > .05). The mean age of the participants was 20.5 years ( SD  = 0.98) and 87.1% were female ( n  = 566). Participants were comprised of third-year students (40.6%, n  = 274), fourth year (37.9%, n  = 246) and second year (21.5%, n  = 93). The survey data for half the group (calibration sample, n  = 325) was used for EFA; the survey data from the other half (the validation sample, n  = 325) was used for CFA. Scores from the PRCA and SPCC instruments were used for evaluating the evidence of relationships to other variables.

The aims of phase II were to collect the scale of internal structure evidence, which identify the items that nursing students perceived as important during an oral presentation and to determine the domains that fit a set of items. The 325 nursing students for EFA (described above) were completed the data collection. We used EFA to evaluate the internal structure of the scale. The items were presented in random order and were not nested according to constructs. Internal consistency of the scale was determined by calculating Cronbach’s alpha.

Then, the next step involved determining if the newly developed OPES was a reliable and valid self-report scale for subjective assessments of nursing students’ previous oral presentations. Participants (the second group of 325 students) were asked, “How often do you incorporate each item into your oral presentations?”. Responses were scored on a 5-point Likert scale with 1 = never to 5 = always; higher scores indicated a better performance. The latent structure of the scale was examined with CFA.

Finally, the evidence of relationships with other variables of the OPES was determined by examining the relationships between the OPES and the PRCA and SPCC, described below.

The 24-item PRCA scale

The PRCA scale is a self-report instrument for measuring communication apprehension, which is an individual’s level of fear or anxiety associated with either real or anticipated communication with a person or persons [ 12 ]. The 24 scale items are comprised of statements concerning feelings about communicating with others. Four subscales are used for different situations: group discussions, interpersonal communications, meetings, and public speaking. Each item is scored on a 5-point Likert scale from 1 (strongly disagree) to 5 (strongly agree); scores range from 24 to 120, with higher scores indicating greater communication anxiety. The PRCA has been demonstrated to be a reliable and valid scale across a wide range of related studies [ 5 , 13 , 14 , 16 , 17 ]. The Cronbach’s alpha for the scale is .90 [ 18 ]. We received permission from the owner of the copyright to translate the scale into Chinese. Translation of the scale into Chinese by a member of the research team who was fluent in English was followed by back-translation from a differed bi-lingual member of the team to ensure semantic validity of the translated PRCA scale. The Cronbach’s alpha value in the present study was .93.

The 12-item SPCC scale

The SPCC scale evaluates a persons’ self-perceived competence in a variety of communication contexts and with a variety of types of receivers. Each item is a situation which requires communication, such as “Present a talk to a group of strangers”, or “Talk with a friend”. Participants respond to each situation by ranking their level of competence from 0 (completely incompetent) to 100 (completely competent). The Cronbach’s alpha for reliability of the scale is .85. The SPCC has been used in similar studies [ 13 , 19 ]. We received permission owner of the copyright to translate the scale into Chinese. Translation of the SPCC scale into Chinese by a member of the research team who was fluent in English was followed by back-translation from a differed bi-lingual member of the team to ensure semantic validity of the translated scale. The Cronbach’s alpha value in the present study was .941.

Statistical analysis

Data were analyzed using SPSS for Windows 23 (SPSS Inc., Chicago, IL, USA). Data from the 325 students designated for EFA was used to determine the internal structure evidence of the OPES. The Kaiser-Meyer-Olkin measure for sampling adequacy and Bartlett’s test of sphericity demonstrated factor analysis was appropriate [ 20 ]. Principal component analysis (PCA) was performed on the 26 items to extract the major contributing factors; varimax rotation determined relationships between the items and contributing factors. Factors with an eigenvalue > 1 were further inspected. A factor loading greater than .50 was regarded as significantly relevant [ 21 ].

All item deletions were incorporated one by one, and the EFA model was respecified after each deletion, which reduced the number of items in accordance with a priori criteria. In the EFA phase, the internal consistency of each construct was examined using Cronbach’s alpha, with a value of .70 or higher considered acceptable [ 22 ].

Data from the 325 students designated for CFA was used to validate the factor structure of the OPES. In this phase, items with a factor loading less than .50 were deleted [ 21 ]. The goodness of the model fit was assessed using the following: absolute fit indices, including goodness of fit index (GFI), adjusted goodness of fit index (AGFI), standardized root mean squared residual (SRMR), and the root mean square error of approximation (RMSEA); relative fit indices, normed and non-normed fit index (NFI and NNFI, respectively), and comparative fit index (CFI); and the parsimony NFI, CFI, and likelihood ratio ( x 2 /df ) [ 23 ].

In addition to the validity testing, a research team, which included a statistician, determined the appropriateness of either deleting or retaining each item. The convergent validity (internal quality of the items and factor structures), was further verified using standardized factor loading, with values of .50 or higher considered acceptable, and average variance extraction (AVE), with values of .5 or higher considered acceptable [ 21 ]. Convergent reliability (CR) was assessed using the construct reliability from the CFA, with values of .7 or higher considered acceptable [ 24 ]. The AVE and correlation matrices among the latent constructs were used to establish discriminant validity of the instrument. The square root of the AVE of each construct was required to reach a value that was larger than the correlation coefficient between itself and the other constructs [ 24 ].

The evidence of relationships with other variables was determined by examining the relationship of nursing students’ scores ( N  = 650) on the newly developed OPES with scores for constructs of communication of the translated scales for PRCA and SPCC. The hypotheses between OPES to PRCA and SPCC individually indicated the strong self-reported presentation competence were associated with lower communication anxiety and greater communication competence.

Development of the OPES: internal structure evidence

EFA was performed sequentially six times until there were no items with a loading factor < .50 or that were cross-loaded, and six items were deleted (Table  2 ). EFA resulted in 20 items with a three factors solution, which represented 64.75% of the variance of the OPES. The Cronbach’s alpha estimates for the total scale was .94. indicating the scale had sound internal reliability (Table 2 ). The three factors were labeled in accordance with the item content via a panel discussion and had Cronbach’s alpha values of .93, .89, and .84 for factors 1, 2 and 3, respectively.

Factor 1, Accuracy of Content, was comprised of 11 items and explained 30.03% of the variance. Items in Accuracy of Content evaluated agreement between the topic (theme) and content of the presentation, use of presentation aids to highlight the key points of the presentation, and adherence to time limitations. These items included statements such as: “The content of the presentation matches the theme” (item 7), “Presentation aids, such as PowerPoint and posters, highlight key points of the report” (item 14), and “The organization of the presentation is structured to provide the necessary information, while also adhering to time limitations” (item 9). Factor 2, “Effective Communication”, was comprised of five items, which explained 21.72% of the total variance. Effective Communication evaluated the attitude and expression of the presenter. Statements included “Demonstrates confidence and an appropriate level of enthusiasm” (item 22), “Uses body language in a manner that increases the audience’s interest in learning” (item 21), and “Interacts with the audience using eye contact and a question and answer session” (item 24). Factor 3, “Clarity of Speech” was comprised of four items, which explained 13.00% of the total variance. Factor 3 evaluated the presenter’s pronunciation with statements such as “The words and phrases of the presenter are smooth and fluent” (item 19).

The factor structure of the 20-items of the EFA were examined with CFA. We sequentially removed items 1, 4, 20, 15, and 16, based on modification indices. The resultant 15-item scale had acceptable fit indices for the 3-factor model of the OPES for chi-square ( x 2 /df  = 2.851), RMSEA (.076), NNFI (.933), and CFI = .945. However, the AGFI, which was .876, was below the acceptable criteria of .9. A panel discussion with the researchers determined that items 4, 15, and 16 were similar in meaning to item 14; item 1 was similar in meaning to item 7. Therefore, the panel accepted the results of the modified CFA model of the OPES with 15 items and 3-factors.

As illustrated in Table  3 and Fig.  1 , all standardized factor loadings exceeded the threshold of .50, and the AVE for each construct ranged from .517 to .676, indicating acceptable convergent validity. In addition, the CR was greater than .70 for the three constructs (range = .862 to .901), providing further evidence for the reliability of the instrument [ 25 ]. As shown in Table  4 , all square roots of the AVE for each construct (values in the diagonal elements) were greater than the corresponding inter-construct correlations (values below the diagonal) [ 24 , 25 ]. These findings provide further support for the validity of the OPES.

figure 1

The standardized estimates of CFA model for validation sample

Development of the OPES: relationships with other variables

Relationships with other variable evidence was examined with correlation coefficients for the total score and subscale scores of the OPES with the total score and subscale scores of the PRCA and SPCC (Table  5 ) from all nursing students who participated in the study and complete all three scales ( N  = 650). Correlation coefficients for the total score of the OPES with total scores for the PRCA and SPCC were − .51 and .45, respectively (both p  < .001). Correlation coefficients for subscale scores of the OPES with the subscale scores of the PRCA and SPCC were all significant ( p  < .001), indicating strong valid evidence of the scale as a self-assessment for effective communication.

The 15-item OPES was found to be a reliable and valid instrument for nursing students’ self-assessments of their performance during previous oral presentations. The strength of this study is that the initial items were developed using both literature review and interviews with nurse educators, who were university tutors in oral presentation skills, as well as nursing students at different stages of the educational process. Another strength of this study is the multiple methods used to establish the validity and reliability of the OPES, including internal structure evidence (both EFA and CFA) and relationships with other variables [ 15 , 26 ].

Similar to previous to other oral presentation instruments, content analysis of items of the OPES generated from the interviews with educators and students indicated accuracy of the content of a presentation and effective communication were important factors for a good performance [ 3 , 4 , 5 , 6 , 8 ]. Other studies have also included self-esteem as a factor that can influence the impact of an oral presentation [ 3 ], however, the subscale of effective communication included the item “Demonstrates confidence and an appropriate level of enthusiasm”, which a quality of self-esteem. The third domain was identified as clarity of speech, which is unique to our study.

Constructs that focus on a person’s ability to deliver accurate content are important components for evaluations of classroom speaking because they have been shown to be fundamental elements of public speaking ([ 7 ]). Accuracy of content as it applies to oral presentation for nurses is important not only for communicating information involving healthcare education for patients, but also for communicating with team members providing medical care in a clinical setting.

The two other factors identified in the OPES, effective communication and clarity of speech, are similar to constructs for delivery of a presentation, which include interacting with the audience through body-language, eye-contact, and question and answer sessions. These behaviors indicate the presenter is confident and enthusiastic, which engages and captures the attention of an audience. It seems logical that the voice, pronunciation, and fluency of speech were not independent factors because the presenter’s voice qualities all are keys to effectively delivering a presentation. A clear and correct pronunciation, appropriate tone and volume of a presentation assists audiences in more easily receiving and understanding the content.

Our 15-item OPES scale evaluated the performance based on outcome. The original scale was composed of 26 items that were derived from qualitative interviews with nursing students and university tutors in oral presentations. These items were the result of asking about important qualities at three timepoints of a presentation: before, during, and after. However, most of the items that were deleted were those about the period before the presentation (1 to 6); two items (25 and 26) were about the period after the presentation. Analysis did not reflect the qualitative interview data expressed by educators and students regarding the importance of preparing with practice and rehearsal, and the importance of peer and teacher evaluations. Other studies have suggested that preparation and self-reflection is important for a good presentation, which includes awareness of the audience receiving the presentation, meeting the needs of the audience, defining the purpose of the presentation, use of appropriate technology to augment information, and repeated practices to reduce anxiety [ 2 , 5 , 27 ]. However, these items were deleted in the scale validation stage, possibly because it is not possible to objectively evaluate how much time and effort the presenter has devoted to the oral presentation.

The deletion of item 20, “The clothing worn by the presenter is appropriate” was also not surprising. During the interviews, educators and students expressed different opinions about the importance of clothing for a presentation. Many of the educators believed the presenter should be dressed formally; students believed the presenter should be neatly dressed. These two perspectives might reflect generational differences. However, these results are reminders assessments should be based on a structured and objective scale, rather than one’s personal attitude and stereotype of what should be important about an oral presentation.

The application of the OPES may be useful not only for educators but also for students. The OPES could be used a checklist to help students determine how well their presentation matches the 15 items, which could draw attention to deficiencies in their speech before the presentation is given. Once the presentation has been given, the OPES could be used as a self-evaluation form, which could help them make modifications to improve the next the next presentation. Educators could use the OPES to evaluate a performance during tutoring sessions with students, which could help identify specific areas needing improvement prior to the oral presentation. Although, analysis of the scale was based on data from nursing students, additional assessments with other populations of healthcare students should be conducted to determine if the OPES is applicable for evaluating oral presentations for students in general.

Limitations

This study had several limitations. Participants were selected by non-random sampling, therefore, additional studies with nursing students from other nursing schools would strengthen the validity and reliability of the scale. In addition, the OPES was developed using empirical data, rather than basing it on a theoretical framework, such as anxiety and public speaking. Therefore, the validity of the OPES for use in other types of student populations or cultures that differ significantly from our sample population should be established in future studies. Finally, the OPES was in the study was examined as a self-assessment instrument for nursing students who rated themselves based on their perceived abilities previous oral presentations rather than from peer or nurse educator evaluations. Therefore, applicability of the scale as an assessment instrument for educators providing an objective score of nursing students’ real-life oral presentations needs to be validated in future studies.

This newly developed 15-item OPES is the first report of a valid self-assessment instrument for providing nursing students with feedback about whether necessary targets for a successful oral presentation are reached. Therefore, it could be adopted as a self-assessment instrument for nursing students when learning what oral presentation require skills require strengthening. However, further studies are needed to determine if the OPES is a valid instrument for use by student peers or nursing educators evaluating student presentations across nursing programs.

Availability of data and materials

The datasets and materials of this study are available to the corresponding author on request.

Hadfield-Law L. Presentation skills. Presentation skills for nurses: how to prepare more effectively. Br J Nurs. 2001;10(18):1208–11.

Article   Google Scholar  

Longo A, Tierney C. Presentation skills for the nurse educator. J Nurses Staff Dev. 2012;28(1):16–23.

Elfering A, Grebner S. Getting used to academic public speaking: global self-esteem predicts habituation in blood pressure response to repeated thesis presentations. Appl Psychophysiol Biofeedback. 2012;37(2):109–20.

Turner K, Roberts L, Heal C, Wright L. Oral presentation as a form of summative assessment in a master’s level PGCE module: the student perspective. Assess Eval High Educ. 2013;38(6):662–73.

Liao H-A. Examining the role of collaborative learning in a public speaking course. Coll Teach. 2014;62(2):47–54.

Tsang A. Positive effects of a programme on oral presentation skills: high- and low-proficient learners' self-evaluations and perspectives. Assess Eval High Educ. 2018;43(5):760–71.

Carlson RE, Smith-Howell D. Classroom public speaking assessment: reliability and validity of selected evaluation instruments. Commun Educ. 1995;44:87–97.

Langan AM, Wheater CP, Shaw EM, Haines BJ, Cullen WR, Boyle JC, et al. Peer assessment of oral presentations: effects of student gender, university affiliation and participation in the development of assessment criteria. Assess Eval High Educ. 2005;30(1):21–34.

De Grez L, Valcke M, Roozen I. The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Comput Educ. 2009;53(1):112–20.

Murillo-Zamorano LR, Montanero M. Oral presentations in higher education: a comparison of the impact of peer and teacher feedback. Assess Eval High Educ. 2018;43(1):138–50.

Polit DF, Beck CT. The content validity index: are you sure you know what’s being reported? Critique and recommendations. Res Nurs Health. 2006;29(5):489–97.

McCroskey CJ. Oral communication apprehension: a summary of recent theory and research. Hum Commun Res. 1977;4(1):78–96.

Dupagne M, Stacks DW, Giroux VM. Effects of video streaming technology on public speaking Students' communication apprehension and competence. J Educ Technol Syst. 2007;35(4):479–90.

Kim JY. The effect of personality, situational factors, and communication apprehension on a blended communication course. Indian J Sci Technol. 2015;8(S1):528–34.

Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166 e7–16.

Pearson JC, Child JT, DeGreeff BL, Semlak JL, Burnett A. The influence of biological sex, self-esteem, and communication apprehension on unwillingness to communicate. Atl J Commun. 2011;19(4):216–27.

Degner RK. Prevalence of communication apprehension at a community college. Int J Interdiscip Soc Sci. 2010;5(6):183–91.

Google Scholar  

McCroskey JC. An introduction to rhetorical communication, vol. 4th ed. Englewood Cliffs: NJ: Prentice-Hall; 1982.

Hancock AB, Stone MD, Brundage SB, Zeigler MT. Public speaking attitudes: does curriculum make a difference? J Voice. 2010;24(3):302–7.

Nunnally JC, Bernstein IH. Psychometric theory. New York: McGraw-Hill; 1994.

Hair JF, Black B, Babin B, Anderson RE, Tatham RL. Multivariate data analysis. 6th ed. Upper Saddle River: NJ: Prentice-Hall; 2006.

DeVellis RF. Scale development: theory and applications. 2nd ed. Oaks, CA: SAGE; 2003.

Bentler PM. On the fit of models to covariances and methodology to the bulletin. Psychol Bull. 1992;112(3):400–4.

Fornell C, Larcker D. Evaluating structural equation models with unobservable variables and measurement error. J Mark Res. 1981;18:39–50.

Hair JF, Black WC, Babin BJ, Anderson RE. Multivariate data analysis: a global perspective vol. 7th ed. Upper Saddle River: Pearson Prentice Hall; 2009.

Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37(9):830–7.

Foulkes M. Presentation skills for nurses. Nurs Stand. 2015;29(25):52–8.

Download references

Acknowledgements

The authors thank all the participants for their kind cooperation and contribution to the study.

This study was supported by grants from the Ministry of Science and Technology Taiwan (MOST 107–2511-H-255-007), Ministry of Education (PSR1090283), and the Chang Gung Medical Research Fund (CMRPF3K0021, BMRP704, BMRPA63).

Author information

Authors and affiliations.

Department of Nursing, Chang Gung University of Science and Technology, Division of Pediatric Hematology and Oncology, Linkou Chang Gung Memorial Hospital, Taoyuan City, Taiwan, Republic of China

Yi-Chien Chiang

Department of Nursing, Chang Gung University of Science and Technology, Taoyuan City, Taiwan, Republic of China

Hsiang-Chun Lee & Chia-Ling Wu

Administration Center of Quality Management Department, Chang Gung Medical Foundation, Taoyuan City, Taiwan, Republic of China

Tsung-Lan Chu

Department of Nursing, Chang Gung University of Science and Technology; Administration Center of Quality Management Department, Linkou Chang Gung Memorial Hospital, No.261, Wenhua 1st Rd., Guishan Dist, Taoyuan City, 333 03, Taiwan, Republic of China

Ya-Chu Hsiao

You can also search for this author in PubMed   Google Scholar

Contributions

All authors conceptualized and designed the study. Data were collected by Y-CH and H-CL. Data analysis was conducted by Y-CH and Y-CC. The first draft of the manuscript was written by Y-CH, Y-CC, and all authors contributed to subsequent revisions. All authors read and approved the final submission.

Corresponding author

Correspondence to Ya-Chu Hsiao .

Ethics declarations

Ethics approval and consent to participate.

All the study methods and materials have been performed in accordance with the Declaration of Helsink. The study protocol and the procedures of the study were approved by Chang Gung Medical Foundation institutional review board (number: 201702148B0) for the protection of participants’ confidentiality. All of the participants received oral and written explanations of the study and its procedures, as well as informed consent was obtained from all subjects.

Consent for publication

Not applicable.

Competing interests

No conflict of interest has been declared by the authors.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Chiang, YC., Lee, HC., Chu, TL. et al. Development and validation of the oral presentation evaluation scale (OPES) for nursing students. BMC Med Educ 22 , 318 (2022). https://doi.org/10.1186/s12909-022-03376-w

Download citation

Received : 25 February 2021

Accepted : 14 April 2022

Published : 26 April 2022

DOI : https://doi.org/10.1186/s12909-022-03376-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Nurse educators
  • Nursing students
  • Oral presentation
  • Scale development

BMC Medical Education

ISSN: 1472-6920

clinical presentation evaluation form

IMAGES

  1. Presentation Evaluation Form Templates For 2023

    clinical presentation evaluation form

  2. FREE 37+ Presentation Evaluation Forms in PDF

    clinical presentation evaluation form

  3. presentation evaluation forms Templates

    clinical presentation evaluation form

  4. Clinical Evaluation Form

    clinical presentation evaluation form

  5. Seminar evaluation form

    clinical presentation evaluation form

  6. FREE 5+ Clinical Observation Forms in PDF

    clinical presentation evaluation form

VIDEO

  1. Clinical Evaluation Video 1

  2. Clinical Evaluation 3

  3. Clinical Practice Evaluation 1

  4. Clinical Evaluation 4

  5. Clinical Evaluation 1.2

  6. How to Evaluate a Presentation?

COMMENTS

  1. PDF Formal Presentation Evaluation Form

    Organized and easy to follow. 3. Presenter exhibited a good understanding of topic. 4. Presenter was well-prepared. 5. Presenter spoke clearly/effectively. 6. Time for presentation used effectively.

  2. PDF CQC Presentation Evaluation Form

    Question does not use PICO format. Loose connection between patient and question. Question will provide guidance for clinical care of patient. Question in PICO format including. Patient/population/problem. Intervention. Comparison group. Clinical outcome. Uses key elements of patient's history and plan to justify PICO question.

  3. PDF Focus on Clinical Presentation (00177519)

    PT INITIAL EVALUATIONS: FOCUS ON CLINICAL PRESENTATION. Prepared by Kay Hashagen, PT, MBA, RAC-CT October 2017. Physical therapists must document clinical presentation as one of the four initial evaluation components to support the appropriate evaluation complexity code. The following table shows which clinical presentation relates to each ...

  4. PDF Competency-Based Evaluation Form

    CWRU Clinical Case Presentation . Competency-Based Evaluation Form . Student Name: _____ Date of Case Presentation: _____ Please rate each item on the following scale: - = performance is below our standards, due to incomplete or erroneous information . 0 = performance is adequate and at the level we expect of advanced students ...

  5. PDF Clinical Evaluation Rubric

    KEY. U = Unsatisfactory. N/I = Needs Improvement. S = Satisfactory. E = Exceeds expectations. Unable to develop skills, meet clinical objectives, utilize resources for remediating knowledge deficits, or threatens the safety of the patient. Unable to synthesize theory with clinical practice.

  6. Clinical presentation, evaluation, and diagnosis of the ...

    INTRODUCTION. Acute pulmonary embolism (PE) is a common and sometimes fatal disease. The approach to the evaluation should be efficient while simultaneously avoiding the risks of unnecessary testing so that therapy can be promptly initiated and potential morbidity and mortality avoided [].The clinical manifestations, evaluation, and diagnosis of PE are discussed in this topic.

  7. PDF Oral Case Presentation Evaluation

    5 Physical Examination. Physical exam is precise and includes all pertinent positive and negative findings including: • Vital signs including temperature, blood pressure, pulse (orthostatic /P's when appropriate), respiratory rate, height, weight, MI, and head circumference (for pediatric patients) • General • Skin, hair, nails ...

  8. PDF Clinical Evaluation Form

    Clinical Evaluation Form. In this section use the performance rating scale provided below to give the student feedback. Consistently misses essential information. Poor organization, accuracy and/or ineffective questioning techniques. Does not consider non-verbal cues.

  9. PDF Patient Case Evaluation Form

    Patient Case Evaluation Form Student name_____ Date_____ Case Number _____ Explanation Points Preceptor Remarks No remarks - 10 points ... • Appropriate length of presentation Does not meet standards 6 points or less No remarks - 10 points Minor adjustments needed- 7-9 points Patient Selection

  10. The Patient Presentation Rating Tool for Oral Case Presentations

    The Patient Presentation Rating (PPR) tool can be used to evaluate oral case presentations, and was developed using focus groups of medical school educators from several medical disciplines including pediatrics, internal medicine, psychiatry, surgery, and neurology. Methods: The PPR includes 18 items, comprising six sections and an overall ...

  11. How to present patient cases

    Presenting patient cases is a key part of everyday clinical practice. A well delivered presentation has the potential to facilitate patient care and improve efficiency on ward rounds, as well as a means of teaching and assessing clinical competence.1 The purpose of a case presentation is to communicate your diagnostic reasoning to the listener, so that he or she has a clear picture of the ...

  12. PDF EVALUATION FORM ORAL PRESENTATION

    ancillary clinical data to work into topic presentation Read extensively; communicated findings and actively taught throughout presentation Insufficient observation/data (2) Demonstrates knowledge of pathophysiology of disease and treatment • • • • • • Fail = 0 Low Pass = 1 Pass = 2 High Pass = 3 Honors = 4 N/A Presentation ...

  13. PDF TEMPLATE FOR CLINICAL CASE REVIEW AND PRESENTATION

    clinical impression/ findings. Medical problems, and self reported verbalised problems. Findings on examination. Holistic assessment. Te Whare Tapa Wha…spiritual, physical, psychosocial, emotional issues pt and/ or whanau identified. Holistic approach to assessment, and identification of palliative care needs. Resolution of issues identified

  14. Forms, Templates & Examples

    Annual Program Evaluation Data Checklist. APE Documentation Templates (Sign-in Sheet & Agenda, Meeting Minutes, and Approval of Action Plan) APE Prep Instructions for Program Directors (Step-By-Step Instructions & APE Checklist) A Quick Method to Analyze Program Evaluations. APE Powerpoint Presentation Example.

  15. Clinical Evaluation Report Template

    The clinical evaluation affirms compliance with relevant safety and performance requirements (Regulation (EU) 2017/745, ANNEX I, clauses 1 and 8). Overall, the clinical safety, performance, and benefits demonstrate that the <medical device> aligns with current knowledge and technological standards. Conclusions: The clinical evaluation confirms ...

  16. PDF Department of Pharmacy Practice and Administration

    Presentation Content. Presents a complete history and physical Discusses pertinent laboratory data Understands pharmacology, pharmacodynamics, adverse effects, etc. of relevant drugs Exhibits understanding of patient's hospital course Demonstrates individualization of dosing regimen Identifies monitoring parameters for relevant drugs Presents ...

  17. Clinical Forms

    Evaluation Forms. Mid-Rotation Evaluation Form (PDF) Third Year Preceptor Evaluation Form (PDF) Fourth Year Preceptor Evaluation Form (PDF) Case Report Grading Rubric (PDF) Research Project Report Grading Rubric (PDF) Excused Absence Forms. OMS 3 and 4 Request for an Unplanned Excused Absence (PDF) OMS 3 and 4 Request for a Planned Excused ...

  18. Evaluating Oral Case Presentations Using a Checklist

    Table 1: Percent Agreement Among Evaluators on Whether Oral Case Presentation Checklist Items Were Completed, The George Washington University School of Medicine and Health Sciences, 2010*. The mean total score assigned by student-evaluators (76.6%, SD 11.8%) was slightly higher than the mean score assigned by faculty (74.0%, SD 10.2%; P = .02).

  19. PDF Psychiatry Clinical Skills Evaluation Form (CSV v.2)

    Psychiatry Clinical Skills Evaluation Form (CSV v.2) Patient Type . Physician-Patient Relationship (overall) Unacceptable Acceptable . Develops rapport with patient ... Case Presentation (overall) Unacceptable Acceptable Organized and accurate presentation of history .

  20. Developing a Beginner's Guide to Writing a Clinical Case Report: A

    Junior doctors were invited by email to look at the online presentation and complete an online evaluation form thereafter. The questions were adapted from the Evaluation Form for Teaching and Presentations provided by the Joint Royal Colleges of Physicians Training Board. Data was analysed both quantitatively and qualitatively. Results

  21. 12 Free Presentation Evaluation Forms (What to Include)

    Purpose of Presentation Evaluation Form. An evaluation form allows you to give a critical review and evaluation of a presentation. Different aspects of the presentation are judged as part of the evaluation; this includes the presenter's effectiveness and efficiency in imparting information, body language, enthusiasm, volume, modulation, ease of flow, clarity of speaking, and the presenter ...

  22. Development and validation of the oral presentation evaluation scale

    Oral presentations are an important educational component for nursing students and nursing educators need to provide students with an assessment of presentations as feedback for improving this skill. However, there are no reliable validated tools available for objective evaluations of presentations. We aimed to develop and validate an oral presentation evaluation scale (OPES) for nursing ...

  23. PDF Poster Presentation Evaluation Form

    POSTER PRESENTATION EVALUATION FORM. Please note: this form will be given to the presenter after the event to provide feedback. Presenter(s): Title: Session & Time: Please mark the score for each evaluation criterion below. When you are finished, combine the total points at the bottom for the overall score. Format.