Citation
Design, Development, and Validation of a Virtual Radiation Oncology Clinic for Error Training

Material Information

Title:
Design, Development, and Validation of a Virtual Radiation Oncology Clinic for Error Training
Creator:
Willoughby, Twyla R
Place of Publication:
[Gainesville, Fla.]
Florida
Publisher:
University of Florida
Publication Date:
Language:
english
Physical Description:
1 online resource (244 p.)

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Biomedical Engineering
Committee Chair:
BOVA,FRANK J
Committee Co-Chair:
GILLAND,DAVID R
Committee Members:
MULLER,KEITH E
DANA,THOMAS M
DVORAK,TOMAS
MEEKS,SANFORD L
Graduation Date:
5/3/2014

Subjects

Subjects / Keywords:
Distance functions ( jstor )
Dosage ( jstor )
Error rates ( jstor )
Flow charting ( jstor )
Image files ( jstor )
Physicians ( jstor )
Radiation oncology ( jstor )
Radiotherapy ( jstor )
Simulations ( jstor )
Software ( jstor )
Biomedical Engineering -- Dissertations, Academic -- UF
radiation -- simulation -- virtual
Genre:
bibliography ( marcgt )
theses ( marcgt )
government publication (state, provincial, terriorial, dependent) ( marcgt )
born-digital ( sobekcm )
Electronic Thesis or Dissertation
Biomedical Engineering thesis, Ph.D.

Notes

Abstract:
Virtual reality systems have been widely accepted in medical training and other high risk professions because they allow the user a full opportunity to work independently and to encounter errors in a safe and cost effective manner. A Virtual Radiation Oncology Clinic (VROC) was designed and developed to allow self-guided training for radiation oncology physicians and staff. The design involved identifying the key elements of the radiation oncology process from the physician's perspective. These elements include tumor volume identification, dose prescription, plan evaluation, and radiation dose verification. Feedback opportunities and specific metrics to use within the VROC were identified through a review of the literature along with a review of the overall workflow in radiation oncology. Recommended feedback includes contour evaluation and feedback about the quality of the overall treatment. The VROC includes a tool to assist in error analysis once a patient treatment is completed. Development included creating applications to generate feedback metrics, provide error analysis, and simulate the daily radiation treatment information. Once the VROC was designed and developed it was used to simulate a set of sample errors based on those reported in the literature. Tests of the VROC involved verifying the accuracy of the simulated daily treatment information compared to actual treatment information and analyzing the feedback metrics to be able to score the severity of the simulated error. Final face validation included evaluating the overall realism and utility of the VROC tool. Construct validation tests are proposed for the next implementation of the VROC system. ( en )
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis:
Thesis (Ph.D.)--University of Florida, 2014.
Local:
Adviser: BOVA,FRANK J.
Local:
Co-adviser: GILLAND,DAVID R.
Electronic Access:
RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2015-05-31
Statement of Responsibility:
by Twyla R Willoughby.

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Embargo Date:
5/31/2015
Classification:
LD1780 2014 ( lcc )

Downloads

This item has the following downloads:


Full Text

PAGE 1

1 DESIGN, DEVELOP M ENT, AND VALIDATION OF A VIRTUAL RADIATION ONCOLOGY CLINIC FOR ERROR TRAINING By TWYLA R. WILLOUGHBY A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF TH E REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2014

PAGE 2

2 2014 Twyla R. Willoughby

PAGE 3

3 T o my m om who has been a s ource of encouragement and support

PAGE 4

4 ACKNOWLEDGMENTS I would like to thank my advisor Dr. Frank Bova for being willing to take a risk on an unconventional graduate student and also on an unconventional idea. His ideas and time and effort have made this possible. There were several times where I got stuck on an idea or a con cept and he helped to keep things moving forward. I look forward to opportunities in the future to continue this work and to collaborate on some other possibilities. I also acknowledge Dr. Sanford Meeks along with my colleagues at UF Health Cancer Center at Orlando Health who have been a continual source of encouragement and patience. They have provided me with the latitude to be able to keep working on this project, and have been a sounding board for different ideas. With support and the sup port of the physics department under his leadership I would not have made it this far I also thank the doctors that I have had the privilege of working with the past several years while on this project. Dr. Patrick Kupelian for helping me to get started down this path several years ago as well as Dr. Tomas Dvorak who was willing to serve on the committee and Dr. Patrick Kelly for being willing to listen to some fo r my i deas and provide some feedback. Without the physicians, there would be no concept for t his project, and the entire work would be meaningless. I thank them for their willingness to listen to new ideas and provide guidance and direction on this project. I thank the other member of my graduate committee Dr. Dana, Dr. Gilland, and Dr. Muller who have been willing to listen to these ideas and offer insight and suggestions on how to best approach dif ferent aspects of this project.

PAGE 5

5 I especially thank my mom who has supported me through this entire process. She has come to my aide many times to tak e care of things at my house and has also helped out with ting so I could work late or spend weekend s working on this project. She has been a source of encouragement through all of the difficulties and has sacrificed being able to do many t hings she would want because take the time. Finally, I must acknowledge God who has encouraged me that it is important to continue to develop ing whatever skills or talent He has given. I am sure I would not have stuck with this process all this time had I not felt it was part of a bigger plan

PAGE 6

6 TABLE OF CONTENTS page ACKNOWLEDGMENTS ................................ ................................ ................................ .. 4 LIST OF TABLES ................................ ................................ ................................ ............ 9 LIST OF FIGURES ................................ ................................ ................................ ........ 11 LIST OF OBJECTS ................................ ................................ ................................ ....... 12 LIST OF ABBREVIATIONS ................................ ................................ ........................... 13 ABSTRACT ................................ ................................ ................................ ................... 16 CHAPTER 1 INTRODUCTION ................................ ................................ ................................ .... 18 2 VROC REQUIREMENT LISTING ................................ ................................ ........... 29 Simulation Overview ................................ ................................ ............................... 29 Radiation Oncology Training Requirements ................................ ........................... 34 Instructional Design in Simulation Training ................................ ............................. 36 Summary ................................ ................................ ................................ ................ 43 3 RADIATION ONCOLOGY ................................ ................................ ....................... 46 Radiation Oncology Workflow ................................ ................................ ................. 46 Consultation ................................ ................................ ................................ ..... 47 Radiation Simulation: (Planning Preparation) ................................ ................... 48 Treatment Planning ................................ ................................ .......................... 49 Treatm ent Preparation ................................ ................................ ..................... 51 Treatment & Verification ................................ ................................ ................... 52 Radiation Oncologist Workload ................................ ................................ ........ 55 Data and Applications Used in Radiation Oncology. ................................ ........ 56 Errors in Radiation Oncology ................................ ................................ .................. 57 Metrics in Radiation Oncology ................................ ................................ ................ 61 Contouring Metrics ................................ ................................ ........................... 62 Metrics for Dosimetry in Radiation Oncology ................................ .................... 64 Metrics for Error Reporting ................................ ................................ ............... 69 Summary ................................ ................................ ................................ ................ 70 4 DEVELOPMENT OF A VIRTUAL RADIATION ONCOLOGY CLINIC ..................... 83 Materials ................................ ................................ ................................ ................. 84 Methods ................................ ................................ ................................ .................. 86

PAGE 7

7 Virtual Patient (Figure 4 1 A) ................................ ................................ ............ 87 Scenario Request (Figure 4 1 B) ................................ ................................ ...... 92 Contour & Prescription Comparison (Figure 4 1 C & D) ................................ ... 93 Plan Compariso ns Metrics (Figure 4 1 E) ................................ ......................... 95 Virtual Treatment Machine (Figure 4 1 F) ................................ ......................... 96 Final Plan Report (Figure 4 1 G) ................................ ................................ .... 102 Final Error Exercise and Debrief (Figure 4 1 H) ................................ ............. 103 Documentation ................................ ................................ ................................ ...... 104 Summary ................................ ................................ ................................ .............. 105 5 ERROR SIMULATION TESTING ................................ ................................ ......... 115 Background ................................ ................................ ................................ ........... 115 Methods ................................ ................................ ................................ ................ 118 Errors in Pre Planning and Treatment Planning ................................ ............. 118 Errors in Patient Treatments ................................ ................................ ........... 119 Daily orthogonal k V images ................................ ................................ ..... 122 Portal images ................................ ................................ ........................... 124 CBCT simulation ................................ ................................ ...................... 125 Results ................................ ................................ ................................ .................. 126 Treatment Dosimetric Errors ................................ ................................ .......... 126 Daily Orthogonal KV Images ................................ ................................ .......... 126 Portal Images ................................ ................................ ................................ 128 CBCT Verification Images ................................ ................................ .............. 129 Conclusions ................................ ................................ ................................ .......... 130 6 RADIATION METRICS AND RADIATION ERRORS ................................ ............ 141 Methods ................................ ................................ ................................ ................ 141 Metric Calculations ................................ ................................ ......................... 144 Error Scoring ................................ ................................ ................................ .. 145 Results ................................ ................................ ................................ .................. 146 Contouring Metrics ................................ ................................ ......................... 146 Error Scoring ................................ ................................ ................................ .. 147 Metric Analysis ................................ ................................ ............................... 149 Head and neck data comparison ................................ ............................. 149 Multiple patient metric analyses ................................ ............................... 153 Combination of all error sets ................................ ................................ .... 153 Conclusions ................................ ................................ ................................ .......... 154 7 ROOT CAUSE ANALYSES AND MEDICAL EDUCATION ................................ ... 163 Background ................................ ................................ ................................ ........... 1 63 Development of RCA Exercise ................................ ................................ ............. 168 Example RCA ................................ ................................ ................................ ....... 173 Sample Therapists Reports ................................ ................................ ............ 174 Scoring the Error ................................ ................................ ............................ 175

PAGE 8

8 ................................ ........................... 175 Causation Table ................................ ................................ ............................. 177 Propose a Quality Improvement ................................ ................................ ..... 178 Summary ................................ ................................ ................................ .............. 178 8 ACCEPTANCE TEST AND INITIAL USE OF VROC ................................ ............ 181 Background ................................ ................................ ................................ ........... 181 Methods ................................ ................................ ................................ ................ 184 Results ................................ ................................ ................................ .................. 185 Liked Best ................................ ................................ ................................ ....... 187 Liked Least ................................ ................................ ................................ ..... 187 Other Recommendations ................................ ................................ ................ 188 Summary ................................ ................................ ................................ .............. 188 9 CONCLUSIONS AND FUTURE WORK ................................ ............................... 194 Review of VROC System ................................ ................................ ...................... 194 Future S tudies ................................ ................................ ................................ ...... 197 Simulation Training For Residents ................................ ................................ 197 Metrics & Feedback ................................ ................................ ........................ 199 Roo t Cause Analysis ................................ ................................ ...................... 201 Final C omments ................................ ................................ ................................ .... 203 APPENDIX A DETAILED DESIGN TABLE ................................ ................................ ................. 204 B FILM ANALYSIS INFORMATION ................................ ................................ ......... 207 C METRIC CALCULATION INFORMATION ................................ ............................ 211 D HEAD AND NECK METRIC CALCULATIONS ................................ ..................... 216 E VARIETY CASE METRIC CALCULATIONS ................................ ......................... 223 F INITIAL VROC CASES ................................ ................................ ......................... 228 G RCA EXCERCI SE FORM ................................ ................................ ..................... 231 LIST OF REFERENCES ................................ ................................ ............................. 235 BIOGRAPHICAL SKETCH ................................ ................................ .......................... 244

PAGE 9

9 LIST OF TABLES Table page 1 1 Varian Medical System commercial software used for VROC simulation ........... 26 1 2 S oftware and tools developed to create the VROC simulat ion ........................... 27 3 1 Error distribution from ROSIS system ................................ ................................ 79 3 2 Summary of ROSIS results by category ................................ ............................. 81 3 3 Dosimetric severity score ................................ ................................ ................... 82 3 4 Consequence severity score ................................ ................................ .............. 82 4 1 Varian software modules and the ir use within VROC ................................ ....... 111 4 2 Files needed to create a virtual RO patient ................................ ...................... 112 4 3 Example results of contour comparison report ................................ ................. 113 4 4 Summary of developed VROC tools and applications ................................ ...... 114 5 1 Difference between actual and simulated images ................................ ............ 140 6 1 Data calculated ................................ ................................ ................................ 159 6 2 Example of calculated metrics for a single head and neck case ....................... 160 6 3 Average consequence severity vs average combined metrics ......................... 161 7 1 VROC sample treatment report ................................ ................................ ........ 180 7 2 Expanded VROC tre atment report ................................ ................................ .... 180 8 1 VROC acceptance test ................................ ................................ ..................... 190 8 2 Summary of initial reviewers ................................ ................................ ............. 192 A 1 VROC workflow development chart in table format ................................ .......... 204 B 1 Image and registration offsets in KV/KV image testing ................................ ..... 208 B 2 Difference for each data point between actual and simulated films .................. 209 B 3 Alignment compared to known isocenter (actual film) ................................ ...... 209 B 4 Alignment compared to known isocenter (simulated films) ............................... 209

PAGE 10

10 B 5 Alignment compared to known isocenter (actual CBCT) ................................ .. 210 B 6 Alignment compared to known isocenter (simulated CBCT) ............................ 210 C 1 Values used in the EUD and TCP for target calculations ................................ 212 C 2 Values used in the EUD and NTCP for normal tissues ................................ ..... 213 C 3 Recommended TS and SI values for calculating DD ................................ ........ 215 D 1 Head and nec k error scores ................................ ................................ ............. 219 D 2 Target values for head and neck cases ................................ ............................ 220 D 3 EUD and NTCP for ipsilateral parotid head and neck cases ............................ 221 D 4 EUD and NTCP for spinal cord head and neck cases ................................ ...... 222 E 1 Variety case disease sites and errors ................................ ............................... 224 E 2 Error scores for variety cases ................................ ................................ ........... 225 E 3 Target metrics for variety cases ................................ ................................ ........ 226 E 4 Normal tis sue metrics variety cases ................................ .............................. 227 F 1 Patients available in initial VROC ................................ ................................ ..... 228

PAGE 11

11 LIST OF FIGURES Figure page 1 1 Radiation oncology workflow versus time and workload ................................ ..... 28 3 1 Workflow of radiation oncology team member responsibilities ......................... 73 3 2 Workflow of radiation oncology overview ................................ ......................... 74 3 3 Workflow of radiation oncology data elements ................................ ................ 75 3 4 Workfl ow of radiation oncology common errors ................................ ................ 76 3 5 Workflow of radiation oncology feedback and metrics. ................................ ..... 77 3 6 Sample dose volume h istogram. ................................ ................................ ........ 78 4 1 VROC development overview. ................................ ................................ .......... 108 4 2 VROC components ................................ ................................ ........................... 109 4 3 Illustration of how setup errors are simulated. ................................ .................. 110 5 1 Phantom setup for virtual treatment machine verification ................................ 134 5 2 Simulated orthogonal kV images without patient setup error. ........................... 135 5 3 Actual orthogonal kV images without patient setup error ................................ .. 135 5 4 Simulated orthogonal kV images with a setup error. ................................ ........ 136 5 5 Actual orthogonal kV images with a setup error ................................ ............... 136 5 6 Simulated port al film gantry at 0 degrees without patient setup error. .............. 137 5 7 Actual portal film gantry at 0 degrees without patient setup error. .................... 137 5 8 Simulated portal film gantry at 45 degrees with patient setup error. ................. 138 5 9 Actual portal film gantry at 45 degrees with patient setup error. ....................... 138 5 10 Example of a simulated CBCT image registration. ................................ ........... 139 6 1 Absolute change in conformity vs average severity score ................................ 162 8 1 VROC specific workflow for a single trainee and single virtual patient. ............ 193

PAGE 12

12 LIST OF OBJECTS Object page 4 1 VROC Director User Manual ................................ ................................ .............. 88 4 2 VR OC Scenario Request Form ................................ ................................ .......... 9 3 4 3 VROC Instructor User Manual ................................ ................................ ............ 9 3 4 4 VROC Trainee User Manual ................................ ................................ ............... 9 4 4 5 VROC Prescription Form ................................ ................................ .................... 9 4 4 6 RCA Excercise ................................ ................................ ................................ 10 4 9 1 CTCAE v4.0 ................................ ................................ ................................ ...... 200

PAGE 13

13 LIST OF ABBREVIATIONS 3DRT 3D radiation therapy five times in series to determine the root cause of the error. AAPM The American Association of Physicists in Medicine ABR American Board of Radiology ACGME Accreditation Council for the Graduate Medical Education ACR American College of Radiology AHR Q Agency for Healthcare Research and Quality ARRO Association of Residents in Radiation Oncology ASTRO American Society of Therapeutic Radiological ATP Acceptance Test Procedure CME Continuing Medical Education: Required educational hours for many medical licensures. CT Computed Tomog raphy This is a method ology for 3D x ray imaging DD Detrimental Dose. Calculated by summing the weighted dose change per structure for dose errors DICOM Digital Imaging and Communication in Medicine A standard for medical imaging files that allows them to be transferred from one system to another. Dosimetrist A professional within the radiation oncology department who is specifically trained in the planning and calculating of radiation oncology treatment plans DRR Digital ly Reconstructed Radiograph an x ray that is reconstructed from a CT scan DVH Dose Volume Histogram EMR Electronic medical record. This is a computer system and database specifically for medical purposes. It contains specific medical data.

PAGE 14

14 EUD Equivalent Uniform Dose Gray (Gy) Dose unit for radiation energy deposited IGRT Image Guided radiation therapy: The use of daily localization imaging to guide the radiation treatment. IMRT Intensity modulated radiation therapy JCAHO Joint Commission on Accreditation of Health Care Organiz ation Currently uses the name The Joint Commission L inac Linear Accelerator is a method of accelerating partials that can be used as a technique for creating therapeutic radiation MLC Multi leaf Collimator device to shape the radiation beam from the ma chine. MOC Maintenance of Certification MU Monitor Unit: this is a unit of radiation treatment machine time and is related to the dose delivered. NCI National Cancer Institute NCPS National Center for Patient Safety NIH National Institute of Health NPSF National Patient Safety Foundation NTCP Normal Tissue Complication Probability NTCP tot A combined NTCP that was calculated for the purpose s of this study calculated as the product of 1 NTCP for each structure or (1 ( 1 NTCP (n)) for all structures OAR Organ at risk: Used in treatment planning to describe the non target tissues OBI On Board Im aging: This is the name Varian Medical Systems uses to describe the attached kV imaging system on their high energy linear a ccelerator treatment machines. PQI Practice Quality Improvements PRO Practical Radiation Oncology: A journal published by Elsevier specifically addressing clinical issues within radiation oncology

PAGE 15

15 R&V Record and Verify system. A computer system used wi thin radiation oncology to specifically identify the specific details of the radiation treatment. The system interfaces with the treatment machine to capture machine settings to verify the machine settings. RCA Root Cause Analysis RO Radiation Oncology RO ILS Radiation Oncology Incident Learning System TCP Tumor Control Probability TJC The Joint Commission: organization that accredits hospitals for Medicare reimbursement. Previously called JCAHO TPS Treatment planning system: A computer system used to develop radiation treatment plans on a patient by patient basis. UbD Understanding by Design a methodology for instructional design UID Unique Identity value: This is a specific number that is used within the DICOM file to identify where the image was created. VR Virtual Reality VROC Virtual Radiation Oncology Clinic: Developed specifically for this project, this is a novel simulator that simulates the radiation oncology workflow for physician trainees or resident

PAGE 16

16 Abstract of Dissertation Prese nted to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy DESIGN, DEVELO P M ENT, AND VALIDATION OF A VIRTUAL RADIATION ONCOLOGY CLINIC FOR ERROR TRAINING By T wyla R. Willoughby May 2014 Chair: Frank Bova Major: Biomedical Engineering Virtual reality systems have been widely accepted in medical training and other high risk professions because they allow the user a full opportunity to work independently and t o encounter errors in a saf e and cost effective manner. A Virtual Radiation Oncology Clinic (VROC) was designed and developed to allow self guided training for radiation oncology physicians and staff. The design involved identifying the key elements of the elements include tumor volume identification, dose prescription, plan evaluation, and radiation dose verification. Feedback opportunities and specific metrics to use within the VROC were i dentified through a review of the literature along with a review of the overall workflow in radiation oncology. R ecommend ed feedback includes contour evaluation and feedback about the quality of the overall treatment. The VROC includes a tool to assist in error analysis once a patient treatment is completed. Development included creating applications to generate feedback metrics, provide error analysis, and simulate the daily radiation treatment information. Once the VROC was designed and developed it was u sed to simulate a set of sample errors based on those reported in the

PAGE 17

17 literature. Tests of the VROC involved verifying the accuracy of the simulated daily treatment information compared to actual treatment information and analyzing the feedback metrics to be able to score the severity of the simulated error. Final face validation included evaluating the overall realism and utility of the VROC tool. Construct validation tests are proposed for the next implementation of the VROC system.

PAGE 18

18 CHAPTER 1 INTRODUCT ION The field of radiation oncology has changed rapidly over the past several years due to advances in imaging and computer controlled delivery technologies. While these technologies have created the ability to deliver more precisely placed high doses of radiation, they have introduced new avenues for errors to occur that are often more difficult to detect and could have severe consequences. It is kn own that these errors can occur, yet residents have limited exposure and therefore limited training opport unities to develop the skill sets necessary to detect, correct, and most importantly develop systems to prevent such errors. Upon graduation from his or her residency program, the clinician will have little if any experience in appropriate root cause analy s e s, while at the same time having to accept full responsibility for bringing such analysis to the practice he or she joins. The most effective way to learn about such errors is to provide the resident with real life experiences, allowing him or her oppor tunity to learn from their own mistakes. However, in most training programs mature and robust systems are in place to detect and correct any errors that occur before they can cause harm to a patient. Additionally, to enhance training and provide an enviro nment where such errors can be prevented, the number of patients that a resident manages is limited. In order to provide physicians an opportunity to learn to manage a large number of patients and to be allowed to develop the skill sets needed to detect er rors, to realize the ramifications of such errors, and to participate in root cause analysis, a simulation tool unique to the discipline of radiation oncology is being developed.

PAGE 19

19 Simulation is the process of replicating a procedure or technique within a c omputer or with the use of a model in order to provide hands on practice. Simulation has been widely used in other procedures such as in aviation or in surgery where the risk is high Radiation oncology presents a unique environment for the introduction of a simulation that we are calling a Virtual Radiation Oncology Clinic (VROC) Figure 1 1 is a diagram of the radiation oncology process, with a brief description of the workflow. The consultation represents the initial physician encounter with the patient. The preparation stage is the acquisition of a set of medical images that are used to generate a computer model of the patient. Treatment planning is the process of performing dose calculations on the computer model in order to determine the directions fro m which to aim the radiation. Plan preparation includes the process of checking the plan and converting it into machine parameters for delivery on the treatment machine. Treatment delivery includes patient setup on the treatment machine as well as the give n dosage of radiation treatment. To illustrate the complexity of the process, an average timeframe from consultation to completion is illustrated at the top of Figure 1 1. This shows that it is not uncommon for patient treatment to occur over many weeks or even months. During this time, the physician must rely on others to carry out parts of the process and at any given time he or she will be managing several patients who are at different stages of the treatment process. The average numbers of patients see n are indicated at the bottom of Figure 1 1. Because of the duration of the radiation process and the workloads, an example workload for a physician might include consultations on 3 patients, starting the planning process on a different set of 3 other pati ents, reviewing treatment plans for an

PAGE 20

20 additional 3 patients, and reviewing treatments for at least 15 additional patients who are receiving daily treatments ( www.acr.org ) (1 3) The modern day radiotherapy clinic leverages many imaging technologies to allow the clinician to verify the treatment position by acquisition of digital images. In addition to the initial digital images and verification images, most of the machine setting s are recorded digitally in order to create a real time record of the patient treatments. Because of the digital nature of the patient record, and the prolific use of computer models for patient data, radiotherapy presents an ideal opportunity to create a virtual clinic for use in training medical residents, specifically on the unique situations of errors that could occur when implementing new technology, or when instructions are not clear, or when procedures are not followed. Quality assurance within the r adiation oncology environment is concerned with both the accuracy of the delivery technologies and creating a safe overall process for treatment. The newly reported guideline for radiation oncology quality assurances serves as a good reference for both his torical practice as well as new guidelines (4) Historically medical physics focused on improving the quality of treatment by improving t he accuracy of calculations and measurem ents of the radiation delivered, with specific emphasis on machine quality assurance as specified by the publications of the American Association of Physicists in Medicine (AAPM) (5 7) Improvements on the models and computer calculations describing radiation interactions and programming techniques have significantly improved the accuracy of dose calculations However, because of the complexity of the algorithms, they may al so obscure errors or have a tendency to cause the users to become overly confident about

PAGE 21

21 the results from the computer system, which could lead to errors. Hi storical ly, plans for radiation were simple enough that rules of thumb or simple hand calculations could be used to verify that th e plan met the prescribed goals. With more sophisticated and complicated techniques, this type of check is not possible. As new technology has developed, the quality assurance programs have evolved to meet the needs and to pr event errors from occurring (8 9) Because they have developed over time and sometimes as a reaction to specific errors, it may not be obvious why different quality assurance procedures are in place. The new trainee lacks the experience to see how these quality assurance programs have developed over time, and they may not appreciate the types of errors that could occur if one of these programs is changed or the process breaks down. In some large departments, the resident may be removed from the quality as surance process so much that they may not even know what quality assurance proce sses are in place. Recommendations for minimizing errors within radiation oncology often reference the success of error mitigation within the fields of aviation, surgery, and a nesthesiology (10 12) These specialties have pioneered the use of simulations in training to supplement curriculum on errors and to provide opportunities to practice how to handle situation s and to gain first hand experience of the ramifications of break downs in the quality assurance process which can lead to errors. While the radiation oncology process typically requires many days for completion, each aspect of the process can be partitioned into a series of sepa rate procedures, any one of which could allow the introduction of an error. To date, a simulation of the entire radiation oncology workflow

PAGE 22

22 that can simulate errors in radiation oncology has not been used to train users on error mitigation in radiation onc ology despite successes in other professions. The design, development, and validation of a Virtual Radiation Oncology Clinic (VROC) simulator aimed at physician residents would provide a much needed tool within the educational process. This tool would all ow residents to see the ramifications of errors as well as to practice their skill in evaluating treatment plans and verifying treatment delivery. As training sets are developed this would be an opportunity to expose the resident to many different errors that have occurred through the years and to underscore why the quality assurance measures are in place. An added benefit of simulation training is the opportunity to provide direct feedback to the resident about their performance compared to expectations Ideally, metrics that could be calculated and a scale based on the dose of radiation to different structures could be used to score a specific plan which could then be compared to a standard plan It might also be possible to score the delivery of a plan with errors as compared to one without errors. Various metrics have been calculated for radiation treatment plans; however, there is no consensus on what, if any of these metrics should be reported in the event of an error. Several metrics can possibly be used to help describe errors and these may also be used to provide quantitative feedback to residents and other trainees about the quality of the care they give T he task of designing developing, and validating the VROC required the following three deve lopmental steps. The first step was t o determine the elements required in the design of a virtual radiation oncology clinic. This include d process mapping of radiation oncology, review of error s in radiation and their frequency, and a

PAGE 23

23 review of metrics tha t could be used for feedback within the simulation The second step was to develop the realistic VROC to simulate errors and provid e feedback The third step was to v erify that the system functions property and validate the feedback metrics and overall sim ulation of the radiation oncology workflow. Chapter 2 presents background information on simulations in radiation oncology and in medical training. This includes a review of radiation oncology training and on medical errors training. Also considered are su ccessful instructional design elements related to the use of simulations in education. Recommendations from previous studies were used to develop a list of simulation requirements for the Virtual Radiation Oncology Clinic. Chapter 3 identifies the specific Radiation Oncology components and requirements for the design of the VROC. This includes mapping the radiation oncology workflow from the physician viewpoint. A review of the literature on radiation oncology errors was used to evaluate the type and freque ncy of errors that would be required within the VROC system. A review of the literature was performed to recommend or treatment compared to specific standards. Chapte r 4 details the development of the VROC. The created VROC system was developed by creating processes and developing computer code that work alongside commercially available software. The commercial clinical software provides a realistic user interface for the resident. Table 1 1 lists the commercial software that was used within the VROC. D eveloped computer code was written to calculate feedback metrics and to simulate treatmen t verification images and files to be loaded into the commercial

PAGE 24

24 software during a simulation Additionally, a set of web enabled worksheet s were created to provide an avenue to request specific simulation scenarios, and to create an exercise for the trainee in how to conduct a root cause analysis A set of user documents was created t o assist the resident, the instructors, and the person managing the VROC in the overall use. Table 1 2 summarizes the developed products that alongside the commercial software were used to create the VROC that will be described in detail in Chapter 4. Test s were done to verify the functionality of the developed VROC system. Chapter 5 includes reported data on a study that was done to test the accuracy of the simulated treatment verification information. The verification images that were created to simulate recreated using an imaging phantom. Chapter 6 includes all metrics simulated on a variety of different virtual patients with a variety of simulated errors using the VROC A study was d one to compare the metrics to physician scoring of the severity of the error to recommend specific values for feedback within the VROC. One of the primary functions of the VROC was to provide training regarding errors in Radiation Oncology. As such, it was necessary to create an exercise for the residents to complete once they encountered a simulated error within the VROC system. Chapter 7 describes the development and construction of the root cause analysis exercise that was incorporated within the VROC sy stem. Most residents have heard lectures on the root cause analysis process. However, in a clinical environment, any errors that occur will be managed by the risk management team and the quality assurance team and will often exclude trainees A s a teaching tool, one of the main

PAGE 25

25 goals of the VROC system is to provide exercises on root causes and to force the trainee to think about added quality assurance measures or processes to prevent errors. Chapter 8 describes initial face validation of the VROC system. An acceptance test of the VROC system was first performed to show that all of the necessary aspects of the VROC were functional. Validation then included review of the VROC by a set of staff familiar with radiation oncology workflow to indicate if the VRO C was realistic and to get initial impressions and feedback for future studies. Areas that were marked as needing improvement were noted for future development. Chapter 9 includes a final summary of the work that was done to develop the VROC system. Resul ts of initial testing are summarized. Future goals include further developing the automation of the VROC. This would provide greater utility within a training facility. Additional studies to validate the error scoring system as well as to validate the over all VROC system would require the adoption of the VROC within a training facility. Other utilities for future work with the VROC include the use of the VROC for testing competencies or for credentialing. Additional expansions of the VROC may include utilit y for other users within radiation oncology, specifically the medical physicist, dosimetrist, and the radiological technologist.

PAGE 26

26 Table 1 1. Varian M edical S ystem s commercial software used for VROC simulation Varian Name Module Description Electronic Medical Record (EMR) Patient Manager Review patient d emographics documents, test results and radiation treatment summary Time Planner Scheduling patient appointments and treatments. Can be sorted by location or staff Record and Verify System (R&V) RT Chart Record treatment goals and prescriptions. Used to track delivered treatments. Offline Review Review of portal images and verification treatment images Radiotherapy Treatment Planning System (TPS) Eclipse Contouring, tre atment plan, prescriptions, and calculations

PAGE 27

27 Table 1 2. S oftware and tools developed to create the VROC simulation Tool Type Description User Manuals Trainee user manual Document Manual for trainee or resident Instructor user manual Document Man ual for instructor Director user manual Document Manual for director Define Virtual Patient Scenario request Form Request for a specific scenario Virtual patient checklist Document Checklist to create virtual patient Virtual Treatment Machine Create_ MV.m MATLAB Generates simulated portal image Create kV.m MATLAB Generates simulated kV/kV images and registration file Create_CBCT.m MATLAB Shifts registration & changes DICOM header of an existing CBCT image set to be loaded into a virtual patient Create RT.m MATLAB Creates RT_TREATMENT record to representing verified beam delivery Simulation Feedback Prescription form Form Contour_comparison.m MATLAB Compares two RT_STRUCTURE files Metrics.m MATLAB Generate s EUD, NTCP, CI from DVH files Final report Document A report explaining errors that occurred, and summarizing all reported metrics RCA reporting form Form A step by step form to work through a preliminary RCA on the reported case Final VROC Evaluati on Form Survey of experience using VROC

PAGE 28

28 Figure 1 1 Radiation o ncology workflow versus time and workload

PAGE 29

29 CHAPTER 2 VROC REQUIREMENT LISTING The first step of the development process was to identify specific requirements for simulations for traini ng physician residents. This was done by reviewing the literature on simulations in radiation oncology, simulation in medicine, requirements for radiation oncology training, and instructional design. A final summary of these recommendations was used to adv ocate specific requirements. Simulation Overview The concept of simulation, in general, has been routinely used in radiotherapy. Early treatment te chniques were planned using an x ray or fluoroscopy machine to simulate the treatment machine. The patient wo uld lie on a treatment couch similar to the one they would be on for treatment while the x ray source was positioned around the patient with the doctor guiding this placement (13) With the patient at this machine, t he physician would decide on the beam entrance based on either radi ograph or on the geometry of where the beam was aiming at the patient. These machines were called simulate the radiation treatment machine. With the advent of c omputed t omography (CT) scanners as well as advanced computer graphics capabilities in the 1980 1990s, a CT scan of the region of interest could be used to model the patient and the beam placement could be simulated within the computer system (14 16) During this transition time, this t ype of simulation was called a v irtual s imulation to differentiate between the processes of determining treatment parameters directly on the patient versus the same process performed on a

PAGE 30

30 comp uter simulation of the patient. Today, almost all radiotherapy departments utilize the CT scanner and a computerized model of the patient for this Simulation. More recently, virtual r eality (VR) simulation has also been adopted in radiation therapy traini ng. The use of a sophisticated 3D system is in place in several training facilities where radiation therapists are trained on how to operate the very expensive and heavy equipment. This system is commercially available through the company Vertual, Ltd. Vid eo illustrating this virtual reality radiation machine can be viewed at their website: http://www.vertual.eu/ (17) Other uses of VR includ e determining realistic computer graphics of the treatment machine in order to perform dry runs or practices in some complicated treatments where the table and radiation machine may come very close to one another, possibly even hitting in some cases. Hamza Lup et al. (18) developed a web based graphical VR system with realistic visual illustrations of the treatment machine, the patient, and the treatment couch that could be accessed remotely with the specific intent of collision detection and treatment order optimization. While simulation has been used in these specific instances within radiation oncology, there are no reported studies or simula tion tools specifically designed for the resident, nor are there any training tools specific to simulating errors in radiation oncology. Two of the prerequisites for the VROC include a specific focus on a tool for resident use, and a requirement that error s must be able to be simulated. Simulation as a training tool has been widely used in many areas that are considered high risk, such as aviation and medicine. Specifically to medicine, simulators are popular training tools for all levels of education from layperson edu cation on c ardio p ulmonary r esuscitation (CPR) and first aid to surgeon training on sophisticated robotic

PAGE 31

31 surgeries. With the high cost associated with medicine and medical education and the risk of practicing on human subjects, simulation as a means of training is both cost effective and useful. Many of the developed simulators and virtual reality (VR) systems for medicine were developed for procedures that are both image based and require a specific skill such as navigating a scope through the airway or the digestive tract (19) Others are for training where quick decision making skills are needed, such as in military medical care and in emergency medicine. Many have reported on the benefits of a simulation in these environments to help the student become familiar with a certain piece of equipment or to learn a new skill; both of which seem to be more easily learned wh en the stress of a situation can be controlled (20) A meta analysis on the efficacy of virtual r ealit y in medical training was completed by Seymour et al. (21) His analysis looked at all reported studies using VR for medical education to determine which studies had adequate testing mechanisms and which proved their utility in the use of VR systems to improve skills or to train more efficiently. One specific study included in the meta analysis indicated that the skills that were acquired during a VR procedure for microsurgery were transferred to actual animal surgeries. One of the specific endpoints of this study was to determine what parameters or tools could be used to evaluate the utility of the simulation or virtual training. The parameters that Seymour found useful to evaluate the utility of the VR system were : a.) The time required to learn a ne w skill, b.) The error rate in a procedure following the training, and c.) The time required to complete a specific task. This illustrates that for the particular application, physicians were able to make decisions and/or complete tasks more quickly follow ing time spend practicing with a simulator.

PAGE 32

32 Several professional organizations and conferences have joined together to try to investigate topics specific to medical simulation. Because of the stress involved and the need for quick decision making, emergen cy medicine departments have pioneered many of these studies and have published proceedings of their 2008 Academic Emergency Medicine Consensus C onference with special topics on the science of simulation in healthcare (22) A subcommittee from this con ference looked specifically at the impact of simulation to improve safety in the overall healthcare system (23) One feedback information from error reporting systems into simulations training and thereby improve patient 1. Use malpractice claims, hospital reported errors and QA programs as a basis for simulation training. 2. Use electronic charts to cross check and track errors for more data for training. 3. Pool data on malpractice throughout a region/ area to gather more simulation teaching cases. 4. Use a high fidelity mannequin simulator in the clinical environment when new equipment is introduced (s uch as pacemakers), to determine possible errors and prevent them. This list was generated by physicians from a variant of medical specialties but provides an excellent resource for opportunities in incor porate patient safety training to the VROC system. The errors that occur within radiation oncology can be collected from reporting syst ems and from published reports or from pools of malpractice claims or mandatory reports to state radiation safety agencies. The entire VROC system is an el e ctronic record t hat can be checked against the actual clinical system. For added utility any new p rocedures or treatments could be first tested using the VROC to

PAGE 33

33 evaluate possible holes in the quality assurance process or to assist in the development of procedures associa ted with these new techniques. The remaining requirements for the VROC come from a literature review to determine the specific attributes that have proven useful when using simulations for training. Issenberg et al. (24) conducted a thorough review of s imulators and medical education specifically with the goal of determining which features are necessary in order to provide the most effective learning experience. While this meta analysis found hundreds of articles published from 1969 to 2003, only 109 met the criteria of being empirical studies that used simulators and that contained an assessment that could be used. Issenberg found 10 key features of simulators that were most effective in teaching skills or concepts. 1. Feedback 2. Repetitive practice 3. Integrat ed into overall curriculum 4. Practice with increasing level of difficulty 5. Adaptable to multiple learning strategies 6. C linical variations 7. Controlled environment 8. Individualized 9. Outcomes and benchmarks are clear 10. Validity realism (context f or understanding) One of the critiques this study gave to the simulation community, as a whole was that there has been a lack of scientific rigor in many educational studies to test the utility of simulation in education. For example, o ne of the earliest studies using a VR training system performed by Mills et al. (25) noted that VR training did not show improved skills over standard learning but concluded that their study was flawed due to lack of metrics used to evaluate the experience within the virtual environment. The study concluded that the desig n of the virtual training tool must take into account the way the study would

PAGE 34

34 be assessed. Additional guidance for simulation training studies can be gained by following recommendations from McGaghie (26) who evaluated all published studies on simulation in education to test the hypothesis that the time spent in simulation training correlated to learning. While his conclusion indicated that time was correlated with learning, he was critical of the fact that out of 670 peer reviewed journal articles, only 31 contained adequate assessment, statistics, or basic scientific design, from which to draw conclusions. He cites six specific flaws that were found that should be avoided in the proposal of a study involving a virtual reality training system. Lack of background information on current research, inte r specialty journal citations, and other work in the area of simulation in medicine. Most research has few subjects and give s no attention to statistical power Lack of awareness of study design in education, behavioral science, and clinical discipline Lack of measureable properties of educational or clinical variables. Lack of statistical reporting consistency. McGaghie recommends that data should report central tendency (mean, median, and mode ), dispersion, and effect size for dependent variables. Lack of description of educational interventions and their strength and integrity. Radiation Oncology Training Requirements As with physician residents in other programs, radiation oncology residents are given core competencies from the Accreditation Council for Graduate M edical Education (ACGME). The core competencies include lists of disease sites and tumor types E ach resident must participate in the consultation and treatment of a minimum number of cases. Specific details about the cases that are to be manage d by the resident can be found in P art IV of the ACGME G uide for Radiation Oncology which

PAGE 35

35 states that each resident is required to manage approximately 150 patients each year with a minimum of 450 patients over his or her entire four years of training ( http://www.acgme.org/ ) (27) In order to count a specific case towards their log, the resident is required to participate in the simulation and treatment planning and in the initiation of treatment H owever due to the length of some courses of treatment they are not required to follow all patients through to the end of treatment. The ACGME also provides guidelines regarding training in the area of systems based practice and professional issues such as healthcar e economics and quality improvements. Specific content of the educational curriculum is not described and the ACGME guideline states that: Residents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as wel l as the ability to call effectively on other resources in the system to provide optimal health care. Residents are expected to: Participate in identifying system errors and implemen ting potential system solutions (27) I t may be possible to uti lize a v irtual radiation oncology c linic in order to provide added experience in the management of unique cases or in managing more patients than may be seen by the resident in his or her training. Also, as mentioned previously, the cases the resident part icipates in are skewed towards the initiation of treatment and therefore a patient is not always follow ed through to the end of his or her treatment. The VROC may allow for the experience of a total continuation of care. Also, the VROC may provide an oppor tunity to show the resident a more realistic workload that they may see because in many clinics it is common to see as many as 300 patients a year compared to the ACGME requirement of 150. A survey of residents from 2005 2008, conducted by the Association of Residents in Radiation Oncology (ARRO), summarized the hours spent in various

PAGE 36

36 clinical situations. In most training situations the residents report spending approximately 50 hours per week in the clinic. They also report that they did not see enough c ase for lymphoma, pediatrics, and sarcomas (28) There is no mention how much time was spent on physics, biology, health care management, or medical errors. Also, there were no reported stu dies of specific error training in radiation oncology, nor in radiation physics, despite this being listed in the core competencies. This illustrates the lack of curriculum specific to errors management within radiation oncology. While there are few specif ic curriculum guidelines, medical errors have received the attention of the media and therefore have been the focus of some new guidelines. The recommendation from the professional organization the American Society of Radiation Oncology ( ASTRO ) initiative public and government focus on radiation therapy errors. The six point agenda to improve quality of radiation therapy includes as one of its points an increased educational focus on radiation therapy error s. The society dedicated several issues of one of the professional j ournal of Practical Radiation Oncology (PRO) to different aspects of radiation oncology errors and has also published new reporting guidelines for radiation errors. They have not as of yet made any specific recommendations for error mitigation training, nor have any specific training tools been developed to provide hands on training. Instructional Design in Simulation Training One area of criticism from the educational research using simul ations was that many studies and also many simulation tools do not contain adequate assessment and have failed to take into account instructional design. Instructional design is the study and practice of theoretical methods that are utilized in implementin g a learning module

PAGE 37

37 including educational objectives and their assessments. Instructional design theories take into account studies in education as well as in soci ology psychology and science. In order to properly create a tool for use in education or to conduct an educational study using a simulator as a tool, it is important to explore various different design theories to determine the most appropriate one for any particular project. Careful evaluation of instructional design theory will help to address the concerns of McGaghie (26) regarding the use of medical simulators for training. A publication by Battles is one of the first to discuss instructional design in medical education as a means to prevent medical errors (29) A key conclusion in simulation training to residents in order to protect the patient from inexperienced residents. He recognized that this would require the implementation of an entirely new instructional program from the one that has been in place in the past several years, and all specialties would be required to develop virtual training solutions as well as a curriculum that relied more heavily on these simulations. Specific to instructional design for medical education, it is important to note that t learners and there are well known theories that address how adults learn. Some principles that are proposed by Knowles (30) for adult learning are: Adults need to be involved in the planning and valuation o f their instruction Experience (including errors) provide s the basis for learning Adults are more interested in learning subjects that have immediate relevance to their job or personal life Adult learning is problem centered rather than content oriented.

PAGE 38

38 These principl e s apply to medical residencies where the goal of the program is to practice and apply the knowledge taught in medical sc hool. T he goals are to learn to do the job they will be expected to perform as a n attending physician (31) Another popular theorist on adult learning K.P. Cross created a similar list of important features of adult education and also points out that, unlike children, adults are self directed and are capable of discovering things for themselves (32) Because adults are self directed but learn best by experiential learning and by repetition, a simulation module is appropriate as a means of trying to te ach. A prominent historical design methodology is a linear design process known by the acronym ADDIE, which describes the steps required to generate a curriculum for students. The steps are described below and are generic to all types of learning. ANALYZE. Determine the audience and state the specific learning objectives. This includes investigating the learning enviro nment and constraints and the timeline for the project. DESIGN Detailed list of the specific elements of the learning objective and a detail ed prototype of the project including storyboards & graphics DEVELOPMENT. The creation of learning materials based on the d esign. IMPLEMENTATION. The learning plan is put into place along with procedures for implementation and distribution. EVALUATION Th is process is partly included in each of the other phases, which is called formative evaluation Formative evaluations are combined to generate a summative evaluation, including tests specifically related to the objectives. Feedback is given at this point. Many researchers have adapted the ADDIE model for instructional design by adding more steps For example, an adapted model is known as the Dick & Carey Model, described in their book (33) This model provides additional steps for assessing knowledge acquisition and for adapting the

PAGE 39

39 design of the curriculum in order to meet instructional objectives. A competing design process to the linear approaches mentioned above is a non linear approach In this model, the revision of the learning material is made at any point within the process. One such model was developed by K emp (Kemp Model) which contains similar steps to the ADDIE and the Dick & Carey models but provides guidance to make modifications and analysis throughout the implementation of the curriculum to help to restate g oals or objectives if necessary (34) This model incorporates opportunities to evaluate performance throughout the learning process so that revisions and re planning can occur. While the previ ously mentioned methods of instructional design can be effective, there is a competing theory for instruction development that is called Understanding by Design (UbD). This methodology is a backwards engineering methodology where the entire structure of th e curriculum is designed to ensure that the learner understands the material being taugh t (35) The focus is not just on recall and memory of the acquired information or knowledge, but is a methodology that focuses on ensuring the student understands t he concepts of a particular subject, and is able to apply the information to different scenarios. This is what is asked of medical person n el as they process information that is presented to them in the clas sroom and apply it along with their experience to develop a care plan for the patient they are treating. For this project the goal is to help the trainee to understand root causes of radiation errors and how to develop a quality assurance program to prevent such errors from occurring in the future. This i nvolves understanding what the error was and why it occurred.

PAGE 40

40 There are three main steps that were used to develop a curriculum for understanding : 1.) Identify the desired results or concepts that are to be understood; 2.) Determine acceptable evidence of the understanding; 3.) Plan the learning experiences. In developing a training program, a great deal of time is spent on the second step because it is only through testing, evaluating, or questioning the student that one can determine if they understood th e concept. The selected endpoints to illustrate learning will typically include many of the different questions that also test the student or resident s knowledge as well as their understanding of that knowledge. Once it is determined how to assess the lea rning of ideas, the curriculum can be developed around these assessments to ensure the objectives are being achieved. S imulators or virtual systems are specifically useful in trying to teach understanding because they provide opportunity for trainees to ap ply the informatio n that they have been presented. Virtual reality systems training can also be a way to assess whether the student understands wha t they have been taught by presenting different scenarios to the student to see if they can apply the inform a tion they have already learned. Anderson et al. (36) discuss the utilization of simulators in medical education so that cognitive knowledge and described the utility of the simulator to provide opportunities to apply learning to new situations and to give a better understanding of the concepts that were being taught. S imulation as a tool for education can be useful in many different educational strategies, but specifically for medical res idents, the design approach known as Understanding by Design seems to be most appropriate. There are three specific strategies in the UbD approach. These are de scribed in the following pages.

PAGE 41

41 Identify the desired results : While using this technique, all go als and objectives are stated as complete sentences, recognizing the overall objective is for the student to understand a concept. These can include over arching themes as well as specific concepts, but the idea of understanding is not just that the studen t can recite a statement, but that the information can be applied to other situations. Once all the learning statements and assessments are written, then the curriculum can be developed. For this project, the over arching goal is that for the resident to u nderstand how the various components of radiation therapy fit together and how errors in patient setup or mistakes in radiation deliver y will affect patient outcomes. Determine the acceptable evidences of learning : The evidence of understanding of the comp onents in radiotherapy and how errors occur will be shown by To develop assessments of the understanding of these concepts the UbD design philosophy lists six facets of assessment that need to be included (35) A learning assessment mechanism will need to be developed as part of the simulation tool. Specific to this project and to the evidence of learning would be the ability to detect that an error occurred, and the ability to conduct a ro ot cause analysis following an error. Once several simulations have been completed, another assessment can be made to assess the number of errors that go undetected compared to those that are detected. This number should decrease with experience. Specific metrics related to the treatment plan will be used as quantitative assessment for the expected effects of the error and will be presented as feedback to both the trainer and the trainee. One of the overall evidences of understanding will be for the residen t to complete a root cause

PAGE 42

42 analysis and identify root causes. Once the RCA form is filled in and a measurable quality assurance guideline is proposed, the supervising physician and the resident can discuss the root causes and compare them to procedures and policies within the department. Plan the learning experience : While the most intensive aspect of the understanding by design methodology is to focus on the assessment tools to assess learning, it also requires a learning plan to ensure that the goals are met. Specifically, for this project, each training scenario must be carefully planned in order to ensure that the different aspects of target volume delineation, treatment plan evaluation, and treatment verification are covered. In addition, information on how to monitor or detect an error must be designed with both the learning objectives and the learning assessment in mind. The tool that will be used for this study will be the virtual reality clinic, but a learning plan will need to be developed that outl ines which cases are to be studied and why. An acronym that describes this stage of the design is WHERETO. Each letter describes the key things that are desired while planning the learning experience (35) W: Where & why give overview of project H: H ook get the students attention E: Explore & Experience the hands on component of learning R: Reflect/Rethink this is the review where the big picture items are re emphasized E: Evaluate work initial evaluation or self assessment T: Tailor and pe rsonalize change things as required for the individual student O: Organize for effectiveness

PAGE 43

43 The overall learning plan will be implemented on each scenario with the VROC. For the initial development, the faculty member will be able to list the learning goals for each scenario, the specific planning exercises, and the errors that are supposed to be detected. Summary The requirements for the VROC system were taken from a literature review of simulations in medical education, residency requirements, medic al error education, and from instructional design. The ten specific features required for simulation training that were identified by Issenberg et al. (24) serve as the initial criteria but were modified to create a radiation oncology specific list. This particular list includes some items that are specific to conducting a scientific study and do not apply directly to the development of the training tool. However, the training tool must be made flexible enough and also robust enough to allow appropriate s cientific studies to be done once it is developed. The following is the list of requirements for the VROC. 1. Realistic 2. Specific for p hysician residents 3. Controlled e nvironment 4. Easily incorporated in to the curriculum 5. Can simulate a variety of radiation tech niques and errors 6. Learning objectives and goals are clear for each exercise. 7. VROC can give immediate f eedback on performance 8. End of Simulation feedback and l earning a ssessment One of the primary features for developing the VROC is that it be realistic or valid. The realism of the VROC is key to whether or not it would be incorporated within a curriculum. Realism, as reflected in the type of procedures performed, timing, and overall level of complexity, was considered in all aspects of the design and dev elopment process. Following the development, the system was evaluated based on its realism.

PAGE 44

44 Another requirement was that the VROC be specifically designed for radiation oncology physician residents. One of the shortcomings in physician training is the lack of opportunity to be involved in error mitigation. The VROC simulation training system must be designed to fill this need by being able to simulate all types of errors. Additional requirements are that the system fit into the curriculum that is already in place for error training, includes the opportunity for error mitigation analysis or root cause analysis. One of the other requirements for the VROC is that it is a controlled environment. Because the goal of the VROC is to incorporate it within an actual radiation oncology clinic, one of the requirements is that it remains separate from clinical software and hardware so there is no risk to patient safety or data corruption to the clinical data from the VROC system. This isolation will also provide the flex ibility to develop scenarios about various errors without any chance of harming patients. Another required feature of the VROC is that it be capable of giving feedback to the user regarding how they perform. The feedback should be given within a reasonable time period and should be related to how the person using the simulator is performing compared to expectations or compared to specific learning objectives. Also, the other goal was to give the trainee opportunities to be involved in error mitigation. This would be feedback at the end of the simulation where the user conducts an error mitigation study. The remaining requirements for the VROC relate to its flexibility to develop different training situations that each follows specific ideas of instructional design. One different simulation s of patients and error scenarios each have specific learning

PAGE 45

45 objectives, with specific learning assessments As was described in the Underst anding by Design section, the desired result or the expectation of the person completing the simulation should be identified. In addition to developing specific curriculum for each of the training scenarios, the VROC must be flexible enough to accommodate individualization of cases based on the level of experience of the user as well as being able to expand to allow m ultiple cases at the same time.

PAGE 46

46 CHAPTER 3 RADIATION ONCOLOGY Specific details about radiation oncology workflow, errors in radiation oncol ogy, and metrics for radiation oncology were needed to further develop the requirements list from Chapter 2 The detailed workflow of the radiation o ncology (RO) process from the physician perspective will serve as a template from which to model the VROC. To determine appropriate errors to simulate, a complete literature search of errors in radiation oncology was performed and related to the radiation oncology workflow. To determine appropriate time and data for feedback to the resident on the quality of th eir work, a literature review of various metrics is summarized. Radiation O ncology W orkflow As with all specialties, RO has a unique set of terminology. The different job titles, equipment names, and general terminology are described and defined in the bri ef description and workflow that follows. This description details specific tasks of the physician, each of which will ideally be simulated within the VROC. Along with the task description, the physician typically interacts with a number of other professio nals and staff. The VROC will need to be able to simulate the duties of the other team members listed within the workflow. Detailed explanations of the job descriptions of different members within the radiation oncology as well as best practices for differ ent functions are available through the ASTRO report of Safety is No Accident (4) The workflow as outlined below is taken from personal experience and more information can be found in the report from ASTRO Figure 3 1 is a simple workflow diagram of the RO process with a list of each team member involved in each part in the process. Figure 3 2 is a similar diagram but describes the specifi c task with a small image illustrating the task. These

PAGE 47

47 two figures are described in detail below. For the purposes of this workflow description there are no distinction s between the attending physician and the resident physician; residents perform the sam e tasks as attending physicians, just under supervision. Consultation Consultation is the initial encounter the physician has with the patient. The purpose of the consultation is to meet with the patient to determine if there is a role for radiation and to discuss the possible outcomes and side effects if they do have radiation treatment. Figures 3 1 indicates that the support or ancillary staff coordinate schedules charti medical assessment on the patient, including tests such as checking blood pressure, weight, and temperature. The nursing staff also helps the physician to answer questions and explain radiation side effects and treatment processes to the patient. computer screen of a digital medical chart is illustrated in Figure 3 2. The physician will review is typically available via a computer system and is designated as an electronic medical record (EMR). The EMR contains physician notes, diagnostic radiologic images and re ports, pathology reports, medical assessments and lab results (blood work). In most cases the physician can also see all other diagnoses for which the patient has had or is receiving treatment such as cardiac care, blood sugar issues such as diabetes, and previous cancer treatments. For cancer care, the specific diagnosis and staging of the disease is an important indicator of whether radiation is recommended or not, and is also typically indicated within the EMR. In addition to determining if radiation is to be

PAGE 48

48 given, this is the opportunity for the physician to explain cure rates and possible side effects of radiation therapy to the patient. Once it is determined that the patient should have radiation therapy, the physician will write orders so that the r emainder of the appointments can be scheduled, which will begin the treatment process. In many cases orders or recommendations of the physician are also reported electronically within the EMR. Radiation Simulation: (Planning Preparation) Radiation simula tion i s the process of identifying how to position patient for radiation treatment. In most modern facilities, a radiation therapy technologist or a radiation therapist performs the simulation The orders are typically found as a document within the EMR. A t the time of the simulation or planning preparation, the patient is positioned on the couch of a computed t omography (CT) machine in the position the physician has requested for the radiation treatment. The therapists acquire a set of CT images of the are a to be treated, and these images are used to create a 3D model of the patient within the computer system. The patient is free to leave after the CT scan is acquired and the remainder of the planning procedure for the treatment will be done on the model of the patient rather than requiring the patient to be present while different machine settings are tried. Once the CT s can is available, the physician must determine the speci fic area that is to be treated. For this work a specialized computer s ystem call ed a t reatment p lanni n g s ystem ( TPS ) is used because it contains tools for annotating on the CT images These tools allow the physician to the specific target area if necessary, a proc ess referred to as c ontouring. The TPS also includes computerized graphical models of the radiation treatment machine, or medical linear accelerator

PAGE 49

49 (linac). This modeled linac allows the physician to geographically place a radiation beam aimed at a specific location, and to indicate how the beam should be s haped. In most modern machines a multi leaf collimator (MLC) is used for b eam shaping This is a device that is inside the machine and is programmed to create the shape by moving small metal pieces to create the appropriate shape. Figure 3 2 shows an illus trat ion of a CT scan with a contour in place. Also shown in the l ast figure on the column under planning p reparation is an example of a geographical representation of a beam that is planned for treatment. This beam is displayed above the patient and indica tes a radiation beam that would enter directly onto the patient from the front. For each area and type of tumor, there are relatively standard shapes and guidelines for how to design these blocks or openings (called a portal). The se can be found in most ra diotherapy textbooks (13, 37 38) are simulating the radiation treatment within the computer system. In order to eliminate confusio oncology, this portion of the workflow will be called p lanning p reparation. The overall simulation of the radiation oncology experience that is being built within the VROC, for t he remainder of this discussion and this project will therefore be better described so as to differentiate between this specific step in the process versus simulation of the entire process. Treatment Planning The next step in the radiotherapy process is to identify and plan t he entire course of treatment. In most clinical s ituations there is little to no differentiation between p lanning p reparation and treatment p lanning. The physics team work s alongside the

PAGE 50

50 physician to develop the exact set of beams and b eam settings to deliver the best treatment. As radiation is aimed at the patient, the photons or electrons deposit energy into the patient. The energy deposited can be quantified as dose using units of Gray (Gy). The treatment planning system includes both the geometrical and dosimetric model s of the linac The dosimetric model describes how much energy of radiation will be given as the b eam passes through the patient. As different beams are aimed at the target, this energy will sum up to give a greater dos e at their intersection. The results of this process are calculations that indicate how much of the dose is deposited and where. This is called a dose distribution T he corresponding set of instructions, which includes beam orientation, MLC shape, and dose from each beam, is called the radiation treatment plan. Figure 3 2 indicates a n illustration of a dose distribution. The lines on this distribution are in percentages of maximum dose, so that the 100% total represents the area that will receive the most r adiation, and corresponds to the most energy deposited. The other picture depicts another tool used by the physician to evaluate the plans and is a graphical representation of the dose given to different structures. This dose volume h istogram (DVH) is used routinely to evaluate the treatment plan and is further described in the section on metrics. T he role of the physician at this point in the process will depend on the diagnosis and the dose of radiation prescribed. An increasingly greater number of patie nts in modern radiation therapy have customized plans that have the beams shaped specifically to the tumor so as to minimize dose to any other organs. This type of planning consists of two different types. One is 3 D imensional c onformal t herapy (3DRT) whe re the beams are shaped to conform to the target area Another is i ntensity

PAGE 51

51 m odulated r adiation t herapy (IMRT) where each beam is subdivided into smaller segments that are either open or blocked for different percentages of time to further shape the dose a long the depth of the beam, in addition to the sides of the beam. This allows for much more flexibility in shaping the dose within the patient. IMRT plans are generated by use of computer algorithms that optimize various beam shapes and settings in order t o meet specific dose goals and limits. In both 3D RT and IMRT planning the physician identifies the anatomy to be treated (37) by using computer of 3D objects representing anatomical organs and regions of interest. These contours can be assigned dose goals or limits and can be used to generate statistics in order to evaluate the quality of the treatment plan. Treatment Preparation Once a final tr eatment plan is developed it must be prepared for treatment delivery. Many of the treatment machines in use today operate with computer controls. Because the treatment machines have a number of small mechanical motors that drive the MLC as well as all of t he other mechanical settings, linac s require a file with all of these settings in order to deliver treatment to the patient. These files are transferred to the treatment machine via a computer system called a r ecord and v erify s ystem (R&V). This system is designed to both send the treatment plan information to the treatment machine and also reads the settings on the treatment machine and records those settings for each treatment. This computer system must be integrated into both the EMR and also must be abl e to receive the treatment plan information from the TPS. Both the data from the treatment plan in the TPS as well as the data transferred to the R&V system must be carefully checked to ensure that the treatment matches the

PAGE 52

52 The physics staff is responsible for maintaining the quality assurance program to ensure that the treatment plan parameters match the physic radiation dose. H owever, the physician is ultimately responsible for ensuring that the plan is correctly delive r ed. In addition to finalizing the treatment plan, the other part of preparing the plan for treatment is to develop a set of reference radiographic images to be used for alignment. The physics team us es the TPS to crea te an image that looks like an x ray i mage and represent s the patient anatomy where the radiation beam will intersect during treatment. The orientation is referred to as the b eye v iew (BEV) because it is oriented as the image that would be seen from looking through the patient from the p ers pective of the treatment beam. The image itself is a computerized reconstruction that is computed by ray tracking through the 3D model of the patient from the CT images. This image is referred to as a digitally r econstructed r adiograph (DRR). The DRRs t hat are uploaded into the R&V system can be used at the time of treatment to ensure the patient is in the correct location and to check that the MLC s on the machine are defined correctly and have transferred to the machine correctly. An example of a DRR is in the workflow diagram Figure 3 2. In some cases the physician may want additional images to be taken. These will all be set up at this time and signed off on by a physicist. Treatment & Verification The radiation dose is delivered by a linac under the c ontrol of radiation therapists who are specifically trained on the operation of these specialized treatment machines. Typically the physician is not present at the time of treatment and th e therapist will position the patient on the treatment couch in the same position as in the CT scan and verify that the linac is programmed to the settings from the treatment plan. T o verify that

PAGE 53

53 the patient is being treated correctly, the therapists on the machine will take acquire images using equipment attached to the t reatment machine. These images are uploaded into the R&V system for the physician to review and compare to the reference images. Modern radiotherapy machines have several different radiographic imaging techniques that can be performed. The most c ommon imag ing type is called a p ortal i mage and is taken by using the actual treatment beam and beam shaping devices along with a digital imaging panel. The photons from the treatment machine pass through the patient to expose an imaging panel on the other side of t he patient. Depending on the patient anatomy, dark spots on the image will indicate more photons passing through regions of low density, and bright areas in the image indicate areas with fewer photons passing through due to interactions in higher density r egions. This image will show the shape of the beam portal indicating any blocking devices used as well as anatomy to indicate that the beam is being delivered to the correct location within the patient. The contrast of this type of image is not as good as a diagnostic x ray because the image is taken with high energy photons that interact predominantly though Compton scattering. Physicians, therapists, and physicists are all trained to understand the physics behind a portal image compared to a kV image, and also to interpret both types of images. These images are particularly useful because they give information about the patient location as well as the exact entrance and shape of the radiation beam, to be compared to the planned radiation beam. The routine use and benefit of portal images to detect both patient setup and beam shape errors are reported in the literature (8, 39

PAGE 54

54 42) One study reported the precision of alignment based on portal films to be on the order of about 3mm (43) Portal images, either electronically or on film are submitted to the physician for required review a nd approval. In addition to the portal image, many modern radiation delivery devices are equipped with additional imaging equipment. The most common add on imaging equipment utilize s a kV x ray tube and imaging panel that is similar to a standard diagnost ic quality x ray By taking an orthogonal pair of x ray s the 3D geometry of the patient can be determined. A 3D image set can be acquired by rotating the x ray tube around the patient to generate a CT data set known as cone beam CT image (CBCT) (44 46) When imaging other than portal imag ing is used to evaluate the patient alignment it is typically used daily and is called image guided radiation t herapy (IGRT). D available te chnology, the physician will review these images either daily or weekly. Procedures involving very high doses of treatment or fewer treatment fractions require that the images be evaluated and approved prior to any treatment (39, 46 47) IGRT is rapidly being implemented, and in most modern radiation oncology centers will be used on at least half of their patients. The other methods of treatment verification involve evaluating the patient and s treatment record. Each week of the radiation treatment the patient will meet with the physician during an under t reatment v isit to discuss any side effects and to evaluate how the treatment is progressing. treatment doses will be modified and sometimes the overall treatment plan will be expected.

PAGE 55

55 swelling in the radiatio n area the physician may choose to repeat the simulation of the patient to generate a new plan. Additionally each week someone from the physics team review s the chart, the treatment machine settings, and the treatment plan to ensure that the prescribed tr eatment is being carried out correctly. Radiation Oncologist Workload One of the complications for radiation oncology is that the physician is required to manage a variety of patients at different stages of this overall workflow at any time. Mazur et al. (48) reported on the workload and how the overall workload and stresses correlated to errors in radiation oncology. It may be possible to use the VROC to simulate this workload to test some of these ideas. Radiation oncology departments are accredited through the American College of Radiology (ACR) who publish a report, that overall, the average number of new patients seen each year by a radiation oncologist is 205 (156 for academic institutions) ( www.acr.org ) (1) This correspond s to an average of 3 new patients per week. Since the VROC system was being developed at UF Health Cancer Center at Orlando Health, statistics at this facility were compared to the ACR a verage and indicated an average of 4.7 new patients seen per week. This clinic is a fairly large practice with seven physicians and six radiation treatment machines. Advanced treatment (IMRT) and advanced imaging (IGRT) are available on 5 of the 6 machines On average, about 125 patient treatments are given daily which breaks down to about 5 new start patients per physician per week. On average about 65% of these treatments include IGRT which would produce on average 14 image sets daily per physician, and a total of 71 image sets per week for review. These workflow

PAGE 56

56 numbers were included in Chapter 1, Figure 1 1 to help to illustrate the overall complexity in radiation oncology. D ata and Applications Used in Radiation O ncology Within the workflow there are essentially three different computer systems that the physician utilizes. General category nam es for these products are: the electronic medical r ecord (EMR), the treatment p lanning s ystem (TPS), and the r ecord and v erify s ystem (R&V). For example, all cons ult and physician reports are uploaded into the EMR, the treatment plan and all CT data are typically transferred to the EMR. The R&V system is a specialized radiation oncology system to support all of the different digital data representing the treatment plan and verification images for the linac. From the physician viewpoint, the R&V is a radiation oncology specific EMR. In order to build the VROC system, it was important to identify the different electronic files used at each point in the process because these files will be used in the VROC. In some cases, programs to create these files will be necessary as part of the VROC. Many of these files are in a specific format to allow data transfer from one application to another. DICOM is a set of standards pu blished by the Association of Electrical and Medical Imag ing Equipment Manufacturers ( www.medical.nema.org ) (49) to ensure different pieces of medical imaging equipment ca n sen d and receive compatible data. The Digital Imaging and Communications in Medicine Standard (DICOM) was first introduced in the 1980s as the availability of digital imaging became more prevalent. In 1994, digital data for radiotherapy became more commo n and there was a need to address additional formats specific to the radiotherapy community. Each DICOM file consists of header information containing all of the patient, physician, machine, and manufacturer information for the file. The remainder of the f ile contains the actual data

PAGE 57

57 of the image made up of pixel coordinates (x,y,z) and gray levels that together make up the medical image (50 51) Each slice of a CT scan is a separate DICOM file. Unlike radiographic imaging, radiotherapy also requires a description of each aspect of the Linac system. As a result, a standard for radiotherapy was developed that was introduced to support data unique to radiation therapy DICOM_RT. DICOM libraries describing each line of the DICOM file are available for most high level programming languages (XML, MATLAB C++, etc.) The files unique to radiation oncology that are important in managing patient treatment are: RTPLAN: File containing all of the LINAC Beam settings based on t he treatment plan RTDose: File containing the 3D dosimetric information from the treatment plan. RTIMAGE: The Digital Portal image (also contains some LINAC Beam settings). RTSTRUCT: File that contains information about the contours of the anatomy that w as superimposed on the CT scan (in treatment planning). RT_BEAMS_TREATMENT_RECORD: File created at the LINAC with all of the settings used to deliver the radiation. Figure 3 3 is an illustration of the entire workflow with each of the different computer systems and data elements that are used at each part of the therapy process identified. The file format for the different types of files is also identified. Most radiotherapy specific data and image data are in a file format to c omply with the DICOM format. Errors in Radiation Oncology With the workflow of the radiation oncology system in mind, a literature review of errors in radiation oncology was conducted to determine wher e in this process errors occur and with what frequency. One of the largest studies on radiation oncology errors

PAGE 58

58 555 errors were reported by Princess Margaret Hospital in Toronto from 1997 to 2002 (52) This study categorized errors by type of treatment and determined that there was a higher likelihood of error in the patients receiving head and neck treatment. A study by Klein et al. (53) reported on errors at Washington University (St. Louis, MO ). He categorized these errors by different treatment technology and technique and concluded an increase in errors for treatment techniques with mix ed level of complexity and to cha nges in procedures that occur when switching between different technologies (53) This issue of errors related technology was also investigated by Dr. Fraas (54) specifically focusing on computer controlled systems and on the increased risk if the treatment p lan data is entered incorrectly from the start of treatment. The conclusion was that although computerized control systems make daily treatments less prone to errors, that if the data is incorrect from the start, there is little opportunity to catch this a nd the error will continue for many treatments. Macklis et al (55) also evaluated errors based on where they occur in the workflow and how t hat related to technology. Conclusions from the errors reported at the Cleveland Clinic between 1991 1995 reported an increased risk at the point of data entry into the computer control systems. A voluntary European error reporting system provides another source of data about radiation errors. The Radiation Oncology Sa fety Information System (ROSIS) was set up by the European Society of Therapeutic Radiology and Oncology (ESTRO). Since this is a voluntary system, the error rates may not match those from spe cific organizations. The system is anonymous so there is limited information about the specific clinics where the error occurred, but typically more details about each error are included. With few other resources available to detail different errors to spe cifically

PAGE 59

59 categorize them for the VROC system, the ROSIS systems is useful in prescribing possible error scenarios for training simulations, as it represents sufficient data points to generate a frequency distribution as well to categorize into various cat egories of errors specifically to make recommendations about errors to simulate within the VROC. In r adiation o ncology, e rrors can be broadly clas sified as either dosimetric errors or spatial errors. D osimetric errors are those in which the quantity of ra diation is different than expected due to calculation errors or due to devices or beam modifiers being incorporated into the treatment plan, but omitted during treatment. Spatial errors that are reported with a higher frequency include any error where the target area is missed due to errors in the alignment of the patient with the treatment therapeutic radiation beam. This can happen for a number of reason s, including confusing or missing instructions for patient setup, mis identification of setup marks on t he patient, wrong patient files selected, or confusion in interpreting alignment images. Sp atial errors are the most common errors and are one of the reasons for the push for r adiographic imaging techniques. However, the rapid introduction of advanced technologies can result in a natural tendency to become too reliant on technology and assume the computer calculated setup deviation to be correct without sufficient scrutiny of th e image registration. The ROSIS system data was used for this study because to date it is the largest available resource of reported errors. Added benefits of using this system are that many of the errors. While the other reports that were cited also cont ain many errors, they only summarize the categories and do not provide enough information in order to re

PAGE 60

60 categorize specifically for the VROC system. Specifically for this project, the frequency of errors that occur at different parts of the radiation onco logy workflow could serve as a useful guide for developing the VROC system. To summarize the types of errors that can occur and where in the process they typically occur, the errors in the ROSIS system were analyzed and categorized by different main subcat egories of radiation oncology. A total of 651 errors were evaluated that contained enough information to assess where they occurred within the overall process. Table 3 1 is the complete list of all errors, with numbers to indica te how many of the overall errors were of that same type. The type s of errors with the greatest frequency, as well as the percentage of the 651 errors at each step in the radiation oncology workflow are illustrated in Figure 3 4. This illustrates that the greatest number of errors occurred at the treatment delivery and verification step. To further understand what types of errors to model within the VROC system, ng on whether or not the error resulted primarily in a change in the magnitude of the radiation dose or if it resulted primarily in a change in the location of the delivered dose. In some cases, errors resulted in both dosimetric and spatial aspects. Also, some errors were based or dosimetric error. Both the spatial and dosimetric errors were classified by whether they occurred prior to treatment (in planning or prepara tion) or whether they occurred at the time of treatment. The different categories and percentages of errors are summarized in Table 3 2 This shows that approximately 70% of the reported errors resulted in the patient receiving wrong treatment. The other errors were either caught before the patient was treated or were related to issues that did not result in a patient

PAGE 61

61 receiving incorrect treatment. Table 3 2 indicates that errors relating to dosimetric issues prevailed within the pre planning and planning sections for the workflow while patient setup and spatial errors dominated during the actual treatment. The dosimetric errors were approximately equal to spatial setup errors. Based on this information, errors that will be simulated within the VROC system will primarily represent setup variations and spatial errors at the time of treatment, and will primarily represent dosimetric or calculation errors during the pre treatment portion of the treatment. Metrics in Radiatio n Oncology The radiation oncology workflow was also evaluated to determine where appropriate opportunities for feedback exist within the VROC A literature review of several metrics reported for radiation oncology treatments is described. The goal is to id entify metrics that can represent the overall quality of the treatment plan that would change if either spatial or dosimetric errors occurred. Ideally, these metrics are easily calculated and would be easily presented to the physician using the simulator a t the time of the treatment plan selection and at the end of treatment. The metrics that are described in this section have been reported in the literature for an array of different applications including plan standardization, protocols, treatment plan opt imization, as well as attempts to evaluate different planning techniques. There are no reported instances of using these metrics as a routine means of scoring trainees or for feedback in physician ability. As part of a real clinical workflow, some of these metrics are not routinely reported and the plan is checked for accuracy by either having another physician peer review the contours, dose, and plan, or by other chart checks Both the routine feedback options and the VROC possible options are listed in Figure 3 5 that highlights the RO workflow with different opportunities for feedback.

PAGE 62

62 Contouring Metrics In most situations, the treatment planning process begins with the physician describing the area to be treated by target de lineation or contouring the structures on the CT scan of the patient. While there are several different metrics that have been used in radiation oncology for treatment plan comparisons or plan quality comparisons, these metrics represent the ideal dose, as suming the radiation is delivered exactly as the physician has prescribed. If the physician prescribes or indicates the dose to be delivered to the incorrect location, this may not be noticed and reported as an error unless there is a method for checking e ach contour and prescription the physician writes. In most clinics check s for accuracy in contouring is performed by a peer review process where a colleague reviews the imaging and evaluates the contours. Also, there exist standards of care describing what structures should be treated and included within the treatment area, depending on the histology of the disease. The standardization of target and normal structure delineation has been the focus of many of the radiation oncology professional organizations for a number of delineate targets to determine the best match. Within the clinical setting and to better compare plans from different institutions enrolling patients int o national protocols, ASTRO, RTOG and RSNA have published sets of structure atlases with detailed descriptions of how structures should be named. Borders and margins for contouring are included. These atlases of contours are available through the RTOG or A STRO websites www.ASTRO.org (56) www.RTOG.org (57) Most facilities do not have an automated method for contour comparisons and in cases where residents are asked to

PAGE 63

63 do the contouring; the attending either supervises them makes changes and /or explains what they would do differe ntly. Within the VROC system, it was important to provide tools to compare contours, but the focus of the system was to provide an overall radiotherapy tool from start to finish. Therefore it was not the purpose of this project to determine the ground trut h for each type of dataset. The simulation tool was meant to provide training opportunities and to provide the resident with an opportunity to practice contour delineation. It will also provide feedback to the residents regarding their contour in compariso n to those in the original plan. In this way, each clinic and each supervising physician would establish their own ground truth for reporting purposes. A summary of reported contour comparison metrics was conducted by Hanna et al. (58) who concluded that there was no consensus on reporting metrics for comparing contours. However, they state that for a metric to be useful it must contain both volum etric and geometric information. Several papers use a correlation coefficient that is related to area of overlap of two contours, such as the Jaccard coefficient. This coefficient is a value between 0 and 1 where 0 represents no agreement between the two v olumes and 1 represents a complete agreement between the two volumes. This correlation coefficient is sometimes called the Conformity Index (CI), the Concordance Index, or the Comparison of Union (59) Others have reported a simila r metric for (60) shown in Equation 3 1. (3 1)

PAGE 64

64 more frequently; therefore it was selected as one of the metrics for the VROC. T here is a direct mathematical relationship between these two values, so one can be calculated from the other if necessary. similarity metric could be identical for two different contour part of the target and one in which errors is not the same. Additional metrics related to the type of tissue either missed or unnecessarily included must be identif ied. Two additional metrics that have been used are the percentage of volume of target that is missed, and the volume of non target tissues that are included as target. For the initial reporting of the VROC system, all of these metrics will be included in the VROC report. Metrics for Dosimetry in Radiation Oncology There are several different metrics that have been used in radiation oncology for treatment plan comparison or for research studies ; however, none of these have been utilized for managing or eval uating errors, or for feedback to residents about their quality of work. In most treatment situations uniform dose of radiation. The radiation varies through each structure, in some cases from no dose to do ses above the prescribed dose The typical method for evaluating the change in dose throughout the structures is to create a histogram of volumetric subsections (voxels) contained within the structure sorted by dose (61) This is called a dose volume histogram (DVH) and is useful for quickl y determining the dose given compared to different percentage volume of struct ures. This is particularly useful at evaluating the consequences of treatments where the volume irradiated to certain

PAGE 65

65 threshold doses is important. Typical structures that are vo lume dependen t are lung, kidney, and liver. For other structures, the maximum dose any point receives may be the most important predictor of side effects, such as damage to nerve tissues or skin. The maximum, average, and doses to certain volumes can be ob tained from the DVH. An example of a dose volume histogram for a couple of structures for both a normal plan as well as how those same DVHs could change if an error is encountered to those structures is shown in Figure 3 6 Eac h different structure will have a different DVH for each plan, including any errors that are simulated DVH is very common in radiation therapy and is evaluated by looking at the dose on the horizontal axis to determine the percentage of volume receiving t he dose or vice versa. Many of the protocols from the RTOG have several different dose and volume constraints, and some facilities have very sophisticated forms or spreadsheets that are used to evaluate plans. It is important to note that the dose and volu me percentages change depending on the prescribed dose, and for dose escalation procedures there may be several different sets of tables of guidelines for a single disease site. The issues with this type of reporting are that all of the parameters are spe cific to the particular target tissue and prescribed dose. For generating a feedback tool for use over a wide range of types of cases and users, it is deal to have a set of metrics that can be calculated for all types of treatment techniques and for all di sease sites and is not dependent upon specific target structures. In order to do this, a few metrics that can be scaled to a standard dose fraction are ideal, as well as a set of metrics that can represent the DVH as a single number, rather than a set of n umbers specific to a specific treatment procedure. Niemierko et al. (62) defined the equivalent uniform dose

PAGE 66

66 (EUD) as the dose that if used to unif ormly irradiated to the structure would yield the same level of complication as the non uniform dose EUD can be calcul at ed from the DVH by using: (3 2) w here d is the dose in voxel i, N is the number of voxels in the structure, and is a parameter determined empirically by fitting dose responses of various tissu es. Another method used in comparing treatment plans attempt s to relate the dose given to the relative risk of injury The relative sensitive for each structure is included as a parameter fit to empirical data that fits information from the Dose Volume hi stogram or the EUD to observed complications or cure rates, known as either the Normal Tissue Complication Probability (NTCP) or the Tumor Control Probability (TCP), respectively (63 66) T he EUD can be related to the normal tissue complication probability (NTCP) and the TCP proposed by the Lyman Kutcher model by the following equations (67 69) where TD 50 is a value that relates to the tolerance dose for a 50% complication r ate for the specific structures of interest. The value 50 represents the slope of the dos e response curve and is modeled for each individual structure. (3 3) (3 4) Since the values of NTCP and TCP represent probabilities of side effects or of cure, they are conceptually easy to understand, and have potential for revealing

PAGE 67

67 increases in potential risk f rom a given error. They have not to date been ex plored f or this purpose. Another advantage of a set of probabilities is that they can be combined using mathematical calculations for combined probability in order to represent the probability of complication free survival (P+) (70) or of a total complication value (NTCP tot ). All metrics that are calculated for these planning studies attempt to assign realistic outcomes to different types of treatment based on various aspects of the treatment plan. Specific studies have been conducted using patient follow up and outcomes along with biological studies to develop models based on specific tissue responses to radiation. Because different cell types respond differently, and measurable endpoints are different, studies typically focus on a specific disease with one measureable side effect. Examples of these studies include rectal bleeding during prostate treatment (71) hearing loss during acoustic neuroma treatment (72) laryngeal swelling in head & neck treatment (73) and pulmonary complications following breast treatment (74) While there remain s controversy on the reliability of any specific numerical score to project specific patient outcomes, these studies have shown useful in providing relative information and a scale by which to compare competing plans of treatments, and therefore their use to compare to a standard plan when analyzing errors seems reasonable. All of the above metrics have been used in radiation oncology for treatment plan co mparison or for research studies comparing specific endpoints; however, they are not commonly proposed for error reporting systems. One study reported that the use of the equivalent uniform dose (EUD) to the target volume and a few selected structures was

PAGE 68

68 useful in describing consequences of certain errors (75) Another group looked at this same metric as a means of reporting setup uncertainties which are often much small er in magnitude (a fe w mm), but occur frequently (76) While both reported utility in using EUD, they have not been widely accepted. These metrics are all scores that have been used to rank the quality of treatment H owev er, there is no evidence that they can be tied to patient outcomes, which is the reason it is difficult to use them as an absolute score of the quality of a resident Within education in general, and virtual training specifically, it is extremely important to be able to provide quick and complete feedback to the resident about how their progress. Ideally, there would be some way to score the trainee so that they can compare their performance to another individual or to themselves without relying s olely on the qualitative feedback from clinical instructors. The goal of this work and the establishment of these metrics is to provide feedback to assist the trainee in area s needing improvement. Even though these radiation oncology metrics may not be di rectly correlated to patient outcomes they may still have utility as general feedback to treatment. When describing treatment plans or deviations to treatments, there are of ten other parameters related to the treatment volume coverage or the amount of non target tissue that is treated which may need to be reported as well. T he physician will typically look at how well the prescribed dose matches the target volume that they ha ve depicted. Various different conformity metrics have been reported in the literature for this closeness of fit. Feuvret et al (59) conclude d that while there are numerous

PAGE 69

69 methods of calculating the conformity index there is no consistency in how these values are to be calculated. One of the more common metric s is the Jaccard index (same as de scribed for contour comparison). This metric is also not typically used for error reporting, but is an indication of how an error affects t he overall coverage of a tumor by the prescribed amount of dose. Finally, the actual percentage of coverage used to calculate the Jaccard index may be one of the simplest to calculate, but it is also typically included in some way within the prescription a nd should also be reported for error summaries. Metrics for Error Reporting Unlike dosimetric errors, metrics specifically designed for reporting errors do not have any value for the standard plan and are typically created to describe the severity of the error in any number of ways. Depending on the overall goal of the study, some severity scores emphasize likelihood of the error occurring or the ease with which they can be detected. While this information is interesting for designing a quality assurance p rogram or for failure mode analysis, they may not be meaningful to the resident when an error actually does occur. While these scores are interesting, they may not be helpful for resident feedback because they are not widely accepted and would be specific to the particular facility reporting the event. A newly published report of the joint task group of the ASTRO and AAPM committee on recommendations for error reporting proposes two different metric scales for reporting errors and near misses. Both scales are to be used for each event, and both scales range from 0 10. One of the scores is based entirely on the change in dose and is scaled by percentage of change from prescribed dose to any structure. The other metric is based on medical events, whether perm anent or temporary, and may increase

PAGE 70

70 with early or late occurring side effects (77) Two t ables of the severity scoring are reproduced as Table 3 3 and Table 3 4. The recommendations on how to score errors are included within the table, and the report allows for scoring based on predictions of medical events for near miss cases, which can be ap plied to errors that are simulated within the VROC. Detrimental d ose is a metric that was recently proposed at a national meeting of the AAP M by Carlone et al. (78) Detrimental d ose (DD ) is calculated by using parameters related to the sensitivity of a structure and the dose deposited to that structure and also provides a means of scaling in the event that a system error causes several patients treatments to be in error. This is similar to dose e quivalent which is calculated for radiation safety co ncerns in health physics. While this has been proposed, it has not been tested or utilized extensively. This may prove to be a useful metric to include in er ror reporting software programs (78) The detrimental dose (DD) is calculated in units of Gy using Equation 3 5 below (3 5) where DE is the dose error to the target structure, D is the dose error to the normal tissue, VI/V is the fraction of the structure that is irradiated, TS is a weighting factor that relates to the sensitivity of the structure in question, and SI is the severity of the injury that resulted due to the dose error. If the error resulted in many patients receiving a dose error the n this is summed over all patients. Summary To further specify details for the requirements of a VROC, the overall radiation oncology process was outlined and a workflow diagram was created. The workflow, as it

PAGE 71

71 relates to the other team members and their w ork, was created to determine all the aspects of radiation oncology that would have to be simulated for the physician resident to do all expected work of a physician managing the care of a patient without supervision. The specific data elements that are us ed at each step in the radiation oncology workflow were listed in Figure 3 3. The minimum computer systems required for simulating the work of the physician include an EMR, a TPS, and an R&V System. The errors that have occurred in radiation oncology were categorized by where they typically occur in the process. Typically dosimetric errors occur at the time of treatment planning and spatial errors occur at the time of treatment, with approximately the same frequency. The opportunities to provide feedback t o a resident using the VROC simulation include providing calculated metrics to compare contours as well as metrics to compare different treatment plans. Treatment planning metrics and also error metrics can be calculated for any treatment error. The metric s that were recommended for initial calculation include NTCP for all normal tissues, EUD for both normal and target tissues, TCP for target tissues, percentage of target coverage, conformity index, detrimental dose, and the severity scores as recommended f or radiation oncology reporting systems. Opportunities for these metrics to be calculated include the time frame after the treatment plan is selected and at the end of treatment. In order to simulate the overall radiation oncology workflow, the other requ irement for the VROC is that there must be a way to transfer different patient records into the system at the correct times. For the purpose of the VROC, all patient data represents the v irtual p atient, and part of the development will be to develop a proc ess for transferring in appropriate files at the right time to make the radiation

PAGE 72

72 oncology workflow happen in real time. This will include CT scan for planning, the treatment plans, as well as all of the files representing daily treatments. Additional deve lopment will be needed for the files that represent patient treatments because in some cases an error will need to be introduced into the treatment images and these will need to be modified accordingly.

PAGE 73

73 Fi gure 3 1. Workflow of radiation o ncology t eam member responsibilities 1. Consultation 2. Planning Preparation 5. Treatment/ Verification 4. Treatment Prep (QA) 3. Treatment Planning Physician: Interview & examine patient Nursing: Physical E xams, Labs, Assist physician Physics: perform he dose calculations/ planning Physician: Setup, area, types of images, Contour, add Beam Support Staff: Schedule, coordinate care Technical Staff: Review chart and prepare treatment Support Staf f: Schedule, coordinate care Nursing: CT contrast, labs, & patient consent and education Physician: Evaluate plans, define specific goals Physician: Final check and review of plan Physics: Double check & measurements Physician: Review chart & verif ication images Physics: review chart and verification images Technical Staff: Deliver Treatments Support Staff: Schedule, coordinate care Nursing: Manage patient & side effects Technical Staff: Setup Patient on CT scanner acquire images

PAGE 74

74 Figure 3 2 Workflow of radiation o ncology o verview 1. Consultation (Review EMR) Review EMR 2. Planning Preparation Review CT Images and Indicate Target Area (contour) & Beam 3. Treatment Planning Treatment plans are calculated to cover target. 4. Treatment Prep (QA) 5. Treatment Ver ification Prepare plan & Reference Images CBCT, Portal & kV images

PAGE 75

75 Figure 3 3 Workflow of r adiation o ncology data e lements 1. Consultation 2. Planning Preparation 5. Treatment/ Verification 4. Treatment Prep (QA) 3. Treatment Planning E MR: Consultations, Pathology, etc. Word or PDF EMR: Diagnostic Images DICOM TPS: Contours DICOM_RT: Structures E MR: Diagnostic Images DICOM R &V: Plan information DICOM_RT: Plan R &V: Treatment files DICOM_RT: Treatment R&V: Treat Images DICOM_RT: IMAGE T PS: CT Scan DICOM EM R: Procedure Orders Word or PDF EMR: Procedure Orders Word or PDF TPS: CT Scan DICOM TPS: Radiation Dose DICOM_RT: RT_DOSE E MR: Diagnostic Images DICOM R&V: Reference Images DICOM_RT: IMAGES R &V: Plan information DICOM_RT: Plan TPS: CT Sca n DICOM_RT: PLAN T PS: Contours DICOM_RT: Structures T PS: Beams DICOM_RT: PLAN TPS: CT Scan DICOM E MR: Other info: lab results, blood pressure, etc. EMR specific format

PAGE 76

76 Figure 3 4. Workflow of r adiation o ncology c ommon e rrors

PAGE 77

77 Figure 3 5. Workflow of r adiation o ncology feedback and m etrics.

PAGE 78

78 Figure 3 6 Sample d ose v olume h istogram The target volume, sig moid colon, and rectum are shown for the normal plan (solid) as well as a plan that containes errors. 0 10 20 30 40 50 60 70 80 90 100 -10 10 30 50 70 90 % Volue Dose (Gy) PTV Rectum Sigmoid PTV error Rectum error Sigmoid error

PAGE 79

79 Table 3 1. Error d istribution f rom RO SIS system Error Type # Frequency Total Simulation CT r ange or slice thickness 9 1.38% Simulation information 5 0.77% Film or p atient ID labeled wrong 13 2.00% Wrong area 3 0.46% 4.61% Prescribing Rad/c hemo not synchronized 2 0.31% R x 40 6.14% Rx change 4 0.61% Prior treatment not accounted for 1 0.15% 7.22% Treatment Planning Calculation 34 5.22% Planning / wrong point 1 0.15% Calculation (depth or distance) 20 3.07% Block calc/tray factor 11 1.69% Plan change 1 0.15% Treatment planning (general/wrong algorithm/) 5 0.77% Gap calculation or overlap settings 3 0.46% Wrong energy 12 1.84% 11.52% Preparing plan Chart check not done/ o ther process 18 2.76% Charting (something written wrong or put into R&V wrong) 62 9.52% Charting field names/labels 14 2.15% Charting (missing/unclear instructions) 30 4.61% Data transfer (DRR/from TPS) 17 2.61% Dose /fraction 2 0.3 1% Printouts /paperwork 3 0.46% 21.97%

PAGE 80

80 Table 3 1 Continued Error Type # Frequency Total Treatment Patient Setup Bolus 23 3.53% Machine settings (gantry/col/field s ize) 58 8.91% No. of treatment (BID, RX change, tre ated twice, or m issed) 15 2.30% Accessory/compensator/wedge 9 1.38% Block 50 7.68% Boost fields or wrong fields tx 5 0.77% Computer issues 10 1.54% C ouch settings (either not noted or not set correctly) 11 1.69% Communication (change not communicated/new sim) 6 0.92% Immobilization 11 1.69% Portal imaging 3 0.46% Manually recorded wrong 1 0.15% Setups (either console or patient) 30 4.61% Marks 3 0.46% Shifts 36 5.53% Patient treated when on br eak 1 0.15% Wrong isocenter 9 1.38% Wrong patient 7 1.08% Wrong plan treated 2 0.31% 44.55% Other Process 26 3.99% B racy/HDR 2 0.31% Unusual setup 1 0.15% Wrong drug label (?) 1 0.15% Films not reviewed 2 0.31% Infection 1 0.15% Machine qa 2 0.31% Mechanical failure/broken equipment 10 1.54% Pacemaker in field 2 0.31% Patient fall 1 0.15% SRS 1 0.15% Wrong machine 1 0.15% Block cutter 1 0.15% 7.83%

PAGE 81

81 Table 3 2 Summary of ROSIS results by c ategory Dosimetry Spatial Both Before Treatment 21.49% 1.07% 9.98% At Treatment 11.2 0 % 20.43% 8.61%

PAGE 82

82 Table 3 3. Dosimetric s everity s core Score Dosimetric Scale 9/10 >100% absolute dose deviation from the to tal prescription for any structure 7/8 >25 100% absolute dose deviation from the total prescription for any structure 5/6 >10 25% absolute dose deviation from the total prescription for any structure 3/4 >5 10% absolute dose deviation from the total prescription for any structure 1/2 <5% absolute dose deviation from the total prescription for any structure Not Applicable Adapted from Ford EC, Fong de Los Santos L, Pawlicki T et al. Consensus recommendations for incident learning databas e structures in radiation oncology. Med Phys ;39 p. 7285 Appendix 1 Table1. Table 3 4. Consequence severity s core Score Consequences (actual or predicted) 10 Premature death 8/9 Life threatening intervention essential. 8 Possible recurrence due to under dose 7 Permanent major disability (or G rade 3/4 permanent toxicity) 5/6 Permanent minor disability ( Grade 1/2 permanent toxicity) 3/4 Temporary side effects Major Treatment /hospitalization 2 Temporary side effects intervention 1 Tem porary side effects no intervention 0 No harm Unknown Adapted from Ford EC, Fong de Los Santos L, Pawlicki T et al. Consensus recommendations for incident learning database structures in radiation oncology. Med Phys ;39:p. 7285 Appendix 2 Table1.

PAGE 83

83 CHAPTER 4 DEVELOPMENT OF A VIRTUAL RADIATION ONCOLOGY CLINIC Based on the requirements listed in Chapter 2 and the overall workflow from Chapter 3 a template for the VROC system was created. Figure 4 1 illustrates the RO wor kflow that was described in Chapter 3 as well as added elements needed in order to simulate this workflow. The added elements include a detailed definition of a v irtual p atient, a v irtual t reatment m achine to simulate the treatment records, as well as comp uter programs to provide feedback The details of these additional components will be described within this chapter For reference the added elements for the VROC are labeled with letters A H and the typical radiation oncology workflow steps are numbered 1 5 in both the text and in Figure 4 1. Within the simulation of the RO process itself (within the VROC), the physician resident will act as the physician who is managing the care of the virtual patient, and therefore there is no distinction between atten ding physician and resident as it relates to the virtual patient. In order to clarify the different roles, i.e., someone using the describe the specific role of each physician. The use the relationship that either the resident or the attending has in relation to e ither a virtual patient or to a real patient. The other user of moves forward and does not get stuck The director has the added responsibility of assembling all of the files necessary for each simulation patient plan and treatment.

PAGE 84

84 Materials Based on the workflow described in Chapter 3, the minimum equipment required for the simulation of the clinical RO workflow include the following computer systems : an EMR, a TPS, and an R&V. An electronic medical record (EMR) system is needed to contain and organize patient inf ormation, a treatment planning system (TPS) is needed to define target volumes and create and approve treatmen t plans, and an R&V system is needed for the treatment verification and review While it could be possible to simulate these types of systems, the best way to insure realism for the end user is through the use of commercially available software. An in kind loan between Varian Medical S ystems (Palo Alto, CA), University of Florida (Gainesville, FL) and UF Health Cancer Center at Orlando Health (pre viously under the name MD Anderson Cancer Center Orlando) was granted to provide clinical computer systems to be used to build the VROC system. The commercial software available through this agreement included one Eclipse treatment planning station and two Aria One advantage to the Varian system compared to other systems on the market include s the fact that all components of the s ystem utilize a common database. This means that there is a single patient file that is shared bet ween the EMR, TPS and R&V system eliminating the need for file transfer in and out of different components within the RO workflow This further facilit ates automation for the simulation. Another advantage is that different software products run on a Micro soft Windows platform that can allow the use of other Microsoft or PC based software to run on the same platform. Even though this system uses a common database the actual product is a set of several applications that each has different function s within the RO workflow. T able 4 1

PAGE 85

85 is a list of all of the Varian software applications that were available The specific utility of each software module is described in the t able along with the specific step in the RO workflow for the VROC where the application would be used. Some applications are not needed for the VROC because they are designed to support elements of the radiation oncology or medical oncology process that do not need simulation such as accounts receivable and chemoth erapy scheduling and treatments. To summarize the Aria and Eclipse system applications, the EMR system is consists of the applications Patient Manager, Time Planner, and Scheduling. The R&V is made up of the application RT Chart, Chart QA, and Offline R eview. The TPS is the single application of Eclipse The components that a physician typically works with in a clinical setting include Patient Manager, Time Planner, RT Chart (rarely), Offline Review, and Eclipse These were the key components used for the VROC system. Both Eclipse and Aria also include administration modules that were needed in order to add patients, configure the system, and test the workflow. These tools are only used as part of setting up a system and creating patients and are not used within the VROC other than to delete or archive patient files at the end of the simulated treatment. Another option required for the development of the VROC included access to patient radiation oncology records. A protocol was approved through the I nt ernal R eview B oard (IRB) at UF Health Cancer Center Orlando at Orlando Health (under the name M D Anderson Cancer Center Orlando) to allow the use of patient records for education purposes and for the development of a simulation training tool. All data had to have patient identification information removed and the patient data was not meant to be shared other than for educational purposes.

PAGE 86

86 The other materials needed included software to develop a metrics calculation code as well as DICOM editing tools and DICOM image management tools. A student license of MATLAB (MathWorks, Natick, Massachusetts, USA) was installed on one of the VROC workstation to allow code to be developed for use within the VROC. A free DICOM viewer was used to easily view both the DICOM header and the images throughout the development process. (Santesoft DICOM Viewer 3D, Santesoft, LTC, Athens, Greece). Additional development for the VROC included creating documents and forms. User manuals for various system users were written and made available on the VROC system desktop. In some of the applications it was necessary to have the user fill in a request form or a feedback survey. Online forms were created using Adobe ). This allows forms to be shared via the internet and a set of icons was created on the desktop of the VROC system to represent different forms and surveys required as part of the VROC. Methods Specific methodology for developing the VROC system was to detail the desired simulation workfl ow. This workflow included identifying all elements of the RO workflow from Chapter 3 as well as interactions between the instructor and the trainee that were specific to the simulation. The general RO workflow of Chapter 3 had to be broken down for each s pecific task that was performed. For each task, both the data elements, as well as appropriate software from the m aterials list were identified. Areas where computer code or a process had to be developed are labeled as VROC. This include d any tasks that ar e typically performed by other RO team members.

PAGE 87

87 A detailed table was created to begin the development process. The table included specific options for software as well as the specific task that was to be performed. The entire table is included in Appendix A. A s with many development projects, various tasks were developed simultaneously I n some cases, processes had to be developed by trial and error. The developed aspects of the VROC fall within eight specific categories that can be identified on a VROC wo rkflow diagram in Figure 4 1 The typical RO workflow items are labeled numerically from 1 5. The eight items that were developed are listed below: A. A v irtual patient for RO purposes, and a methodology for creating the virtual patient. B. A method for describing a specific simulation scenario including patient and simulated error within the VROC system C. A p rogram to calculate contour metrics for contour comparison D. A p rogram or process to check the prescription E. A p rogram to g enerate a metrics f or plan comparison F. A virtual m achine to generate the different treatment records and verification images representing daily treatments G. A P rogram to calculate treatment plan metrics for final treatment review H. A F inal feedback and exercise on error mitigation. Virtual Patient ( Figure 4 1 A ) The v irtual p atient represents the collection of all of the data elements needed anywhere within the RO process to simulate a patient receiving radiation therapy. The v irtual p atient is created by copying and re plicating actual patient images and hospital records for a patient who has received radiation therapy. All patient data must be de identified such that the actual patient identity could not be discovered. I t will be part of

PAGE 88

88 the simulation d o collect these items and organize them for use within the VROC. A detailed procedure and checklist for how to copy, anonymize, and upload these images into the VROC is included in the VRO C Director User Manual available at the UF Digital Archive (Object 4 1) T he detailed list of files from the actual real world patient that are needed to create the v irtual p atient is included i n Table 4 2 Steps 1 5 of the RO workflow are described below to explain how each of the files are use d to simulate the patient within the VROC Object 4 1. VROC Director user Manual as PDF document (.pdf file 256 KB) Step O ne of the process is the consultation. For the purposes of the VROC, the face to face interaction of the consultation process itself is not currently simulated. The trainee will, however, have the opportunity to review all documents and chart information that would typically be available at the time of this consultation within the Pat ient Manager application This should be adequate information to determin e if radiation is appropriate. The information that is contained in the VROC will be taken from an actual real world patient where all patient identifying information ha s been removed or replaced. Step T wo in the process is the treatment p reparation. The information for the virtual patient will be created by making a copy of t he digital documents and images of an identified real world patient treatment. The copy of these files will be made anonymous, and saved in the virtual p atient folder. Files include all required diagnostic imaging as well as the treatment planning specific CT and MR images. The actual patient treatment plan, contoured structure files, and dose distributions will b e retained to compare the simulation values to the expert. This entire patient record is copied, v irtual p atient file folder so

PAGE 89

89 that the structures and the treatment plan can be compared later in the simulation during metric calculations and feedback. Another copy of the entire patient record is created by deleting the treatment plans and dose calculations and deleting the target volume structures. The remaining patient file on the Eclipse includes the treatment planning CT, supporting medical images (MRI or PET), and all normal tissue contours. within the Virtual Patient file folder so that at any time, the old pat ient can be purged from this system and a how to create this patient is included in the d irector anual. Step T hree of the process is the treatment planning. This process is not usuall y performed by the physician; however the physician is skilled in evaluating plans. To simulate a set of treatment plans developed by the physics staff, a set of pre calculated treatment plans will be created when the virtual patient is created. T he simu lation d irector is responsible to create the pre calculated treatment plans based on the initial r eal world patient treatment, and based on the type of errors the resident di rector would like to simulate. The actual patient treatment plan within the TPS is copied and changes are made to create errors within the plans (at the discretion and direction of the instructor). Some of the easiest errors to simulate include changes to prescription and changes to energies. A patient re cord or file within the Eclipse & Aria system can have many different treatment plans. The simulation training sets typically have between 2 4 different treatment plans. When creating the virtual p atient, these plans will be created and the entire patient record will be a rchived into a folder named L T

PAGE 90

90 Specifically within the simulation, once the trainee has completed contours, the director can load the entire patient record from the MULTIPLE folder into the v irtual p atient record that the traine e was working on. This will make all of the treatment plans available for review at one time. Once the plans have been uploaded, the trainee will review the different plans using tools of the TPS the plans, base d on how well they achieve the planning objectives. For treatment plan approval the resident is expected to be able to understand that different screens within the planning system and know how to extract the numbers that are required for them to evaluate a plan. Once the trainee is satisfied that a clinically appropriate plan has been developed TPS. Based on the approval of the plan, a set of metrics will be generated representing the approved plan compared to th e plan that was used to treat the actual real world patient. Step Four of the process is plan preparation. Other than approving the treatment plan, the physician does not typically prepare the plan for treatment. This step will be the responsibility of the s imulation d irector. Before the treatment plan is ready for the RT C hart These approvals are in part because the system is a clinically available product and these chec ks are mandated to force a check of all the treatment beams prior to patient treatment. In order for the approved plan to be ready for treatment, each treatment beam in the plan has to have an assigned reference image. The reference images are DRRs that ar e calculated within the TPS by ray trace through the treatment planning CT scan. The DRRs are reconstructed for each beam angle and represent the ideal patient position for treatment. When the reference images are created, they are

PAGE 91

91 assigned a unique DICOM identification number that indicates the specific software used to create the file. Other details included in the DICOM header include all of the details about the beam and patient orientation. The DICOM treatment plan (RT_PLAN) file contains all of the ma chine settings for each beam for treatment and also includes the specific identification number of the reference image assigned to that beam. All of the unique identification numbers (UID) are created when the files are created and approved. When v irtua l p a tient treatment occurs, the treatment verification images and treatment files have to reference the correct plan UID and i mage UID in order to match t he files to the correct patient, plan, and reference image. Because of these details with the DICOM heade r and the UID, all files are created in real time as part of the simulation process and cannot be prepared ahead of time. Step F ive in the process is the treatment delivery and the radiographic verification images that are taken at the time of treatment. D aily treatments will be represented in the R&V system by both treatment records of machine settings that are used to chart the number of daily treatments given and the accumulated dose. Verification or localization imaging will be checked using R&V softwar e. The files representing daily treatments are uploaded on a day to day basis so that the simulation will run in real time. The collection of software that was written to create the daily treatment record and the verification images are described in the up coming section on the virtual t reatment m achine. When the v irtual p atient is created, template images rep resenting portal images and kV x ray images for a perfect patient setup are created. The Eclipse software allows the user to change the settings when creating the DRR to vary the contrast by

PAGE 92

92 changing the relative weighting of different ranges of CT values for the simulated radiograph. It is possible to create DRRs that look more like conventional MV portal images. This process is only done when the v ir tual patient is created. The template images can be further degraded or shifted to represent errors as part of the daily treatment process which is described in the section on the virtual treatment m achine. In addition to the template portal and kV images if the instructor would li ke to use CBCT images for this v irtual p atient, a dataset of the actual patients CBCT i mages have to be stored in the v irtual p atient file. Scenario Request ( Figure 4 1 B ) The VROC system was created specifically for educationa l purposes. In order to achieve specific educational objectives, the scenario request form was created to allow the instructor to request a specific patient and specific errors to play out as part of a simulation scenario. Each virtual patient could have s everal different scenarios that vary in the level of complexity The VROC was intentionally made flexible to accommodate all types of errors that can occur, both dosimetric and spatial errors In addition, the v irtual p atient can be set up to contain more than one error. For each scenario it is important that there be specific learning objectives and expected responses from the user when they encounter an error in the VROC simulation These objective s can be used at the end of the simulation to ensure that trainee learned the desired information or response. The Scenario R equest F orm is used in order to build the v irtual patient The current form is a web available form maintained at Adobe FormCentral but has also be uploaded to the UF Digital Library and c an be found as the VROC S cenario Request From (Object 4 2 ). The information that the s imulation d irector needs in order to gather

PAGE 93

93 information to build the v irtual p atient include: actual patient name and / or medical record number, the specific image sets that are needed, and the specific documents that should be replicated Object 4 2. VROC Scenario Request Form as PDF documen t (.pdf 53 KB) The instructor should indicate any specific treatment planning e rrors that they want to show to the trainee and why they have chosen these errors. I f there is a specific response they expect from the Trainee they must indicate that as well. The same is needed for treatment delivery errors The table of errors from Cha pter 3 is included in the i nstructor u ser m anual so instructors may review the types of errors available for simulation Also, information on how to write a learning objective is included in the VROC I nstructor User M anual (Object 4 3) located at the UF Di gital Archive. The other information that is needed from the Instructor include s any supplemental reading for the trainee and any specific assessments of learning that the instructor would expect of the trainee Object 4 3. VROC Instructor User Manual as PDF document (.pdf 175 KB) Contour & Prescription Comparison (Figure 4 1 C & D) After the trainee has reviewed the documentation and diagnosis of the patient they must review patient CT images and identi fy the area to target with radiation. The trainee should be able to independently log into the TPS, open the patient file, and d elineate the target volumes on the CT dataset This is a typical expectation for actual real world residents working with patien ts so this is no differen ce for the VROC. Specific instructions on how to do this are included in the VROC T rainee User Manual (Obje ct 4 4) It is quite common in a busy center for the normal tissue contours to be defined by the dosimetrist or the physici st, so the initial patient file for the v irtual p atient

PAGE 94

94 within the TPS for the VROC system will have all of the normal tissues pre defined, but will not have contours for any of the target structures. When the trainee is done contouring all of the target s tructures, they save the treatment plan. Object 4 4. VROC Resident User Manual as PDF document (.pdf 2 MB) At this time the trainee will also need t o specify all of the dose constraints for the plan. The prescription (Rx) within many of the EMR and R&V systems associated with a treatment plan contain only space for a single target structure name with one total dose and the number of treatments. This i s inadequate to define the entire treatment plan. Many facilities use a spreadsheet or simple notes to describe the planning intent for the variety of target s and objects at risk. Another w eb was created for the trainee to fill in all requested doses and dose limits to normal tissues. This form simply has a list of targets that the trainee can fill in along with corresponding dose and fractionation. For the organs at risk, the trainee must list a dose limit to a specific v olume of tissue or a maximum dose. There is also space to record up to eight objects at risk and their corresponding dose limits. A copy of the R x Request Form has been uploaded at UF digital archive (Object 4 5). Prescription values that were used to trea t the actual patient compared to those en tered on the prescription form can be compared within this version of the VROC, and the results e mailed to the trainee as a table. Object 4 5. VROC Rx Request Fo rm as PDF document (.pdf 89 KB) Contour comparison metrics are calculated by comparing the DICOM structure file that is saved in the VROC system corresponding to the trainee contours compared to the DICOM_RT structure file that was saved in the EXPERT folder when the v irtual p atient was created. This DICOM file contains coordinates per CT slice for each

PAGE 95

95 contoured object. The coordinates can be read into MATLAB code and converted to a 3D polygon. Mathematical subroutines within MATLAB are used to calc ulate the interception of different 3D objects. This, along with the volumes of the two structures, is used to calculate the Dice s s imilarity coefficient based on Equation 3 1. The is also listed, and the volume in cubic centimeters of normal tissue that the trainee included within their target volume is calculated. A table of these values can be exported from MATLAB and emailed to the t rainee and the instructor to compare the conto urs. An example of these calculations for a lung target case is shown in Table 4 3 Plan C omparisons M etrics (Figure 4 1 E) The next area to be developed included feedback metrics comparing different treatment plans. If the tra inee chooses a plan with an error, the simulation is stopped and metrics are calculated comparing the plan with errors to the standard plan. The cumulative DVH corresponding to the standard and the error plan can easily be exported from the Eclipse workst ation either during the v irtual p atient preparation or when the metrics are calculated The DVH file is a file with percentage dose and percentage volume columns for each different contoured structure. All of the metrics for comparison are calculated from the DVH file. The equations and details about how these values were calculated are given in Chapter 3. Code that was published in the literature (69) was adapted to give feedback for the plan comparison. This code calculate s EUD and NTCP for normal tissues and EUD and TCP for target structures. Additional required input for the metrics calculations include total number of fractions and the prescribed total dose. The equations for EUD, TCP, and NTCP require specific input that are different for each structure and relate to

PAGE 96

96 the radiation response of different tissue types. More details about the specifics of the metric calculations can be found in Appendix B. From the DVH, the MATLAB code was also used to extract the percentage of the target volu me within the prescribed dose. Al s o of interest was a measure of the conformity of the prescribed isodose line to the target volume. The most common metric for dose conformity that is reported in th e literature was the Jaccard conformity index (59) (4 1) w here A is the volume of the target within the prescribed and the intercept of A and B is the total tissue volume within the prescription isodose line. Both of these values can be extracted from the DVH. The calculation of all of these metrics for both the error plan as well as the plan without errors presents a significant amount of data that cou ld be confusing to the trainee. In order to try to evaluate all of the m etrics and their changes with severity of an error, tests were designed to evaluate the metrics and their changes with severity to recommend a final report for the VROC system. This is the subject of Chapter 6 Virtual Treatment Machine (Figure 4 1 F) When the resident has selected an acceptable plan for treatment the patient will proceed with virtual treatment. The intent of the VROC is that it will run in real time so each day repr esents a single treatment. The virtual t reatment m achine is a collection of four different computer programs. One creates the file that represents the treatment and daily dose. The other three programs correspond to the three different imaging types available: portal Images, orthogonal kV images, and CBCT images. The information

PAGE 97

97 from the Instructors scenario request form will be used to select the specific imaging for each virtual p atient. Each day the files will be created and uploaded into the R&V system for all v irtual p atients undergoing treatment. Details about the four diffe rent programs are described below. Treatment r ecord : The RT_TREAT file is a DICOM record generated by the treatment m achine. This DICOM file includes patient treatment plan settings (desired beam settings) and also records actual beam settings at the time of treatment. At the time of treatment an RT_PLAN file is sent to the treatment machine and used to program the treatment machine settings. The machine interfaces with the R&V computer to compa re the RT_PLAN settings to the actual machine settings. Both a c tual and p lanned settings are recorded into the RT_TREAT file. To represent a patient treatment error, the parameters in the DICOM file associated with the specific machine setting can be changed using MATLAB For example to represent a patient setup in t he wrong location, the table parameters can be changed. To represent a change in the amount of time the machine was on, the MU settings can be changed. Within the clinical situation, if the treating therapist does not use the settings from the plan file (w ithin a narrow range), they must acknowledge this deviation and override the parameters from the plan. A line in the treatment record will record this machine override with a time and For patient treatments, the physician does not scrutinize this RT_TREAT file and only uses it as a means of determining if treatment has or has not been given. Therefore, for the VROC purposes, the RT_TREAT file is primarily used so that the dose accumulation within the VROC will be accurate.

PAGE 98

98 Portal i mages : Portal images were described in Chapter 3 on workflow. These images are of a poorer quality than traditional kV x rays. Also, they are taken to represent the actual beam shape for treatment, and therefore are often of a very small cross section of the patient and show limited anatomy. A template MV portal image was c reated as part of creating the virtual p atient. This template MV image is then modified to create the daily verification portal image for the VROC. The specific modifications to the image for the VROC include changing the DICOM header as well as changing the image itself. The DICOM header information has to be changed to reflect the correct date and time stamp to represent daily treatment. The DICOM UID in the portal image must match the approved treatment plan or the file will not load into the system. Other information in the header includes the gantry angle and magnification of the image. These are not changed, but are needed in order to modify the ima ge to represent a setup error. The image can be changed to simulate actual patient setup errors. This is done by shifting the entire image by the amount of the simulated setup error prior to uploading the image into the R&V system. This process is illustr ated in Figure 4 3 The first panel shows an example x ray The center of the image represents the treatment location. Before shifting the image a set of black pixels is added to the image and the center coordinate is shifted by the amount of the setup error. The center pixel is moved and then an image of the same number of pixels as the original images, but centered on the new location is written to file. This same process will apply for all image types in order to simulate a setup error.

PAGE 99

99 Another complication for the portal images is that they are taken at a specific gantry angle that may not correspond to the patient Cartesian coordinates used to describe patient setup errors. A matrix transformation was used to convert the p atient coordinates to the image coordinates. This has been reported in the literature for comparisons of films taken at different gantry and table angles (79) The transformation used to translate the patient coordinate system to the film coordinat e system is given in Equation 4 1 where (u,v,w) are image coordinates (x,y,z) are patient coo rdinates, represents the gantry angle and represents the table angle. (4 1) Specific tests were done to test the accuracy of the error simulation These are reported in Chapter 5. Daily KV image v erification : Another method of treatment verification requires films to be taken daily using a kilovoltage (kV) x ray source. In this case it is customary to tak e these kV images as a r ight l ateral (RT) and as an anterior (AP) p rojection (assuming supine, head first, position). Template kV AP and RT l ateral images of the ideal patient position are created when the v irtual p atient is created. Unlike the single imag e application of the portal images, the orthogonal image pair allows 3D patient registration within the l inac Software. This means that at the time of patient treatment or images, the machine settings are captured. The 3D registration process is the proces s of comparing these newly acquired x ray images to the reference images that were stored in the treatment plan. When the new images indicate that the patient needs to be moved, the treatment couch can be shifted remotely and the new couch positions, along

PAGE 100

100 with the magnitude of the shifts, are recorded in a r egistration File. The registration file also records the specific image DICOM Unique Identif y Value (UID) for both the new x ray image and the reference image in the plan. To simulate setup deviation i n the kV images, the template images are loaded into MATLAB C ode is used to pad the images with black pixels and then to shift the images based on the patient shifts. C ode is also used to create a registration file. Depending on the specific situation t he registration file can indicate a correction to the setup error that was introduced in the verification images. Before saving all of the files, the newly simulated kV images and registration file have to reference the UID for the correct treatment plan a nd for each of the reference images. T his is all handled within the subroutines. The final product consists of t wo kV images and a registration file. All three must be loaded into the R&V system to represent the image registration process. The testing of t his code and the accuracy of different setup error simulations is the subject of Chapter 5. Treatment v erification CT images : The final type of image verification used clinically is a 3D dataset of axial slices of Computed Tomography images of the patient These images are acquired using a Cone Beam CT technique (CBCT) that uses a single x ray tube and a flat panel imager. The 3D dataset is reconstructed during acquisition and can be used to align the pat ient based on internal anatomy. The CBCT can also be used to detect internal anatomical changes and determine if a plan needs to be modified. The CBCT dataset consists of a set of 64 CT slices encompassing approximately 15 cm in length centered on center of the treatment machine rotation (isocenter). The go al is to position the target area of the patient at the isocenter. Due to

PAGE 101

101 the imaging equipment, there are couch limitations that do not allow the patient to be set far to the left or the right to make sure the equipment will clear as it rotates around the patient to generate the CBCT. When treating areas far to one side or the other, the images are taken with the patient centered at the machine and after the images are taken the patient is shifted to the treatment location. The CBCT dataset is a 3D data se t so a shift in the patient coordinates correspond s to the same direction in the CBCT coordinate system. Shifts in s uperior and inferior direction means that the entire CBCT dataset will load into the review software at a different location relative to the treatment planning dataset With the current VROC system, this can only be simulated by cha nging the Z coordinate within the registration file. Shifts in either the lateral direction or in the AP direction can be simulated by physically shifting each of t he images the same way as described in Figure 4 3. Similar to the kV image alignment, a registration file is created at the time the CBCT image is acquired at the actual treatment machine. The registration file contains a list of each of the DICOM UID fo r both the reference CT scan as well as the CBCT scan in order to match the two image sets together. Because of the complexity of this file, the workflow to generate the CBCT images requires the Director to load the trainee approved t reatment plan onto the actual patient treatment machine to create the registration file. This is only done once for each treatment plan Based on trial and error this was the easiest way to create the record with the correct image UIDs.

PAGE 102

102 R egistration file from the 4D console: Th e specific steps that are required to create the s imulated CBCT images are listed (and are detailed in the Director User Manual) Simulation director exports approved RT plan CT images Deliver a CBCT using these files in order to generate a template Regis tration Transfer new Registration file into MATLAB Transfer actual patient CBCT into MATLAB MATLAB program to shift images & change header information on real patient CBCT to virtual p atient approved plan. Save the new CBCT images Change the AET of th e CBC T images to match a Varian m achine Change the DICOM tag information for each CBCT image in the Registration file to match the new UID for each DICOM image Change the AET of Registration file to Varian machine. As w ith the kV and p ortal images, the ver ification of the code for the CBCT images and the accuracy of the setup error simulation is the subject of Chapter 5. Final Plan R eport (Figure 4 1 G) The final plan report is generated based on the total treatment simulation and depends on the number of treatments given which contain errors compared to the correct treatments. The simulation director can create a composite treatment plan based on the number of different treatments that were given to the v irtual p atient. For the initial VROC development, t he simulation will be stopped early when errors occur and go undetected. The trainee will have two opportunities to correct the error, but if they miss an error after these two attempts the simulation is stopped The system assumes that the remaining trea tments include the error The reason the simulation is

PAGE 103

103 stopped early is to provide feedback and correction close to the time the error occurs. Because some treatments can go for up to 6 weeks there is no benefit in allowing a continuation of the error wh ich could have the effect of reinforcing the error to the trainee. If the trainee detects the error then the error is corrected and the simulation continues. At the end of the simulation, or when the simulation is stopped, the same metrics that were descr ibed for the pl an comparison are calculated Ideally, the metrics will help the trainee understand the overall consequences of the error. A complete analysis of the different feedback metrics and how they change with errors is the subject of Chapter 6 Fin al Error Exercise and Debrief ( Figure 4 1 H ) It is customary within simulation training to provide an opportunity to answer questions and to give feedback to the trainee at the end of the simulation. This is an opportunity for the instructor and the direct or to determine if the learning objectives were met. For the VROC, a tool to perform an error mitigation was developed to facilitate some questions for the debrief session between the trainee and the instructor. The error mitigation exercise was created to try to follow specific methods for conducting a root cause analysis (RCA). The exercise itself is a form that forces the trainee to stop and think about the error that occurred and why the error occurred. The specific details about the t heory of RCA and h ow this tool was developed is the subject of Chapter 7 and a copy of the RCA F orm is stored at UF Digital Archive (Object 4 6). At the end of each VROC experience both the Instructor and the Trainee are asked to fill in a survey about the specific simula tion to help provide opportunity for improvements in future

PAGE 104

104 development and to modify scenarios that are confusing or ones that do not achieve their desired objectives. Object 4 6 RCA Analysis Form as PDF document (.pdf 89KB) Documentation Because there are three different user s in the VROC system the Trainee, the Instructor, and the Director a set of three different user manuals with the steps involved in conducting the simulation were developed as p art of the development phase. These include Object 4 1 Director User Manual Object 4 3 Instructor User Manual and Object 4 4 : T rainee User Manual I n the current configuration of the VROC system t he S imulation D irector has many responsibilities. The ir p rimary role is to create virtual patient cases based on a set of criteria from the instructor. I n addition to real world patient files, some additional data needs to be created including multiple treatment plans and templates for the images. A checklist t o assist in this process can be copied from the U ser M anual. The other role of the simulation director is to calculate metrics at the appropriate times and also to run the code to create daily treatment files and daily imaging files. For future development many of these items may be automated. The primary roles of the I nstructor are to identify clinical cases for the simulator and to debrief the trainee at the end of the simulation. The patient selection will require the instructor to identify types of dis ease sites and errors they would like to simulate as well as the learning objectives for each scenario. A single test patient can be used for many different scenarios as desired by the instructor. The other primary role of the I nstructor is to facilitate a at the instructor s discretion how they wish to handle the debrief, but the RCA tool gives

PAGE 105

105 at least some indication of how well the trainee understands the different aspects of quality assuran ce within the radiotherapy process. The U ser M anual for the Instructor includes a sample VROC Virtual Patient Scenario R equest and a sample RCA exercise as well as a list of errors that have occurred in radiation therapy, and a description of the differen t reported metrics. The trainee user manual is meant to serve as a guide for how to navigate through the software used in the VROC. Because most of the software that the trainee interacts with is FDA cleared and is widely used in radiotherapy the U ser M a nual i s meant only to guide the user in the some aspects of the system that may be unique to the VROC. Some of the basic features of the software are identified in case the user is unfamiliar with s. I ns tructions are included in the manual so that the trainee can accomplish the tasks related to the VROC, such as contouring the target volume and selecting a treatment plan. Th e U ser M anual contains a detailed explanation of the different metrics that are re ported at the time of contour feedback and the overall metrics, including a list of references for additional reading. Another chapter includes background information on different techniques used to complete an RCA, the importance of error mitigation. Summ ary In order to develop the overall system for a simulation in radiation oncology a detailed list of each task was created. This task list was used to begin development. Appendix A includes the detailed list that was used as a developmental checklist The summary of all items that were created or developed for the VROC are listed in Table 4 4 Some items are documents, some are forms, some are computer code and some are processes. Col lectively, all are needed in order to create the simulation.

PAGE 106

106 Figure 4 1 was used as part of the development process to help describe required elements for the VROC and how those elements relate to the overall RO workflow. Once developed, another way to describe the developed items is through a schemati c of the VROC system. Figure 4 2 is an illustration of the physical layout of the computers applications installed. For ease, these will be referred to as the Eclipse workstation and the Aria workstation. The v irtual p atient is the compilation of all of the data that collective ly represents the patient and patient treatments. This data is stor ed on the server and can be accessed from either of the VROC workstations. The Aria workstation is used to review patient information and treatment verification images. The Eclipse system is used to create virtual p atients and by the trainee for contouri ng and plan review. For convenience, and due to computing capabilities, MATLAB code was installed on the Eclipse compute r s that live within the MATLAB applic ation and run directly from the Eclipse Workstation desktop. The feedback metrics are calculated from the MATLAB code and can be emailed to the trainee or instructor directly from the program or through the use of other email programs. All other products developed for the VROC can be accessed from any computer. and can be accessed from anywhere. The feedback reports for the preliminary VROC will be sent as email to t he instructor and the Trainee. The VROC Scenario R equest F orm, RCA F orm, and P rescription F orm are all I nternet accessible web forms available

PAGE 107

107 All documents that are needed for use on the VROC sys tem are installed on both workstations as well as on the UF digital storage system.

PAGE 108

108 Figure 4 1 VROC development overview Steps 1 5 are part of radiation oncology workflow. Specific aspects of VROC that were added to create simulation are letters A H.

PAGE 109

109 Figure 4 2 VROC c omponents

PAGE 110

110 Figure 4 3 Illustration of how setup errors are simulated A. Initial image B. Add band of black pixels around image C. Shift center the amount of the simulated error D. Read out the same image size centered o n new image center (orange).

PAGE 111

111 Table 4 1. Varian software m odules and their use within VROC Module Work flow Description VROC Activity Capture All Billing and report writing Not needed Chart QA 5 Weekly p hysicist check Not needed Document Approval 1 ,2 & 5 Approve consult, pathology, lab reports. (also duplicated in Patient Manager) Not needed Archive 1 To manage large databases Duplicated in Administration Not needed Patient Manager 1 2 & 5 Demographics, documents, radiation treatment summary. D ocuments for review, patient under treatment review Report Manager 5 Reports of data within the database can be generated S tatistics about the performance of the resident. Time Planner All Scheduling patient appointments and treatments. Can be viewed by location or staff. Manage resident schedule and work list RT Chart 3 5 Treatment goals, prescriptions track delivered treatments. Evaluate t reatment Image Browser 1 & 2 R eview all images (diagnostic and treatment planning images can also be done in TP S ). Not Needed Offline Review 5 For reviewing of portal images, and treatment images R eviewing treatment images Portal Dosimetry 4 & 5 For physics QA of IMRT patients Not n eeded Eclipse 2 & 3 C ontouring, treatment plan, prescriptions, calc ulations P hysician contouring, & plan comparisons

PAGE 112

112 Table 4 2 Files n eeded to create a v irtual RO p atient Type of Files Data Required Real Patient Documents Pathology Consultation s (from other doctors) Procedure n otes Diagnostic r adiology reports Real Patient DICOM Trea tment p lanning CT Other images (Pet and/or MRI) RT_ P LAN from clinical case RT_STRUCTURE from clinical case Copy of CBCT images from clinical case Patient archives from Eclipse only normal tissue contours no plans Patient archive with m ultiple treatment plans and expert contours XPERT Patient archive i ncludes expert RT_Plan and expert RT_STRUCTURES Template Files (created in TPS ) DR Rs representing p ortal i mages for all treatment beams DRRs representing AP and Lat kV images RT T REATMENT from manual recorded treatments for DVH files for expert plan

PAGE 113

113 Table 4 3 Example results of contour c omparison r eport Lung training c ase Name of s tructure GTV CTV PTV Dice s imilarity 0.05 0.1 4 0.19 Expert v olumes (cc) 3.48 37.58 90.42 Trainee v olumes (cc) 2.9 3 5.65 12.31 % of target m issed 95.7 6 92.13 89.28 Normal ti ssue included (cc) 2.76 2.69 2.62

PAGE 114

114 Table 4 4. Summary of develope d VROC tools and applications Tool Software User Description Trainee user m anual Document Trainee Manual for t rainee Instructor user m anual Document Instructor Manual for i nstructor Director user m anual Document Director Manual for d irector Scenario r equest Form Instructor Request for a specific scenario Virtual patient c hecklist Document Director Checklist to create virtual p atient Prescription f orm Form Trainee Contour_ c omparison .m MATLAB Director Compares two RT_S TRUCTURE files Metrics.m MATLAB Director Generates EUD, NTCP, CI from DVH files Create _MV.m MATLAB Director Generates simulated portal i mage Create kV .m MATLAB Director Generates s imulated kV/kV images and registration file Create_CBCT.m MATLAB Dir ector Shifts registration & changes DICOM header of an existing CBCT image set to be loaded into a virtual p atient Create RT .m MATLAB Director Creates RT_ T REATMENT record to representing verified beam delivery Final r eport Document Director A report exp laining errors that occurred, and summarizing all reported metrics RCA reporting form Form Trainee A step by step form to work through a preliminary RCA on the reported case Final VROC e val Form Trainee Survey of experience using VROC

PAGE 115

115 CHAPTER 5 E RROR S IMULATION T ESTING Once the Simulation tool was built it was necessary to test the different developed components of the system. The VROC system was built specifically for training about errors that occur within the radiation oncology process. Errors can occur that treatment delivery. Within the clinical setting, there are several processes in place to catch errors that occur. Different team members are responsible for carrying out different aspects of these quality assurance measures The physician is ultimately responsible to ensure that the patient treatment is carried out correctly and that the quality assurance measures are being performed. The specific opportun ities the physician has to verify and check that the plan is being carried out correctly are at the time of prescription entry initial target delineation planning approval, and daily treatment image review. The VROC system will incorporate errors and all ow appropriate opportunity for the physician to catch these errors. Contour errors are handled at the time of contour feedback and in the current VROC are not propagated through to treatment. Errors that occur from prescription errors or from treatment pla n errors will be handled within the treatment planning system within the VROC. Errors that occur at the time of treatment will be shown to the trainee by means of the simulated treatment images. Validation of the error simulation included testing of the al gorithms used to produce the verification images. Background In the VROC system, a virtual patient consists of a set of DICOM image records that can be loaded into the Varian treatment planning and R&V system s as described in

PAGE 116

116 Chapter s 3 4 Errors can occur at a number of places within the overall radiation oncology workflow. A summary of errors that have occurred in radiation oncology was included in Chapter 3. It is the goal to use the VROC specifically for errors training, and therefore errors will occur at a much higher frequency than they would in a real clinic for training purposes and to stress the importance of a good quality assurance program. In general, radiation errors can be classified as either dosimetric or as spatial. Dosimetric errors are tho se in which the overall error results primarily in a change in the dose delivered. Spatial errors are those that are primarily a result of the dose being delivered to the wrong location It has already been pointed out in the literature review that the dos imetric errors occur most frequently in the preparation and planning steps of the radiation process and spatial errors occur most often at the t ime of treatment. D osimetric errors can occur as a result of error s in the prescription or due to incorrect use of the treatment planning system. Within the real radiation clinic, these types of errors are prevented or detected by the peer review process as well as by a detailed chart review process. Peer review involves a peer physician evaluating the contours, th e prescribed doses and the overall goals for treatment to see t hat they meet general standards of care. T he report on safety in radiation oncology (4) give specific details on how to conduct a peer review. There are also standards on what target structures should be drawn and general guidelines for prescription can be found through the National Comprehensive Cancer Network (NCCN) ( www.nccn.org ) (80) The chart check process is completed by a physicist who reviews the chart to make sure the correct machine settings and dose algorithm within the TPS have been used, and that the planned dose matches the prescribed dose and has transferred into

PAGE 117

117 the R&V correctly The physician is also required to sign off of the treatment plan prior to initi ation of treatment. It is the responsibility of the doctor to verify that the plan matches the prescription and the overall intent of treatment. The physician should also be aware of the obvious details of the plan, specifically how long each beam would re main on, and the general direction of each beam. They should also be able to evaluate the plan relative to normal tissue doses to make sure the plan is safe. This would include being able to check details within the R&V system. Errors that can occur at the time of treatment relate to delivering the prescribed plan to the correct location. T he highest frequency of delivered treatment errors is pati ent setup errors. This specific type of error represents roughly 20% of all errors that included in the ROSIS da ta. A number of other setup errors go unreported because even though an error occurs, if a portion of the target volume is still treated and there is no risk of side effects, it may go unreported. Quality assurance for treatment delivery includes the ther apists checking one another and verifying that the patient is setup as described in the prescription and treatment plan The second method of verifying the patient setup is through the use of imaging. Depending on the imaging protocol, images will be taken either daily or weekly. The images that are taken are compared to reference images that relate back to the correct treatment setup. Radiation therapists carefully check the patient setup by inspecting the d aily images This is the first level of quality a ssuran ce for patient treatments. Also it is expected that any images that are taken are approved by the physician on the day in which they were taken. In the event that a verification image or a treatment file indicates that an error has occurred, it is th e re sponsibility to

PAGE 118

118 flag the error and to make recommendations for corrective measures. In an actual clinical situation, if the physician fails to catch an error, or they fail to review the film daily, there is a strong likelihood that the erro r could occur again. Methods Several different error types were simulated using the VROC system to test the overall VROC process and to test that errors realistically matched actual errors that could occur within a clinical setting. The primary focus was to validate developed software that was created for the VROC system. The components that utilized commercial software were tested to make sure that a particular error could be created, but validation testing was not necessary. The workflow from Chapter 3 w as re created with a list of all of the opportunities for different f eedback in the simulation process. Figure 3 1 was used for describing the error simulations Errors in Pre Planning and Treatment Planning Within the VROC, contouring error can occur, bu t for training purposes these errors are not carried through to treatment. Instead, feedback is given to the trainee about the overall agreement of their contour to the expert contour. The instructor can specify how they wish to proceed if the contour does similarly coefficient for agreement, such as requiring the trainee to repeat the contour process. Because the contouring tools themselves were commercially available products, there was no need to validate the tools for perform ing contouring The validation of the metrics that were calculated relative to the conto urs is included in the metrics testing in Chapter 6. Because the VROC system specifically uses commercial software for the developed treatment plans, any type of presc ription dose error that can occur in a real

PAGE 119

119 clinic can be duplicated within the VROC. In order to simulate a prescription error, the original treatment plan is copied and the prescription dose in the plan is changed. The plan immediately updates with the n ew information Several plans with different dose prescriptions were created as part of the testing of the metrics calculations in Chapter 6. T he added p rescription form allows the VROC to simulate a comparing the trainee prescript ion to the expert prescription. No calculations are done for this review; it is a simple side by side comparison per each structure. Because the VROC system uses a clinical treatment planning station, any clinical treatment planning error can be simulated It is not typically the physician responsibility to check all of the technical details of plan development, but the physician should check those items that relate to the overall dose distribution by evaluating each CT slice as well as the composite and fr actional doses on the DVH. F or example the use of the wrong energy should be evident in the treatment plan because the organs of risk may receive too much dose. To simulate this error, the beam energy is simply changed in the treatment plan. Another error could be that the planner failed to account for a specific o rgan at r isk (OAR) a new plan can be generated that overd oses that particular structure. Several different planning errors were simulated as part of the metrics calculations for C hapter 6. Specifi c validation of the accuracy of the dose algorithms within the Varian Eclipse system is part of the initial acceptance test process that was completed with Varian when the system was delivered. Errors in Patient Treatments Errors in treatment that relate to a dose error can be simulated within the VROC by changing the radiation treatment record. This file is created using code as part of the v irtual t reatment m achine. Within the RO workflow, the only aspect s of the daily

PAGE 120

120 treatment record that the physicia n typically evaluates are the number of daily treatments and the accumulated dose. A measure time the machine is on that is related to the dose given is called a m onitor u nit (MU). The number of the treatment records uploaded will determine the number of t reatments A separate treatment record is created for each beam for each treatment. To simulate an extra treatment a duplicate set of the files can be created and uploaded into the system and it will appear as though all fields were treated twice. To chang e the dose from any field the MU values can be changed and the dose that is recorded for each beam will change by the same ratio. Validation was specific to the daily dose tracking. Sets of delivered fields that included under dose from too few MU given a s well as from treatment beams not being treated were tested. Over dose situations where extra fields were treated or the MU was too high were also tested. The set of different dose errors was repeated on three separate patients to make sure that the code to generate the radiation treatment record was correct and that the files could be imported into the RT Chart software and represent the different dose errors. One of the main quality assurance tools in the daily treatment of patients is through the use o f daily or weekly imaging. Within the VROC, errors that occur at the daily treatment are simulated within the verification images by using code that was developed as part of the VROC development. To test the accuracy of the VROC treatment error simulation a set of tests were developed to test each of the three different image verification methods of the VROC system. Ground truth for the images and for the error was created using a radiographic The V ROC process was used to

PAGE 121

121 for create a virtual p atient by using the CT images of the radiographic phantom as though they represented an actual patient. Prior to scanning the phantom in the CT scanner a set of metal fiducials were applied on both the anterio r and lateral surfaces of the phantom. These were used to remove ambiguity when comparing the image location in the VROC software. For each of the three different verification imaging a CT scanner and treatment machine. the phantom was taken to the actual treatment machine (Varian 23ix with OBI) to acquire treatment verification images of the phantom in the correct treatment position as well as a number of other positions representing setup errors. Figure 5 1 is a photograph of the radiographic phantom on the treatment machine co uch. The imaging panels and the kV x ray source are deployed. The silver reflective markers on the phantom were used as some of the fiducials for the image registration. T he same three anonymized and their CT data used to patients within the VROC system. Virtual treatment images were created for the virtual patients using the virtual t reatment m achine code described in Chapter 4. For validation testing the images and records created by the t reatment machine were co mpared to those simulated in VROC. All images both real and simulated were imported into the Offline Review (Varian Medical Systems ) software that is incorporated into the VROC. The coordinates of the fiducials in each of the images or image pairs were re corded. The differences in these

PAGE 122

122 coordinates were compared for all modalities to verify the simulation code for the images accurately simulate setup errors The Offline Review software in VROC handles each of the three imaging modalities differently under n ormal clinical circumstances. For these reasons, the specifics for each of the three imaging validation tests are handled separately. Daily o rthogonal k V images T he treatment plan for the Actual phantom plan was transferred to the treatment machine. The p hantom was placed on the treatment couch and aligned to the setup marks that were used for the CT scan that should represent the ideal treatment position. An initial set of orthogonal images was acquired to verify that the phantom was positioned correctly. The treatment machine so ftware was used to compare the x ray images to the reference images. If the software indicated that the patient should be shifted, this shift of the couch was performed. Four different registration scenarios were identified. And a ll four possible alignments were checked for three different setup errors in the lateral, longitudinal, and vertical directions (x,y,z) of (3, 2, 1 ) cm (2,1,3) cm (5.4, 4.4, 3 .0) cm. The four different registration scenarios are described below. 1. Initi ally align patient correctly ; register images correctly; do not shift patient ; Final = patient aligned correctly. 2. Initially align patient correctly ; register images incorrectly; shift patient ; Final = patient aligned incorrectly 3. Initially align patient incorrectly ; register i mages correctly shift patient; Final = patient aligned correctly 4. Initially align patient incorrectly; register images incorrectly; do not shif t patient; Final = patient aligned incorrectly

PAGE 123

123 For each set of setup errors and reg istration situation, the orthogonal images and the registration files were ca ptured at the treatment console. In cases where the registration required the treatment couch to be moved, that translation was physically performed just as it would in an actual patient scenario. In total there were 12 separate sets of image pairs that were taken of the radiographic phantom. Once all films were acquired they were imported into Offline Rev iew The coordinates of the fiducials and anatomical landmarks within the ima ge were recorded f or each of the images for both the initial phantom position as well as the registered phantom position according to the above 4 different registration scenarios. The known offsets that were introduced were recorded to verify that the coor dinates of the fiducials changed by the correct amount. A v irtual patient was created by copying the CT scan for the phantom. The VROC and MATLAB code along with the template image s were used to virtually recreate each of the different setup errors and al ignment scenarios create A set of 12 separate image pairs was created for the virtual patient. The simulated k V alignment films were loaded into the corresponding v irtual patient in Offline Review. The coordinates corresponding to the fiducials and intern al landmarks were recorded for each of the simulated films. The differences between the coordinates of the fiducials in the real vs the simulated films were calculated to verify the accuracy of simulated patient location within the images. The average s and the standard deviation s of these points were calculated. Also, as part of the registration, both the initial and final registered isocenter

PAGE 124

124 coordinates were recorded to verify the shift that is being reported and illustrated within the Offline Review. Po rtal i mages The same radiographic phantom was used to create an actual and a virtual patient for testing the portal images. A plan was generated representing beams with a variety of different gantry angles. The phantom was again positioned on the treatment couch and aligned to the initial setup marks on the outside of the phantom. Portal images were taken at each of five different gantry angles. The phantom was shifted at the treatment machine to represent a patient setup error and all five portal images re peated. This was repeated a third time and all portal images repeated. A total of five gantry positions for each of the three different setup variations were taken for a total of 15 films were acquired on the phantom and were loaded into Offline Review. Th e coordinates of the fiducials in each of the images were recorded. The initial CT images wer e anonymized to create another virtual p atient in the VROC system. The same treatment plan was created on the v irtual patient as for the phantom. Also, as was desc ribed regarding the virtual p atient, the template DRR images were created to represent the different portal images. Using the VROC v irtual t reatment m achine system, the set of 15 different portal films were created to simulate the portal images that were t aken with the phantom. These 15 different images were loaded into Offline Review and the coordinates of the fiducials were recorded. The differences between the real and the simulated fiducial locations of the images were calculated to determine average an d standard deviations between simulated portal images and real portal images.

PAGE 125

125 CBCT s imulation A treatment planning dataset was created to make a new patient. The patient file was exported to the treatment machine and the following five different setups a nd registrations were created. 1. Phantom aligned to correct location imaged at correct location registered to correct location (no shift and final alignment is correct). 2. Phantom aligned to correct location imaged at correct location shifted to a (Treatment location is incorrect). 3. Phantom aligned to wrong location filmed at wrong location registered and aligned back to correct location (shift and final alignment is correct) 4. Repeat of #2 (Images ar e correct, shifted off from isocenter different direction) 5. Repeat of #3 Images are taken at incorrect location shifted back to correct treatment location. The five different alignment scenarios were performed on the radiographic phantom at the treatm ent machine. All sets of the registration images and the registratio n files were imported into the Offline R eview system for verifications. The coordinates of the various fiducials were measured at both the acquired location as well as the registration loc ation. A virtual patient was created to represent the same radiographic phantom that was used for the CBCT testing. This virtual patient was given a different name and a new plan was developed for this patient in the system. The initial CBCT dataset of the phantom was used as the template from which to derive all of the simulated scenarios. The same anatomical locations were identified and the coordinates recorded from the simulated CBCT dataset. The average and standard deviations of the differences in the coordinates were used to compare the actual and the simulated registration process.

PAGE 126

126 R esul t s Treatment Dosimetric Errors The daily treatment records were created for a test patient to include multiple fields treated and both under dose and overdose based on the number of MU given pe r each of the treatment beams. These files were imported into the VROC to verify that the dose recording based on the number and dose in the treatment fields matched the expected values. For the situations tested, the dose track ing matched the expected values. The simulation of a person overriding and typing in their credentials was not simulate d in this initial development. This is a matter of identifying the correct field label within the DICOM header and making the appropriate changes to them. Additional modifications could be made for future work depending on the level of sophistication desired within the VROC. For the purpose of dose tracking, only the total fractions and the total dose were of interest at this stage in the d evelopment. Daily Orthogonal KV I mages In orthogonal kV imaging one image represents a beam entering from the anterior of the patient or an AP beam. The other beam represents a beam entering from the patien ight lateral. The righ t lateral beam indicates the patient offset in the superior/inferior directions (Y axis of the film) and the a nterior/ p osterior direction (X axis of the film). The AP film indicates the left and ri ght direction (X axis) and the s uperior/inferior direc tion (Y axis). Figure 5 2 is an example of one of the simulated test pairs of films. The images on the left side are the Lateral images, and the images on the right side are the AP images. The main window for both images is a f usion of the registration image (ideal or expected patient location) and the

PAGE 127

127 simulated image, and the images in the bottom are the registration image (left) and the simulated film ( right). For each of the films, coordinates for fiv e unique points were mea sured Three of the points corresponded to the metal fiducials and two were for a natomical points representing a point on one of the v ertebral bodies and a point in the left lung. Also the coordinates of the isocenter at the time the film was acquired and the i socenter after the registration were recorded. Initial check of the VROC system was that all of the simulated films could be loaded into Offline Review and were able to be displayed correctly. Figure 5 2 is the simulated film setup that should mimic Figure 5 3 (r eal) phantom setup. What can be noticed is that the image quality or window and level between images are not exactly the same, and the edges on the wire fiducials for the simulated images are blurry due to the reco nstruction which is based on CT slice thickness. When a patient is not setup correctly at the treatment machine and the therapists shift the patient after initial imaging, this is indicated by different colored crosshairs in the image in the Offline Revie w s oftware. An orange crosshair indicates the location the patient was when the film was acquired and the green crosshair indicates where the patient was after image registration. This is illustrated in Figure 5 4 for the simulated setup error and in Figure 5 5 for the real setup error. Because the simulated films had to be manually shifted and cropped and also the registration file had to be modified to indicate a correction to the setup error, it was important to check the coor dinates of the images as well as the coordinates of the two different isocenters.

PAGE 128

128 To determine the overall accuracy of the simulated films compared to the real films the differences in the fiducials and isocenter points were taken. A total of 24 image pai rs were available for evaluation, 12 datasets and 12 simulated datasets. For each film, a total of 60 different data points were compared, along with the 12 acquired isocenter points and 12 actual isocenter points. The coordinates of the points we re read out in pixel values and converted to mm. The pixel size of all films was 0. 26 mm. The overall difference between real and simulated fiducial coordinates was 3.65 2.25 mm. The overall differences between real and simulated acquired isocenter were 1.94 2.8 0 mm and for the registered is ocenter it was 0.89 0.81 mm. These values were calculated for each of the orthogonal directions and are listed in Table 5 1. More details of the analysis of each individual fiducial an d the different isocenter coordinates on the simulated and the r eal images are reported in Appendix B. Table B 1 provides the specific details and amount of each of the setup errors that were simulated. Table B 2 indicates the differences per each of the d ifferent fiducials in the analysis. Tables B 3 and B 4 list the actual isocenter coordinates for both the simulated and the real images. Portal Images Figures 5 6 and 5 7 illustrate the simulated (5 6 ) and a ctual (5 7 ) portal images for a phantom that is setup without errors for the beam with g antry angle 0 degrees Figures 5 8 and 5 9 are the simulated and actual portal images for the beam with g antry angle of 45 degrees with the phantom offset in all three orthogonal directions. Coordinates of several points within both the simulated and the actual portal images were recorded using the Offline Review application. These coordinates are relative to the center of each film and therefore can be used to determine the offset between the

PAGE 129

129 simulation and the actual images. For each image between five and nine unique points either described by a metal fiducial on the phantom surface or anatomical landmarks within the phantom were used. A total of 15 differen t films with 5 different gantry angles were used to evaluate the code to simulate films. There was a total of 110 unique points that were compared between actual film and simulated film. The overall distance to agreement was calculated from the shift coor dinates and the average and standard deviations for all points was 3.1 4 2.2 1 mm. (included in Table 5 1) One particular simulated film for the gantry angle of 140 indicated the highest offset between simulated and created (due to difficulty in visualizi ng some of the fiducials in the real portal image ). By eliminating these data points the average and standard deviation between the sets of image was 2.9 4 2.05 mm. Unlike the kV images, each film represents different patient orientation so there was no a ttempt to divide the data by patient orientation CBCT Verification Images The CBCT da ta analysis within the Offline Review s oftware was similar to the orthogonal image set in that the default review mode display s the CBCT dataset over top of the treatment planning CT representing the final position of the patient at the end of the registration process An example of the CBCT image registration process is illustrated in Figure 5 1 0 For each CT a set of eight different points made up of fiducials and an atomical landmarks were used. For one dataset the shift caused two of the fiducials to not be included in the actual CT scan so for that pair of images, only six points were evaluated. The coordinates were based on the coordinate system of the treatment planning CT datase t that had a pixel size of 0.85 mm and a slice thickness of

PAGE 130

130 3.0 mm. The average and standard deviation of the difference in actual and simulated images was 0.46 1.06 mm Lateral; 0.25 1.27 mm A/P; 0.33 2.07 mm S/I. (Included in Table 5 1). The averages and standard deviations of the isocenter between the known shift and the actual reported shift are included in Appendix B C onclusions Both dosimetric and spatial errors can be simulated within the VROC system. Dosimetric errors typically occur at the time of treatment planning and will be available as treatment plan options for the trainee to review before the virtual patient starts treatment. For errors involving dosimetric variance at the time of treatment, t he file representing the treatment can be modified to reflect different errors in dose or beam settings prior to importing into the VROC system. S patial errors that occur at the treatment machine can be detected by the use of pa tient verification images. T he virtual r adiation machine is a collection of MATLAB code that was created to create these verification files and to simulate the changes in these images if a spatial error occurs. Validation was done to show that errors in the simulated images were the same as ground truth values of actual setup errors on a radiographic phantom. The results show that for all three treatment imaging modalities, the code in the VROC v irtual t reatment machine generated images that could be successfully uploaded into the VR OC to represent daily treatments. Careful attention to the DICOM header within the VROC code allows the patient files to load and reference the correct set of images within the Offline Review software. These test s also indicate that the shifts of the image s to represent patient setup errors could be made in the correct direction and made to represent the overall magnitude of a setup error.

PAGE 131

131 Overall, the CT images showed better agreement in the location of the simulated error between simulated images and rea l imag es. I n comparing the kV images and the portal images, the kV images have the advantage of being able to identify all three directions at one time. T he agreement between the actual and the simulated films for all points was worse in the superior and i nferior direction than in the other directions. The DRR that is used for the simulated film has a decreased resolution in this direction because the CT scan that was used to create the DRR was taken with a 3 mm slice thickness. When the DRR is created, the average pixel value between slices is used to create the image. It has been documented in different localization studies that the resolution in the Superior/ I nferior direction is depe ndent upon the CT slice thickness. By using CT images with smaller slice width, this should improve. Localization studies using as small as 1.5 mm or less CT slice are often needed when treating very small areas such as brain treatments in order to improve the resolution in this plane (81) Also of note is that the fiducial average and standard deviations are greater than those for isocenter or those for CT (anat omical points). This is related to an issue of divergence of planar imaging. Divergence or penumbra occurs because the radiation beam fans out as it exits the x ray machine, passes through the patient, and interacts on the imaging panel (82) The overa ll result is a magnification that increases with increasing distance from th e center of the image. This magnification issue also increases with increasing distance from the imaging panel. Many of the fiducials that were used to compare the real films to the simulated films were on the surface of the phantom and we re farther from t he image panel that the actual treatment volume. As

PAGE 132

132 the fiducials are moved away from the center of the beam, the relative distance between them will become magnified. The simulated film shifts do not account for divergence. It is important to note that th e reference image that is used to calculate the patient setup error also does not account for this change in distortion Therefore when overlaying the images, the s imulated images show a better agreement to the reference images than do the real images. Th e divergence issue is not as much of a problem for obj ects that are farther from the x ray source or for objects close to the center of the beam. T his is why the coordinates for the isocenter locations show good agreement between actual and simulated films This overall agreement between the simulated and the actual films for marks on the patient surface of about 3 .0 mm may not be adequate for high precisions treatment techniques, but are well within the normal margins for standard radiotherapy treatment. S tudies have been done that indicate that the prostate can move anywhere between 5 mm and up to 10 mm during the time the patient is on the treatment table (83 84) It is not uncommon to see changes in patient anato my due t o tumor response to radiation. In head and neck cancer, tumor changes of up to 1 0 m m have been documents (85) Th ese two examples indicate that a dev iation on the order of 1 0 m m may be within clinical margins for the target and would not be counted as an error. For this simulation testing, the errors that were chosen were based on reported setup errors where the alignment w as to the wrong vertebral bod y or to patient external marks. In these cases the setup error was on the order of 35 mm.

PAGE 133

133 The purpose of these studies was to validate the virtual t reatment m achine software of the VROC Treatment records can be created and uploaded to the VROC on a daily basis. These records can indicate a change in dose per fraction or total dose. The different types of images and their corresponding registration file can all be created and successfully uploaded into the VROC. If any of the tags within the DICOM header a re not correct the files will not load into the Offline Review software. Setup errors on the order of 3 5 m m were tested within all systems. The simulated films were within 4 mm overall agreement of the actual films based on coordinates of fiducial points a nd within 2 mm overall for points in the center of the patient and near the isocenter.

PAGE 134

134 Figure 5 1 Phantom setup for v irtual treatment machine v erification

PAGE 135

135 Figure 5 2 Simulated orthogonal kV images without patient setup error Figure 5 3 Actual orthogonal kV images without patient setup error

PAGE 136

136 Figure 5 4 Simulated orthogonal kV images with a setup error. Figure 5 5 Actual orthogonal kV images with a setup error

PAGE 137

137 Figure 5 6 Simulated portal film with g antry at 0 degrees without patient setup error. Figure 5 7 Actual portal film with g antry at 0 degrees without patient setup error.

PAGE 138

138 Figure 5 8 Simulated portal film with g antry at 45 degrees with patient setup error. Figure 5 9 Actual portal fi lm with g antry at 45 degrees with patient setup error

PAGE 139

139 Figure 5 1 0 Exa mple of a simulated CBCT image r egistration

PAGE 140

140 Table 5 1 Difference between actual and simulated i mages Difference (actual vs s imulated) Overall avg. (mm) AP mm LAT mm S/I mm Portal images f iducials 3.14 2.21 kV f iducials 3.65 2.25 0.79 1.67 1.25 2.19 0.21 3.20 kV align ment isocenter 1.94 2.80 0.16 0.36 0.09 0.75 0.12 0.31 kV r egistered i so center 0.89 0.81 0.05 0.10 0.15 0.56 0.16 0.34 CBC T f iducials 2.49 2.03 0.25 1.27 0.46 1.06 0.33 2.07 CBCT alignment isocenter 0.78 0.15 0.17 2.15 0.85 1.60 1.21 2.71 CBCT registered i so center 0.78 0.15 1.21 1.16 1.37 1.15 1.21 1.66

PAGE 141

141 CHAPTER 6 R ADIATION METRICS AND R ADIAT ION E RRORS Another major development for the VROC system was opportunities to provide feedback to the trainee. A study was performed to simulate several different errors within the VROC and to calculate all of the proposed metrics. The goal s were to test t he algorithms for calculating metrics and to finalize recommendations for feedback for reporting errors within the VROC system. Several different metrics were discussed during the review of the radiation oncology workflow in Chapter 3. To try to determine which of the metrics to be used for feedback, they were compared to two different recommended severity scores for error reporting in order to make final recommendation Methods The ideal set of metrics for the VROC would be easily calculated for any type o f error and would scale appropriate ly to the severity of the error. Also, the metrics must be able to be used over a variety of types of errors, t arget sites, and prescription doses Within radiation therapy, e rrors that occur for curative intent will in g eneral have a greater overall risk for creating an adverse side effect than the same error that occurs on a patient treated to a low d ose for pain palliation. For example a setup error of 35 m m that occurs several times on a curative head and neck treatmen t with very narrow margins and a sharp dose gradient between high and low doses could potentially create a very high dose into a critical structure such as cord or brainstem, causing a permanent radiation injury. The same error on a uniform dose distributi on for pain management of a cord compression may decrease the overall pain control, but is unlikely to result in an injury.

PAGE 142

142 There are t wo different types of reports that are generated in the VROC system for feedback. The first report occurs after the initi al contouring is completed and is feedback specific to contouring errors. A study was done to test the code that was developed for the contour comparisons. Th e contour comparison code compares two separate DICOM structure files, one for the expert and one for the trainee. The developed code was tested for accuracy against volumetric calculations taken directly from within the TPS The second type of feedback is the planning metrics that can be generated either at the time of treatment plan approval, or at t he comp letion of the patient treatment. Both treatment planning and treatment errors can be simulated within the TPS in order to generate a composite treatment plan. For example if the error was a dosimetric error in which the prescription was entered inc orrectly, the prescription of the treatment plan is simply changed to reflect this error. I f the error is a treatment error in which the patient is setup incorrectly at the time of treatment the location of the treatment beams in the treatment planning sys keeping all of the other b eam settings exactly the same. The TPS can re calculate the dose from this shifted dose distribution. A composite of different shifted dose distributions and/or planning errors can also be created by adding the plans together and scaling the dose distributions by the number of times the different errors occurred. A composite dose volume histogram (DVH) can be created from this composite plan To investigate the details of how th e metrics change specifically based on errors a set of ten errors were simulated onto three head and neck cancer datasets. Within all three datasets the same normal tissues were contoured and the prescribe d doses were

PAGE 143

143 the same. This study was done in orde r to investigate the details of the normal tissue metrics to determine which if any best described the overall effect of the errors on the patient risk of injury. H ead and neck treatments were chosen because they are frequency treated so there are a numb er of different cases, and because of the anatomy they remain one of the more complicated treatment sites in most radiation oncology clinics and more complex treatments have been associated with either a higher incidence or for more severe consequences fro m errors (53, 86 87) The three patients had been treated with IMRT treatment to the primary target and involved neck nodes to 70 Gy, while simultaneously treating the secondary lymphatic nodes to 63 Gy, and the co ntra lateral lymphat ic nodes to 57 Gy. For the low neck region, a single anterior field was used to treat lower neck to 50 Gy using a midline block as needed to spare the cord. The upper and lower fields were matched by using a single isocenter and asymm etric jaws. A total of ten different errors were simulated. Four of them were geometric setup errors occurring between 2 5 different treatments out of the 35 total treatments. Four of the errors were dosimetric and occurred primarily at the time of treatm ent planning. The final two were a combination of both dosimetric and spatial errors. The detailed descriptions of the ten different errors are included in Appendix D A second study was conducted to evaluate changes in metrics with different anatomy and p rescription doses over a range of different treatment types. Because the anatomy and the prescribed doses could change from one case to another, the recommended metrics should be applicable over all cases tha t would occur within the VROC. Ten unique datase ts to different disease sites were selected and three different

PAGE 144

144 errors were simulated onto each. The errors that were selected were based on the likelihood of occurrence, but were modified slightly to create metric sets that would include all levels of sev erity. For each patient, the metrics for the errors and the standard plans were calculated. The details of the specific errors and disease sites used for this study are included in Appendix E Metric Calculations The metrics were calculated based on the DV H from the composite treatment plan. The details about the metrics can be found in Chapter 3 and details about the developed code are included in Chapter 4. A complete list of all of the metrics that were calculated is included in Table 6 1 The metrics were calculated for the error plans as well as for th T he difference and percentage change between the standard plan and the error plan for each metric was also calculated. All of the probabilities metrics (NTCP and TCP) that were calculated were combined using mathematical combinations in order to calculate both a probabilit y of complication free survival, P + which was calculated from the combined NTCP tot and the TCP. The combined NTPC tot represent the percentage likelihood of any complication from any structure. The model parameters that were used to calculate EUD, NTCP, and TCP are included in the Appendix C with other details about the metric calculations. Detrimental dose (DD) as described in Cha pter 3 is a number that is related specifically to errors in radiation oncology. It is calculated by weighting the dose differences to different structures because of the error by the relative sensitivity of each structur e. For the purposes of this study, the dose differences to each structure were taken from the change in the EUD for each str ucture. The weighting factors that were

PAGE 145

145 used for each structure were taken from the publication and are reprinted in the Appendix C Error Scoring One o f the goals o f the metric calculation analysis was to be able to correlate the metric scores to overall severity of the error. There is currently no consensus on severity scoring nor is there a national radiation error reporting system. N ewly published recommendation s for a standardized national error reporting system include two different error scoring systems. These were described in Chapter 3 ( Table 3 3 and Table 3 4 ) (77) One is based on dose severity determined by the percentage change in dose to any structure. The other is a conseque nce severity, based on expert opinion and on the grade of different side effects from the error. Ideally, a set of metrics that could automatically predict these scores would be useful for feedback from the VROC system. For the dose severity score the percentage of EUD change between the error pl an and the as treated pl an was used. The recommendations for this dose severity scoring are based on the maximum percentage change in the dose to any structure There are no specific recommendations in th e report on h ow these should be normalized. For simp lification and consistency the percentage EUD changes were normalized to the prescribed dose. Both increase s and decrease s in target dose are considered and only increases in doses to normal tissues are considered. The maximu m of all of these percentage c hanges in EUD was then used to determine the dosimetric severity score. For the consequence severity metric, all sixty of the dose errors (thirty head and neck errors and thirty variety of disease sites) that were calculated were evaluated by

PAGE 146

146 experts in r adiation oncology (two physicists and one physician) who gave a score using Table 3 4 along with clinical experience to predict the consequences. To score the severity of the errors the experts had access to the different EUD values, change in coverage of the target structures, changes in both EUD and NTCP to normal tissues, and a complete description of the error. A cross section of the patient anatomy and treatment plan was also included to help to describe the region of the patient that was treated. In some cases the expert could make an estimate of the severity based completely on their knowledge of the anatomy involved and not based on the specific data presented. R esults Contouring M etrics The contour metrics feedback calculations were compared to vo lume calculations from within Eclipse along with hand calculations for the similarity metrics. A test c ase with several different geometrical shaped structures (cube, sphere, single slice square) was created. The volumes of the structures calculated in Ec lipse were compared to those calculated using the VROC contour comparison code The agreement was wit hin 5 % for all structures that were > 1cc. The small structure volume was calculated as 0.7 cc in Eclipse and 0.9 cc in VROC. ty calculation, the target structures were copied and changed to create overlap, under lap and mis alignment of the object. Volume calculations within Eclipse were used to get information needed to perform hand Three different scenarios were

PAGE 147

147 Error S coring The primary goal of investigating the metrics was to determine if there would be a way to automate both the dose severity and cons equence severity score by using the other metrics that were calculated. The dose metric from Table 3 3 (Ford et al. (77) ) scale s with dose, but is not linear. Since this is directly related to the change in dose to any structure the values from within the table were plotted to determine an equation to be used for automatically calculating the d ose severity metric based on the percentage change in EUD. A logarithmic fit to the published table yielded Equation 6 1. (6 1 ) Equation 6 1 was included into the metric feedback calculations so that the dose metric could be obtained automatically once all of the % EUD calculations are made. Three different experts in radiation oncology evaluated the different error plans to d etermine the consequence severity score of the different plans The average and standard deviations of all of the scores was calculated to be used to compare the various different other metrics. The averages and standard deviations of these scores are included in Appendix D (Head and Neck) and Appendix E ( Variety Site C ases). I nitial observation s of both datasets indicate that within the errors that would fall between 2 an d 7 there is a greater variability in expert score t effect = 0 and those that would definitely cause harm or death = 10 Over all sixty errors reported t he m ean standard deviation was 1.38 Upon further investigation one problem with this scoring system is the non linearity that is introduced because of the recommendation to score a possible recurrence due to

PAGE 148

148 under dose (77) out of the analysis and the average standard deviation in the scores improved to 1 1 6. T he under dose errors were evaluate d separately In total there were 35 different cases in which the target %EUD decreas ed with the simulated error. Of these, only 17 were rated by any one of the three rater s as an 8. Of the 17, about half showed complete agreement between the three evaluat ors. For those in complete agreement the minimum absolute % EUD change reported was 12.6%. For those in which there was not agreement the average under dose was 4.4% Of those cases where the EUD decreased, but the error was not scored as an 8 the change in t arget EUD was <5%. Also associated with an under dose is the percentage of the target that is covered by the prescription line. The average change in percentage coverage for all of the 17 under dose cases was 53% (14% min). For those that were not in agreement the average was 38%. Of those cases where the EUD decreased, but the error was not scored as an 8 the average change in coverage was 4%. F or automation of the consequence severity score, a cutoff rule was selected such that any target dose decrea se of 5% along with a percentage coverage loss of 10% would be reported as an 8 in the initial VROC feedback. The average s and maximum values of the consequence severity score s from the three raters were compared to the dose severity score calculated by E quation 6 1. This was to determine if the dose severity score could be used as an approximation to the consequence severity score. For all 60 cases, least square regression was used to cores to the dose severity metric. The best correlation between dose severity metric and any of the

PAGE 149

149 consequence severity metrics was found for the average inter rater score (R 2 = 0.675) Analysis without the 17 under dose cases improved (R 2 = 0.818) compare d to an individual rater of R 2 = 0.790. This indicates that the dose severity score can serve as a guide for the consequence severity score. Metric Analysis Head and n eck d ata c omparison The metrics from the head and neck cases were first investigated to determine how best to report the different metrics and to observe general trends in the data. A s ample of the metrics for the s tandard plan (without errors) and two plans with errors are shown in Table 6 2 for illustra tion The dose severity score and the consequence severity score that were given to these plans are included in this table for comparison. These samples were chosen because the errors had a severity score s > 0. A complete list of all of the metrics for all thirty of the head and neck cases is included with a discussion in Appendix D. The plan listed as Error 1 represents a spatial error in which the patient setup was approximately 3 cm different from the desired location for 2 of the 35 treatments. Error 2 in Table 6 2 represents a spatial error in which the patient was aligned to some other external marks that were on the patient instead of the ideal location. For this example, the patient was set to the wrong isocenter for 5 of 35 treatments. Two of the first obser vations that were made wit h the data were related to the mean t arget dose and to the TCP that were initially calculate d. The m ean target dose for most purpose s was redundant with the t arget EUD dose and therefore it was eliminated from reporting. It was, however, useful to check that the EUD calculations were being performed correctly for initial calculations The second observation was

PAGE 150

150 related to the t u mor c ontrol p robability (TCP). This value should be related to the likelihood of tumor control and shoul d approach 100% for a better control rate. The initial calculations of these values yielded unexpectedly low numbers for the standard plan even though they were clinically acceptable plan that would presumably have good tumor response. For the three clini cal head and neck plans, initial TCP values were 65.2%, 58.7%, and 61.3%. Another issue with using this metric was that it increases (or improves) for all increases in dose, therefore, for any over dose; the TCP will appear to improve. For these reasons, T CP was not recommended for final reporting. Because the TCP was not included the P+ values that were proposed (combined probability) were also not used for further analysis. For target volumes, the EUD to the target appears to be a good indicator of the o verall dose to the target structure. This metric and the percentage of target volume covered by the prescription line are indicators of the target receiving adequate dose. For the examples in Table 6 2 Error 2 shows that only 68% of the target volume rece ived the prescribed dose when this error occurred. This could be a useful in describing a lack of tumor control To also check that the changes in EUD were not overly sensitive to specific patient anatomy, the same ten errors were duplicated on a total of three different head and neck patients. The average standard deviation for all t arget EUD difference was 1.3 Gy out of the prescribed 70 Gy or < 2 %. The worst EUD correlation between the three patients was with one particular error where one field w as tr eated for extra fractions. In this case (#9 in Appendix D Tables ). T he standard deviation between the 3 patients was 10.1 Gy and indicates that for this particular type of error the difference in EUD could

PAGE 151

151 vary greatly from one patient to the next. In gene ral, the EUD differences were not sensitive to specific patient anatomy for the same type of target, for the same type of error. The conformity index (J) is related to the target volume and to the overall tissue volume receiving the prescribed dose. A low conformity could indicate either a miss of the target, an over dose, or an under dose. In the examples in Table 6 2, the change in the overall conformity was worse for the error that occurred for five treatments than for the one that only occurred twice. Initial observations indicate that the conformity index appears to change for all different types of errors, including dosimetric errors as well as spatial errors. In order to consider possible harm done to the patient due to the error, the NTCP and EUD fo r each of the normal structures were evaluated. Normal structures included in the head and neck cases included : b rainstem, s pinal c ord, p arotid glands, larynx, oral c avity, and e yes. Upon closer examination, the only normal structures for which the NTCP wa s greater than 1% for either the normal plan or the error plan were for the structures of ipsilateral parotid and s pinal cord. All of the data for the head and neck study are included in Appendix D. G eneral observations will be described here. Relative to the p arotid gland doses, only one of the three patients had high values of NTCP. This patient had an NTCP value of 48.8% for the standard plan. The other two patient NTCP values were <5% and indicate that the NTCP values are sensitive to specific patient anatomy and treatment plan. Another observation was that the change in NTCP would be greater if the NTCP is already high. This is the case for the patient illustrated in Table 6 2 The NTCP is already very high indicating the patient

PAGE 152

152 will likely suffer sid e effects to the parotid gland. Any changes to the plan due to errors will likely increase either the severity of those side effects or increase the likelihood of the side effects more so than for a patient that has an initial plan NTCP close to 0%. These observations indicate that side effects from errors cannot be generalized over different patients for the same tumor type and error. A similar observation was seen for the NTCP values for the spinal cord. These values were minimal for all situations excep t for one significant error in which the spinal cord would have received a potential dose incr ease above threshold of 45Gy. Again the initial observations indicated that for only one of the three patients did the NTCP change enough to indicate a significan t risk of injury. I nitial observations indicated that it may be useful to combine the NTCP values to eliminate reporting the structures that had NTCP values less than 1%. In order to create a simplified set of reported values, a combined NTCP tot could ind icate whether the trainee should be concerned with looking into patient side effects There is an added benefit in combining the NTCPs for comparing errors that occur on different patients with different normal structures from each other. Initial observat ions of the changes in EUD to normal tissues were done to help to determine the overall magnitude of the error The dose severity metric is based on the maximum percentage of change of dose. For these purposes, the EUD dose was normalized to the prescribed dose. For example, the right eye dose for Error 1 (Table 6 2) ch anged from 1.64 Gy to 2.37 Gy. This could either be calculated as an increase in 45% relative to the standard plan or a change of 1% of the overall dose (0.7 Gy out of 70 Gy prescribed dose). For comparing different plans and errors all EUD dose

PAGE 153

153 differences were normalized to the prescribed dose to determine how significant of a dose change occurred. The final metric that was proposed from Chapter 3 was the Detrimental Dose (DD). The DD was c alculated as a possible means of scoring or r anking the severity of an error and appears to scale according to the severity of the over all error. While the magnitude of the DD increases with an expected increase in severity of an error it is unclear how it relates to actual perceived severity or h ow it relates to physical dose. Multiple p atient m etric a nalyses U nlike the h ead and neck study, t he patients in this study were each treated to different prescribed doses and included different normal structures. Because there are also different number and types of structures for all patients only the combined metrics and normali zed metrics could be compared. This included the percentage EUD changes, c hange in Conformity (J), NTCP tot and DD along with the severity scores given to each of the different treatment plans. A complete list of all of these values for the thirty different treatment plans is included in Appendix E. Combination o f all e rror sets All of the data for both studies was combined to determine if any of the calculated metrics could be used to within the VROC in order to recommend a consequence severity score that would correlate to expert scores. A s already indicated, the consequence severity metric associated with under dose (score of 8) can autom atically be calculated based on a threshold of 5 % change in target E UD and change in dose overage of at least 10%. T he remaining analysis was to determine which of the metrics would best approximate the consequence severity score Because the VROC will i nclude patients

PAGE 154

154 with different anatomy and planned to different doses, the analysis focused on metrics that could be normalized and combined metrics that eliminate patient specific anatomical calculations. Linear regressio n analysis was performed on all da ta that could poss i bly be related to the severity. This included target EUD (Gy) % Change in t arget EUD, % target c overage J NTCP to t and DD. In addition to each of these values, the differences between these values and the standard plan values were al so evaluated. A total of 4 3 of the origina l 60 error plans were evaluated (the under dose plans were not included). Based on linear regression between the average consequence severity score and each of the different reported metrics, the metric with the h ighest correlation to the average severity score was the change in J. with a correlation of R 2 =0.467. The datasets were sorted by the average severity metric to determine trends in the metrics. Table 6 3 is the average for each metric grouped by severity. Some of the severity scores were combined due to the relatively few errors with their values. By grouping the values based on natural cutoffs the change in the conformity fit the consequence severity score with a correlation of R 2 of 0.92. The graph representing the change in conformity index with severity score is included in Figure 6 1 Conclusions In the training of radiation oncology residents there is very little opportunity for the resident to learn from their mistakes or those of others. O ne of the key requirements for the VROC was that it allows errors to occur in order to provide opportunities to investigate how errors affect patient treatment. Additionally, there are very few quantitative me trics that can be used to either score the severity of an error that occurs or to evaluate a trainee regarding their ability to manage a patient. By evaluating a set

PAGE 155

155 of possible metrics that are associated with radiation therapy, we have identified a simpl ified set of metrics to use as feedback for the VROC system. Contour metrics can be easily calculated by comparing two separate structure set files. The D ice s similarity metric and the calculated volumes and percentage values were tested for accuracy whi ch indicates that the code was functional Further testing could be done to test the overall usefulness in evaluating there m etrics as they relate to errors. The remaining study focused on the metrics associated with the composite treatment plan. The metri cs were compared to a scoring system that is being implemented for a national ra diation error reporting system. The scoring system includes a d ose severity score and a c onsequence s everity scores One result of the study using three evaluators to score dif ferent treatment errors indicated that the scoring system may be prone to inconsistencies in how different experts score different errors. As part of an additional study, it may be useful to better describe the values associated with the different scores in order to improve the consistency. Listing specific temporary and permanent side effects o r describing specific interventions necessary to treat the radiation induced side ef fects could help provide standardization. A possible study of interest to furthe r investigate this scoring system would be to have more experts score th e errors and in addition provide feedback about the specific toxicity that caused them to rate the error with a particular score. Based on this information, the scoring system could be changed to better describe the consequences of a radiation error.

PAGE 156

156 With respect to the target volume metrics the following two conclusions were made. First, the calculation of the TCP is not useful in comparing errors since it appears to improve with erro rs. Another reason not to use TCP is that for relatively low doses such as palliative care, the TCP is not meaningful. Second, the changes in EUD with different errors on a common disease site indicated that the change in EUD with errors is not sensitive t o patient specific anatomical changes and serve s as a good indicator for the overall plan dose change. A non linearity in the consequence severity score was noted for target volume under dose. For the VROC an under dose in the target will be calculated by a decrease in the target EUD. A consequence severity score of will be scored for an error if the target dose decreases by 5% and the target coverage changes by 10% Overall, more testing must be done to determine specific cutoff values, but this would require additional validation of the expert scores. The remaining analysis was done to try to describe the relative risk to normal tissues as a result of the error. Historically, errors were reported primarily based only on the ch ange in target volume. The state of Florida has change d the required reporting guidelines to include Code 64E 5.101) (88) The physician needs tools to help them to determine if an uninten ded permanent functional damage would occur as a result of the error. The NTCP tot is theoretically useful to determine if there is an increased risk to the p atient as a result of an error. C hanges in NTPC tot could indicate that the physician

PAGE 157

157 should inves tigate further to determine which structure is most at risk. There were two specific is sues with the correlation of NTCP to t to t he consequence severity score. The first was an issue with the scoring system itself, which was already described. The second wa s that for some plans such as palliative cases, there were not enough structures contoured around the target volumes in order to represent the error. For consistency, all of the tissue surrounding a target volume including u nspecified soft tissue and bone would need to be consistently defined from one patient to the next so that all patients will have st ructures near the target volume from which to calculate NTCP. To better improve the correlation, th is study should be repeated with all normal structures c ontoured on all datasets and all structures included in the NTCP calculation s A similar problem to that described for the NTCP calculation was obse rved for the detrimental dose. The DD could possibly be a good indicator for the overall severity of the er ror. Theoretically, the weighting of different structures by the sensitivity as well as the dose change would appear to correlate with the actual perceived severity of the error. The data analysis from this study did not indicate this correlation. This was most likely due to the variety of different normal tissues used in the calculation. In order for this metric to be correlated to a health physics concept the overall burden on the patient must be considered. All of the irradiated volume must be accounted for in one of the structures in order for the calculation to scale appropriately for a variety of cases. Lastly, the metric that was surprisingly well correlated to the consequence severity score was the conformity index (J) This is a calculation of how well the prescription volume matches the target volume. This metric is sensitive to both

PAGE 158

158 dosimetric errors as well as spatial errors Conformity also changes for both under dose and over dose situations. Additionally, the metric itself is normalized betwee n 0 and 1 making it easier to compare over all diseases and prescribed doses. The average change in J was correlated to the average consequence severity score in order to provide an equation that could be used within the VROC feedback code By using this e quation, a consequence severity metric can be calculated in the VROC. It is important to note that the error bars are very large on this metric, and the correlation was not strong, therefore this severity score is only a recommendation and is for education al purposes only. Examples of the recommended VROC feedback are shown in Chapter 7 which includes a complete example of an error and how the feedback metrics can be used for an error study. The output from the VROC feedback report include the prescribed d ose the change in EUD to the target volume dose and consequence severity metrics, change in conformity, and change in NTCP tot A second table report indicates all target and normal tissue EUD and NTCP values. While the severity s cores given in this repo rt for the trainee to determine their own recommended severity scores when completing the root cause analysis.

PAGE 159

159 Table 6 1. Data calculated All data c ollected for c ases PTV v alues ( 1 ) Volume ( cc) ( 2 ) EUD (3) TCP ( 4 ) Mean d ose ( 5 ) Coverage by Rx line Normal s tructure (n) ( 6 ) EUD ( n) ( 7 ) NTCP ( n) ( 8 ) Normal CC > R x Combined J (from line 1, 5, and 8) DD (from line 2 & 6)

PAGE 160

160 Table 6 2. Example of c alcula ted metrics for a single h ead and n eck case Metric Baseline Error 1 Error 2 Prescribed d ose (Gy) 70 70 70 # Fractions 35 35 35 # Error f ractions 0 2 5 Dose severity m etric 1 5 Consequence severity m etr ic 0 6 EUD PTV1 75.75 75.17 71.10 % Coverage by R x 97.09 96.09 68.20 CI (J) 0.74 0.75 0.61 EUD b rainstem (Gy) 22.24 22.16 22.33 NTCP b rainstem (%) 0.00 0.00 0.00 EUD ipsilateral p arotid (Gy) 46.31 47.07 47.91 NTCP ipsilateral p arotid (%) 52.72 59.13 65.74 EUD contralateral p arotid (Gy) 19.73 20.97 24.25 NTCP contralateral p arotid (%) 0.00 0.00 0.00 EUD rt e ye (Gy) 1.64 2.37 7.69 NTCP rt e ye (%) 0.00 0.00 0.00 EUD lt e ye (Gy) 1.12 1.53 5.35 NTCP lt e ye (%) 0.00 0.00 0.00 EUD l arynx (Gy) 2 0.61 21.67 23.78 NTCP l arynx (%) 0.00 0.00 0.00 EUD oral c avity (Gy) 57.97 58.62 57.98 NTCP oral c avity (%) 0.57 0.69 0.58 EUD spinal c ord (Gy) 34.66 34.57 34.80 NTCP spinal c ord (%) 0.06 0.06 0.07 Combined r isk (%) 53.02 59.43 65.96 DD (Gy*) 0.00 27.22 113.80

PAGE 161

161 Table 6 3 Average consequence s everity vs average combined m etrics Average s everity Avg. NTCP tot Dif Avg.CI Dif n (points) 0 0.01 0.02 0.01 0.03 10 1 0.00 3.04 0.02 3.04 9 2 0.09 3.66 0.08 3.66 7 5 3.76 3.61 0.28 0.20 5 6 28.64 29.75 0.26 0.22 7 8 24.81 37.78 0.43 0.29 5 R 2 0.71 0.92

PAGE 162

162 Figure 6 1. Absolute c hange in c onformity vs a verage s everity s core y = 0.053x 0.016 R = 0.973 -0.05 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0 1 2 3 4 5 6 7 8 9 Absolute Change in Conformity Average Consequence Severity

PAGE 163

163 CHAPTER 7 ROOT CAUSE ANALYSES AND MEDICAL EDUCATION Background One of the benefits of using a virtual reality system for training is that errors and other complications can be introduced into a simulation that cannot be allowed to occur in the normal clinical process. In addition to providing opportunities to teach trainees how to identify errors it also provides an opportunity to introduce discussion of cause and prevention. R oot cause analysis is often used to understand the system features that permitted the error to occur. Error mitigation and investig ation is r equired curriculum for medical residents and medical students, however, there are no details on how this curriculum should be incorporated into training, and there is no standard for testing on error reporting or on error mitigation To Err is H uman (89) is one of the most cited references regarding medical errors. Two major recommendations from this book include the need for consistent definitions of medical errors and establishment of national reporting systems of medical errors. Currently The Joint Commission (TJC) defines a drastic error as an unexpected occurrence involving death or serious physical or psychological injury, or the risk sentinel event (90) These are required to be reported to TJC. The state o f Florida uses the term medical event rather than sentinel event (88) but is a simila r reporting requirement for serious injury or death. The recommendations are not as clear for those situations where side effects can be managed or for errors that were caught prior to harming the patient. The recommendation to record all errors, including is based on error mitigation studies that indicate investigation of all errors provide information to better structure quality assurance programs. This is the reason

PAGE 164

164 there has been increased interest in the radiation oncology community to cr eate a national incident reporting system (56, 77, 91) The benefit of widespread use of an incident learning system is the opportunity to evaluate the quality program at o One of th e most widely used methods for investigating the cause and effect of specific errors is through t he use of a Root Cause Analysis (RCA). An RCA is conducted for any of the errors that are reported to The Joint Commis sion or to the State of Florida. T he goal of RCA is to evaluate the error without blame to determine methods to prevent the error in the future There are several different tools available to help in conducting an RCA. For implementation into the VROC system, only the most common RCA methods wer e evaluated. T hree common methods of conducting a root cause analysis are: 1.) Ask why 5 times 2.) Causal tree 3.) Decision table, (92) The entire process typically requires a designated committee to interview each individual involved to determine all factors of a particular event. The results of the RCA are to highlight the main reasons that the error occurred and to identify specific measureable improvements that can be implemented to resolve the issue. L imited studies are found that evaluate the utility of allowing residents to be involved in the RCA A study by Voss et al (93) pil oted a new curriculum for medical residents that specifically addressed how to handle and respond to errors and address safety concerns This curriculum involved several didactic lectures as well as a project where the residents were required to conduct a root cause analysis and develop a quality initiative project. They report ed successfully implementing 15 out of 25 unique quality improvement measures into routine clinical use. These quality improvemen ts

PAGE 165

165 ranged from the addition of phone lines to improve communication between doctors and nurses in urgent care to the removal of possibly fatal gases from a gastronomy order form. Additional insight from this study was that the curriculum positively changed the opinions about how the y or responded to adverse events after this exercise They noted on experiences were worthwhile to most of them and that training was successful in to a systems a nd they e to O ther conclusions and observations positive review of their experi ence with the RCA exercise included : 1. Having real experiences with an error helped the resident appreciate the importance of safety 2. The resident needed to be able to see a measurable change (in quality and safety) 3. Trying to implement too many projects mea 4. It is important to involve all parties in the analysis of an error. Their ongoing recommendation also includes better assessment of learning objectives within the curriculum. Huffman Dracht et al conducted a survey o f all residency programs in emergency medicine and several other areas of clinical specialties (94) The goal was to determine if emergency room residents are better trained on how to handle a medical error compared to residents in other departments What this research found was that only about 31% of non emergency room residencies reported knowing whether it was mandatory to report a medical error to the state and in what situations it was required to write a letter to the patient or to apologize to the patient. There were no formal

PAGE 166

166 curriculum guidelines requiring a certain number of lectures on the topic of errors, however, most resident directors reported at least 3 4 lectures and the required presence of residents at morbidity and mortality confer ence to discuss deaths f rom errors within the hospital. While this is not related to radiation oncology, it would be reasonable to assume that radiation oncology residents are no more prepared in handling errors than residents in other areas of medicine wi th the exception that as part of their required radiation physics training they are review state licensing and regulations for radiation errors. In another study Dror et al. (95) evaluated a new approach to training on error mitigation. This study suggests that focusing on error recovery rather than error reduction helped to alert residents to e rrors that could occur. The summary indicates that the advantage to focusing on error recover was improved skills and knowledge necessary to detect and mitigate errors This focus on error recovery at the point of an error also minimized consequences of er rors. This means that stressing the correction of errors as soon as they occur can prevent a further escalation of the error. Also noted was the importance of quick feedback during initial training One other note was that p ractice cases should focus on ex aggerated errors in order to emphasize the error within the training scenario. Another technique that was described by Dror et al. was to start the trainee by detecting errors of others, to work up to more complex situations and also to situations in which there are many distractions. A similar study by Vincent et al (96) evaluated training in safety in medicine and came to the conclusion that a better method of engaging teams in training abo ut errors was to require teams to identify and diagnose errors. This study was conducted by

PAGE 167

1 67 having students watch a procedure where a patient is given the wrong medication and to use this to discuss the process of identifying and detecting errors. The resi dents were then encouraged to find errors that other had make and to discuss these errors with the end goal of prevention. This exercise could be extended to radiation oncology residents by using the VROC simulation in order to illustrate a number of diff e rent errors to new residents. This might allow many more opportunities to see different cases while not (97) This may also be an opportunity for continuing education or for recertification where the root cause an alysis and error mitigation could be conducted within the constraints of a training session. As has already been described, there are no specific errors training curriculum recommendations for radiation oncology. There are however, new specific error repo rting guidelines that should be incorporated into the errors training already in practice. P ublished in Medical Physics in December of 20 1 2 (77) are recommendations for data requirements for designing a reporting sy stem. Included in the recommendations are the two different error scoring systems used to evaluate the metrics in Chapter 6. Also i ncluded for replication and to provide consistency in error reporting is a hierarchical chart of root causes in radiation oncology. While Ford et al. do not make recommendations on how to arrive at these root causes, this common set of root causes can help to provide insight into quality assurance programs and also should be taught for consistency. Summarizing the cited studies above there are several key features that should be included in a new curriculum for medical residents in radiation oncology, spec ific for error mitigation training. Thes e are summarized below. T hese guidelines provide

PAGE 168

168 educational goals in developing a root cause analysis tool that will complement the VROC system. H ands on training is best (9 3) Use recommended RCA methods (92) Allow residents control over measureable quality improvement methods (93) Include didactic info rmation on how and when to report errors (94) Focu s on error recovery (95) Focus on error detection (in their own (97) Apply Causation tab les from ASTRO recommendations (77) Development of RCA Exercise A root cause analysis tool for residents using the VROC s ystem will consist of two parts; a d idactic curriculum and a clinical exercise in conducting an RCA. While some recommended reading and references are included within the Resident Manual for the VROC, the didact ic curriculum for the radiation oncology resident is the responsibility of the resident director and in situations where RCA is already covered, it may not be necessary to have a complete course on RCA techniques Some of the best tools that have been foun d for training are referenced below to provide enough background information for the resident to complete the exercise. The two articles included in the VROC Trainee manual are the radiation oncology reporting guidelines (77) and the description of the three most com mon RCA techniques (92) The best resources for one trying to investigate how to perform a root cause analysis are found through a couple of organizations that are available over the Internet These ar e the National Patient Safety Foundation (NPSF) (98) and the National Center for Patient Safety (NCPS) (99) The NPSF was founded in 1997 and has as its primary urriculum in safety education both throu gh continuing medical education ( CME ) courses that can

PAGE 169

169 be scheduled and taken online. They also offer a list of courses that one can attend. A recent board certification organization has been created in conjunction with NCPS. T he Cer tification Board for Professionals in Patient Safety has created a certification process to vest professionals interested in illustrating proficiency in t he patient safety initiatives. This further helps to standardize training and the e rror review process. This board also provides information on error analysis and review and together with the NPSF help to promote go od practices in patient safety. The other organization that has a strong web presence and a lot of helpful tools for patient safety review is the National Center for Patient Safety (NCPS). This patient 1999 to assist in error reductio n within the VA organizations. The website www.patientsafety.va.gov (99) has several tools to assist in the RCA analysis process including a flip chart that can be used to specifically guide the investigator through a root cause analysis. This flip chart can be ordered through the website listed above. One important aspect of the RCA is to require the t rainee to review current state and federal reporting requirements. Within the state of F lorida the recommendations for all medical aspects of radiation and the required reporting for adverse effects can be found in the Florida Statute 64E 5 .101 (88) Copies of the specific sections related to linear accelerator based therapeutic radiation uses are included in the VROC Resident Manual for reference. The second aspect of the root ca use analysis curriculum for the VROC includes a tool to provide hands on training in t he RCA process along with each virtual p atient that encounters an error. The developed RCA tool is a combination of two different RCA

PAGE 170

170 techniques. U nlike a real RCA situation, within the simulation of the VROC, the trainee works ind ependently and does not h ave a team to assist in the error investigation. This limits the use of s ome of the more arduous RCA processes involving complicated flow charts and diagra ms that are cumbersome and time consuming and may be more confus ing than helpful within the VROC sett ing. For these reasons the VROC Root Cause analysis tool 5 Whys RCA technique along with the published c ausation table from the ASTRO/ AAPM error reporting recommendations (77) The techn ique for error analysis has its origin in the Toyota Automotive Company (100) where it was used to per form error analyzing to improve manufacturing. The main reason this technique was chosen for the VROC Root cause analysis tool is that it is simple to describe and can be easily implemented Another advantage is that it does not re quire additional resources. Also, this technique can be conducted in the absence of a management team and the resident can be prompted through the steps with a simple set of questions. To use this technique, the person performing the error analysis begins by askin g the question epeated five different times to hopefully identify the root cause by investigating the cause and effect relationship along the process. If the final outcome is not an actual root cause the process can be repeated. Issues with using this particular techniqu e are that it sometimes does not lead to the actual root cause, or it may identify an aspect of the situation that does not follow cause and effect relationships.

PAGE 171

171 To assist in determining if the trainee has identified a root cause, a published causation t able is also used in the VROC RCA tool (77). The causation table is another technique for conducting an RCA. It is typically up to each individual institution to determine their own set of cause and effects tables based on their specific workflow. This can take many hours to complete for the complicated radiation oncology workflow. While this may be a useful exercise for developing a quality assurance program, it is too time consuming for implementation within the VROC. The newly published common cause of r adiation oncology errors groups these common causes by six categories: organizational, technical, human behavior, patient related, external, or procedural. By implementing a rule based set of tables, it is relatively easy to guide the trainee through the c ausation table to identify a specific cause that could have l ed to an error within the VROC v irtual p atient treatment. One of the most important exercises from the RCA is to propose a quality improvement plan. O nce completing a residency program, most cer tifying agents such as The American Board of Radiology (ABR) (101) require the board certified physician to maintain certification through continui ng education and required documentation of participation in process specific requirements related to quality improvement. Most certified physicians participating in MOC (Maintenance of Certification) will need to develop a quality initiative and carry it out either by enrolling in national protocols or by conducting a chart review to establish metrics of quality and proposing improvement. The ABR lists several references for the user to read more about quality metrics and how to develop a storyboard for a patient quality improvement project (PQI). The general principals of the storyboard process and also of developing patient quality

PAGE 172

172 programs are used in proposing a solution to the error that occurred within the VROC system. One commonly accepted method for quality improvement is the plan do study act cycle. Resources on how to perform these quality improvement procedures can be found through the Agency for Healthcare Research and Quality (AHRQ) (102) The steps in the process involve planning the improvement strategy develop ing the strategy, monitor ing the strategy, and reassess ing the strat egy. For this exercise, there will not be time to conduct the monitoring and reassessment, but the trainee should take the time to plan a strategy including the specific quality assurance plan to be implemented and the required personnel in order to implem ent the particular quality improvement. For development, the trainee will be required to list specific objectives for their initiative, how they will implement the initiative and what measures will be evaluated to determine if the strategy is working. Alon g with this quality metric the resident should think through barriers to implementation, budgetary constraints, and give a reasonable time frame for implementation as well as evaluation of the strategy. Within the VROC system, once an error occurs that goe s undetected for more than two treatments, the simulation will be stopped For all er rors, the metrics and feedback data that w ere described in Chapter 6 will be presented to the trainee by means of a summary report. The report will include the dose severi ty score and a recommended consequence severity score. For any errors with severity score of 1 or higher the RCA form must be completed A critical aspect of the exercise is to force the trainee to evaluate cause and effects of errors within a radiation o ncology department To begin the RCA process, the trainee can access the VROC RCA F orm through an

PAGE 173

173 icon on the desktop of each of the VROC workstations. This form is also available through the UF Digital Archive (Object 4 6). A printout of the RCA form is also included in Appendix G In addition to the error scores and metrics, in some instances error s occurs at the treatment machine when the physician is not present In this situation, it is important to get information from those involved in the error to determine how and why the error occurred. When an RCA is conducted in a real clinical situation, those involved in the error are interviewed to determine what occurred. In order to simulate a similar exp erience, the current VROC system will include written statements from the therapists to describe what happened when the error occurred. These written statements are either fictional accounts of a particular scenario written by experienced radiation oncolog y team members or are accounts taken from actual reported errors. Future versions of this RCA tool may include the use of videotaped reports of the therapists or even interactive v irtual r eality interviews with scripted avatars. Another option is the use of actors playing the role of the therapists to conduct a root cause analysis to provide real ism in this interview process. The priority of this work is to help to who hav e made reporting guidelines a nd to focus their attention on q uality improvement measures; therefore the logistics of the interview and investigation are left for fur ther development in the future. Example RCA The best way to illustrate the RCA exercise is to walk through a particular error situation using the RCA form. The example below is taken from one of the head and neck patient errors that was simulated for the metric calculation of Chapter 6. Below is a

PAGE 174

174 description of the erro r that occurred within th e VROC including fictional reports from two fictional therapists who describing the error. The simulated error occurred on a patient who was receiving radiotherapy for oropharyngeal cancer with IMRT treatment with daily IGRT. The p atient was aligned using a cone beam CT for daily imaging. T he auto alignment feature of the software caused the image registration to be performed to the wrong vertebral body in the neck This was n o t caught while the patient was on the table. The doctor did not review the images The same thing happened two days in a row. Sample Therapists R eport s Two different therapists were interviewed in the process of trying to determine what went on for this particular error: Below are two fictional accounts the two therapists who were in volved in this error. Therapists 1 I was running the console and my partner was setting up the patient. Both of us went into the room, asked the patient their name, and checked that His chart was pulled up on Aria We both put the patient in the mask, but I realized I had not programmed up the machine for the CBCT. I went out of the room to select the CBCT and waited for my partner to come out of the room. After she ca me out of the room I started the CBCT. Once the CBCT was ready to review I selected a reg ion of interest and clicked on the auto button. The shifts seemed ok (~ 2cm) so I applied the shifts, called for Time Out to verify the p atient and started treatment. As far as I remember the sam e thing happened the next day. a bou t the setup or about the alignment and I didn't notice anything out of the ordinary with the shifts. After the second treatment the doctor called saying that the images were bad and rejected the image s. The doctor also paged the physicist. He asked us t o write down what was done on these two days of treatment and informed us that the patient was treated off by about 2 3 cm both days. Therapist 2 I went into the room to set up Mr. XX I waited for my partner to mode up the machine. Then checked the final p osition matched what we had marked f rom the first day of treatment. I came out of the room and my partner started the CBCT. Another patient interrupted me to discuss their schedule so I did not check the image fusion. After my partner applied the shifts sh

PAGE 175

175 double check t hat this was the right patient. The next day she was again running the console and I set up the patient. We both went into the room to put the mask on and verify the patie nt. After coming out of the room, a nurse had a question about another patient who was supposed to come to see the doctor after treatment. Again, at the Time Out I stopped to verify that we had the correct patient on the table and we started the treatment I d reviewed the images before treating the next day. Also, I was busy both My partner said that she reviewed the image registration and it look ed good. When the doctor called to tell us he was rejecting the films we looked and saw what he was talking about. Scoring the Error To complete the VROC RCA Exercise the trainee will refer to the feedback reports they will receive from the VROC system. T he tw o different feedback reports for this example are shown in Table 7 1 and Table 7 2 T he RCA form requires the trainee record the percentage change in dose to the target volume and to the two norma l structure s with the greatest dose cha nge. These values are taken directly from Table 7 1 and Table 7 2. The trainee is also asked to determine if the error is a recordable event or a medical event as determined by Florida state statutes 64E 5.101. The tr ainee is then asked to evaluate the consequence severity score. A copy of the recommended reporting guidelines is included in the RCA form, and r ecommended values for this specific case based on the metric calculations are included in Table 7 1. It is up t o the trainee to verify that these are appropriate values. For this example the dose severity score is 1 and the consequence severity score is 1. Review of the E rror The next steps in the RCA process are to have the trainee describe the er ror. This includes describing why they think the error occurred and how the error was detected. It may be necessary for the trainee to locate and r eview any hospital or clinic guidelines that they may not be familiar with. Many times the root cause may inv olve

PAGE 176

176 someone not following the policies and procedures or may involve outdated or inaccessible guidelines. Other common root causes are inadequate training or documentation about how equipment works. The trainee will need to investigate these prior to com pleting the root cause analysis. For the specific error in this example, some possible guidelines that may need to be reviewed include: T ime out policy prior to imaging or treatment Timeliness of I mage review and approval Online image review policies ( at the treatment machine) Daily use of the image registration software (is register ok?) Policies on d istractions in the console area Procedures for IGRT when re imaging is required Once the resident has had a chance to review the policies, proce dures, or guidelines that are in question, they can begin the root cause analysis. The next lines on the RCA form are related to the 5 Whys technique. The RCA form has room for the trainee to repeat this process twice, but if they are not satisfied with the direction they end up going, they are welcome to continue repeating it until they identify a root cause that can be found in the causation table The example below illustrates how this would be done for the geometric error in this example Error: Patien t was treated 3 cm off from intended location for 2 treatments. Illustrated: Why # 1 : Why was the patient treated 3 cm off from intended? Cause : Because the patient was shifted 1 vertebral body off from intended Why #2: Why was the patient shifted 1 vertebral body off? Cause: Because registration was to wrong vertebral body. Why #3: Why was registration to wrong vertebral body ?

PAGE 177

177 Cause: Use of Auto alignment without evaluation ? Why #4: Why was auto used without evaluation ? Cause: r time out and image review procedures Why #5: Why were procedures not followed? Cause : T herapist was unfamiliar with the procedure In this particular example the root cause indicates that the problem was related to the procedure s for image guidance. Th e reason the procedure was not followed could be because the therapist was unfamiliar with the procedure or because the proce dure was never implemented, or could even be related to a willful decision to not follow the procedure. Even though there may be mu ltiple procedure issues, the one that specifically lead to the event in question was that the images were not reviewed prior to initiating the patient daily treatment. There may be multiple methods for making sure ime physician has reviewed images prior to treatment. Any of these could have prevented this particular issue as could a required re imaging process after a shift of more than a few mm. While the trainee may come up with many different problems or causes, for this exercise they only will need to identify only one Causation Table Once the trainee has completed the from the published table of radiotherapy causes the corresponding r oot cause M ost of the errors will fall within the categories of o perational, t echnical, or h uman factors. The operational includes anything regarding procedures, routine training, and also staffing policies The technical area includes any equipment failu re and issues related to machine service or maintenance. Human issues for this purpose relate to human

PAGE 178

178 resource issues such as staffing needs or budgetary concerns. For the specific example, the trainee should select Operational Management (which inc ludes all policies & procedural issues). The dropdown list for o perational m anagement includes Policies, procedures and regulations tab. After selecting this, the trainee can identify the specific issue with the policies that relates to the root cause, i n this case Policy not followed The entire table for this causation form is taken directly from Ford et al. (77) Propose a Quality Improvement The trainee can now propose patient quality indicators that are specific to the error in question. The proposal requires a list of committee members, an implementation timeline, measurable endpoints and roadblocks to implementation Specific for this example the quality metric that may need to be implemented may include a review of all of the policies and procedures to make sure they are up to date and a process for making sure all team members are f amiliar with all policies Details should include a timeline to complete the review of the policies (such as 6 8 weeks), and the list of committee members including team members from physics, therapists, and nursing. The measureable endpoint would be the n ew updated policies and procedures along with a proposed method to document that everyone in the department is familiar with the procedures. The plan may also need to include a way to encourage new employees to review the policies and to check that every one in the depar tment continues following them. Possibly a database, or required annual sign off could provide documentation that all are familiar with the procedures. Summary The root cause analysis t ool as part of the VROC radiation oncology training sy stem provide s an opportunity to describe an error that has occurred to a virtual patient

PAGE 179

179 under the care of the trainee. This is an excellent opportunity to go through a mock root cause analysis to re evaluate policies and procedures and areas quality assur ance systems need to be implemented to prevent errors. As part of the process the user is asked to review Florida s tate reporting guidelines to determine if the error must be reported. Several specific questions about why the error occurred are used to gui de the user to the root cause and a set of causation table s based on recommended reporting guidelines were included to help the resident choose a common root cause within radiation oncology. One important aspect of the training exercise is to provide a few leading questions to help the trainee thing about quality improvement projects, timing, and implementation strategies. All of these questions and an swers can be emailed to the faculty physician to facility some points for the final debriefing time between the resident and the instructor.

PAGE 180

180 Table 7 1. VROC s ample t reatment r eport Metric Value RX Dose ( Gy) 70 Error: dose m etric 1 Error: consequence m etric 1 % Change EUD t arget 0.82 % Change OAR m ax 1.78 Structure for m ax OAR Ipsilateral p arotid NTCP tot change 6.4 2 Conformity c hange (J) 0.0 2 Detrimental Dose 27.22 Table 7 2. Expanded VROC treatment report Expert Delivered Expert Delivered EUD (GY) EUD (GY) % COV % COV PTV1 75.75 75.17 97.09 96.09 PTV2 49.6 3 53.2 9 94.53 93.92 NTCP NTCP Brainstem 46.31 47.07 0 .00 0 .00 Right parotid 46.31 47.07 52.73 59.13 L eft parotid 19.7 3 20.97 0 .00 0 .00 Right eye 1.64 2.37 0 .00 0 .00 Left eye 1.1 2 1.53 0 .00 0 .00 Larynx 20.61 21.67 0 .00 0 .00 Oral cavity 57.9 7 58.62 0.58 0.69 Cord 34.66 34.57 0.06 0.06

PAGE 181

181 CHAPTER 8 ACCEPTANCE TEST AND INITIAL USE OF VROC Background In software design as well as in simulation there are various methods of both verification and validation. Verification is the process where the tool that is created is tested to determine if it performs as it was described It is essentially debugging and te sting that code is functioning. The validation process is done to prove that the application performs a specific test or that it tests a specific construct. There are seve n different types of validation that can be utilized for various learning tools. These are face validity, content validity, criterion (or concrete) validity, construct validity, convergent validity, and discriminate validity (103) Face validity the system appear to adequately model the real wo surveying a group of experts in a subspecialty who have opportunity to spend some time with the system to determine if they feel the sys tem matches the real world. This type of validity does not test that the system actually mea Content validity is concerned with evaluating the overall data and simulations within a system. Typical content validations will survey a g roup of experts in a subspecialty to see if they evaluate the system as adequately representing the range of experience that is expected within a simulation (104 106) There are two reported metrics that are used i n calculating content validity from evaluations by a set of experts ) can be easily calculated from a multi level (or Likert scale) score where the users are expected to rate

PAGE 182

182 various items within the virtual system based on how essential an item is towards the system matching the real world or meeting the educational goals (107 108) Content Validity Ratio (or Content Validity Index CVI) is calcula ted from simply taking the ratio of those raters who evaluated an item as essential (n e ) relative to all other raters (N) and is calculated as: (8 1) It has been noted that this simple calculation does not discriminate between the possibilities of a chance agreement among also suggested as a means of calculating inter rater agreement because it the probability of agreement due to change (Pr(e)) is subtracted from the observed agreement (Pr(a)) (108) (8 2) The values of both the CVR and the are between 1 and +1 where the higher positive score is correlated to higher level of agreement. By using a 1 tailed t test, with 0.05 for significance, a minimum CV R value is calculated for different numbers of users. For example for N=5, CRV must be at least 0.99, and for N=10 CRV must be at least 0.69 in order to prove content validity. For the Kappa value, any positive value indicate s that the agreement is better than that of chance alone. Content validity can be handled by adequately designing the system after having searched the literature to determine appropriate cases. For this project content validity can be established by desig n since the s ystem was built on commercially available

PAGE 183

183 software The cases for v irtual p atients are taken from actual patient s that have received treatments and the errors are taken from reported errors in the literature. Construct validity would further t est that the VROC metrics can be correlated to learning and that the overall system can be used t o teach novice users about the e ffect errors have on the overall patient treatment. A future study would involve multiple users treating several patients withi n the VROC system. Based on level of experience, assuming the same cases are used, the overall severity scores from the more experienced users should be near to zero and the number of errors detected should be higher. T he endpoint that i s most often studi ed for differences between two different groups using simulation training is the time that it ta kes to perform specific tasks or how quickly the task is completed without an error (105, 109) The particular endpoi nt of time to complete any specific procedure in radiation oncology may not be a good indicator for differences between novices and experts. As was indicated on Figure 1 1, the radiation oncology process takes several weeks to complete, and some tasks may routinely take days to complete. While the doctor may have a few days to complete a specific task one time. What may be a better indicator of differences between an experienced physician and a novice in radiation oncology is the number of patients that they can successfully manage. For the initial VROC implementation for training, one proposed endpoint would be the number of patients that they can manage without miss ing errors in the plans or treatments.

PAGE 184

184 The convergent and discriminate validity have to do more with the overall theory behind the test that is used to measure the validity. Convergent and discriminate validity are used to compare the simulation with reali ty or a gold standard to determine if the simulation is in fact correlated to what it should be correlated with and that it is not correlated with those ideas that it should not be correlated with. These ideas are typically more useful in the physical scie nces where a study can have a theoretical probability of occurring and the validation of the experiment to the theoretical can be tested. Methods Once the VROC was developed along with user manuals and feedback tools, the system had to be validated. The in itial use of the system included an acceptance test to check functionality of the VROC system. The goal of the acceptance test was to demonstrate that all aspects of the VROC system function as intended The acceptance test document was created from the de velopment table for the VROC system. In some cases certain aspects of the system do not work as initially planned. T hese were noted so changes could be made for future revisions of the system For the purposes of the acceptance test, a single user familiar with radiation oncology but unfamiliar with the VROC completed all of the steps of the RO process through the VROC system. A set of initial v irtual p atients was created to provide examples to the instructor and to the trainee. A v irtual patient request fo rm (described in Chapter 4) was compl eted for each of these patients. This form, in addition to providing opportunity to the instructor to write learning objectives is used to populate the Virtual Patient database The entire list of initial v irtual p atie nts is included in Appendix F Each patient

PAGE 185

185 case has two different scenarios that can be selected for training one with a dosimetric error and one with a spatial error. The instructor could choose to use both errors. Additional errors can be created for a ny of these datasets by completing the v irtual p atient request form with the additional information In that case, each specific scenario should be described to ensure that the simulation is modeled correctly and meets specific learning objectives. After t he acceptance test was completed, a validation test was performed to evaluate the overall realism of the VROC. Five different users were asked to evaluate the VROC by review ing two different virtual patients through the entire RO process. Each of the diffe rent primary functions of the simulated RO workflow was evaluated relative to realism and to the utility of the feedback given A workflow diagram specific to the trainee within the VROC system was used to help navigate those completing the validation test through the VROC system. Figure 8 1 is the diagram indicating how a trainee would manage a virtual patient through the VROC. The five users were medical physicists with a variety of experience in radiation oncology. After vie wing the two cases within the VROC system they were asked to fill in a survey of overall impressions of the VROC. They were also asked to make comments about the system including re commendations for improvements. Results Table 8 1 is the completed a cceptance test procedure (ATP) The items are marked as either pass or fail as indicated in the table. The items that were marked as FAIL include populating the physician schedule once a patient is assigned, and a detail ed log file to record time spent using the VROC. The current VROC system uses

PAGE 186

186 commercially available software that can track users log in times. Reports for evaluating log in time are standard and can be run, but for the ini tial testing and development this was not inclu ded as a priority and was left for future work. Also, the scheduling of items to show up on the physician schedule can normally be done within the commercial product, and therefore testing of this application was not completed. This was intentionally left for future testing when multiple patients would be scheduled for multiple users. For validation the five users were asked to rate each of the VROC items based on realism. A summary of the rankings of the different users is included in Table 8 2 The item that had the highest marks for realism was the con touring exercises. The realism is obtained because of the use of commercially available contouring tools. The item with the least realism was the consultation. Comments indicate that this was not realistic because it only included documents to review. This was a known issue within the VROC because it was designed specifically for the treatment planning and treatment review portions of the clinical process. Future development coul d focus on the consultation portion of the RO process, but this would require development of face to face virtual reality simulation The users also rated the usefulness of different feedback. Feedback of the VROC can be given after contouring, at the tim e of treatment plan approval, and at the conclusion of the simulation. The feedback that received the highest marks for utility was the report after initial contour comparison. The metric feedback at the time of the treatment plan approval was felt to be t he least useful compared to that given at the end of treatment.

PAGE 187

187 Based on the v alidity statistic of the CVR with only five evaluators all five would have to agree in order to prove validity. With so few raters, none of the items were in complete agreement. An additional study with more users would provide additional validity of the overall content of the system. It would also be required to have a variety of raters including multiple physicians as well as different users with all levels of experience. This initial validation study was also done to gain insight for future developments. Comment sections on the survey included opportunities to report on the specific aspects and to list spe cific recommendation s for improvements. Summaries of the comments are listed below (similar comments were omitted). Liked Best Integration within a clinical treatment planning and R&V system The realistic nature of the errors used in the training (actual errors that are exp erienced in the clinic) RCA tool The end of simulation of an error. It provides practice for aspects in radiation therapy for which there is currently little or no means of simulating and evaluating one's performance such as contouring and image review. T T he feedback given is useful to see where you did well and where you still need more improvement. Liked Least Consultation phase is least realistic, because all other phases or RT process are actually performed in a virtual setting already and VROC is a very realistic substitute.

PAGE 188

188 I did not feel the image analysis was a strong as the other components. The tools are there for image review, but the simulated portal images were not as realistic feeling (as say, the contour comparison component) within the VR OC. I don't think that their need to be a director for VROC to help "oversee" the simulation from A to Z or initiate the steps and forms for the trainee/user. I believe the "overhead" that comes with that may hamper the use of VROC or limit the number of p articipants for this potentially very useful and much needed tool. Other Recommendations The addition of other imaging modalities for image review would be interesting (CBCT, kV/kV). It will be useful to simulate within VROC an actual documented error or use VROC to document actual errors in the clinic VROC can potentially be used as an error database for any radiation oncology department that new residents are required to explore as part of their training curriculum and study the clinical significance of these errors. VROC can also contain all the documents generated for that actual error including state correspondence (if state was involved), follow up actions, etc. I think it would be helpful if the image review feedback included the difference betwee n the correct shifts and the shifts the participant made/accepted so they can go back and look at images they did not align properly or approved when they shouldn't have. Also, I think the VROC would be helpful for others such as physics residents but more features would be helpful for them such as comparisons of normal structure contours. Summary Initial testing and validation of the VROC system was conducted by completing an acceptance test to show that different features of the VROC worked as intended. The acceptance test was based on the development chart that was used to develop the VROC. C urrently there is not a log file that record s the time the trainee spends performing different ta sks. T his is a simple matter of adding different users to the

PAGE 189

189 system and running a set of reports in the administration portion of the software and was left for future studies. Because different tas ks within the RO process often take several days to complete the overall time spend using the VROC system may not be useful in dicator of training. There were no added tools VROC. This is an area that will need to be developed in order to study the response of Trainees when managing multiple patients. Because the system includes a full patient management and task scheduling system there is no reason to believe that this will not work, but the details of how to populate the schedules has to be worked out and was not part of the initial development or validation. One other area of t he acceptance test and validation that failed to perform as initially designed was the overall automation of di fferent aspects of the system. All components are functional in the current VROC, however, the feedback reports, metric calculations, and daily t reatment simulations will need to be automated for future development in order to scale the VROC system for simultaneous patient simulation. Based on the five users that initially evaluated the VROC the most realistic aspect of the VROC is the contouring and pre planning step and the least realistic aspect was the consultation. This does not present a problem because this version of the VROC was not intended to fully simulate the consultation. Comments from the initial evaluation of the VROC indicate that one aspect of the VROC that was perceived as being useful or that was liked the best was the RCA exercise. This is encouraging for future work and could possibly be presented as a stand alone exercise.

PAGE 190

190 Table 8 1 VROC a cceptance t est Label From Fig. 3 2 Test Pass/ Fail Comments A/B Instructor can selects a clinical patient and sends to the virtual clinic the following items: Images to VROC P A/B Virtual p atient is available in VROC P A/B Instructor can search through the database and see what data is available. P A/B Instructor can review the types of errors available for a the virtual patient P A/B Instructor searches through system and selects a virtual patient and assigns patient to resident P 1 Patien t is available on the trainee schedule in Time Planner F schedules not developed 1 L og file is started F log file scripts not written yet 1 Consultation report is available in Patient M anager P 1 P 2 CT scan s and normal tissue contours for patient are available in TPS P 2 Add GTV/ CTC/ PTV P 2 Doses entered into p rescription Form P 2 Save and e xport patient P ** all needs automation**

PAGE 191

191 Table 8.1 C ontinued Label From Fig. 3 2 Test Pass/ Fail Comme nts C/D Trainee & i nstructor receive notification about results of contour comparisons P ** Needs better automation* 3 In TPS : multiple plans should be available to P ** Needs automation 3 Review plans and choose on e with an error P E Feedback about error plan is sent to resident and director. P needs improvemen t and automation 3 Go back into tx planning and unapproved plan P F Since treatment proceeds in real time, have to wait abo ut 2 days to check tx films/fields P *No Need to Demo 5/F Log into Offline R eview there should be 2 sets of images to be reviewed. P ** Virtual treatment files need automation 5/F Mark 2 nd P 5/F Next day log in. Tx should be corrected. P **No need to ATP 5/F Patient continued through a course of treatment (10 treatments). P G F inal report is sent to the resident and resident director. P ** needs automation H Can complete the RCA based on final metrics P

PAGE 192

192 Table 8 2 Summary of initial reviewers Would you recommend the VROC for training? Yes Yes Yes Yes Yes Av g Realism of the VROC: 1=Not Realistic 5=Very Realistic Overall s ystem 4 5 5 4 5 5 Consultation 4 3 4 4 4 Initial i mage revie w 4 5 3 4 5 4 Contouring 5 5 5 4 5 5 RX input 4 5 5 4 5 5 Plan e valuation 5 5 5 4 4 5 Treatment e valuation 5 5 4 4 4 4 Specific error s ituation 4 5 5 4 4 4 Feedback: How helpful was the feedback you received? 1=Not Helpful 5=Very Helpful Cont our c omparison 5 5 5 4 5 5 Initial Tx plan (if applicable) 4 4 4 4 4 Final treatment e valuation 4 5 5 4 4 4

PAGE 193

193 Figure 8 1 VROC specific workflow for a single trainee and single virtual patient.

PAGE 194

194 CHAPTER 9 C ONCLUSIONS AND FUTURE WORK The purpose of this project was to design, develop and validate a Virtual Rad iation Oncology Clinic (VROC). The first step included defining the requirements for the system This included a detailed workflow of radiation oncology as well as summaries of errors that have occurred in radiation oncology. Feedback for training included proposing a set of metrics that could be used The second step included development of the VROC by collecting hardware and developing software to simulate diffe rent aspects of the RO workflow. The third step include d in itial testing of the VROC developed code as well as validating the overall system for realism The requirements that were initially defined are listed below along with a brief description of how ea ch requirement was met Review of VROC S ystem There were eight different requirements for the VROC that were li sted in Chapter 2. A description of each requirement along with the work done to date to fulfill the requirement is included. 1. Realistic 2. Specif ic for physician residents 3. Controlled environment 4. Easily incorporated into the curriculum 5. S imulate s a variety of radiation techniques and errors 6. Learning objectives and goals are clear for each exercise. 7. VROC can give immediate feedback on performance 8. End of Simulation feedback and learning assessment Realist ic : Realism was created by replicating each of the different aspects of the radiation oncology workflow using commercially avail able software from Varian M edical Systems. Tests were done to test t he realism of the treatment verification images

PAGE 195

195 compare d to real verification images. The accuracy between simulated and real images was within clinical setup deviations of about 3 mm. Several users evaluated the system and scored different aspects of the system based on the realism. Overall most aspects were rated as Very Realistic. Specific for p hysician: The overall workflow for the VROC was designed based on workflow diagrams that were focused on the physician roles. The VROC was designed to simulate all of the work of the other team members and allow the physician trainee to carry out their normal responsibilities. Future work will be to generate interest from the radiation oncology residency community in order to identify early adopters to further te st the VROC and its utility for radiation oncologists Additional work on the VROC is required to scale the overall capacity so that each trainee can manage multiple patients at the same time to realistically simulate the demands on the physician in a busy clinic. Controlled e nvironment : The entire VROC is an isolated database with only v irtual p atient data included. It does not interface directly with any clinical treatment or R&V systems. The VROC could be made to automatically i nterface with clinical software if that were done, it would be recommended to keep a separate database exclusively for the VROC. Any v irtual p atients that are created are stripped of any identifying information and all DICOM information is changed so that there is no risk of patient data being incorrectly transferred from the VROC to the clinical system s. Ea sily i ncorporated into curriculum: Becau se each trainee can use the workflow without direct supervision from the instructor, t he VROC would be easy to

PAGE 196

196 incorpora te into the already existing curriculum of a radiation oncology facility. Also, a set of user manuals for all users has been created to further facilitate the use of the VROC in a training facility. It is a desired future study to implement the VROC into a residency program for a trial period S imulate s a variety of techniques and errors : Tests were performed using the VROC to simulate a variety of different er rors. A summary of errors was presented in Chapter 3. A selection of patient setup errors was recr eated into the VROC to prove that treatment verification images could be sim ulated correctly. This was shown in Chapter 5. Also, a variety of different errors were simulated in order to calculate the overall metrics for a plan implemented with errors to il lustrate how the different feedback would change when errors are introduced. This was included in Chapter 6. Learning objectives and goals are clear for each case : It is the resident responsibility to specify learning objectives as they relate t o the curriculum. While it is hoped that future studies using the VROC will be conducted, the goal of developing the tool was to ensure that it was flexible enough to accommodate any learning objects as specified by the instructor. For this reason, a simul ation request form was created to involve the instructor in the development of the specific scenarios for training. A user manual was created along with the VROC to give the instructor guidance on how to develop these scenarios. Feedback on performance : Th e different opportunities for feedback within the natural radiation oncology workfl ow were identified in Chapter 3. C ontour and planning feedback metrics were investigated to propose a set of metrics that could be related to outcomes. A study was done on s everal simulated errors to evaluate the metrics and

PAGE 197

197 determine a final suggested report for feedback comparing treatment plans or treatments with errors. The summary of this work was included in Chapter 6. End of s imulation opportunity to test learning obje ctives : Overall simulation fe ebrief discuss the overall exercise as to the realism and to discuss what the trainee observed and what they learned. T his overall feedback inc ludes an exercise in Root Cause Analysis to provide training to the user about common root causes in radiation oncology. An online web form was developed to walk the resident through this exercise. This can be emailed directly to the resident director for review. T o assess the utility of the overall simulation experience, a survey form has been created for those who either receive a demonstration with the VROC or who use the VROC in order to g ive feedback on the overall impressions as well as the utility of the system. Future S tudies The future for simulation is promising Specifically for radiation oncology the added benefit of simulation training could be cost savings by adding opportunities to practice patient treatment planning and patient image review without added time on the expensive treatment machines or added time of supervising physicians or the resident director. Also, the VROC does not affect patients receiving treatment and is a completely separate dat abase to ensure patient safety, while provi ding opportunities to illustrate the ramifications of errors in radiation oncology. Simulation Training For Residents Simulation has been widely used in the routine procedures in medical training but has yet to be embraced at the level of sub specialties This could be due in part to the

PAGE 198

198 cost of startup of a simulation program as well as the difficulty in some practices to provide a realistic simulation. Because most of the technical procedures that the radiation oncologist perform are managed through t he use of digital images and through computer programs, buildin g a simulator specific for the r adiation oncologist was done by creating an isolated EMR, TPS, and R&V system. Two significant contribution s of this work were the software to simulate errors in th e patient setup through the use of simulated portal images, kV/kV images and CBCT images and the feedback metrics related to errors Future development for the VROC includes automation of the code that generates these images as well as automation of the feedback reports With full automation, the system would require almost no support from outside users and could be left in a training facility to allow novice users the opportunity to manage more patients than they currently may manage. One future study th at should be perused is the implementation of the VROC into a training program. Ideally each user would manage many different virtual patients at one time T he same patient cases could be used for both novice and expert users and the differences in the fee dback metric reports for the different users would indicate if there is a difference between novice users and expert users. In order to implement a stud y of this nature, additional funding and support are needed A preliminary report was sent to Varian Med ical Systems regarding the status of the in kind loan used to build the VROC, but a final presentation will determine future opportunities for additional resources to conduct a full study in a training facility. Additional sources of funding may also be av ailable through the radiation oncology community or through the medical simulation community. Manuscripts on the initial validation of the VROC are being

PAGE 199

199 submitted to both communities to gain insight into future options for resident training with VROC Met rics & Feedback Another result that came from this work related to the feedback metrics and to the error scoring. The feedback metrics that were evaluated for the VROC include a number of plan s pecific parameters as well as scores related to the severity o f the error based on error reporting recommendations Ideally, feedback would be directly related to the severity of side effects from the error. The severity scoring system used in this study was taken from recommendations for a radiation oncology specifi c national radiation oncology error reporting system (77) This work f ound that with only three raters, the error scoring system appears to have a level of subjectivity that was not expected. Further investigation into this scoring system did not find any published articles in which this particular error scoring system has been implemented Per sonal communication (11 0 ) confirmed that this is the error reporting system will be implemented within the Radiation Oncology Incident L earning S ystem (RO ILS) being developed by ASTRO (56) Terezakis et al. (111 ) conducted a recent multi institutional study in anticipation of the national reporting system She combined near miss and error data from two large academic ra diation oncology centers to conclude that there is utility in a national incident learning system because many of the errors and near misses were related to human errors One criticism of this work by Terezakis is that while she makes recommendations for the errors that should be reported to the new RO ILS based on error severity, she did not score the errors from

PAGE 200

200 these two institutions b ased on the new recommended scoring system. For this published work, the valid ated French Nuclear Safety Authority (ASN) adverse event scoring system was used. The ASN system uses a five point error scoring system that duplicates side effect severity grades found in the Common Terminology Criteria for Adverse Events ( published by t he National Institute of Health (NIH) and the National Cancer Institute (NCI) ) The entire listing of adverse events has been uploaded to the UF Digital Archive for reference, and can also be downloaded from: CTCAE v4.0 (Object 9 1). The primary differenc e between the ASN system and the ten point scoring system is the separation between temporary and permanent side effects. Traditional scoring system s only evaluate the severity of the error regardless of whether it is permanent or temporary. The recommenda tions in Ford et al. (77) specify a Grade 1 permanent side Within the ASN system both errors would be given the Object 9 1 CTCAE v4.0 as PDF document (.pdf 512 K B) A study to i mprove the correlation of the VROC metrics to expected side effects is necessary in order to improve the utility of the VROC feedback reports. A pro posed future study includes extending the rating of the severity metrics of the sixty error plans to several more experts in radiation oncology. In addition to the scoring system that was used for the initial VROC study, the CTCAE grades for adverse effect s and the ASN scoring system will additionally be used. This may prove useful in providing metric feedback from VROC that is correlated to adverse events from errors. Additionally,

PAGE 201

201 through this study, it may be possible to validate the ten point error scor ing system or make recommendations to ASTRO on details needed in order to improve the consistency of the scoring system. One other issue with the feedback reports was related to the metric calculations for some of the palliative disease sites. As was poin ted out in the metric analysis, some of the cases did not have an adequate number of st r uctures defined near the targeted area In those cases, instead of indicating the severity of the error the lack of data could inappropriately appear as a low risk of consequences within both the NTCP and the DD calculations. For the cases in question, contours should be added to represent all of the organs in the vicini ty of the tumor volume. This could improve the correlation of the NTCP values T he DD dose calculatio ns should also b e normalized to the number of structures so that the dose calculation is not biased based on the number of structures. Root Cause Analysis Another future study involves the RCA exercise. Several of the initial evaluators ranked this as a u seful training tool. While it was built specifically for the VROC system, it may have utility to the radiation oncology community as a whole. There are many small clinics that may not have the benefit of a risk analysis team and could benefit from a simple methodology for evaluating a clinical error Also, while national reporting systems are useful for the community at large, it will be years before the data collected would generate recommendations on how to prevent specific errors or near misses, and ther e is little information on whether the RO ILS will provide tools for conducting a root cause analysis.

PAGE 202

202 B ecause the VROC RCA exercise is a web based form, it would be easy to conduct a survey of several different users and institutions to gain feedback on overall perceived utility of the RCA exercise. Specific feedback about the utility o f the Five Ford et al. (77) could be used to make modifications to this exercise before publishing electronically for open use. Another possible future study is based upon one of the comments from the v alidatio n testing. The VROC may be especially useful for predicting severity scores for (111) indicated that most incidents included re the errors were caught before they occurred. She noted difficulty in predicting the severity of the adverse events in these situations I f the VROC system were used for error reporting, the error would be simulated as though it was carried out on the pa tient, and the feedback metrics could be used to predict the severity score. U sing the VROC as an incident reporting system eliminates the redundancy of the incident system and also p rovides real, clinical examples. Also, using the VROC as the incident re porting system w ould provide additional resources and buy in from both the clinical team s as well as the residency training programs where time and man power are an issue Within routine clinical work, the error s and near misses for the department are reviewed on a routine basis as part of the departmental quality assurance meeting s. The errors and near misses could be reviewed directly from within the VROC where the RCA form could be used to help conduct the root cause analysis. This could also be us ed for a resident conference to provide quality assurance summaries based on real clinical cases the resident is already involved in. To further

PAGE 203

203 automate this system a group of standardized patient s could possibly be used (112 113 ) in much the same way as is done for health physics. While this may not be as accurate for the patient specific anatomy, it would provide automation and time savings because the patient data would not have to be uploaded into the VROC sys tems. Final C omments A virtual radiation oncology clinic (VROC) has been created using commercially available software along with developed tools to simulate the radiation treatment machine and to generate feedback. Initial testing on the VROC show that th e different aspects of the system are functional and that error can be accurately simulated. Feedback can be calculated to compare contours between and expert and novice. Error report feedback that can predict a possible severity of the error can be calcul ated for any error that occurs in the system. An RCA exercise was created to supplement learning about error mitigation using the VROC. In itial validation indicates that the system is realistic. Several studies are proposed for future use of the VROC and t he RCA Exercise.

PAGE 204

204 APPENDIX A DETAILED DESIGN TABLE Table A 1 VROC w orkflow d evelopment chart in t able f ormat Step Workflow Workflow t ask Software Description of features/functions A/B List of virtual patients Create d atabase VROC Searchable databas e (by name, resident, diagnosis, error ) Generate virtual p atient VROC I nstructions for importing/ exporting patient from clinical environment Add virtual patient to d atabase VROC D ICOM DAEMON to import patients into Database Multiple treatment p la ns VROC/ TPS Treatment plan & scripts to modify treatment plans Metrics for the Tx plans VROC/ TPS Export plan from TPS to MATLAB : Develop software in MATLAB to calculate metrics. Portal films for Tx review VROC/ TPS Scripts to create DRRs within TPS & store in database for Tx review VROC/ TPS & stored DRRs A Assign patient CT & patient demographic to virtual c linic Patient Manager M ethod to add demographics to a patient in Aria Schedule patient for c onsult Time Planner M ethod (instructions) to assign a schedule Log file for resident time in V RO C V RO C/ Report manager L og file to track of the time the resident is actively using the system. 1 Consultation report Mark cons ult as reviewed Patient Manager T racked review status back to the database 1 Determine if r adiation Simulation assigned to resident V RO C / Time Planner Script or .bat program to assign to Time Planner 2 Planning prep. CT scan available VROC/ TPS / RT Ch art DICOM software to send the CT data to TPS Patient scheduled for planning VROC/ Aria / Time Planner BATCH file or script to add patient to resident schedule Contours are loaded VROC/ TPS BATCH file or DICOM to upload the RT_STRUCTURE file

PAGE 205

205 Table A 1 Continued S tep Workflow Workflow t ask Software Description of features/functions 2 Review CT Open patient in TPS Eclipse Instruction how to open a patient file and structure file in TPS 2 Add contours Adds target structures Eclipse Instru ctions on how to add GTV and PTV in TPS 2 Add planning intent P rescription i ntent RT Chart Instructions to resident how to add an Intent or Rx Add a beam & RX Eclipse Instructions on how to add a beam and R x Saves & Export VROC Eclipse Instruction s to export patient to contour review C/D Review of contours Contours to compare VROC Bat file to recognize new file in d atabase Contours are compared VROC/ MATLAB Software to compare contours Results of c ontour comparisons VROC/ MATLAB Differen ces between contours sent via email to resident and d irector. Patient purged from Eclipse / V RO C/ Eclipse A Bat file to clear out all planning and contouring 3 Import other 4 treatment plan V RO C / Eclipse Import other plans to review over the top of the other files 3/A Planning options Upload 4 plans for review VROC/ Eclipse Instructions on how to compare plans 3 Chooses a plan Resident review & approves plan Eclipse Instructions on how to approve a plan Patient exported Eclipse / VROC Ins tructions on how to export patient for plan comparisons Calculate metrics for various plans VROC P rogram to calculate metrics (NTCP, TCP, EUD, etc.) Metric tracking along with plan approvals VROC Metrics are stored within the database

PAGE 206

206 Table A 1. Continued Step Workflow Workflow task Software Description of features/functions E Attending r eview Report of metrics of plans VROC Metrics are emailed to resident and director Clean TPS Eclipse / V RO C Batch file or script to clear old patient file p repare Eclipse treatment 4/F Start t reatment Approved plan setup for treatment VROC/ Eclipse / RT Chart Plan setup in RT Chart for Create t reatment records VROC / RT Chart Software to create the DICOM treatment record Create creates RT images VROC / Eclipse Creates DICOM verification images in TPS and stores them until needed Send tx records & images VROC/ RT Chart DICOM DAEMON and Batch files to send the TX record and images to RT Chart Films review Offline Review Instructions for resident: how to check films in Offline Review Films marked as r eviewed (or with changes) Offline Review 4/F Resident Review Tx films Evaluate f ilm status VROC P rogram that will recognize notes on films Update treatment based on film status VROC (see 5a) P rogram to change treatment based on notes in films Overall metric of treatment Eclipse / VROC P rogram to update metrics calculated to r elate to errors in treatment G/H Review of Treatmen t Root cause analysis VROC Web based program for a ro ot cause analysis Final report for r esident VROC Program that will compile all of the metrics, log files, and the root cause analysis into a final r eport.

PAGE 207

207 APPENDIX B FILM ANALYSIS INFORMATION In cluded within Appendix B are the alignment details Table B 1 includes all of the shifts that were tested for the kV/kV alignment process. The initial image location is where the phantom was placed on the t reatment couch, and the registration position is where the patient was at the end of the localization process. This indicates whether or not the patients were shifted back to isocenter or were off when the patient completed treatment. The shift values were used to compare the final isocenter location. The differences for each were used to determine how accurately the real registered images aligned to expect. Tables B 2 through B 4 include the details about the kV image registration. Table B 2 is the detaile d shifts for each fiducial. Table B 3 gives the isocenter locations for the real film and Table B 4 gives the isocenter locations for the simulated films. The detailed isocenter locations for the CBCT data for the real and simulated datasets are included in Table B 5 and Table B 6

PAGE 208

208 Table B 1. Image and r egistration offsets in KV/KV image testing Planned offset of points to center of image X= Lateral Y= AP Z = S/I (mm) Set1 Set2 Set3 Initial image p osition (ON) Registration position (ON) 0, 0 0 0, 0, 0 0, 0, 0 0, 0, 0 0, 0, 0 0, 0, 0 Initial image p osition (ON) Registration position (OFF) 0, 0, 0 20, 10, 30 0, 0, 0 30, 20, 10 0, 0, 0 5.4, 4.4, 32 In itial image p osition (OFF) Registration position (ON) 20, 10, 30 0, 0, 0 30, 20, 10 0, 0, 0 5.4, 4.4, 32 0, 0, 0 Initial image p osition (OFF) Registration position (ON) 20, 10, 30 20, 10, 30 30, 20, 10 30, 20, 10 5.4, 4.4, 32 5.4, 4.4, 32

PAGE 209

209 Table B 2 Difference for each data point between a ctual and s imulated films Difference (a ctual vs s imulated ) AP (film1) mm S/I (film1) mm Lat (film2) mm S/I(film2) mm Fiducial 1 1.64 2.21 0.50 3.4 5 0.90 0.91 0.05 2.70 Fiducial 2 2.39 2.46 0.77 3.09 1.40 2.88 0.33 4.59 Fiducial 3 1.89 2.57 0.35 3.04 0.75 0.90 1.02 4.52 Anatomical Pt1 0.14 0.81 1.59 3.12 0.69 1.36 0.65 1.13 Anatomical Pt2 0.45 1.66 0. 07 1.91 0.23 1.57 1.25 3.16 Table B 3 Alignment compared to known isocenter (actual f ilm) Actual image AP (film1) mm S/I (film1) mm Lat ( film2) mm S/I(film2) mm Acquisition isocenter 0.33 0.23 0.62 0.47 0.01 0.53 0.43 0.27 Registrat ion i socenter 0.33 0.23 0.73 0.30 0.09 0.44 0.45 0.25 Table B 4 Alignment compared to known i socenter ( simulated f ilms) Simulated image AP (film1) mm S/I (film1) mm LAT(film2) mm S/I(film2) mm Acquisition isocenter 0.16 0.45 0.43 0.39 0. 04 0.34 0.38 0.34 Registration i socenter 0.29 0.32 0.52 0.33 0.06 0.15 0.35 0.32

PAGE 210

210 Table B 5 Alignment compared to known i socenter ( a ctual CBCT ) Actual CT Scans X= Lat (mm) Y=A/P (mm) Z = S/I (mm) Acquisition i socenter 0.22 0.82 0.21 1.48 0.79 1.93 Registration i socenter 0.29 0.35 0.21 0.31 0.18 0.59 Table B 6 Alignment compared to known i socenter ( s imulated CBCT) Simulated CT Scans X= Lat (mm) Y=A/P (mm) Z = S/I (mm) Acquisition i socenter 1.08 1.04 0.38 0.9 5 0.42 0.79 Registration i socenter 1.08 0.93 1.41 1.05 1.39 1.98

PAGE 211

211 APPENDIX C METRIC CALCULATION IN F ORMA TION Table C 1 includes the numbers that were used in the calcu lations for EUD, NTCP and TCP. The response of tumor tissue is different tha n normal tissue therefore the table is divided into two parts. The upper part is the target tissue values, and the lower is the normal tissue values. The values for the table are taken from sources inc luding Gay et al.( 69) Niemierko et al. (62), and Emami ( 114). a is the parameter that is used for EUD calculations and relates to the overall tissue response. It is fo und empirically and is <0 for target tissues and >0 for normal tissues. 50 represents the slope of the dose response curve and is modeled for each individual structure TD 50 is a value that relates to the tolerance dose for a 50% complication rate for the specific structures of interest. The value 50 represents the slope of t he dose response curve and is modeled for each individual structure early responding (low values). It is used in the initial EUD calculation to scale to equivalent dos e given at a 1.8 Gy fraction

PAGE 212

212 Table C 1 Values used in the EUD and TCP for target calculations Targets a 50 TD 50 PTV (general) 12 2 60 10 PTV Lung 10 2 55 10 PTV Prostate 7 2 71 3 PTV Breast 7.2 2 60 3 PTV Brain 10 2 60 10 PTV Pa lliative 13 2 45 2.1 PTV Pancreas/abdomen 12 2 60 10 PTV Bone palliative 13 2 45 3 PTV H/N Primary 10 2 70 10 PTV H/N Secondary 12 2 60 10 PTV H/N Nodes 10 3 55 10 PTV H/N 10 3 55 10 Adapted fro m Gay HA, Niemierko A. A free program for calculating EUD based NTCP and TCP in external beam radiotherapy. Phys Med 2007; 23: Table 1. P 118. Emami B Tolerance of normal tissue to therapeutic radiation Reports of Radiotherapy and Oncology 2013 ; 1(1); Table 2 3 p p 39 41

PAGE 213

213 Table C 2. Values used in the EUD and NTCP for normal tissue s Normal Tissue a 50 TD 50 B ladder 6 4 80 4 B ody 6 4 55 6 B owel 6 4 65 8 B rainste m 7 3 65 2.1 B reast (non t arget ) 0.8 2 60 3 C hiasm 25 3 65 3 C ord 30 4 70 3 C ord 30 4 45 3 E soph agus 19 4 68 3 E ye 15 2 65 3 Fe mur 10 4 65 0.8 H eart 3 3 50 2 Ki dney 1.3 3 28 3 L arynx 5 4 80 3.5 Le ns 3 1 18 2 L iver 3 3 40 1.5 L ung 1.2 4 24.5 4 N erves 25 3 65 3 Or al cavity 5 4 80 4.4 P arotid 1 4 46 3 P enile bulb 25 4 50 3 R ectum 6 4 80 3.9 S kin 6 4 78 2 S tomach 6 4 65 8 T rachea 5 4 65 4.4 V ein 3 4 50 2 Adapted from Gay HA, Niemierko A. A free program for calculating EUD based NTCP and TCP in external beam radiotherapy. Phys Med 2007; 23: Table 2. P 118. Emami B Tolerance of normal tissue to therapeutic radiation Reports of Radiotherapy and Oncology 2013 1(1); Table 2 3 p p 39 41

PAGE 214

214 Detrimental Dos e : For the calculation of the Detrimental dose the following data are required: DE = Dose Error to target D = Delivered Dose to tissues VI = Tissu e Volume irradiated V = Volume of Tissue TS = Tissue Sensitivity SI = Patient Severity Index T h ese are based on a proposal of Carlone et al.(78). Because the EUD was used for the calculations within the VROC, and this value gives the dose to that would have the same effect if the entire organ was irradia te d, there was no need for the v olumetric values within the Detrimental dose calculations. Also, in order to prevent patients going to very high treatment doses such that surrounding normal tissues receive very high doses from being scaled too highly, the c hange in the dose to the normal tissues was used rather than the actual Delivered Dose to tissues. In this way the DD that we calculate will give an indication of the severity of the error. Recommended values for TS and SI are illustrated in Table C 3 and are taken from the poster presen tation on DD and are similar to health physics scaling values for different tissues and are similar to other scaling values for severity.

PAGE 215

215 Table C 3. Recommended TS and SI values for calculating DD Tissue or Effect Value Tissue Sensitivity (TS) CNS 10 Cardiopulmonary 8 Abdominal organs 5 Muscle/bone 3 Skin/Cartilage 1 Effect on Pat ient (SI) Death 10 Failure to cure 9 Paralysis & loss of function 7 Hospitalization 5 Cosmetic 2 None 0

PAGE 216

216 APPENDIX D HEAD AND NECK METRIC CALCULATIONS Details for the head and neck data analysis are included in Appendix D. The details of t he different plans that were created are described below followed by the summary of the different data that were calculated. Table D 1 is the summary of the average and standard deviations of the consequence severity metrics that were given to the differen t head and neck plans by the experts. It also includes the dose severity metrics. This table indicates the large standard deviations that were noticed in the scoring metrics. Table D 2 lists all of the target metrics for the Head and neck cases for all er rors. Table D 3 lists the details of the parotid gland values for all cases. Table D 4 lists all of the different dose metrics for the spinal cord. The errors that were simulated are described below. 1. Spatial Error: The patient was aligned to their initia l simulation marks without the proper shift to the treatment planned isocenter (about 3 cm in each direction). Shift not noticed until 1week later when portal films were re taken. (5 treatments). 2. Spatial Error: Same error as that described in #2, but is ca ught and corrected after a single treatment. 3. Spatial Error: Patient is aligned with CBCT in a daily imaging regimen, but the auto alignment cause d a fusion to the wrong vertebral body in the neck not caught by evaluation this occurs for 2 of the 35 treatm ents. 4. Spatial Error: This is the same error as that in #4 above, but shifted in the opposite direction for 2 fractions. 5. Dosimetric Error : S ystem interrupt ion due to network connectivity, data was not recorded; t herapists inadvertently treat on e beam a se cond time in the confusion. 6. Dosimetric Error : Due to confusion, therapists change the gantry angle for 2 beams for 5 treatments 7. Dosimetric Error : The physician w rites the prescription for 1.8 Gy for 35 Fractions when they intended 2.0 Gy for 35 Fractions. The plan is created to match the prescription, which was not detected. The patient is under dosed by 7 Gy.

PAGE 217

217 8. Dosimetric Error: P lanning system accidentally calculated plan for 70 Gy in 30 Fractions instead of 35 fractions. The prescription in the record and verify system is correct for 70Gy in 35 Fractions and no on e checking the chart realizes the plan is for only 30 Fractions. By treating to a total of 35 fractions, the patient would receive an over dose of 11.66 Gy. 9. Spatial /Dosimetric: In this scenario the physician contours the high dose volume on The high neck nodes and low neck nodes are not changed and the dose to them is the same. 10. Spatial /Geometric: The single anterior supraclavicular field was prescribed 25 f ractions while the remainders of the fields were prescribed 35 fractions. For this simulation, the supraclavicular field was not stopped after 25 fractions and was allow ed to continue for 35 fractions In the case of Patient 1, the parotid received a much h igher dose than it did for the other 2 patients for both the clinical plan as well as the error plans. As is typical with radiation plans, the closer a normal structure is to a target, the sharper the dose gradient towards that structure making the plan mu ch more sensitive to errors in that direction. In these situations where there is a high dose gradient small changes in patient setup will introduce much larger changes into the OAR than will be observed for plans where there is not a gradient such as is t he case with Patient 2 and Patient 3. For the ipsilateral parotid (parotid on the same side of the neck as the tumor volume), the NTCP value was calculated as 48.8% based on the approved treatment plan which would have been an acceptable risk for this par ticular situation, any dose or geometric error that shifted the dose into an already compromised parotid gland will have a dramatic effect on the NTCP and therefore the NTCP changes by more than 5% risk in at least 5 of the scenarios, the most dramatic of which was error #8 in which the parotid has an NTCP value of 97%. For spinal cord, the NTCP values were minimal for all situations except for error scenario #10 in which the spinal cord would have received a potential dose increase

PAGE 218

218 above threshold of 45Gy and the NTCP value jumped to 30.6% (Patient 1). Patients 2 and 3 also showed and increase to the risk to the cord from error #10, however it was not as dramatic because even though the error scenario is described as the same type plan is slightly different and therefore the consequences of the supraclavicular field being treated 10 extra fractions does not have exactly the same effect.

PAGE 219

219 Table D 1. Head and n eck error scores Patient 1 Patient 2 Patient 3 Error # Dose Severity Dose Severity Dose Severity s core s core s core s core s core s core 1 5 6 3 3 5 5 5 5 5 2 1 2 3 1 2 3 1 2 3 3 1 0 1 1 0 0 1 0 0 4 1 0 1 1 0 0 1 0 0 5 1 0 1 1 0 0 1 0 0 6 5 0 1 1 0 0 1 0 0 7 5 8 0 5 8 0 3 8 0 8 5 6 2 5 5 4 5 5 4 9 7 8 0 7 8 0 5 8 0 10 7 6 1 5 5 3 5 6 2

PAGE 220

220 Table D 2 Target values for head and neck cases Case # Patient 1 EUD (Gy) Patient 2 EUD (Gy) Patient 3 EUD (Gy) Patient 1 % Cov Patient 2 % Cov Patient 3 % Cov P atient 1 J Patient 2 J Patient 3 J Standard 75.75 73.16 74.18 97.09 88.45 95.48 0.74 0.80 0.86 1 71.10 68.44 69.48 68.20 16.91 33.53 0.61 0.17 0.33 2 74.52 72.29 73.3 96.05 80.63 92.04 0.76 0.77 0.85 3 75.17 73.12 74.01 96.09 89.16 95.25 0.75 0.82 0.8 8 4 74.73 72.75 73.81 94.90 83.47 92.75 0.77 0.77 0.85 5 75.94 73.52 74.67 97.35 91.75 96.71 0.73 0.81 0.86 6 74.8 0 73.52 72.78 96.42 91.78 82.96 0.75 0.81 0.77 7 66.99 64.65 65.62 0.04 0.15 0.00 0.00 0.00 0.00 8 90.92 87.65 89.01 99.9 99.98 100.00 0. 21 0.36 0.44 9 66.52 44.4 0 59.83 65.53 37.36 52.81 0.34 0.38 0.35 10 75.34 73.28 74.33 96.67 89.2 0 95.48 0.27 0.39 0.86

PAGE 221

221 Table D 3 EUD and NTCP for i psilateral p arotid head and neck cases Case # P atient 1 EUD P atient 1 NTCP Severity P atient 2 EUD P atient 2 NTCP Severity P atient 3 EUD P atient 3 NTCP Severity Standard 46.31 52.7 2 19.93 0 .00 21.6 0 0.00 1 47.91 65.74 6 3 24.66 0.01 5 5 26.21 0.01 6 1 2 46.58 54.97 2 3 20.84 0.00 2 3 22.5 0 0.00 7 2 3 47.07 59.13 0 1 20.6 0.00 0 0 2 2.47 0.00 8 0 4 45.8 0 48.26 0 1 19.53 0.00 0 0 21.04 0.00 7 1 5 46.52 54.44 0 1 19.94 0.00 0 0 21.79 0.00 6 4 6 45.67 47.16 0 1 20.23 0.00 0 0 20.91 0.00 1 1 7 40.27 10.63 8 0 17.44 0.00 8 0 18.98 0.00 10 0 8 57.09 96.94 6 2 24.22 0.00 5 4 26.2 0 0.01 4 3 9 19.73 0.00 8 0 23.05 0.00 8 0 16.76 0.00 8 0 10 46.04 50.35 6 1 20.03 0.00 5 3 21.66 0.01 7 0

PAGE 222

222 Table D 4 EUD and NTCP for s pinal c ord head and neck c ases Case # P atient 1 EUD P atient 1 NTCP Severity P atient 2 EUD P atient 2 NTCP Severity P atient 3 EUD P atient 3 NTCP Severity Standard 34.66 0.06 33.06 0.03 34.72 0.06 1 34.8 0 0.07 6 3 32.7 0 0.02 5 5 33.4 0 0.03 6 1 2 34.6 0 0.06 2 3 32.79 0.03 2 3 34.27 0.05 7 2 3 34.57 0.06 0 1 32.54 0.02 0 0 34.42 0.06 8 0 4 35.71 0.1 0 0 1 32.78 0.03 0 0 34.09 0.05 7 1 5 34.78 0.07 0 1 33.16 0.03 0 0 35 .00 0.07 6 4 6 36.28 0.13 0 1 32.97 0.03 0 0 35.08 0.08 1 1 7 30.31 0.01 8 0 28.88 0 .00 8 0 30.47 0.01 10 0 8 42.36 1 .51 6 2 40.31 0.69 5 4 42.61 1.66 4 3 9 34.66 0.06 8 0 33.06 0.03 8 0 34.85 0.07 8 0 10 52.25 30.58 6 1 46.44 6.26 5 3 50.33 19.48 7 0

PAGE 223

223 APPENDIX E VARIE TY CASE METRIC CALCULATIONS The variety patient error metrics were taken from ten different patients each with different disease selection. The prescribe doses for each plan were different as were the normal structures of interest. For each of the ten patients three different errors were created into each patient case and the changes in the metrics were calculated. Table E 1 is the list of the errors that were created along with the ten specified disease sites and cases. The experts scored all of the plans and the average and standard deviation of the consequence severity metric is in cluded along with the dose severity metric in Table E 2. The target tissue doses are summarized in Table E 3. Only the combined metrics for the normal tissue doses were reported and are listed in Table E 4.

PAGE 224

224 Table E 1 Variety case disease sites and err ors Disease Error 1 Error 2 Error 3 1 Left b reast Setup (1 of 30) MLC omitted 2 fields (5 of 30) MLC omitted 1 field (25 of 30) 2 Lung Setup (2 of 30) MLC omitted 1 field (5 of 30) Wrong e nergy 6x vs 18x (30 of 30) 3 Palliative spine Dose RX (12 of 12) Setup 2.5 cm (4 of 12) Setup (12 of 12) 4 Whole b rain Norm Block (15 of 15) Setup 3cm + 3cm (4 of 15) Rx error (3 of 15) 5 Partial b rain Setup (1 of 33) Setup marks on mask (3 of 33) Setup marks on mask (5 of 33) 6 L arynx Setup (1 of 35) Wedge omit 1 field (3 of 35) Wedges inverted (35 of 35) 7 Head/Neck Setup (5 of 35) Field TX 10 extra times IMRT MLC omitted (3* stopped early) 8 Abdomen Setup (1 of 28) Setup (2 of 28) RX wrong (28 of 28) 9 Prostate Wrong Field (1 of 39) Fields s witched (5 of 39) Setup (15 of 39) 10 Rectum 1 Extra Fld (1 of 28) Wrong f ields (3 of 28) FX error 18 instead of 28

PAGE 225

225 Table E 2 Error scores for v ariety c ases Error1 Error2 Error 3 Patient # Dose Average Severity Dose Average Severity Dose Average Severity 1 1 0 0 6 4 3 8 6 1 2 1 6 4 4 1 1 8 7 2 3 8 6 2 8 8 0 9 8 0 4 5 7 1 5 6 2 8 7 1 5 1 0 0 1 6 4 6 6 4 6 1 3 4 1 3 3 1 1 1 7 5 4 4 7 7 0 10 10 0 8 1 0 1 1 0 1 5 4 3 9 1 0 1 1 0 1 8 8 0 1 0 1 0 1 5 2 3 7 7 0

PAGE 226

226 Table E 3 Target m etrics for v ariety c ases Case # Plan 1 % EUD Pl an 2 % EUD Plan3 % EUD Base % Cov Plan 1 % Cov Plan 2 % Cov Plan 3 % Cov Base J Plan 1 J Plan2 J Plan 3 J 1 0.07 16.69 27.44 100.00 100.00 100.00 100.00 0.66 0.67 0 .59 0.57 2 2.00 6.41 41.50 90.05 75.22 99.85 100.00 0.84 0.72 0.75 0.55 3 40.24 40.43 108.08 100.00 100.00 37.06 44.04 0.09 0.03 0.36 0.01 4 19.69 22.83 51.81 98.30 100.00 86.13 100.00 0.97 0.98 0.85 0.47 5 0.03 1.58 3.07 96.90 96.90 72.05 71.06 0.73 0.73 0.62 0.62 6 4.10 3.30 0.56 100.00 60.68 100.00 100.00 0.05 0.05 0.04 0.05 7 4.47 0.27 75.42 96.32 72.71 96.67 0.00 0.65 0.67 0.53 0.00 8 0.95 2.19 13.26 87.60 75.88 64.60 99.96 0.70 0.65 0.65 0.57 9 0.16 1.03 43.56 99.64 99.38 99.75 1 4.15 0.65 0.66 0.65 0.13 10 0.80 0.37 19.03 95.30 99.99 99.67 100.00 0.94 0.89 0.51 0.20

PAGE 227

227 Table E 4 Normal tissue m etrics v ariety c ases Case # NTCP tot Baseline NTCP tot Error1 NTCP tot Error2 NTCP tot Error3 DD Error1 DD Error2 DD Error3 1 0.00 0.00 0.03 1.23 0.18 68.17 160.52 2 0.57 0.09 1.94 84.42 5.56 80.45 389.34 3 0.00 0.00 0.00 0.00 41.30 14.68 81.79 4 23.95 5.47 62.87 98.64 8.56 205.67 538.36 5 7.29 7.36 11.91 19.50 4.11 61.61 227.50 6 7.40 4.02 11.80 0.03 2.87 12.07 1.64 7 49.09 65.47 6 5.70 100.00 109.06 5.96 6012.94 8 91.14 90.57 90.03 99.90 1.05 1.18 250.04 9 1.06 0.98 1.36 0.24 5.08 9.78 68.65 10 0.21 0.24 0.25 0.54 6.56 34.34 203.55

PAGE 228

228 APPENDIX F INITIAL VROC CASES T able F 1 Patients available in i nitial VROC Case t itle Plan d etails and planning e rrors Treatment details and treatment e rrors Prostate IMRT with CBCT alignment 3 plans total one is as treated, one with gantry angle changes and not optimized, the other with relaxed constraint on rectum. Major treatment error occu rred in which another prostate patient treatment fields were delivered on this patient. This results in a "wrong patient that would need to be reported. Brain IMRT Treatment error wit h wrong daily dose (1.8 x 33). Also a treatment plan with beams g oing through eyes with no chiasm dose constraint. Routine setup errors (assuming no imaging one day and treated to "wrong" marks ) Also another error where there is a computer glitch and part of the treatment is given 2 x because it was not recorded and t hey didn't realize 2 fields were treated twice. Head and n eck IMRT Plan 1 re optimized with only PTV no normal tissue constraints Plan 2 dose grid is cropped so DVH's are not correct Plan 3 Expected plan = Plan 4 (not for review only treated wit h no MLC) Major treatment error that occurs in this scenario is that the MLC files are deleted somehow upon plan approval and are not detected. Treatment is given for 2 treatments before it is realized. Lung IMRT Planning errors are generic types of error s or mistakes. The specific treatment error is that the patient is setup and then somehow shifted to the "wrong site" laterally so that the right lung is treated. This is a reportable error and so an RCA form needs to be filled in. Intro case basic plan r eview Planning errors (3 plans?) Relatively obvious mistakes or MLC left out. Treatment errors are setup errors only, or mis interpretation of portal films due to imaging issues.

PAGE 229

229 T able F 1 Continued Case title Plan details and planning errors Treat ment details and treatment errors Esophagus Alignment Change in constraints, and wrong changes in coverage. Setup er ror to wrong vertebral body. Treatment error with MLC removed due to crash/computer issues. Brain IMRT Setup Errors The planning errors in clude extra margin and also a plan in which the MLC is deleted. Specific treatment error is a setup error on the first day such that patient is at least 2 3 cm off from the CAX. Additional "minor" setup errors will also be introduced after this first error is detected. If the error is not detected then it will continue until the resident realizes there is a mis alignment. Left Breast Only a single treatment plan is available for treatment this time. Treatment delivery will be based on Clinical issues with this particular patient setup where the common isocenter was set 2cm superior on a couple of occasions. Confusion in interpreting portal films. Lung for CAP The initial plan will be used, as well as one that does not constrain to the brachial plexus and to the cord to determine if they check the doses carefully Also, a treatment plan in which the fields are open and conformal (not IMRT) to see if this is significantly d ifferent than the other plans. On fraction 1 2 the alignment to CBCT is mis aligned s uperiorly by 1 vertebral body. The error was not detected and is treated incorrectly for both of these 2 fractions. ( If the resident detected the error on FX 1 then it is corrected.

PAGE 230

230 T able F 1 Continued Case title Plan details and planning errors Tr eatment details and treatment errors Abdomen CAP 3 treatment plans will be evaluated. IMRT with objectives being met Arc + IMRT with objectives IMRT with stomach and kidney objectives not met Alignment error on Fraction 2. If it is detected then it will be corrected for fraction 3 but if not then it will continue to be treated incorrectly until it is detected by the resident. RT Breast CAP Only 2 plans will be available for this case to simplify the selection. One plan will be standard and the oth er will have mixed energies and will be hotter. Initial Films are go od (films prior to treatment). After 1 week of treatment new films sh ow setup error of about 2 3cm. ( Could have been going on for all 5 fractions). Also, a treatment error occurs in which the MLC file for some reason was not loaded

PAGE 231

231 APPENDIX G RCA EXCERCISE FORM

PAGE 232

232

PAGE 233

233

PAGE 234

234

PAGE 235

235 LIST OF REFERENCES 1. American College of Radiology [Internet]. Reston: American College of Radiology; c2013. p. Radiation Oncology Practice Accred itation Program Requirements; [cited 2014 Jan 2]. a vailable from http://www.acr.org/~/media/ACR/Document/Accreditation/RO/Requirements.pdf 2. Cotter GW, Dobelbower RR, Jr. Radiation oncology practice accreditation: the American College of Radiation Oncology, Practice Accreditation Program, guidelines and standards. Crit Rev Oncol Hematol 2005;55:93 102. 3. Potters L, Gaspar LE, Kavanagh B et al. American Society fo r Therapeutic Radiology and Oncology (ASTRO) and American College of Radiology (ACR) practice guidelines for image guided radiation therapy (IGRT). Int J Radiat Oncol Biol Phys ;76:319 325. 4. Blumber AL, Burns RA, Cagle SW et al. Safety Is No Accident: A Framework For QUality Radiation Oncology Care. 1 ed: ASTRO; 2012. 5. Kutcher GJ, Coia L, Gillin M et al. Comprehensive QA for radiation oncology: report of AAPM Radiation Therapy Committee Task Group 40. Med Phys 1994;21:581 618. 6. Purdy JA, Biggs PJ, Bowers C et al. Medical accelerator safety considerations: report of AAPM Radiation Therapy Committee Task Group No. 35. Med Phys 1993;20:1261 1275. 7. Nath R, Biggs PJ, Bova FJ et al. AAPM code of practice for radiotherapy accelerators: report of AAPM Radiation Therapy Task Group No. 45. Med Phys 1994;21:1093 1121. 8. Purdy JA, Klein EE, Low DA. Quality Assurance and Safety of New Technologies for Radiation Oncology. Semin Radiat Oncol 1995;5:156 165. 9. Hartford AC, Palisca MG, Eichler TJ et al. Am erican Society for Therapeutic Radiology and Oncology (ASTRO) and American College of Radiology (ACR) Practice Guidelines for Intensity Modulated Radiation Therapy (IMRT). Int J Radiat Oncol Biol Phys 2009;73:9 14. 10. Fraass BA. Errors in radiotherapy: m otivation for development of new radiotherapy quality assurance paradigms. Int J Radiat Oncol Biol Phys 2008;71:S162 165. 11. Cunningham J, Coffey M, Knoos T et al. Radiation Oncology Safety Information System (ROSIS) -profiles of participants and the fi rst 1074 incident reports. Radiother Oncol ;97:601 607.

PAGE 236

236 12. Hendee WR. Safety and accountability in healthcare from past to present. Int J Radiat Oncol Biol Phys 2008;71:S157 161. 13. Bentel GC, Nelson CE, Noell KT. Treatment planning & dose calculation i n radiation oncology. 4th ed. New York: Pergamon Press; 1989. 14. Nishidai T, Nagata Y, Takahashi M et al. CT simulator: a new 3 D planning and simulating system for radiotherapy: Part 1. Description of system. Int J Radiat Oncol Biol Phys 1990;18:499 50 4. 15. Nagata Y, Nishidai T, Abe M et al. CT simulator: a new 3 D planning and simulating system for radiotherapy: Part 2. Clinical application. Int J Radiat Oncol Biol Phys 1990;18:505 513. 16. Aird EG, Conway J. CT simulation for radiotherapy treatmen t planning. Br J Radiol 2002;75:937 949. 17. Vertual [Internet]. Hull: Logistics Intitute University of Hull; c2014. [cited 2014 Feb 2]. a vailable from: http://www.vertual.eu/products/vert 18. Hamza Lu p FG, Sopin I, Zeidan O. Comprehensive 3D visual simulation for radiation therapy planning. Stud Health Technol Inform 2007;125:164 166. 19. Ilic D, Moix T, Mc Cullough N et al. Real time haptic interface for VR colonoscopy simulation. Stud Health Techno l Inform 2005;111:208 212. 20. Grantcharov TP, Bardram L, Jensen PM et al. [Virtual reality computer simulation as a tool for training and evaluating skills in laparoscopic surgery]. Ugeskr Laeger 2001;163:3651 3653. 21. Seymour NE. VR to OR: a review o f the evidence that virtual reality simulation improves operating room performance. World J Surg 2008;32:182 188. 22. Gordon JA, Vozenilek JA. 2008 Academic Emergency Medicine Consensus Conference. Acad Emerg Med 2008. 23. Kaji AH, Bair A, Okuda Y et al Defining systems expertise: effective simulation at the organizational level -implications for patient safety, disaster surge capacity, and facilitating the systems interface. Acad Emerg Med 2008;15:1098 1103. 24. Issenberg SB, McGaghie WC, Petrusa ER et al. Features and uses of high fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10 28. 25. Mills S, deAraujo, MMT. Learning through virtual reality: a preliminary investigation. Interacting with C omputers 1999;11:453 462.

PAGE 237

237 26. McGaghie WC, Issenberg SB, Petrusa ER et al. Effect of practice on standardised learning outcomes in simulation based medical education. Med Educ 2006;40:792 797. 27. Accreditation Council for Graduate Medical Education [In ternet]. Chicago: The Accreditation Council for Graduate Medical Education; c2000 2014. pp. Radiation Oncolocy; [cited 2014 Jan 2]. a vailable from: http://www.acgme.org/acgmeweb/tabid/2149/ProgramsandInstitutionalAccreditati on/Hospital BasedSpecialties/RadiationOncology.aspx 28. Gondi V, Bernard JR, Jr., Jabbari S et al. Results of the 2005 2008 Association of R esidents in Radiation Oncology Survey of Chief Residents in the United States: Clinical Training and Resident Working Conditions. Int J Radiat Oncol Biol Phys 29. Battles JB. Improving patient safety by instructional systems design. Qual Saf Health Care 2006;15 Suppl 1:i25 29. 30. Knowles MS. Andragogy in action. 1st ed. San Francisco: Jossey Bass; 1984. 31. Knowles MS. The adult learner : a neglected species. 4th ed. Houston: Gulf; 1990. 32. Cross KP. Adults as learners. 1st ed. San Francisco: Jossey Bass; 1981. 33. Dick W, Carey L, Carey JO. The systematic design of instruction. 6th ed. Boston: Pearson/Allyn and Bacon; 2005. 34. Hanley M. Discovering Instructional Design 11: The Kemp Model. Vol 2010; 2010. pp. E Learning Curve Blog. 35. Wiggins GP, McTighe J. Understanding by design. Expanded 2nd ed. Alexandria, VA: Association for Supervision and Curriculum Development; 2005. 36. Anderson JM, Aylor ME, Leonard DT. Instructional design dogma: creating planned learning experiences in simulation. J C rit Care 2008;23:595 602. 37. Perez CA. Principles and practice of radiation oncology. 4th ed. Philadelphia: Lippincott Williams & Wilkins; 2004. 38. Wang CC. Clinical radiation oncology : indications, techniques, and results. 2nd ed. New York: Wiley Lis s; 2000. 39. Bijhold J, Gilhuijs KG, van Herk M. Automatic verification of radiation field shape using digital portal images. Med Phys 1992;19:1007 1014.

PAGE 238

238 40. McCullough EC, McCollough KP. Improving agreement between radiation delineated field edges on sim ulation and portal films: the edge tolerance test tool. Med Phys 1993;20:375 376. 41. Yeung D, Palta J, Fontanesi J et al. Systematic analysis of errors in target localization and treatment delivery in stereotactic radiosurgery (SRS). Int J Radiat Oncol Biol Phys 1994;28:493 498. 42. Herman MG, Balter JM, Jaffray DA et al. Clinical use of electronic portal imaging: report of AAPM Radiation Therapy Committee Task Group 58. Med Phys 2001;28:712 737. 43. Phillips BL, Jiroutek MR, Tracton G et al. Thresho lds for human detection of patient setup errors in digitally reconstructed portal images of prostate fields. Int J Radiat Oncol Biol Phys 2002;54:270 277. 44. Jaffray DA, Siewerdsen JH, Wong JW et al. Flat panel cone beam computed tomography for image gu ided radiation therapy. Int J Radiat Oncol Biol Phys 2002;53:1337 1349. 45. Jaffray DA. Emergent technologies for 3 dimensional image guided radiation delivery. Semin Radiat Oncol 2005;15:208 216. 46. Letourneau D, Wong JW, Oldham M et al. Cone beam CT guided radiation therapy: technical implementation. Radiother Oncol 2005;75:279 286. 47. Hatherly KE, Smylie JC, Rodger A et al. A double exposed portal image comparison between electronic portal imaging hard copies and port films in radiation therapy tr eatment setup confirmation to determine its clinical application in a radiotherapy center. Int J Radiat Oncol Biol Phys 2001;49:191 198. 48. Mazur LM, Mosaly PR, Jackson M et al. Quantitative assessment of workload and stressors in clinical radiation onc ology. Int J Radiat Oncol Biol Phys ;83:e571 576. 49. Digital Imaging and Communications in Medicine [Internet]. Rosslyn; [2014 Feb 5]. a vailable from: http://medical.nema.org 50. Spezi E, Lewis DG, Smith CW. A DIC OM RT based toolbox for the evaluation and verification of radiotherapy plans. Phys Med Biol 2002;47:4223 4232. 51. Germond JF, Haefliger JM. [Electronic dataflow management in radiotherapy: routine use of the DICOM RT protocol]. Cancer Radiother 2001;5 S uppl 1:172s 180s.

PAGE 239

239 52. Huang G, Medlam G, Lee J et al. Error in the delivery of radiation therapy: results of a quality assurance review. Int J Radiat Oncol Biol Phys 2005;61:1590 1595. 53. Klein EE, Drzymala RE, Purdy JA et al. Errors in radiation onco logy: a study in pathways and dosimetric impact. J Appl Clin Med Phys 2005;6:81 94. 54. Fraass BA, Lash KL, Matrone GM et al. The impact of treatment complexity and computer control delivery technology on treatment delivery errors. Int J Radiat Oncol Bio l Phys 1998;42:651 659. 55. Macklis RM. Hidden perils of automation and its effect on error reduction. Ambul Outreach 1999:31 34. 56. ASTRO Targeting Cancer Care [Internet]. Fairfax: American Society for Radiation Oncology; [cited 2014 Feb 5]. a vailable from: http://www.astro.org/ 57. Radiation Therapy Oncology Group [Internet] Philadelphia: Radiation Therapy Oncology Grou p; c2104 [cited 2014 Feb 5]. a vailable from: http://www.rto g.org 58. Hanna GG, Hounsell AR, O'Sullivan JM. Geometrical analysis of radiotherapy target volume delineation: a systematic review of reported comparison methods. Clin Oncol (R Coll Radiol) ;22:515 525. 59. Feuvret L, Noel G, Mazeron JJ et al. Conform ity index: a review. Int J Radiat Oncol Biol Phys 2006;64:333 342. 60. Allozi R, Li XA, White J et al. Tools for consensus analysis of experts' contours for radiotherapy structure definitions. Radiother Oncol ;97:572 578. 61. Lyman JT, Wolbarst AB. Optim ization of radiation therapy, IV: A dose volume histogram reduction algorithm. Int J Radiat Oncol Biol Phys 1989;17:433 436. 62. Niemierko A. Radiobiological models of tissue response to radiation in treatment planning systems. Tumori 1998;84:140 143. 63 Lyman JT. Complication probability as assessed from dose volume histograms. Radiat Res Suppl 1985;8:S13 19. 64. Okunieff P, Morgan D, Niemierko A et al. Radiation dose response of human tumors. Int J Radiat Oncol Biol Phys 1995;32:1227 1237. 65. Brahm e A, Agren AK. Optimal dose distribution for eradication of heterogeneous tumours. Acta Oncol 1987;26:377 385.

PAGE 240

240 66. Webb S, Nahum AE. A model for calculating tumour control probability in radiotherapy including the effects of inhomogeneous distributions of dose and clonogenic cell density. Phys Med Biol 1993;38:653 666. 67. Luxton G, Keall PJ, King CR. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD). Phys Med Biol 2008;53:23 36. 68. Seppenwool de Y, Lebesque JV, de Jaeger K et al. Comparing different NTCP models that predict the incidence of radiation pneumonitis. Normal tissue complication probability. Int J Radiat Oncol Biol Phys 2003;55:724 735. 69. Gay HA, Niemierko A. A free program for c alculating EUD based NTCP and TCP in external beam radiotherapy. Phys Med 2007;23:115 125. 70. Lind BK, Mavroidis P, Hyodynmaa S et al. Optimization of the dose level for a given treatment plan to maximize the complication free tumor cure. Acta Oncol 199 9;38:787 798. 71. Cambria R, Jereczek Fossa BA, Cattani F et al. Evaluation of late rectal toxicity after conformal radiotherapy for prostate cancer: a comparison between dose volume constraints and NTCP use. Strahlenther Onkol 2009;185:384 389. 72. Mee ks SL, Buatti JM, Foote KD et al. Calculation of cranial nerve complication probability for acoustic neuroma radiosurgery. Int J Radiat Oncol Biol Phys 2000;47:597 602. 73. Rancati T, Fiorino C, Sanguineti G. NTCP modeling of subacute/late laryngeal edem a scored by fiberoptic examination. Int J Radiat Oncol Biol Phys 2009;75:915 923. 74. Rancati T, Wennberg B, Lind P et al. Early clinical and radiological pulmonary complications following breast cancer radiation therapy: NTCP fit with four different mod els. Radiother Oncol 2007;82:308 316. 75. Dunscombe PB, Iftody S, Ploquin N et al. The Equivalent Uniform Dose as a severity metric for radiation treatment incidents. Radiother Oncol 2007;84:64 66. 76. Song W, Dunscombe P. EUD based margin selection in the presence of set up uncertainties. Med Phys 2004;31:849 859. 77. Ford EC, Fong de Los Santos L, Pawlicki T et al. Consensus recommendations for incident learning database structures in radiation oncology. Med Phys ;39:7272 7290.

PAGE 241

241 78. Carlone M, Macpher son M. Detrimental Dose: A Proposed Metric to Score Incidents in Radiation Therapy. Medical Physics 2009;36:1. 79. Siddon RL. Solution to treatment planning problems using coordinate transformations. Med Phys 1981;8:766 774. 80. National Comprehensive Ca ncer Network [Internet]. Fort Washington: National Comprehensive Cancer Network; [cited 2014 Feb 5]. a vailable from: http://www.nccn.org/default.aspx 81. Murphy MJ. The importance of computed tomography sl ice thickness in radiographic patient positioning for radiosurgery. Med Phys 1999;26:171 175. 82. Thomas S. Curry I, Dowdey JE, Jr. RCM. Christensen's Physics of Diagnostic Radiology Fourth ed. Malvern, Pennsylvania: Lea & Febiger; 1990. 83. Langen KM, W illoughby TR, Meeks SL et al. Observations on real time prostate gland motion using electromagnetic tracking. Int J Radiat Oncol Biol Phys 2008;71:1084 1090. 84. Kupelian PA, Lee C, Langen KM et al. Evaluation of image guidance strategies in the treatme nt of localized prostate cancer. Int J Radiat Oncol Biol Phys 2008;70:1151 1157. 85. Zeidan OA, Langen KM, Meeks SL et al. Evaluation of image guidance protocols in the treatment of head and neck cancers. Int J Radiat Oncol Biol Phys 2007;67:670 677. 86 Huq MS, Fraass BA, Dunscombe PB et al. A method for evaluating quality assurance needs in radiation therapy. Int J Radiat Oncol Biol Phys 2008;71:S170 173. 87. Morganti AG, Deodato F, Zizzari S et al. Complexity index (COMIX) and not type of treatment predicts undetected errors in radiotherapy planning and delivery. Radiother Oncol 2008;89:320 329. 88. Florida Administrative Register & Florida Administrative Code [Internet]. Tallahassee: State of Florida Department of State; c2014. pp. Rule 64E 65.101 [cited 2014 Feb 2]. a vailable from: http://flrules.org/gateway/ruleNo.asp?d=2064E 2015.2101 89. Kohn LT, Corrigan J, Donaldson MS. To err is human building a safer health system. Wa shington, D.C.: National Academy Press; 2000.

PAGE 242

242 90. The Joint Commission [Internet]. Edition ed. Oakbrook Terrace: The Joint Commission; c2012 2014. p. Sentinel Event Information; [cited 2014 Feb 2]; a vailable from: http://www.jointcommission.org/tompics/hai_sentinel_event.aspx 91. Pawlicki T, Dunscombe PB, Mundt AJ et al. Quality and Safety in Radiotherapy. Boca Raton: Taylor & Francis; 2011. 92. Williams PM. Techniques for r oot cause analysis. Proc (Bayl Univ Med Cent) 2001;14:154 157. 93. Voss JD, May NB, Schorling JB et al. Changing conversations: teaching safety and quality in residency training. Acad Med 2008;83:1080 1087. 94. Huffman Dracht HB, McDonnel WM, Guenther E Resident Education in Medical Errors. The Open Emergency Medicine Journal 2010;3:36 43. 95. Dror I. A novel approach to minimize error in the medical domain: cognitive neuroscientific insights into training. Med Teach ;33:34 38. 96. Vincent CA. Analysis of clinical incidents: a window on the system not a search for root causes. Qual Saf Health Care 2004;13:242 243. 97. Vincent C. Understanding and responding to adverse events. N Engl J Med 2003;348:1051 1056. 98. National Patient Safety Foundation [Int ernet]. Boston: National Patient Safety Foundation; [cited 2014 Feb 5]. a vailable from: http://www.npsf.org 99. U.S. Department of Veterans Affairs [Internet]. Washington DC: U.S. Department of Veterans Affiars; c2014 p. VA National Center for Patient Safety; [cited 2014 Feb 2 ] a vailable from: http://www.patientsafety.va.gov/professionals/onthejob/rca.asp 100. Serrat O. The Five Whys Tec hnique. Vol February 2009. Manila Asian Development Bank; 2009. p. 4. 101. American Board of Radiology. Tucson: American Board of Radiology; c2014. pp. MOC: Maintenance of Certification [cited 2014 Feb 2 ]. a vailable at: http://www.theabr.or/moc landing 102. Agency for Healthcare Reseach and Quality Rockville: Agency for Healthcare Reseach and Quality c2008 2014. pp. Plan do study act (PDSA) Cycle [cited 2014 Feb 2015]. a vailable at: http://www.innovations.ahrq.gov/content.aspx?id=2398

PAGE 243

243 103. Gould DA, Kessel DO, Healey AE et al. Simulators in catheter based interventional radiology: training or computer games? Clin Radiol 2006;61:556 561. 104. Grone J, Lauscher JC, Buhr HJ et al. Face, content and construct validity of a new realistic trainer for conventional techniques in digestive surgery. Langenbecks Arch Surg ;395:581 588. 105. Kenney PA, Wszolek MF, Gould JJ et al. Face, content, a nd construct validity of dV trainer, a novel virtual reality simulator for robotic surgery. Urology 2009;73:1288 1292. 106. Lendvay TS, Casale P, Sweet R et al. VR robotic surgery: randomized blinded study of the dV Trainer robotic simulator. Stud Health Technol Inform 2008;132:242 244. 107. Rungtusanatham M. Let's Not Overlook Content Validity. Decision Line 1998;July:10 13. 108. Wynd CA, Schmidt B, Schaefer MA. Two quantitative approaches for estimating content validity. West J Nurs Res 2003;25:508 518. 109. Gallagher AG, Renkin J, Buyl H et al. Development and construct validation of performance metrics for multivessel coronary interventions on the VIST virtual reality simulator at PCR2005. EuroIntervention 2006;2:101 106. 110. Ford, E. eford@uw.edu. re: question about reference. 2/6/2014. 111 Terezakis SA, Harris KM, Ford E et al. An evaluation of departmental radiation oncology incident reports: anticipating a national reporting system. Int J Radiat Oncol Biol Phys ;85:919 923. 1 1 2 Park S, Lee JK, Lee C. Development of a Korean adult male computational phantom for internal dosimetry calculation. Radiat Prot Dosimetry 2006;121:257 264. 11 3 Bednarz B, Hancox C, Xu XG. Calculated organ doses from selected prostate treatment plans using Monte Carlo simulations and an anatomically realistic computational phantom. Phys Med Biol 2009;54:5271 5286. 11 4 Emami B Tolerance of normal tissue to therapeutic radiation. Reports of Radiotherapy and Oncology 2013 ; 1(1) : 35 48

PAGE 244

244 BIOGRAPHICAL SKETCH Twyla Willoughby attended Northwest Nazarene University graduating with a Bachelor of Science degree in Engineering Physics in 1991. Upon graduation she went to the University of Texas Graduate School of Biomedical Sciences at the M.D. Anderson Cancer Center to obtain a Master of Science degree in Medical Physics in 1995. After graduation, she began her clinical career as a medical physicist at The Cleveland Clinic, gaining clinical experience and also obtaining Board Certification by The American Board of Radiology in 1998. Throughout her clinical career she has been involved in a number of clinical research projects and has been involved in working alongside various vendors during development phases of new technologies as well as commissioning several different new techn ologies into clinical practice. She completed her Doctor of Philosophy in Biomedical Engineering in May of 2014.


xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID EDF9MA6QR_Y6MQUH INGEST_TIME 2014-10-03T22:16:18Z PACKAGE UFE0046652_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES