Research Article

Objective Assessment of Surgical Skills in a 2 Day Visceral Anastomosis Techniques Course Hold in the Annual Congress of the German Surgical Society

Saleem Elhabash and Berthold Gerdes*
Department of Visceral, Thoracic and Endocrine Surgery, Johannes-Wesling-Klinikum Minden University Hospital, University of Bochum, Germany

*Corresponding author: Berthold Gerdes, Department of Visceral, Thoracic and Endocrine Surgery, Johannes-Wesling-Klinikum Minden University Hospital, University of Bochum, Germany

Published: 20 Aug, 2018
Cite this article as: Elhabash S, Gerdes B. Objective Assessment of Surgical Skills in a 2 Day Visceral Anastomosis Techniques Course Hold in the Annual Congress of the German Surgical Society. Clin Surg. 2018; 3: 2081.


Introduction: Simulation skill Laboratories are being increasingly marketed from different institutions and are widely accepted in Europe and the USA. Furthermore, residency training programs in the USA are incorporating such Laboratories into their residency curricula after being mandated by the American College of Surgery (ACS) in 2008 and are using them in the last few years to evaluate the competency of their surgical residents.
Unlike to North America, the literature to date has shown a little, when none about simulation skill labors/Curricula despite their existence in private institutes or few residency training programs in Europe. We still rely mainly on our traditional training methods and surgical simulators are still predominantly aimed at attracting attention at surgical equipment exhibits.
One of the well-known simulation skill Laboratories in Germany is hold yearly since 2005 in the annual German surgical meetings (DGCH). This Skill Laboratory is subdivided into different courses over 4 days with different modules in station-setting which includes common trunk surgical skills, visceral Laparoscopic techniques, and conventional visceral anastomosis techniques as well as courses in vascular and orthopedic surgery.
A valid und reliable objective assessment tool was developed in Canada and currently used widely by residency training programs in the USA and Canada to evaluate the efficacy of technical skill development outside the operating room in a bench setting. This tool is called, the Objective Structured Assessment of Technical Skill (OSATS).
The aim of our study is to demonstrate the improvement of surgical skills through the participation in a selected training module using a validated international assessment tool (OSATS).
Materials and Methods: The visceral anastomosis course which includes five training modules on animal models and takes place over two days was selected for evaluation. Performance of the participants in one module (end-to-end Bowel anastomosis) was measured by qualified surgeons using a task specific Checklist at the beginning and at the end of the course with instructor to participant ratio 1 to 10. The improvement in OSATS scores pre-post the course was assessed using paired T-Test. Participants were asked to perform a baseline bowel anastomosis independent of the course and their scores was analyzed as a possible correlation factor with final OSATS scores. Demographic data of the Participants as well as subjective evaluation forms were collected.
Results: A total of 38 surgical residents completed the 2-day visceral anastomosis course. The mean age was 34 ± 6 years. Fifty-eight Percent were males. Most of the participants were in their 4th and 5th years of residency training. 66% reported performing ≤ 10 Bowel Anastomosis since the beginning of their surgical training. Eight participants (21%) were able to perform an end-to-end Bowel Anastomosis independent of the course and scored a mean of 15 ± 3 in OSATS. OSATS scores improved significantly after completing the course (p=0.000018) with a mean of 15.7 ± 3.5 vs. 18.8 ± 2.4 at the beginning and end of the course respectively. In the regression analysis factors like the ability to perform the procedure before the course, number of in-training so far performed bowel anastomosis or current level of surgical training did not predict the improvement of OSATS scores of the participants pre-post the curriculum (P=0.6, 0.5 and 0.07 respectively). Furthermore, 95% of the participants reported subjective improvement in their skills and all participants gave a positive answer when asked whether to include simulation laboratories into their residency curricula.
Conclusion: Our results show a significant improvement of the surgical skills of residents regardless of their training level after participating in the simulation course as measured by OSATS. We highly recommend the integration of simulation laboratories in the curricula of our national residency training programs.
Keywords: Simulation; OSATS; Validation of surgical assessment tools; Surgical training; Simulation curricula


Simulation: definitions and background
Simulation has been defined as an artificial creation of a set of conditions in order to experience something that is possible in real life. A simulator is defined as a device that enables the operator to reproduce or represent under test conditions phenomena likely to occur in actual performance [1].
We can find many examples in the literature where Simulation was unsystematically used as an education tool in medicine. For example, in the 18th century in Paris, Grégoire father and son developed an obstetrical mannequin made of human pelvis and a dead baby which enabled obstetricians to teach delivery techniques which resulted in a reduction of maternal and infant mortality rates [2]. Of note, simulation in the medical field was not systematically pursued until pioneering efforts took place over the last three decades, learning from simulation in aviation. Simulators in the medical field can be divided into high fidelity or low fidelity simulators. It can be also divided into organic and inorganic simulators.
Methodology in surgical skills assessment
Assessment is crucial in providing feedback to the trainees. Dr. R. Reznick, whose group has many publications about teaching and evaluating surgical skills stated that in order to be a good assessment method, the assessment tool used must be feasible, reliable and valid [3]. Validity refers to the extent of the test to measure what is designed to measure. The most commonly used type of validity when examining the assessment methodology is construct validity, which is the extent to which we are measuring the trait we intend to measure.
There are several methods currently used to assess technical skills [3]. Direct observation with criteria has proven to have a strong validity with a direct relationship between reliability and objectivity of the criteria defined in a given test. The most commonly used observational tool in this category is the Objective Structured Assessment of Technical Skills (OSATS). This tool assesses subjects being tested using an operation-specific checklist or global rating scales. OSATS was created by the same group learning from the Objective Structured Clinical Examination (OSCE), a successful tool designed to examine trainees performing several clinical tasks in time-limited station-setting [4].
Why we need simulation in surgical training?
Restrictions in duty hours, costs associated with training junior residents in the OR, emerging technologies and increased awareness of patient safety are believed to be the major factors driving the recent emphasis on surgical training outside the operating room [5-8]. Surgical skills have traditionally been taught through an apprenticeship model, and then subsequently through the rotating residency model transferred from Europe by William Halstead [9]. The deficiencies of the current system of residency training are becoming increasingly criticized and the "learning by doing" approach, based on the random opportunity of patient flow, is recognized to produce significant variability in educational experience [10]. Furthermore, the assessment of surgical technique has been predominantly subjective, without reliable correlation between dexterity and surgical outcomes [11,12].
As proposed in the educational model of Ericsson, expert performance can be developed through intentional and continuous practice [13]. Simulation aims to represent reality to a level close to what the trainee would face in a real-patient setting [14]. Moreover, simulation enables replication of a single task in a controlled setting and thus developing essential basic motor skills before encountering the complex issues faced during performing or assisting in the operation room [15].
Evidence of simulation
The introduction of Simulation technologies served to fill the gap in the current training model. The acceptance of simulation based training began as the university of Toronto group introduced skills training at bench stations in the late 1980s which subsequently refined teaching methods that incorporated Feedback und performance assessment with validated rating [16-18]. The American College of Surgeons has already identified the potential for simulation techniques to influence patient safety by allowing learning in a riskfree environment, refresh techniques for surgeons, correct casemix inequalities during training, and testing of new procedure or devices in a simulated environment [19]. Many academic medical centers and University hospitals have developed skills laboratories to accommodate learners through a range of surgical specialties, allowing them to practice their skills [20,21].
It is important to evaluate the utilized curricula in simulation laboratories even if these have been adopted from existing resources [22]. Furthermore, transferability to real patient setting and thus better clinical outcomes should be evaluated in order to prove that skills acquired through training with simulators can positively transfer to clinical practice, translating into better patient outcome. Numerous studies document improvement of performance during actual operations following laparoscopic curricula in the simulation skills laboratory. Of note, reviewing the literature to date has shown a stronger evidence for minimally invasive surgery more than for traditional open procedures as to the transferability of learned skills in a simulation skills laboratory to the operating room.
The aim of our study is to evaluate the improvement in performance as well as the retention of surgical skills of candidates in a surgical skill simulation skill laboratory with a predefined curriculum including conventional visceral anastomosis course.

Material and Methods

Structure of the simulation skill labor
1. One of the well-known simulation skill Laboratories in Germany is hold yearly since 2005 in the annual German surgical meetings (DGCH). This Skill Laboratory uses inanimate as well as animate and laparoscopic simulators and is subdivided into different courses over 4 days with different modules in station-settingCommon trunk surgical skills: takes place over 9 hrs und includes introduction to Suturing materials, Skin Suturing, tracheotomy, intubation, Insertion of thorax drains and central venous Catheters.
2. Laparoscopic visceral techniques (13 hrs divided in 2 days): Basic techniques, Laparoscopic suturing und Knot tying, Fundoplication, Bowel-Anastomosis, laparoscopic Colon anastomosis using stapler or hand anastomosis.
3. Conventional visceral anastomosis techniques (12 hrs in 2 days), (Figure 1).
4. Vascular Surgery (10 hrs in 2 days): Vascular basic techniques, aortic prosthesis, vascular anastomosis, interventional vascular techniques, venous patch, Composite-Bypass, Cuff- Anastomosis.
5. Orthopedic surgery (10 hrs): Osteosynthesis. Every module was hold in a bench setting and was introduced with a didactic session and video presentation of the technique performed by a qualified surgeon expert. Trainees could participate in one or more of the five modules.
Simulated modules in the conventional visceral anastomosis course:
1. Bowel anastomosis: this module took place over one and a half hour and was divided into 2 Exercises on harvested porcine small intestine,
a) End-to-end anastomosis of the small intestine using a single layer continuous suturing technique,
b) End-to-end anastomosis using an interrupted suturing technique,
2. Gastroenterostomy and pancreas anastomosis,
3. Billroth II resection with Foot Point Anastomosis,
4. Biliodigestive anastomosis and Roux-Y-Anastomosis,
5. Rectal anastomosis.
Assessment methodology
A task specific check list score designed for bowel anastomosis and adopted from the Objective Structured Assessment of Technical Skills (OSATS) developed by Reznik and colleagues (Appendix I) was used to score the participants. The Check list consisted of 22 items. Each item was scored with one point when done correctly with a maximum score of 22. Four qualified consultants in visceral surgery had a 30 min training session in the scoring check list before the beginning of the course.
Design of the study
The study took place in the surgical skill training laboratory hold in DGCH congress from the 26th until 29th of April 2016. One module had to be selected for evaluation since some of the different modules took place simultaneously. The 2-day conventional visceral anastomosis module which accommodates up to 40 participants divided into 4 groups at four stations with instructor to participant ratio 1:10 was selected. Of the 7 predefined modules in the course, the end-to-end bowel anastomosis module using small intestine from pigs was selected for testing. The testing experts were blinded to the level of residency training of the participants. The participants were informed about the intention to test the improvement of their surgical skill and signed an informed consent before the beginning of the course. To standardize administration, all participants received a scripted orientation of the curriculum of the course. To motivate the participants, we announced two prizes for two randomly selected participants at the end of the course, an I-pad and the back payment of the fees of the course. Participants were surveyed if they have a prior experience in performing a bowel anastomosis before the beginning of the course. Those participants who gave a positive answer were asked to perform an end to end bowel anastomosis (practice 0) und their OSATS scores were analyzed as a possible correlation factor with the improvement in OSATS scores at the end of the curriculum. As with every other module, the intended to test module began with a projector video live presentation, in which an expert with a use of an assistant performed an end to end bowel anastomosis using a continuous single layer suturing technique (Figure 2). At the end of the presentation, the participants were asked to perform an end-to -end bowel anastomosis and were scored by the experts who were randomly assigned to the participants at the beginning of the test (practice-I). The participants worked in pairs, taking turns practicing the procedure (Figure 3). A set of different suture materials with a set of instruments were available for each participant. The participants were evaluated for choosing the correct instruments and suture and were responsible of directing the assistant. We did not focus on the time needed to complete the test and did not include it in evaluating the participants.
The participants continued through the different modules predefined in the course curriculum. Following a short description and a video presentation of each module, the participants were given the opportunity to practice the procedures repeatedly, had the chance to ask questions and were given an immediate feedback by the instructors with a special attention to the different items used in the check list. Of note, to avoid potential bias, the participants were assigned during the rest of the modules to instructors other than those assigned in evaluation process. At the end of the curriculum, we asked the participants to do the same end-to-end bowel anastomosis done at the beginning of the course and were scored again using OSATS by the same 4 experts but differently randomly assigned to the groups of the participants (practice-II). Demographic data of the participants such as age, sex, level of residency training and the number of bowel anastomosis already done by the resident on real patients were collected. A feedback Survey was collected from the participants with regards to the subjective evaluation of the course’s curriculum on their skills and if they think that simulation skill labors curricula should be included as an integral part of the residency training.
Data analysis
Data of the participants as well as the OSATS scores were imported into the statistical package (Version 25; SPSS Inc., Chicago, IL). Total test score represents the sum of a participant’s checklist scores. Paired T-Test was used to assess the improvement of the OSATS scores of the participants. All data are presented as mean ± standard deviation.
Correlations were done with Pearson’s correlation. Univariate analysis using Chi-square Test was used when appropriate. Interrater reliability was calculated using interclass correlation coefficient. Internal consistency, which is a measure of the reliability of the examination, was calculated using Cronbach's coefficient alpha. One-way analysis of variance (ANOVA) was used to assess possible predictors of the improvement of OSATS scores. P values less than 0.05 were considered statistically significant.

Figure 1

Another alt text

Figure 1
Structure of the visceral anastomosis course with 1:10 instructor to participant's ratio. Every module was demonstrated using a video projector.

Figure 2

Another alt text

Figure 2
A photo of two participants taking turns in practicing an end-to-end bowel anastomosis on small bowel harvested from pigs.

Figure 3

Another alt text

Figure 3
Participants were scored by randomly assigned tutors using OSATS at the beginning and at the end of the course.

Figure 4

Another alt text

Figure 4
Positive correlation between the trainings level of participants and number of in-training reported bowel anastomosis.

Figure 5

Another alt text

Figure 5
Improvement of the checklist scores after participation in Conventional Visceral anastomosis course.


A total of 38 participants completed the 2 Days visceral anastomosis course. The mean age of the participants was 34 ± 6 years. Fifty-eight Percent were males, 35 participants were surgical residents under training at the time of the course. Most of the participants were in their 4th and 5th years of residency training.
66% of the participants reported performing ≤ 10 Bowel Anastomosis since the beginning of their surgical training. A Spearman's rank-order correlation was run to determine the relationship between the training level of the participants and number of In-training performed bowel anastomosis reported by the participants (Figure 2). There was a strong, positive correlation between them, which was statistically significant (Rs=0.509, p=0.001) (Figure 4).
Before the course began the participants were surveyed if they can already perform end-to-end bowel anastomosis. Eight participants (21%) gave a positive answer and were able to perform an Endto- End Bowel Anastomosis (practice 0) and scored a mean of 15 ± 3 in OSATS. Furthermore, our results did not show a significant correlation between the OSATS scores obtained from those 8 participants in practice 0 and their training level or the number of intraining performed bowel anastomosis (p=0.7 and 0.8 respectively). Of note, of the 8 participants group, those in their 3rd year of training scored higher in OSATS compared with the others in this group (Table 1 and 2).
Analysis of Construct validity, which is the ability to distinguish among training levels, was assessed by analyzing participant's performance in practice-I with a one-way analysis of variance with training level as the independent variable and was close to be significant (p=0.07). Furthermore, Analysis of the participant´s performance in practice-I did not correlate significantly with the training level when it is defined as the expertise level which is correlated with the number of in-training performed bowel anastomosis (p=0.5) (Table 3).
Checklist scores of all 38 participants improved significantly after completing the course (p=0.000018) with a mean of 15.7 ± 3.5 vs. 18.8 ± 2.4 at the beginning and end of the course respectively (Figure 5).
The statistical results did not change when the pre-post curriculum checklist scores were compared for just the 35 participants under residency training with a mean of 15.8 ± 3.5 vs. 19 ± 2.2 at the beginning and end of the course respectively (p=0.000036).
Standardized internal consistency reliability coefficient (Cronbach’s alpha) for scores generated from all 22 items in the OSATS rating tool was r=0.77. Interrater reliability was 0.70 for the overall checklist. Upon analysis of the pre-post curriculum performance of the participants in the 22 items used in the checklist, 6 items showed statistically significant improvement. These items were loading the needle in the needle driver one half to two thirds from the tip, if the needle enters the bowel at right angle 80% of bites, forceps used on the seromuscular layer only majority of time, amount of tissue damage produced using the forceps, producing square knots and to cut the sutures to the appropriate length. In the regression analysis factors like the ability to perform the procedure before the course, number of in-training so far performed bowel anastomosis or current level of surgical training did not predict the improvement of OSATS scores of the participants pre-post the curriculum (P=0.6, 0.5 and 0.07 respectively). When set as a bench mark, the 8 participants group which performed the task before the beginning of the curriculum scored in OSATS at the end of the course almost equal to those who were not (mean 19.2 vs. 18.8, respectively). However, the pre-post curriculum checklist scores did not change significantly for those 8 participants with a mean of 17.7 ± 2 vs. 19.2 ± 1.75 at the beginning and end of the course respectively (p=0.07). Furthermore, 95% of the participants reported subjective improvement in their skills and all participants gave a positive answer when asked whether to include simulation laboratories into their residency curricula.

Table 1

Another alt text

Table 1
OSATS-Small Bowel Anastomosis.


Traditional surgical training
Dr. William S Halstead, the chief of surgery at Johns Hopkins Hospital (1892-1922) and considered one of the most influential and innovative surgeons in the American medical history, borrowed heavily from the German system of training, which emphasized the integration of basic sciences with practical teaching by full-time teachers, and established a residency training concept (the Halstedian concept) which spread through the USA and formed the basis of the current surgical training system in the modern era of surgery. This contribution to the training of surgeons was referred as Dr. Halsted's greatest legacy [9].
Dr. Halsted introduced an apprenticeship model which placed a heavy emphasis on learning the science of surgery and related disciplines while simultaneously immersing trainees in a supervised clinical setting with increasing levels of responsibility [23]. This model of classic doctrine of "See one, do one, teach one" involving subjective assessment by a mentor has been the hallmark of residency training in Europe and North America.
With the evolving changes in our current health care system, this Halstedian model of training has been increasingly criticized to be time dependent and results in surgical training being prolonged in order to gain sufficient level of operative exposure [4,24]. Moreover, the operation room serves no longer as the ideal atmosphere for surgical training of junior trainees and ethical issues about teaching on live patients have been addressed [25].
The incidence of complications increased following the introduction of laparoscopic surgery during the 1990's. Therefore, it became apparent for the surgical community the implications that surgical training could have on patient safety. Costs of adverse events were also an issue. Complications related to surgery can triple the length of stay and increase costs by more than 600% [26]. This increased the awareness that teaching new skills should take place effectively in a risk-free environment which fueled the interest in pursuing simulation-based training [27,28].
The growing pressure on our operating rooms as well as restriction of duty hours (48 hr/Week in Europe vs. 80 hr/Week in north America) and increasing the complexity of operations have led to increasing gap in resident-patient exposure and limited time spent in teaching the residents as well as the dependence on sheer volume of exposure in residency training rather than specifically designed curricula which subsequently made the reliance on this approach to teaching technical skills questionable with increasing interest in simulation skill laboratories aim to train residents in a risk free environment which has the advantage of allowing the trainees to progress in the face of errors and learn the consequences [8,29,30]. Furthermore, minimally invasive procedures have made it more difficult to acquire adequate experience performing traditional open operations. For example, open common bile duct explorations have become increasingly rare, which led to a dramatic rise in the incidence of technical complications [31,32].
Skill acquisition and deliberate practice
Reznick and colleagues [10] have suggested that the earlier stages of teaching technical skills should take place outside the operating room and proficiency based surgical training, rather than years served, would become standard. This proposal is based on the Fitts and posner's three stage theory of motor skill acquisition widely accepted in the surgical literature. This theory explains how the learner passes through three stages in developing a new motor skill until the last autonomous stage is achieved, in which the trainee no longer needs to think about how to execute this particular task and can concentrate on other aspects in the operation [33]. The association between operative volume and clinical outcomes supports the hypothesis that practice is an important determinant of outcome [34]. Nevertheless, operative volume is not the only determinant of the skill level among surgeons; since it has been shown that performance vary between surgeons working in high volume centers [10].
Proficiency increases with deliberate practice which requires a defined task and involves separated practice along with coaching and immediate feedback on performance [13]. The limited opportunities for deliberate practice in the current training model along with patient safety issues have led to increased interest in simulation skill laboratories with formed curricula to teach surgical skills [10,35].
Responding to the evolving challenges in the surgical training, the Residency Review Committee (RRC) for graduate medical education in the USA requires since 2008 that all residency training programs have to provide surgical residents with access to surgical skills laboratory. Moreover, these facilities must address acquisition and maintenance of skills with a competency-based method of evaluation [36]. Furthermore, the ACS and APDS developed a national skill curriculum in 2007 that is Web-based, is free of charge, and uses proven methods for training, with an emphasis on distributed, deliberate, and structured practice using performance-based end points [37]. The ACS and the society of American gastrointestinal endoscopic surgeons launched the Fundamentals of Laparoscopic Surgery (FLS) which represents the first validated simulation module to be standardized and is now required for surgeons seeking board certification in general surgery in the USA [36,38]. In a recent survey distributed to all residency programs in the USA, 99% of the responders (81 Programs) to the survey had a skill or simulation laboratory [36].
Simulation based learning: Evidence
To date, numerous studies document better performance of trainees using various assessment tools after participation in simulation skills curricula using high or low fidelity models. Assessment of laparoscopic skills dominates the articles to date owing to the growing interest in performing surgical procedures in minimally invasive approach. This is also attributed to the fact that laparoscopic skills are easier to evaluate in a simulated setting compared to open surgical skills [4]. However, there is still a lack of studies showing the improvement of performance of open surgeries in the OR following curricula in laboratories providing training in open surgical skills.
Griswold et al. [39] proposed a system in which simulation outcomes are measured in the literature (T1, T2, T3, T-value). At the T1 research level, simulation outcomes are measured in a laboratory setting. At the T2 level, transfer of skills acquired from simulation training is measured by clinical performance outcomes. T3 level studies assess patient safety. Finally, T-value studies measure the costsaving benefits of simulation training. This Study focuses primarily on the T1 and T2 levels to demonstrate the value of simulation.
T1 Studies: Simulation Outcomes: In a study by Olson et al. [40] a structured simulated based curriculum including bowel anastomosis, skin closure and laparotomy opening and closure and using OSATS in assessment was shown to be effective for first year surgical residents. The inter-rater reliability of OSATS scores was moderate with a correlation coefficient of 0.67. The study agreed with the believers of Reznik that simulation laboratories should be the place to train beginners before real experience with live patients in the operating.
Chipman and Schmitz [22] designed a simulation curriculum to teach basic skills like suturing and excision of skin lesions for first year surgical residents and used OSATS as an evaluation tool. Construct validity for the OSATS tool used was measured by comparing the performance of first year residents to higher level resident on the same tasks. This study demonstrated a statistically significant improvement in basic surgical skills in first year resident measured by OSATS which was comparable to the performance of their higher colleagues by the end of the course.
T2 Studies: Clinical performance outcomes: Anastakis et al. [25] compared the performance of surgical residents on six surgical tasks using cadaver models. Residents were divided in three groups prior to the assessment. One group received training on bench models and the second group on cadavers. The third group received no training other than learning from a prepared text. This study demonstrated a better and equivalent performance for the cadaver and bench model groups compared to the text learning group. The study concluded that simulated training on bench models could enforce resident learning and may be transferable to the operating room which was demonstrated by better performance on cadaver models.
Scott et al. [41] demonstrated a significantly better performance of laparoscopic cholecystectomy in the operation room for a group of residents who received daily training for 10 days on laparoscopic video-trainer in comparison to a control group which received no additional training.
Is simulation cost effective: Berg et al. [42] described a method of developing a cost-effective surgery skills laboratory curriculum to train surgical residents in open and laparoscopic surgical skills. In this study bench models, box trainers, and animate models were used. They estimated the costs to be as low as $982/year/resident. Simulation could be cost effective when compared to the costs of training in the OR investigated in one study to be as high as high as $47,970 per graduating resident [6]. A study by Babineau and colleagues [43] documented an 8- to 44-minute increase in operative time for resident training cases and emphasized the tremendous opportunity costs for faculty time.
Evaluation of surgical skills: OSATS
1. Three general categories have been identified as a framework for assessing surgical quaCognitive/clinical skills,
2. Technical skills, and
3. Social/interactive skills [44].
Traditionally, Trainees are assessed by examining the Logbook and supervisor feedback. This evaluation methodology depends mainly on recollection of memories of surgical performance of the resident from previous rotations. This method can be influenced by many factors such as the personal character of the resident or the assessing faculty team, performance on the surgical floor or the personal interactions of the resident in previous rotations. This method has been proven to lack reliability and validity since performing a number of procedures doesn’t ensure that the procedures have been done well. Additionally, it provides a little information regarding the areas of technical skills that require special attention [3,45,46]. Holmboe and Hawkins [47] argued that successful completion of a certification examination is not an adequate measure of the overall clinical competence of physicians-in-training.
Many investigators have worked on to create standard assessment method to evaluate resident´s skills outside the operating room. Observational type Assessment tools remain the instrument of choice in assessing surgical skills and OSATS is the most commonly used observational tool in this category [4,16,38].
The Objective Structured Assessment of Technical Skills (OSATS) has been proven as an instrument of high validity and reliability in measuring the improvement of performance in simulation skill curricula [16,17,45,48]. OSATS was developed by Martin and colleagues [16] for general surgery residents, in which the trainees perform a number of standardized surgical tasks on simulation models under direct observation. Trainees are scored using two methods. The first is a task specific checklist consisting of several specific technical skills required for performing the examined task. The second is a global rating form, which includes five to eight surgical behaviors, such as flow of operation, knowledge of instruments, and respect for tissues [10].
OSATS can be instrumental in not only assessing learners but in evaluating a specific curriculum. It has been used by some programs to assess the residents on annual basis, to compare them with their peer residents and help identifying the areas of the deficiencies in the resident's performance and therefore, promoting actions in order to correct those deficiencies early in the training years [49]. Other institutions have gone far in using surgical simulators in the selection process of candidates. For instance, Irish training programs have integrated surgical skills assessment on simulators to screen the applicants applying for higher surgical specialties [4]. On the day of the interview, applicants for vascular fellowship at Stanford University are assessed performing a renal artery angioplasty and stent insertion on simulators [35].
Similar to pilots who must be assessed on regular basis, a regular competence-based assessment of surgeons has been suggested with the potential use of simulators in the credentialing process [49]. However, there is still lack of studies demonstrating the implementation of simulators in surgical skills assessment for credentialing [4].

Discussion of Results

The assessment of surgical skills in the OR especially for open surgeries has been challenging owing to the variability of patients and to which degree the resident can be allowed to operate alone. It has been shown that more information about the performance of trainees can be gathered when they act as a primary surgeon [49]. Simulation skill laboratories provide the opportunity of breaking down an operation to key steps and therefore better assessment of each step. In our study, the trainees were allowed to act as primary surgeon who includes unguided selection of instruments and sutures and providing direction to the first assistant. In this study, a significant improvement of the skills of trainees after participating in various simulation modules of open surgical techniques was demonstrated. Using OSATS-procedure specific Checklist, participants were shown to be more knowledgeable and technically proficient in hand-sewn bowel anastomosis.
The results of this study are consistent with the results of other investigators who have evaluated the effectiveness of training surgical residents on open surgical simulation models [22,40,50]. Of note, the time to complete tasks was not used in this study in the assessment of participants as times have been found in previous studies to be a poor surrogate for ability [51].
Reliability of an assessment tool is a measurement of the consistency and replicability of an exam when administered to the same subject on two occasions given that are no intervening changes took place. Values close to 1.0 indicate a higher reliability of the exam. Inter-rater reliability represents the level of agreement between examiners for each participant [52].
In this study, reliability indices and inter-rater reliability were greater than 0.70 for the overall checklist. Comparable to other studies which used OSATS as an assessment tool to evaluate the improvement of their surgical residents, our study showed an acceptable inter-rater reliability [40,49].
Unlike to other studies, the construct validity, which has been measured by the correlation of surgical skills to the level of training of surgical residents, was close to be significant in our study (p=0.07). This can be attributed to the fact that participants in our surgical skill labor are residents or board-certified surgeons coming from different educational institutions and have different level in experience and operative exposure, which makes it difficult compared to other studies who evaluated the skills of residents trained in one educational program [49]. The incapability to show significant construct validity in this study is consistent with the findings of Faulkner et al. [48] whether this was a failing of the OSATS or the rater's rankings is not clear.
The result in this study shows a significant variability in the quality of training of residents in different surgical education programs in Germany. This can be concluded from the results of our survey which showed that 66% of the participants reported performing ≤ 10 Bowel Anastomosis since the beginning of their surgical training with majority of the participants being in their 4th and 5th years of residency training. Additionally, the scores of all participants in practice-I did not correlate with the reported number of in-training performed bowel anastomosis. Moreover, just 8 participants had the confidence to do a bowel anastomosis before the beginning of the course and their scores in the checklist did not correlate with their training level.
The degree in improvement pre-post curriculum was not predicted by the training level or the number of in-training performed bowel anastomosis reported by the participants. Although statistical significance was reached just in six items in the checklist when measuring the degree of improvement pre-post curriculum, the degree in improvement in the overall checklist was significant.
Noteworthy, three participants in this study were specialists and senior surgeon. Nevertheless, the degree of significance in improvement pre-post curriculum did not change when these 3 participants were excluded from the data analysis which excludes this potential bias in the study.
Finally, participants who completed the curriculum in our skill laboratory felt more confident and knowledgeable about the procedures and technical skills required. All participants agreed with the recommendation to integrate simulation laboratories in the surgical education.

Key Points and Conclusion

Surgical simulators are increasingly becoming invaluable instruments for training and technical skills assessment instead of just existing at surgical equipment exhibits to attract attention [44]. They allow residents to progress on their own pace in a lowstress, controlled and safe environment and give at the same time the opportunity for immediate feedback. Simulation laboratories are the places where residents can repeat key steps of procedures and therefore avoid potential harm to patients [42].
As reported in the bulletin of the accreditation council for graduate medical education “Simulation enhances both safety and predictability; and it will be part of the new system of graduate medical education. Every patient deserves a competent physician every time. Every resident deserves competent teachers and an excellent learning environment; Simulation serves both of these core principles” [53].
Reviewing the literature to date, little information beside the sporadic use of simulation techniques in private institutes, is available about the role of surgical skill simulators in the surgical education in Germany. Training in Germany is still halstedian, done mostly per mentoring. Furthermore, assessment of residents is predominantly subjective based mainly on log books and yearly reports and few changes, mainly due to financial constraints, have been done coping with the evolving challenges in the residency training. Our surgical community is required to recognize those challenges and start investigating if our current training model is producing technically excellent surgeons.
In conclusion, this study demonstrates a successful implementation of a surgical skill labor with a predefined curriculum and hold on yearly basis since 2005. The results are consistent with other published studies, demonstrating the effectiveness of simulation skill labors. With growing pressure on our operating rooms, we believe that simulation in training is an excellent alternative model to fill the deficiencies in our current training model and is an effective way to examine the technical skill abilities of our residents.

Limitations of the Study

Our Study is limited with the small sample number of participants, with assessment of just a single task from the various tasks predefined in our curriculum and with the fact that surgical residents participated in our skill laboratory are coming from different training programs with variability of the skills they have already acquired during their training. Nonetheless, I believe that we have clearly demonstrated the value of a simulation-based curriculum to teach visceral anastomosis techniques.


  1. Abdulmohsen H. Al-Elq. Simulation-based medical teaching and learning. J Family Community Med. 2010;17(1):35-40.
  2. Cooper JB, Taqueti VR. A brief history of the development of mannequin simulators for clinical education and training. In: Postgraduate medical journal. 2008;84(997):S.563–70.
  3. Reznick RK. Teaching and testing technical skills. Am Surg. 1993;165(3):358-61.
  4. Shaharan S, Neary P. Evaluation of surgical training in the era of simulation. World J Gastrointest Endosc. 2014;6(9):436-47.
  5. Kohn LT, Corrigan JM, Donaldson MS. To Err is Human: Building a Safer Health System. National Academies Press (US). Washington (DC).
  6. Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg. 1999;177(1):28-32.
  7. European Union: Organisation of working time. Online verfügbar unter, zuletzt geprüft.
  8. Pickersgill T. The European working time directive for doctors in training. In: BMJ (Clinical research ed.) 2001;323(7324):S.1266.
  9. Kerr B, O'Leary JP. The training of the surgeon: Dr. Halsted's greatest legacy. Am surg. 1999;65(11):1101-2.
  10. Reznick RK, MacRae H. Teaching surgical skills-changes in the wind. N Engl J Med. 2006;355(25):2664-9.
  11. Darzi A, Smith S, Taffinder N. Assessing operative skill. Needs to become more objective. BMJ. 1999;318(7188):887-8.
  12. Paisley MAM, Baldwin PJ, Paterson-Brown S. Validity of surgical simulation for the assessment of operative skill. BJS. 2001;88(11):1525-1532.
  13. Ericsson KA, Nandagopal K, Roring RW. Toward a science of exceptional achievement: attaining superior performance through deliberate practice. Ann N Y Acad Sci. 2009;1172:199-217.
  14. Woodhouse J. Strategies for Healthcare Education: How to Teach in the 21st Century: Radcliffe: Abingdon, 2007: xi,153.
  15. Sturm L, Maddern G, Windsor J, Cregan P, Hewitt P, Cosman P. Surgical simulation for training: skills transfer to the operating room. ASERNIP-S. Adelaide, South Australia. 2007.
  16. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273-8.
  17. Lossing AG, Hatswell EM, Gilas T, Reznick RK, Smith LC. A technical-skills course for 1st-year residents in general surgery: a descriptive study. Can J Surg. 1992;35(5):536-40.
  18. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “Bench station” examination. Am J Surg. 1997;173(3):226-30.
  19. Dawson, Steven L. A critical approach to medical simulation. In: Bulletin of the American College of Surgeons. 2002;87(11):S12–18.
  20. Sachdeva AK, Bell RH, Britt LD, Tarpley JL, Blair PG, Tarpley MJ. National efforts to reform residency education in surgery. Acad Med. 2007;82(12):1200-10.
  21.  Dent JA. Current trends and future implications in the developing role of clinical skills centres. Med Teach. 2001;23(5):483-9.
  22. Chipman JG, Schmitz CC. Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg. 2009;209(3):364-70.
  23. Tsuda S, Scott D, Doyle J, Jones DB. Surgical skills training and simulation. Curr Probl Surg. 2009;46(4):271-370.
  24. Velmahos GC, Toutouzas KG, Sillin LF, Chan L, Clark RE, Theodorou D, et al. Cognitive task analysis for teaching technical skills in an inanimate surgical skills laboratory. Am J Surg. 2004;187(1):114-9.
  25. Anastakis DJ, Regehr G, Reznick RK, Cusimano M, Murnaghan J, Brown M, et al. Assessment of technical skills transfer from the bench training model to the human model. In: Am J Surg. 1999;177(2):167-70.
  26. Dimick JB, Chen SL, Taheri PA, Henderson WG, Khuri SF, Campbell DA. Hospital costs associated with surgical complications. A report from the private-sector National Surgical Quality Improvement Program. J Am Coll Surg. 2004;199(4):531-7.
  27. Dent TL. Training, credentialing, and evaluation in laparoscopic surgery. Surg clin North Am. 1992;72(5):1003-11.
  28. Moore MJ, Bennett CL. The learning curve for laparoscopic cholecystectomy. The Southern Surgeons Club. Am J Surg. 1995;170(1):55-9.
  29. Adrian Park, Donald Witzke, Michael Donnelly. Ongoing deficits in resident training for minimally invasive surgery. J Gastrointest Surg. 2002;6(3),501-9.
  30. Rattner DW, Apelgren KN, Eubanks WS. The need for training opportunities in advanced laparoscopic surgery. Surg Endosc. 2001;15(10):1066-70.
  31. Schulman CI, Levi J, Sleeman D, Dunkin B, Irvin G, Levi D, et al. Are we training our residents to perform open gall bladder and common bile duct operations? J Surg Res. 2007;142(2):246-9.
  32. Livingston EH, Rege RV. Technical complications are rising as common duct exploration is becoming rare. J Am Coll Surg. 2005;201(3):426-33.
  33. Fitts PM, Posner MI. Human performance. 1967.
  34. Halm EA, Lee C, Chassi MR. Is volume related to outcome in health care? A systematic review and methodologic critique of the literature. Ann Intern Med. 2002;137(6):511-20.
  35. Bath J, Lawrence P. Why we need open simulation to train surgeons in an era of work-hour restrictions. Vascular. 2011;19(4):175-7.
  36. Varban OA, Ardestani A, Peyre S, Smink DS. Assessing the effectiveness of surgical skills laboratories: a national survey. Simul Healthc. 2013;8(2):91-7.
  37. Mittal MK, Dumon KR, Edelson PK, Acero NM, Hashimoto D, Danzer E, et al. Successful implementation of the american college of surgeons/association of program directors in surgery surgical skills curriculum via a 4-week consecutive simulation rotation. Simul healthc. 2012;7(3):147-54.
  38. Fried GM, Feldman LS, Vassiliou MC, Fraser SA, Stanbridge D, Ghitulescu G, et al. Proving the value of simulation in laparoscopic surgery. Ann Surg. 2004;240(3):518-25.
  39. Griswold S, Ponnuru S, Nishisaki A, Szyld D, Davenport M, Deutsch ES, et al. The emerging role of simulation education to achieve patient safety. Translating deliberate practice and debriefing to save lives. Pediatr Clin North Am. 2012;59(6):1329-40.
  40. Olson TP, Becker YT, McDonald R, Gould J. A simulation-based curriculum can be used to teach open intestinal anastomosis. J Surg Res. 2012;172(1):53-8.
  41. Scott DJ, Bergen PC, Rege RV, Laycock R, Tesfay ST, Valentine RJ, et al. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg. 2000;191(3):272-83.
  42. Berg DA, Milner RE, Fisher CA, Goldberg AJ, Dempsey DT, Grewal H. A cost-effective approach to establishing a surgical skills laboratory. Surgery. 2007;142(5):712-21.
  43. Babineau TJ, Becker J, Gibbons G, Sentovich S, Hess D, Robertson S, et al. The "cost" of operative training for surgical residents. 2004;139(4):366-9.
  44. Aucar JA, Groch NR, Troxel SA, Eubanks SW. A Review of Surgical Simulation with Attention to Validation Methodology. Surg Laparosc Endosc Percutan Tech. 2005;15(2):82-9.
  45. Ault G, Reznick R, MacRae H, Leadbetter W, DaRosa D, Joehl R, et al. Exporting a technical skills evaluation technology to other sites. Am J Surg. 2001;182(3):254-6.
  46. Wanzel KR, Ward M, Reznick RK. Teaching the surgical craft: From selection to certification. Curr Probl Surg. 2002;39(6):573-659.
  47. Holmboe ES, Hawkins RE. Methods for Evaluating the Clinical Competence of Residents in Internal Medicine: a Review. Ann Intern Med. 1998;129(1):42-8.
  48. Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med. 1996;71(12):1363-5.
  49. Goff B, Mandel L, Lentz G, Vanblaricom A, Oelschlager AM, Lee D, et al. Assessment of resident surgical skills: is testing feasible? Am J Ostet Gynecol. 2005;192(4):1331-8.
  50. Jensen AR, Wright AS, McIntyre LK, Levy AE, Foy HM, Anastakis DJ, et al. Laboratory-based instruction for skin closure and bowel anastomosis for surgical residents. Arch Surg. 2008;143(9):852-8.
  51. Lentz GM, Mandel LS, Lee D, Gardella C, Melville J, Goff BA. Testing surgical skills of obstetric and gynecologic residents in a bench laboratory setting. Validity and reliability. Am J Obstet Gynecol. 2001;184(7):1462-8.
  52. Pandey VA, Wolfe JHN, Lindahl AK, Rauwerda JA, Bergqvist D. Validity of an exam assessment in surgical skill: EBSQ-VASC pilot study. European Journal of Vascular and Endovascular Surgery. 2004;27(4):341-8.
  53. Leach DC. Simulation: it’s about respect. ACGME Bulletin. 2005;2-3.