Volume VI, Number 2 | August 2022

The Utility of Arthrobox Training in Arthroscopic Skills Development of Orthopedic Trainees

1. Timothy Beals DO – Sports Medicine Oregon
2. Patrick J. Fernicola MD – The Hughston Clinic, Columbus, GA 31909
3. Fred Flandry MD – The Hughston Clinic, Columbus, GA 31909

Purpose
To prospectively evaluate the effect of a low-cost, commercial triangulation system on resident trainee arthroscopic skills with a cadaveric model.

Methods
We randomized 12 orthopedic resident trainees (postgraduate year 1-4) to either a simulator training group (n= 6) or group with no simulation (n=6).  All subjects were pretested using a standardized arthroscopic rating system on cadaveric knees.  Training group subjects were given unlimited access to the triangulation simulator and instructed to perform training tasks weekly.  Control subjects received no access to stimulators.  12 months later, all subjects completed a posttest on cadaveric knees.  A blinded orthopedic surgeon evaluated the arthroscopic videos using the Arthroscopic Surgical Skill Evaluation Tool (ASSET) score.

Results
Training group knee asset scores increased from 15.33 ± 5.95 to 25.75 ± 6.24 (p=0.498).  In the control group, knee ASSET scores increased from 15.0 ± 7.50 to 24.1 ± 8.16 (p=0.496).  There were no significant differences in mean posttest ASSET scores between the training group and control group, nor were there significant differences in mean improvement of ASSET scores between the training group and control group.  There was a significant correlation between number of arthroscopic knee procedures performed during the study period and improvement in ASSET scores for the control group (P=0.046), but not a statistically significant correlation in the training group (p=0.876).

Conclusions
Access to a low cost, commercial triangulation simulation system did not result in improved arthroscopic skills in orthopedic trainees, but may reduce the effects of differing caseloads.

Level of Evidence: Level I, randomized control trial

Keywords: arthroscopy, training, simulation, knee arthroscopy

Introduction 
Arthroscopic surgeries represent a large portion of orthopedic procedures, and involves a unique skill set that is acquired over many repetitions. Traditional training is almost fully dependent on these repetitions occurring in the operating theater; which may be associated with high cost, longer operating time, and patient safety concerns1–3. In an effort to mitigate any negative effects of resident work hour restrictions, and allow for skill acquisition across varying trainee case-loads and experiences, there has been significant interest in developing arthroscopic simulation in orthopedic training.

There exists a large variety of arthroscopic trainers ranging from low-fidelity to highly sophisticated virtual reality systems. These simulations provide trainees with a safe environment to practice and attain skills in an environment unconstrained by normal scheduled hours.  Despite enthusiasm for simulation development, no consensus model or training system is widely agreed upon. Research in anatomic simulation models have shown transferability to the operative theater4–7, but may be cost prohibitive for general implementation8,9. Low-cost, portable trainers may increase trainee access to simulation training without decreasing the educational benefit10,11. The ArthroBox (Arthrex, Naples, FL) is a simple triangulation training system that has been evaluated in medical students12,13, but has not been evaluated in orthopedic trainees. The purpose of this study was to prospectively evaluate the effect of the at-home triangulation training system on orthopedic residents’ arthroscopic skills in a cadaveric model. We hypothesized that training would result in improved Arthroscopic Surgical Skill Evaluation Tool (ASSET) scores.

Methods
Institutional review board approval was gained prior to the study.  Participations were enrolled from a single orthopedic surgery residency in July 2018.  The inclusion criteria were active orthopedic surgery residents between postgraduate year (PGY) 1 and 4.  Participation was voluntary and not related to any evaluation or rotation.  12 residents agreed to participate.  Residents were categorized as either Senior (PGY 3 and 4), or Junior (PGY 1 and 2).  The residents’ year of training and number of knee arthroscopy procedures logged into the American Council on Graduate Medical Education (ACGME) case-log system prior to study participation were recorded.  The study consisted of a single blinded, prospective, randomized control trial.  Subjects were randomized into either a simulator training group (N=6) or control group with no access to simulator (n=6).  No power analysis was conducted as sample was limited to the population of a single orthopedic surgery program.

Prior to the pretest, all subjects viewed a standardized intraoperative recording of diagnostic knee arthroscopy and were given a standardized form with the task list for diagnostic arthroscopy.  The task list was also fixed to the arthroscopic monitor during the pretest procedure.  A trained observer was provided to get assistance to each subject by placing varus and/or valgus stress on the knee.  Input from the observer was limited to the instructions provided on the cheek sheet.  Video of each subject performing cadaveric diagnostic knee arthroscopy was recorded through the arthroscope with no external views or audio recording.  Recording began with visualization of the patellofemoral joint and continued until the task list had been completed or 25 minutes had elapsed.  Recorded videos were assigned a random identification number for subsequent review and writing.

The Arthrobox Arthroscopic Triangulation Training System (Arthrex) was then provided to each training group subject with a training protocol of 4 tasks.  Subjects were instructed to complete training tasks on a weekly basis.  The control group received no simulator orientation are training access.

A training period of 12 months was utilized to minimize differences in residency rotations and arthroscopy completed by study participants.  After the 12-month period had concluded, participants underwent post testing on cadaveric knee specimens.  Participants were again shown the standardized diagnostic arthroscopy video and provided with the task list.  The posttest cadaveric knee arthroscopy was completed with an identical protocol and condition to the pretest.

Pretest and posttest arthroscopic videos were evaluated twice each by a blinded orthopedic surgeon (T.R.B), utilizing the Arthroscopic Surgical Skill Evaluation Tool (ASSET).  The primary outcome was the posttest ASSET score.  The ASSETallows for assessment arthroscopic skill and is a video-based assessment utilizing intraoperative video to score 8 domains of arthroscopic ability.  The ASSET is a validated and frequently used method for evaluation of performance of diagnostic arthroscopy14,15

Statistical analysis was performed with Excel (Microsoft, Redmond, WA).  Independent samples T tests were used to analyze the means between groups.  Paired samples T tests were used to analyze the means within groups over time.  Spearman correlation was used to evaluate the correlation of knee arthroscopic procedures performed before, during, and upon completion of the study period to the asset pre-/post-tests and change in score.  The intra-rater reliability was measured using the intra-class correlation coefficient (ICC). Statistical significance was defined as p¬<0.05.

Results
Average pre-test and post-test ASSET scores, as well as number of knee arthroscopy cases logged before the study, at the conclusion of the study, and during the study are displayed in Table 1.  The mean ASSET scores in the training group (Figure 1) increased from 15.33 ± 5.95 to 25.75 ± 6.24 (p=0.50).  The mean ASSET scores in the control group (Figure 2) increased from 15.00 ± 7.50 to 24.08 ± 8.16 (p=0.50).  No significant difference was found between the training and control groups’ pre-test score (p=0.92), post-test score (p=0.70), or change in score (p=0.74).

Sub-analysis was performed to correlate the case volume of knee arthroscopy before, during, and at conclusion of the study period with pretest score, posttest score, and change in ASSET score (Table 2).  A statistically significant correlation was noted between knee arthroscopy procedures performed during the study and the change in ASSET score within the control group (r=0.82, p=0.046).  This was not observed in the training group (r=0.08, p=0.876).

Pre-study knee arthroscopy case volumes were 21.67 + 27.18 and 89.17 + 33.38 for junior and senior residents, respectively (p=0.63). Post-study knee arthroscopy case volumes were significantly different at 45.33 + 24.53 and 133.67+ 59.41 for junior and senior residents, respectively (p=0.007). Knee arthroscopy case volumes during the study period were 23.67 + 3.68 and 44.5 + 40.9 for junior and senior residents, respectively (p=0.24). (Table 3)

The intra-rater reliability of the ASSET scoring was excellent with an ICC=0.98.

A pre-hoc power analysis was not feasible due to a lack of established means and standard deviations of the ASSET score in orthopedic trainees. A post-hoc power analysis with data in our population resulted in a required 81 residents per study arm to produce an 80% power to detect a 3-point difference in change in ASSET scores between groups with an alpha error less than 0.05.

Discussion
The primary outcome of this small pilot study demonstrated no identifiable difference in post-test ASSET scores between residents given the arthroscopic simulator and the control group. Nor was there a statistically significant difference in the change in ASSET scores between groups. A statistically significant correlation existed between knee arthroscopy procedures performed in the study period and change in ASSET scores for the control group which was not present in the simulation group. This may indicate that access to the simulator diminished the dose effects of in vivo knee arthroscopic training. 

There is limited evidence of the transfer validity of non-anatomic simulator training to anatomic models in the literature. The preponderance of reports in the field have focused on construct validity16. Studies into transfer validity have frequently used high fidelity virtual reality simulations in orthopedic trainees, and have shown improved performance in shoulder arthroscopy6,17,18. Martin, et al. similarly showed improvement in ankle arthroscopy in orthopedic trainees using a low fidelity anatomic sawbones simulator19. Ledermann, et al. further demonstrated improved performance in arthroscopic partial meniscectomy with high transference to the operating room when a low fidelity anatomic sawbones trainer was used20.

Investigations into non-anatomic triangulation simulation have primarily involved novice participants12,13, with Redondo et al. demonstrating significantly increased ASSET scores in knee arthroscopy when ArthroBox training was utilized in 14 medical students without prior arthroscopy experience13.  The authors noted a likelihood of a ceiling effect in the training model which may contribute to the finding of no significant difference in our study with orthopedic trainees.

There are several limitations of this investigation.  Firstly, we were limited in our sample and power by the number of residents in the program with only 6 members per testing group, significantly lower than the 81 prescribed by our post-hoc power analysis.  This may allow for a significant beta error.  Therefore, this present study may best serve as a pilot investigation for larger, multi-center investigations.  Significant variability in the amount of knee arthroscopy performed during the study period was seen amongst the subjects.  A full academic year was used for the study period to minimize these effects, but could not be fully controlled for.  Further, time spent on the trainer may have been insufficient to elicit significant improvements.  Trainees were instructed to complete the four training tasks on a weekly basis, however, informal discussions with subjects after the conclusion of the study demonstrated variable and diminished adherence to the training regimen depending on other residency demands. This may be an obstacle to the applicable effectiveness of the training system in a generalized trainee population, as educational interventions require both a benefit and a “buy-in” from the residents themselves.

Conclusion 
Access to a low-cost, commercial triangulation simulation system did not result in improved arthroscopic skills in orthopedic trainees, but may reduce the effects of differing caseloads.

References

  1. Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg. 1999;177(1):28-32. 
  2. Farnworth LR, Lemay DE, Wooldridge T, et al. A Comparison of Operative Times in Arthroscopic ACL Reconstruction Between Orthopaedic Faculty and Residents: The Financial Impact of Orthopaedic Surgical Training in the Operating Room. Iowa Orthop J. 2001;21:31. 
  3. Baldwin P, Dodd M, Wrate R. Junior doctors making mistakes. Lancet. 1998;351(9105):804. 
  4. Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: A randomized blinded study. J Bone Joint Surg Br. 2008;90(4):494-499. 
  5. Martin KD, Belmont PJ, Schoenfeld AJ, Todd M, Cameron KL, Owens BD. Arthroscopic basic task performance in shoulder simulator model correlates with similar task performance in cadavers. J Bone Jt Surg – Ser A. 2011;93(21):e127(1). 
  6. Waterman BR, Martin KD, Cameron KL, Owens BD, Belmont PJ. Simulation training improves surgical proficiency and safety during diagnostic shoulder arthroscopy performed by residents. Orthopedics. 2016;39(3):e479-e485. 
  7. Jentzsch T, Rahm S, Seifert B, Farei-Campagna J, Werner CML, Bouaicha S. Correlation Between Arthroscopy Simulator and Video Game Performance: A Cross-Sectional Study of 30 Volunteers Comparing 2- and 3-Dimensional Video Games. Arthrosc J Arthrosc Relat Surg. 2016;32(7):1328-1334. 
  8. Zendejas B, Wang AT, Brydges R, Hamstra SJ, Cook DA. Cost: The missing outcome in simulation-based medical education research: A systematic review. Surgery. 2013;153(2):160-176.
  9. Karam MD, Pedowitz RA, Natividad H, Murray J, Marsh JL. Current and future use of surgical skills training laboratories in orthopaedic resident education: A national survey. J Bone Jt Surg – Ser A. 2013;95(1). 
  10. Sandberg RP, Sherman NC, Latt LD, Hardy JC. Cigar Box Arthroscopy: A Randomized Controlled Trial Validates Nonanatomic Simulation Training of Novice Arthroscopy Skills. Arthrosc J Arthrosc Relat Surg. 2017;33(11):2015-2023.e3. 
  11. Colaco HB, Hughes K, Pearse E, Arnander M, Tennent D. Construct Validity, Assessment of the Learning Curve, and Experience of Using a Low-Cost Arthroscopic Surgical Simulator. J Surg Educ. 2017;74(1):47-54.
  12. Frank RM, Rego G, Grimaldi F, et al. Does Arthroscopic Simulation Training Improve Triangulation and Probing Skills? A Randomized Controlled Trial✰. J Surg Educ. 2019;76(4):1131-1138. 
  13. Redondo ML, Christian DR, Gowd AK, et al. The Effect of Triangulation Simulator Training on Arthroscopy Skills: A Prospective Randomized Controlled Trial. Arthrosc Sport Med Rehabil. 2020;2(2):e59-e70. 
  14. Koehler RJ, Amsdell S, Arendt EA, et al. The Arthroscopic Surgical Skill Evaluation Tool (ASSET). Am J Sports Med. 2013;41(6):1229-1237. 
  15. Koehler RJ, Nicandri GT. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination. J Bone Joint Surg Am. 2013;95(23):e1871-6. 
  16. Rashed S, Ahrens PM, Maruthainar N, Garlick N, Saeed MZ. The Role of Arthroscopic Simulation in Teaching Surgical Skills: A Systematic Review of the Literature. JBJS Rev. 2018;6(9):e8. 
  17. Rebolledo BJ, Hammann-Scala J, Leali A, Ranawat AS. Arthroscopy Skills Development With a Surgical Simulator: A Comparative Study in Orthopaedic Surgery Residents. https://doi.org/101177/0363546515574064. 2015;43(6):1526-1529. 
  18. Dunn JC, Belmont PJ, Lanzi J, et al. Arthroscopic Shoulder Surgical Simulation Training Curriculum: Transfer Reliability and Maintenance of Skill Over Time. J Surg Educ. 2015;72(6):1118-1123. 
  19. Martin KD, Patterson D, Phisitkul P, Cameron KL, Femino J, Amendola A. Ankle Arthroscopy Simulation Improves Basic Skills, Anatomic Recognition, and Proficiency During Diagnostic Examination of Residents in Training: Foot Ankle Int. 2015;36(7):827-835. 
  20. Ledermann G, Rodrigo A, Besa P, Irarrázaval S. Orthopaedic Residents’ Transfer of Knee Arthroscopic Abilities from the Simulator to the Operating Room. J Am Acad Orthop Surg. 2020;28(5):194-199. 
Required Disclosures and Declaration

Copyright Information: No Copyright Information Added

IRB Approval Information: Yes

Disclosure Information: Arthrex, inc provided material support in the form of cadaveric specimens and Arthrobox training units.

The Journal of the American Osteopathic Academy of Orthopedics

Steven J. Heithoff, DO, FAOAO
Editor-in-Chief

To submit an article to JAOAO

Share this content on social media!

Facebook
Twitter
LinkedIn

© AOAO. All copyrights of published material within the JAOAO are reserved.   No part of this publication can be reproduced or transmitted in any way without the permission in writing from the JAOAO and AOAO.  Permission can be requested by contacting Joye Stewart at [email protected].