0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

Evaluation of Otolaryngology Residency Program Websites FREE

Peter F. Svider, MD1; Amar Gupta, MD1; Andrew P. Johnson, MD1; Giancarlo Zuliani, MD1,2; Mahdi A. Shkoukani, MD1,2; Jean Anderson Eloy, MD3,4,5; Adam J. Folbe, MD1
[+] Author Affiliations
1Department of Otolaryngology–Head and Neck Surgery, Wayne State University School of Medicine, Detroit, Michigan
2Section of Otolaryngology, Department of Surgery, John D. Dingell VA Medical Center, Detroit, Michigan
3Department of Otolaryngology–Head and Neck Surgery, Rutgers New Jersey Medical School, Newark
4Center for Skull Base and Pituitary Surgery, Neurological Institute of New Jersey, Rutgers New Jersey Medical School, Newark
5Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark
JAMA Otolaryngol Head Neck Surg. 2014;140(10):956-960. doi:10.1001/jamaoto.2014.1714.
Text Size: A A A
Published online

Importance  Prior to applying or interviewing, most prospective applicants turn to the Internet when evaluating residency programs, making maintenance of a comprehensive website critical. While certain “intangibles” such as reputation may not be communicated effectively online, residency websites are invaluable for conveying other aspects of a program. Prior analyses have reported that certain criteria such as research experience and didactics are important considerations for applicants.

Objective  To evaluate the comprehensiveness of otolaryngology residency websites.

Design and Participants  Review of otolaryngology residency program websites. Websites of 99 civilian residency programs were searched for the presence of 23 criteria.

Main Outcomes and Measures  Presence of 23 criteria for application process, incentives, instruction, research, clinical training, and other.

Results  Only 5 programs contained at least three-quarters of the criteria analyzed; on average programs reported less than 50% of information sought. Among the 99 residency program websites, a description of the following criteria was noted: comprehensive faculty listing (88%), didactics (80%), contact e-mail (77%), current residents (74%), description of facilities (70%), intern schedule (70%), research requirements (69%), otolaryngology rotation schedule (64%), other courses (61%), ERAS (Electronic Residency Application Service) link (55%), year-to-year responsibility progression (47%), call schedule (40%), active/past research projects (37%), area information (34%), message from the program director (33%) or chair (23%), selection criteria (30%), salary (directly on site) (23%), surgical statistics (18%), parking (9%), and meal allowance (7%). The mean (SD) percentage present of factors encompassing “clinical training” was 55% (23%), significantly higher than the mean (SD) percentage of factors covered under the “incentives” category (19% [11%]; P = .01). The proportion of overall criteria present on websites did not differ on organizing programs by region (range, 42%-49%). Sites for “large” programs (≥3 residents per year) were more comprehensive (49% vs 42%; P = .04).

Conclusions and Relevance  While further survey of prospective applicants would be invaluable in determining which factors are of greatest interest, many residency websites appear to be inadequately comprehensive. Despite the relative comprehensiveness of criteria relevant to clinical training when compared with other aspects of websites such as incentives, several crucial aspects of training are still not addressed in many sites.

Prior to applying or interviewing, most prospective applicants turn to the Internet when evaluating residency programs, making maintenance of a comprehensive site critical. Although there have been no surveys specific to otolaryngology applicants, myriad studies questioning applicants for other residencies have confirmed this trend, while others have identified numerous areas for improvement.17 Furthermore, prior analyses have indicated that certain criteria such as comprehensiveness of subspecialty experience, research experience, and didactics may be important considerations for otolaryngology applicants.8,9

Because securing an otolaryngology residency position may be a competitive process,10 applicants oftentimes apply to an excessive amount of programs, particularly if there is difficulty evaluating differences among programs. Addressing any shortcomings in program websites may have significant implications on the residency recruitment process because it may allow applicants to better ascertain which programs may be a better “fit” and consistent with his or her career aspirations. Our objective was to evaluate the comprehensiveness of otolaryngology residency websites, with the goal of identifying areas for improvement.

This project evaluated publically available online websites for information about programs written for prospective applicants to look up freely online, and because this project did not encompass Human Subjects Research, it did not require institutional review board approval as per the standing policy of Rutgers New Jersey Medical School.

The American Medical Association FREIDA (Fellowship and Residency Electronic Interactive Database) was accessed for a listing of otolaryngology residency programs in September 2013. All included websites were each searched by the authors (P.F.S., A.G., and A.P.J.) for the presence of absence of criteria listed in the Box. Of note, information was only considered to be addressed on program sites if it was directly available. In other words, links to information not found on the otolaryngology departmental or residency website were not counted.

Box Section Ref ID

Box.
Criteria Examined on the Websites of Otolaryngology Residency Program
Application Process
  • Link to ERAS (Electronic Residency Application Service)

  • Contact e-mail

  • Selection criteria

Incentives
  • Salary (directly on website)a

  • Benefits (directly on website)a

  • Parking (directly on website)a

  • Meal allowance (directly on website)a

  • Information about surrounding area

Instruction
  • Specific extra courses (eg, temporal bone course, anatomy course, plastics course)

  • Description of didactics

Research
  • Research requirements

  • Active/past research projects in department

Clinical Training
  • Comprehensive faculty listings (including name, training, subspecialty)

  • Current resident listings

  • Facility descriptions

  • Intern year schedule (postgraduate year 1)

  • Rotation schedule (postgraduate years 2-5)

  • Surgical case/responsibility progression

  • Call Schedule/requirements

  • Career placement (past residents)

  • Surgical statistics (either general overall numbers or specific cases)

Other
  • Message from the chairperson

  • Message from the program director

a

Included only if information directly on otolaryngology residency program/department website rather than general link leading to graduate medical education office, other institutional-specific website.

Programs were organized by geographic location, as designated by US census bureau designated regions (Midwest, Northeast, South, and West). In addition, programs were organized by the size of their residencies for further analysis. Programs accepting 2 or fewer residents per year were considered “small” programs, while programs accepting at least 3 residents in any of their years were considered “large” programs. Of the 23 factors examined, the number of items addressed by individual websites was compared by these factors. In addition, the US News & World Report top-ranked hospitals for ear, nose, and throat were examined for the presence of primary clinical training sites for residency programs, and the comprehensiveness of the websites of “ranked” programs (those in the top 50) was compared with that of nonranked programs.11

χ2 Comparisons and Mann-Whitney tests/Kruskal-Wallis tests were used for comparison of categorical and continuous variables, respectively, with threshold for significance set at P < .05. SPSS version 20 software (IBM Corporation) was used for statistical analysis.

Of 106 programs, 100 were civilian US programs, and 99 had available websites that were evaluated in this analysis. Individual program websites contained a mean (SD) of 10.6 (3.5) of the 23 factors sought (46%). Only 5 programs contained at least three-quarters of the criteria analyzed.

Most sites had comprehensive faculty listings, descriptions of didactics, and a contact e-mail address for interested applicants, while fewer than one-quarter of programs listed incentives (other than information about the surrounding area), contained a message from the chairperson, and reported surgical statistics (Table 1).

Table Graphic Jump LocationTable 1.  Presence of Criteria Sought on Otolaryngology Program Websites

When organizing programs by geographic region, no differences in the amount of information available on each website were noted (Table 2) (Kruskal-Wallis, P > .05). When evaluating programs based on size, “large” programs reported a greater mean (SD) number of criteria sought (11.2 [3.2] items) than “small” programs (9.7 [3.8] items) (P = .04) (Table 2). When comparing of programs whose primary clinical training sites were ranked by the US News & World Report rankings11 vs unranked programs, there was no difference in website comprehensiveness (46.1% [11.9%] of items vs 46.0% [17.1%] of items) (P > .05).

Table Graphic Jump LocationTable 2.  Comprehensiveness of Websites Organized by Region and Program Sizea

To our knowledge, this is the first analysis examining the comprehensiveness of otolaryngology residency program websites. However, a major limitation is the subjectivity in deciding which criteria to include. Our hope was to encompass a variety of domains, including (but not limited to) issues of great clinical relevance and those related to quality of life (Box). As the authorship of this article ranges from junior residents to senior faculty, we believe that we have touched on a broad range of concerns that may be pertinent to applicants. Nonetheless, there is variability in the degree to which each of the factors studied plays a role in an applicant’s investigation of a program, and there may be issues other than those identified that are sought by applicants researching programs. Consequently, further study with a survey would be complementary by discerning the degree to which our criteria are relevant.

Although there have been no previous studies to our knowledge examining the criteria otolaryngology residency applicants seek online, prior investigations focusing on the matching process offer guidance. The otolaryngology residency applicant survey in the article by Sharp et al9 revealed that comprehensiveness of subspecialties, resident satisfaction, location, and reputation were considered to be the most important criteria when ranking programs. The present analysis touches on several of these factors (Box). For example, most websites contained comprehensive faculty listings (Table 1), allowing an applicant the opportunity to judge the subspecialties represented among a program’s faculty.

“Resident satisfaction” is certainly a subjective factor that may be interpreted through a variety of ways and not likely to be articulated effectively through a website. However, there are several issues we examined that may be perceived to be in the domain of resident satisfaction: the majority of program websites did not address incentives (Table 1), call schedules, and general case numbers. In terms of “location,” the quality of a website has no bearing on this; however, only approximately one-third of programs contained sections describing the surrounding area, and this represents another potential target for improvement. No differences in online comprehensiveness were noted when organizing programs by geographic region (Table 2).

Residency and departmental websites often represent an applicant’s initial exposure to a program. Analyses examining multiple specialties3,57 have noted that nearly all applicants examine residency websites for information regarding a program. Furthermore, most applicants note that websites influence application decisions5,7 and that an easily navigable site may be an important factor in deciding where to apply.3 One survey of emergency medicine applicants noted that residency curriculum, medical facilities, faculty information, and resident information were the most important materials sought on sites, while the aesthetic quality of sites was considered least important.5 Similar to our present analysis, an investigation of general surgery residency websites found numerous areas for improvement in providing relevant information.6

One set of criteria few programs directly addressed included questions regarding benefits. Specifically, fewer than a quarter of programs included salary, benefits (such as health insurance), parking, and meal allowances on their sites (Table 1). Furthermore, fewer than half of programs offered information about their location or delineated the frequency of call responsibilities. While these are significant issues that affect quality of life, many applicants may feel uncomfortable contacting programs before interviews regarding these issues or discussing these questions during interviews. Applicants may believe that asking these questions may be perceived poorly by programs because several of these issues may not directly relate to clinical training. This potential issue emphasizes the importance of programs addressing these factors on their websites.

Numerous criteria critical to the clinical training experience were included in the majority of websites analyzed (Table 1). Although we did not weigh the relative importance of the various criteria explored, factors falling under “clinical training” are likely considered among the most important by applicants. The mean (SD) percentage of the 9 factors encompassed under “clinical training” was 55% (23%) (Table 1). This was significantly higher than the mean (SD) of the 5 factors considered under the “incentives” category (19% [11%]) (P = .01) (Table 1), which are arguably far less important.

Despite the relative comprehensiveness of criteria relevant to clinical training, several crucial aspects of training are not addressed in many sites. The progression of surgical and clinical responsibility from year to year was detailed in only 47% of websites. Perhaps surprisingly, only 28% of websites commented on the career paths of past trainees and current senior residents. Both of these pieces of information are arguably important in representing the quality of training and the clinical direction one takes after going through a particular program. Detailed information regarding practice setting and fellowship placement may be helpful in directing an applicant toward a program that he or she believes to be consistent with his/her career aspirations.

While no differences in comprehensiveness were noted when organizing by geographic region (Kruskal-Wallis test, P > .05) (Table 2), “large” residency programs contained more of the criteria sought than “small” programs (11.2 items vs 9.7 items; P = .04). Although this difference was statistically significant, it is unlikely that a difference of 1 to 2 items is meaningful. However, this discrepancy may suggest that larger programs may have more resources to direct toward online development. Nonetheless, both cohorts contained less than half of the factors analyzed (Box) (Table 2), emphasizing that most programs have considerable room for improvement in their online sites.

Aside from the aforementioned use of a survey-based analysis to complement our findings, another limitation involves an inability to comment on other factors such as aesthetic qualities and ease of use with regard to these websites. The authors were interested conducting a succinct analysis of objective issues, looking for the presence or absence of 23 factors (Box) in an effort to use a reproducible methodology, and we believed that looking at stylistic aspects of websites would introduce a greater degree of subjectivity inconsistent with our objectives. Another limitation inherent to this topic is the inability of websites to articulate certain “intangibles,” such as program reputation and resident satisfaction. Despite these drawbacks, our hope is that this analysis serves as an important step toward improving online sites to allow prospective applicants an opportunity to make more informed decisions regarding what programs would potentially represent a good fit.

Our findings have considerable implications for the residency recruitment process and suggest several areas for improvement, as most applicants turn to the Internet when researching programs. Although further survey of prospective applicants would be invaluable in determining which factors are of greatest interest when examining residency program websites, many online sites appear to be inadequately comprehensive. Several important considerations such as comprehensive faculty listings, description of rotating facilities, rotation schedules, and description of didactics are described on most sites. However the majority of programs do not address crucial aspects of training such as progression of responsibilities, career paths of trainees, as well as quality-of-life measures including call responsibilities, salaries, and other benefits. Only 5 programs contained at least three-quarters of the criteria analyzed; on average, programs reported less than 50% of information sought.

Submitted for Publication: May 10, 2014; final revision received June 24, 2014; accepted June 30, 2014.

Corresponding Author: Peter F. Svider, MD, Department of Otolaryngology–Head and Neck Surgery, Wayne State University School of Medicine, 4201 St Antoine, 5E-UHC, Detroit, MI 48201 (psvider@gmail.com).

Published Online: September 4, 2014. doi:10.1001/jamaoto.2014.1714.

Author Contributions: Drs Svider and Gupta had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: All authors.

Acquisition, analysis, or interpretation of data: Svider, Gupta.

Drafting of the manuscript: Svider, Gupta, Johnson, Shkoukani, Eloy.

Critical revision of the manuscript for important intellectual content: Svider, Gupta, Johnson, Zuliani, Eloy, Folbe.

Statistical analysis: Svider, Gupta, Johnson, Eloy.

Study supervision: Zuliani, Shkoukani, Eloy, Folbe.

Conflict of Interest Disclosures: None reported.

Previous Presentation: This study was presented at the annual Combined Otolaryngology Spring Meeting; May 16, 2014, Las Vegas, Nevada.

Mulcahey  MK, Gosselin  MM, Fadale  PD.  Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships. J Bone Joint Surg Am. 2013;95(12):e85.
PubMed   |  Link to Article
Chu  LF, Young  CA, Zamora  AK,  et al.  Self-reported information needs of anesthesia residency applicants and analysis of applicant-related web sites resources at 131 United States training programs. Anesth Analg. 2011;112(2):430-439.
PubMed   |  Link to Article
Mahler  SA, Wagner  MJ, Church  A, Sokolosky  M, Cline  DM.  Importance of residency program web sites to emergency medicine applicants. J Emerg Med. 2009;36(1):83-88.
PubMed   |  Link to Article
Kumar  A, Sigal  Y, Wilson  E.  Web sites and pediatric residency training programs in the United States. Clin Pediatr (Phila). 2008;47(1):21-24.
PubMed   |  Link to Article
Gaeta  TJ, Birkhahn  RH, Lamont  D, Banga  N, Bove  JJ.  Aspects of residency programs’ web sites important to student applicants. Acad Emerg Med. 2005;12(1):89-92.
PubMed   |  Link to Article
Reilly  EF, Leibrandt  TJ, Zonno  AJ, Simpson  MC, Morris  JB.  General surgery residency program websites: usefulness and usability for resident applicants. Curr Surg. 2004;61(2):236-240.
PubMed   |  Link to Article
Embi  PJ, Desai  S, Cooney  TG.  Use and utility of Web-based residency program information: a survey of residency applicants. J Med Internet Res. 2003;5(3):e22.
PubMed   |  Link to Article
Puscas  L, Sharp  SR, Schwab  B, Lee  WT.  Qualities of residency applicants: comparison of otolaryngology program criteria with applicant expectations. Arch Otolaryngol Head Neck Surg. 2012;138(1):10-14.
PubMed   |  Link to Article
Sharp  S, Puscas  L, Schwab  B, Lee  WT.  Comparison of applicant criteria and program expectations for choosing residency programs in the otolaryngology match. Otolaryngol Head Neck Surg. 2011;144(2):174-179.
PubMed   |  Link to Article
National Resident Matching Program, Association of American Medical Colleges. Charting Outcomes in the Match: Characteristics of Applicants Who Matched to Their Preferred Specialty in the 2011 Main Residency Match.4th ed. Washington, DC: National Resident Matching Program; 2011.
US News & World Report LP. Best hospitals 2014-15: overview and honor roll. http://health.usnews.com/health-news/best-hospitals/articles/2014/07/15/best-hospitals-2014-15-overview-and-honor-roll. Accessed August 10, 2014.

Figures

Tables

Table Graphic Jump LocationTable 1.  Presence of Criteria Sought on Otolaryngology Program Websites
Table Graphic Jump LocationTable 2.  Comprehensiveness of Websites Organized by Region and Program Sizea

References

Mulcahey  MK, Gosselin  MM, Fadale  PD.  Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships. J Bone Joint Surg Am. 2013;95(12):e85.
PubMed   |  Link to Article
Chu  LF, Young  CA, Zamora  AK,  et al.  Self-reported information needs of anesthesia residency applicants and analysis of applicant-related web sites resources at 131 United States training programs. Anesth Analg. 2011;112(2):430-439.
PubMed   |  Link to Article
Mahler  SA, Wagner  MJ, Church  A, Sokolosky  M, Cline  DM.  Importance of residency program web sites to emergency medicine applicants. J Emerg Med. 2009;36(1):83-88.
PubMed   |  Link to Article
Kumar  A, Sigal  Y, Wilson  E.  Web sites and pediatric residency training programs in the United States. Clin Pediatr (Phila). 2008;47(1):21-24.
PubMed   |  Link to Article
Gaeta  TJ, Birkhahn  RH, Lamont  D, Banga  N, Bove  JJ.  Aspects of residency programs’ web sites important to student applicants. Acad Emerg Med. 2005;12(1):89-92.
PubMed   |  Link to Article
Reilly  EF, Leibrandt  TJ, Zonno  AJ, Simpson  MC, Morris  JB.  General surgery residency program websites: usefulness and usability for resident applicants. Curr Surg. 2004;61(2):236-240.
PubMed   |  Link to Article
Embi  PJ, Desai  S, Cooney  TG.  Use and utility of Web-based residency program information: a survey of residency applicants. J Med Internet Res. 2003;5(3):e22.
PubMed   |  Link to Article
Puscas  L, Sharp  SR, Schwab  B, Lee  WT.  Qualities of residency applicants: comparison of otolaryngology program criteria with applicant expectations. Arch Otolaryngol Head Neck Surg. 2012;138(1):10-14.
PubMed   |  Link to Article
Sharp  S, Puscas  L, Schwab  B, Lee  WT.  Comparison of applicant criteria and program expectations for choosing residency programs in the otolaryngology match. Otolaryngol Head Neck Surg. 2011;144(2):174-179.
PubMed   |  Link to Article
National Resident Matching Program, Association of American Medical Colleges. Charting Outcomes in the Match: Characteristics of Applicants Who Matched to Their Preferred Specialty in the 2011 Main Residency Match.4th ed. Washington, DC: National Resident Matching Program; 2011.
US News & World Report LP. Best hospitals 2014-15: overview and honor roll. http://health.usnews.com/health-news/best-hospitals/articles/2014/07/15/best-hospitals-2014-15-overview-and-honor-roll. Accessed August 10, 2014.

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

1,865 Views
2 Citations
×

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles
Jobs