Tag Archives: survey

Pivot now! Lessons Learned from moving Public Relations Campaigns Classes Online During the Pandemic in Spring 2020

Editorial Record: Original draft submitted October 6, 2020. Manuscript accepted for publication March 9, 2021. First published online December 2021.


Melanie Formentin, Ph.D.
Email: mformentinphd@gmail.com

Giselle A. Auger, Ph.D.
Associate Professor & Chair
Department of Communication
Rhode Island College
Providence, RI
Email: giselleauger@yahoo.com


This exploratory study examined how public relations professors adapted their PR campaigns courses to facilitate online learning in Spring 2020. Emphasis was placed on exploring the distinct or consistent challenges related to modifying coursework, managing student groups, and maintaining client relationships. The study also examined positive outcomes of moving online. Faculty teaching PR campaigns (N = 63) participated in a closed- and open-ended question survey exploring their experiences teaching the course. Results suggest that faculty felt compelled to change class components and experienced challenges related to individual student engagement (particularly in groups) and modifying specific components of students’ campaigns projects, but had fewer problems managing client relationships. Student access to technology and resources was the biggest barrier to success in campaigns courses. While faculty are embracing lessons learned through the quick shift online, the ability to successfully deliver PR campaigns courses online hinges on bridging digital divides.

Keywords: survey, COVID-19, online teaching, public relations campaigns

The best organizations are those that can adapt to the changing needs of their stakeholders. The same holds true when considering public relations education. However, few were prepared for the global challenges the emergent COVID-19 pandemic would create in early 2020.

In late January, 2020, the World Health Organization (W.H.O.) declared a global health emergency as thousands of COVID-19 cases began to spread through Asia (Taylor, 2021). By mid-February cases began to rise across Europe, particularly in Italy; by Feb. 29, the first death in the United States was reported. However, the United States was widely criticized for its response to the growing pandemic (Lewis, 2021). In addition to downplaying the severity of the virus, the Trump Administration leaned on the U.S. Centers for Disease Control and Prevention (CDC) to develop tests it was ill-equipped to produce and distribute. Even as testing availability expanded, the U.S. dealt with poor tracing and isolation procedures, quarantine and mask-wearing policies, and a decentralized response that placed the power for handling the crisis into the hands of state and local officials. By March 15, the CDC recommended that there be no gatherings of more than 50 people in the U.S. (Taylor, 2021), but by then state governors and local officials had already started exploring regional guidelines for slowing the virus’ spread. On March 19, California was the first state to issue stay-at-home orders, followed by dozens of additional states in the coming weeks (Wu et al., 2020).

As the crisis unfolded, educational institutions implemented contingency plans while waiting for guidance from federal and state officials. More than 1,300 colleges and universities across the U.S. shut down, canceling classes and moving instruction online (Smalley, 2021), often with less than two weeks’ notice. This created significant interruptions for students related to campus housing and dining, access to technology, resources for travel, and financial aid. In the midst of this upheaval, educators had to adapt in-progress courses for remote delivery. The move to online learning raised concerns about the quality of these courses, the uncertainty of the evolving situation, and the ability for students and faculty to manage the stress associated with the pandemic. For example, Gen Z adults (ages 18-23) were reported as experiencing significantly more stress than other age groups (American Psychological Association, 2020).

While faculty and students generally experienced the same challenges related to the quickly evolving pandemic, anecdotal evidence suggested that faculty teaching public relations campaigns (or similar capstone) courses seemed to experience different challenges than their academic counterparts. For example, specialized challenges appeared to emerge in these classes as students generally engage in collaborative, client-based or service-learning work. Both faculty and students had to be nimble while making decisions about continuing client relationships, identifying whether it was safe to conduct research, confirming whether students had access to the technology and programs needed to complete class assignments, and more. Based on this anecdotal evidence, the purpose of this study was to explore how professors adapted their courses, to identify unique or consistent challenges to that adaptation, and to identify potentially positive curricula changes that emerged from the experience.

Literature Review

Public relations campaigns courses provide distinct experiential learning opportunities designed to prepare students for internships and jobs. These classes are commonly taught in PR programs and often emphasize team-based and service- or client-driven learning opportunities. Because of the approaches normally used to teach campaigns courses, they are often taught as face-to-face classes. As such, the shift to online learning in spring 2020 meant faculty quickly converted their classes into remote delivery, but in a shorter time period than typically needed to develop quality online courses. The history and values of PR campaigns courses are evaluated before pedagogical approaches to group work and online learning are explored for context.

Public Relations Campaigns Course

The PR campaigns course has a long-standing history as part of excellent PR education. Even prior to Grunig and Hunt’s (1984) Managing Public Relations seminal publication, which defined PR as a management function, scholars discussed the importance of the campaigns course. For example, Rings (1983) discussed a theory-based, team-centered course at Boston University that promoted PR as a management function rather than a mere technical function. More recently, Auger and Cho (2016) found that 56% of nearly 250 PR programs included the course, and 22% provided a similar practicum course. 

The function of PR campaigns courses is often to prepare students for industry. Scholars have noted “because it is considered the capstone course of PR education, the campaigns class has a multi-faceted obligation to its students” (Benigni et al., 2004, p. 259). Benefits of the campaigns course, which traditionally uses a team-based approach to building a communication campaign for real clients, includes experiential learning outcomes such as managing group dynamics; professional decorum and presentation; establishing goals, objectives and strategies based on research; and determining appropriate tactics and evaluative criteria. Students find value in this learning format, particularly by placing classroom material in context and providing depth of understanding to the concepts of audiences and tactics (Aldoory & Wrigley, 2000). However, challenges related to group-based dynamics suggest that students do not always find the experience of working with clients helpful to “learning about compromise, tolerance, or problem solving” (p. 56). 

Outside of PR-specific courses, scholars have examined the concept of the campaigns course from the integrated marketing communication (Moody, 2012), health communication (Neuberger, 2017), and strategic communication perspectives (Anderson, 2018); notably, most of these retain the key characteristics of PR campaigns courses. This includes the use of real clients for whom students identify, research, and analyze a real issue or situation. To do this, they conduct secondary and primary research; create, outline, or execute a plan; then evaluate or indicate evaluative measures for that plan. While some programs use case studies rather than clients, using clients provides students with real-world experience not found through case analysis; arguably, “… students are not properly prepared unless they are thrust into a situation filled with problems and opportunities” (Benigni et al., 2004, p. 262). For example, results from a health communication campaigns class showed that students successfully translated classroom learning to practical application: “Many students could not even identify or define a health campaign at the start of the term. Yet, they end the semester with valuable knowledge and experience” (Neuberger, 2017, p. 147). 

Further, client perspectives show the benefits of campaigns classes. Rogers and Andrews (2016) found that nonprofit partners often lacked PR background, arguing that the need to educate community partners about PR expectations directly addresses the definition of best PR practices creating opportunities for mutual benefit (Public Relations Society of America, 2020). This creates a multi-faceted approach to strengthening student experiences while highlighting the need to privilege client perspectives. Further, Kinnick (1999) highlighted that a key benefit to client organizations came in the form of saved expenditure and staff time. Still others have indicated the value of the campaign plan itself to the organization (Benigni et al., 2004) and the value of opportunities to reflect on and analyze their own programs (Aldoory & Wrigley, 2000). Moreover, while clients are infrequently part of the grading process, studies show that clients are generally satisfied with their partnerships, many returning as a client for subsequent semesters or offering internships and other opportunities to students (Benigni et al., 2004).

Fostering Experiential Groupwork 

  The benefits of teaching PR campaigns courses include creating an environment to practice professional, team-based strategies. In the mid-90s, Blumenfeld et al. (1996) described the power and shortcomings of peer learning, arguing that “results can be positive when close attention is paid to norms, tasks, the mix of participants and their skills, and methods to ensure accountability” (p. 40). Group norms require collaboration and the ability to discuss and compromise on issues; but cooperation is not guaranteed, and evidence suggests that “students often do not behave prosocially” (p. 38). This includes issues related to contributing to the workload and decision making and issues related to interpersonal relationship behaviors. To deter such group dynamics, Blumenfeld et al. (1996) recommend developing meaningful tasks, teaching the art of giving and seeking help, and creating opportunities for accountability. Here, collaboration is a key component of groupwork in the classroom, offering opportunities to build communal knowledge by sharing resources, skills, and insights. However, technological supports must be in place to facilitate this type of learning. 

Although the large body of literature exploring the values of collaboration is not explored in depth here, it is worth noting how collaborative learning shapes and informs PR campaigns courses. Kayes et al. (2005) recognized the growing prevalence of teamwork in education and professional work environments and argued that negative team-based experiences can be overcome “when a team intentionally focuses on learning” (p. 331). This involves clearly identifying group characteristics related to purpose, membership, roles, context, process, and action taking (p. 330). Laal and Ghodsi (2012) also illustrated the social, psychological, and academic benefits of collaborative learning. In PR, scholars have examined the influence of groupwork and collaboration in the context of student-run agencies, highlighting the benefits of experiential learning and gaining professional skills (Bush, 2009; Bush et al., 2016; Bush & Miller, 2011). Here, the ability to work in teams is a key skill required in industry, and students in agency-style groups believed they gained numerous professional skills, including soft skills related to people, organizations, and communication (Bush et al., 2016). 

The Art of Distance Learning

Understanding the challenges faculty faced during the switch to online learning in Spring 2020 means appreciating the need for and significant effort that normally goes into preparing online courses. Faculty must consider student engagement and course design strategies when adjusting courses for online delivery. And, as Moore (2014) discussed:

An understanding of effective instruction in online PR courses is necessary as the rising amount of non-resident “distance” students, the economic downturn, and university focus on decreasing costs, increasing revenues, and improving student access have led to an increase in online undergraduate courses offered online. (p. 283)

To begin, existing research highlights the need to consider different opportunities for student engagement. Student engagement occurs at cognitive, emotional, and behavioral levels (Jones, 2008) and is defined as “the student’s psychological investment in and effort directed toward learning, understanding, or mastering the knowledge, skills, or crafts that academic work is intended to promote” (Newmann et al., 1992, as cited in Bolliger & Halupa, 2018, p. 3). Further, research suggests that students perceive online learning positively when there are high levels of engagement and low levels of transactional distance (Bolliger & Halupa, 2018). And while synchronous teaching strategies can lower perceptions of transactional distances, specific learner-to-learner, learner-to-instructor, and learner-to-content strategies can be used to increase engagement (Martin & Bolliger, 2018). Such strategies can include the use of icebreakers and collaborative work (learner-to-learner); regular communication and clear assignment instructions (learner-to-instructor); and structured discussion and “real-world projects” (Martin & Bollinger, 2018). 

Despite these positive findings, research suggests that the time needed to design and teach online courses is often a barrier to converting face-to-face courses online (Keengwe & Kidd, 2010). Faculty must re-develop courses to bridge instructional design and course organization needs. They must set curriculum, create diverse content delivery and activities, build scaffolded learning opportunities and timelines for group work, and establish netiquette rules (Anderson, 2001). Moreover, scholars recognize that online teaching requires adopting new practices that may be difficult for some faculty members to embrace (Keengwe & Kidd, 2010). Significant responsibility is placed on faculty to learn about new modalities, balance pedagogy and technology, adjust teaching styles, increase communication with students, and recognize benefits and challenges of online learning (Keengwe & Kidd, 2010). 

Distance Learning in Public Relations. Prior to the pandemic there was already a need to respond to the “changing PR teaching environment” (Moore, 2014, p. 283); the pandemic seemingly forced this change on educators across disciplines. Understanding the benefits and challenges of online learning as studied in PR, and recognizing best practices in the context of online learning in general, provides insight into the situation faculty faced in Spring 2020. Moreover, it is worth considering the distinct challenges of converting experiential, group-based courses (such as PR campaigns) to online formats.

Moore (2014) provided one of the first studies examining the success of online courses in PR programs. Specifically, she found that student-student communication significantly impacted student success in courses more than student-instructor communication and interaction. As campaigns classes rely on groupwork, this suggests an important aspect of online learning that must be considered when developing such courses. Similarly, Smallwood and Brunner (2017) found that teams working collaboratively on scaffolded, realistic projects experienced better engagement and group success. Increased “interactions and engagement with course material, other students, instructors, and technology” (Smallwood & Brunner, 2017, p. 453) led to more positive student perceptions and outcomes. However, although students are often considered digital natives, that did not ensure comfort with using technology for classwork and class-based communication (Smallwood & Brunner, 2017). Such findings suggest opportunities to successfully convert campaigns classes online, but also highlight the need to consider barriers to student success. 

The onset of the pandemic stressed the boundaries of typical student and faculty experiences. While many instructors experienced this shift to online teaching in a two-week period, many students were adapting to online learning for the first time. However, PR education seems to naturally employ best engagement practices as students are often required to collaborate in groups, follow specific strategies to complete work (such as using the planning process to develop campaign plans), and produce projects that emphasize practical outcomes. In the context of the pandemic—and without the usual time needed to meet best online teaching practices—converting campaigns courses arguably provided a distinct challenge for PR faculty.

Research Questions

PR campaigns courses often emphasize experiential learning opportunities designed to build and support client-focused relationships while students work in teams, mimicking professional experiences. And as quality online courses generally require significant preparation, combined with the challenges of converting group-based experiential learning to an online format, this study aimed to explore the distinct challenges that faculty teaching campaigns-style courses may have faced during the switch to online learning. Based on the reviewed literature, this led to the following research questions:

RQ1: What consistent challenges did professors teaching PR campaigns courses face converting their classes to an online format?

RQ2: What consistent student group challenges were identified by PR professors because of the switch to online learning?

RQ3: How were relationships with PR campaigns clients impacted by the unexpected changes brought on by the pandemic?

RQ4: What positive course-related changes emerged from the experience of switching PR campaigns courses online?


To answer the research questions and explore faculty experiences teaching PR campaigns courses at the beginning of the COVID-19 pandemic, a 45-question survey was distributed using Qualtrics. The survey was available from June 16-July 1, 2020 to ensure that faculty would have their spring 2020 semester experiences top-of-mind. 

Participant Recruitment

Participants were recruited using convenience and snowball sampling, which primarily occurred via social media channels for PR groups of major conferences including the AEJMC, ICA, and NCA PR divisions. Participants were encouraged to share the survey with colleagues who also taught campaigns-style courses in spring. Of 74 survey responses, N = 63 usable responses were retained for analysis. Two participants did not meet the screening requirement of having taught a campaigns PR course in Spring 2020, while n = 9 responses were removed because the survey was aborted at launch. Of the retained responses, n = 14 were partially completed with completion rates ranging from 24% (n = 1) to 69% (n = 3); 10 participants (15.87%) completed between 49-69% of the survey.

Survey Design

The researchers launched this study because of their experiences teaching campaigns courses. As such, this exploratory survey was designed based on a mix of personal experience and knowledge about best practices in PR pedagogy. To understand faculty experiences, closed- and open-ended questions were designed to understand previous and current experiences teaching campaigns classes, adjustments made because of the pandemic, and outcomes of those adjustments. Participants were also asked about the number and types of classes taught and basic demographic information.

First, faculty were asked about adjustments made to their courses due to the pandemic. This included questions related to previous experience teaching online or hybrid courses, time available to convert classes online, strategies used to determine best formats for the class, and which platforms and tools were used to deliver the course online.

Next, respondents were asked to reflect on the nature and quality of their client relationships and how those may have changed because of the pandemic. This included understanding whether relationships changed and how, and what client-based factors may have influenced changes to the partnerships.

Because the switch to online learning was quite sudden, questions explored what course component adjustments faculty may have made to facilitate learning. This included exploring whether changes were made (a) to their classes in general, and (b) to the client-based projects, specifically.

As campaigns courses often emphasize groupwork, the need for flexibility, and a guide-on-the-side approach to teaching, faculty may have faced specific challenges related to converting their classes online. As such, questions explored challenges faced in the course, content-related issues, and group-related issues. Course-related challenges focused on understanding whether faculty experienced issues such as delivering course material, engaging with students, and communicating with clients. Content-related issues focused on specific challenges related to the campaigns projects such as conducting research, creating tactics, and delivering presentations. Finally, the switch to online courses may have influenced group dynamics, so items were designed to explore issues such as group-based communication, collaboration, conflict resolution, and social loafing.

Despite the emphasis on challenges, faculty may have had positive outcomes and identified strategies they would continue using in the future. As such, items were designed to explore positive changes related to course delivery styles, use of technology, and opportunities to connect with clients.

Participant Demographics and Experiences

Participants ranged from 31 to 73 years old (M = 47.85, SD = 10.96), and were primarily female (n = 35, 55.6%) and white (n = 42, 66.7%). Most participants were assistant or associate professors (n = 31, 49.2%), and taught at public universities (n = 34, 54%) with 30,000 students or less (n = 37, 75.5%). Additionally, participants primarily identified themselves as teaching at institutions with a balanced emphasis on research and teaching (n = 27, 42.9%). Table 1 provides a full picture of participant demographics.

Table 1

Participant Demographics

Prefer not to Identify34.8%
RaceAsian or Pacific Islander34.8%
Black or African American00.0%
Hispanic or Latino23.2%
Prefer not to Identify34.6%
Academic RankLecturer or Instructor914.3%
Assistant Professor2133.3%
Associate Professor1015.9%
Full Professor711.1%
Prefer not to Identify11.6%
Institution TypePrivate1625.4%
Number of Students Enrolled0-5,000914.3%
Institution EmphasisResearch812,7%

Faculty were also asked how many courses they taught, particularly during Spring 2020. Respondents taught between 1-4 sections of campaigns (M = 1.17, SD = 0.53) and between 1-7 total courses (M = 2.76, SD = 1.21) during the semester. Most participants (n = 11, 87.3%) taught one campaigns section in spring, and generally taught 3 (n = 25, 39.7%) or fewer classes overall. Additionally, per semester, participants taught between 1-5 courses (M = 2.75, SD = .92) on average, with most teaching 3 courses (n = 28, 44.5%). Almost all participants had previously taught campaigns prior to the pandemic (n = 56, 88.9%), but had varying prior experience teaching classes in different formats. More than half of participants had previously taught fully online classes (n = 34, 54%) and hybrid classes (n = 32, 50.8%); fewer participants had previously taught flipped classes (n = 25, 39.7%). 


This survey was designed to understand the specific challenges instructors faced while teaching campaigns courses during a pandemic. As this study is exploratory, we begin by outlining general findings, then answer the guiding research questions.

Course Adjustments

Because faculty were required to make changes on short notice, there was interest in understanding how much lead time instructors had to make the shift online. Table 2 shows how much advance notice faculty had to prepare for online delivery, how much time they had to prepare, and how much time students had to prepare. Although the number of days advance notice they received from their universities varied, most participants (n = 39, 61.9%) and their students (n = 41, 65.1%) had 7 or more days to prepare for online delivery. This may be attributed to the timing of university closings, which often coincided with spring breaks.

As course adjustments had to be made quickly, it seemed valuable to understand how faculty sought advice about online course formats. Participants somewhat agreed that they sought advice from colleagues (M = 5.21, SD = 1.72) and their department (M = 4.78, SD = 1.76) and were less likely to work with on-campus faculty development groups (M = 3.54, SD = 2.18) or get feedback from students (M = 3.29, SD = 2.18). Ultimately, faculty used a blend of asynchronous and synchronous delivery (n = 45, 71.4%) for their campaigns courses.

Table 2

Time to Convert

Advance notice university gave to provide online deliveryTime faculty had to prepare for online deliveryTime students had to prepare for online delivery
0-2 daysn = 4, 6.3%n = 5, 7.9%n = 2, 3.2%
3-4 daysn = 15, 23.8%n = 9, 14.3%n = 13, 20.6%
5-6 daysn = 9, 14.3%n = 10, 15.9%n = 7, 11.1%
7-8 daysn = 17, 27%n = 17, 27%n = 19, 30.2%
9+ daysn = 18, 28.6%n = 22, 34.9%n = 22, 34.9%

Correlation analysis was used to explore the relationship between course-preparation experiences. Notably, a weak but significant relationship showed that faculty who had no choice about which format to use were less likely to survey students (p < .001, r = -.446) or seek advice from colleagues (p = .007, r = -.336) about course formats. This suggests that when faculty had a choice regarding how to deliver their courses, they were more likely to seek feedback and advice; when they did not have course delivery options, they simply moved forward as best as they could considering the circumstances.

Open-ended results showed that instructors relied on multiple platforms and tools to deliver course content. Most participants used a combination of their university learning management system (e.g., Blackboard, Canvas, Moodle, Sakai), third-party video-conferencing platforms (e.g., Zoom, Google Classroom, GotoMeeting, WebEx), and additional tools (e.g., GroupMe, Google Drive, Kaptura, Keynote, Panopto, Slack, social media groups, VoiceThread) to deliver course content, manage group work, and communicate with clients.

RQ1: Specific Challenges Faced by Professors

As the pandemic created the need to adapt quickly, it was important to understand what challenges professors faced when converting their courses online. The survey explored two areas of potential challenges faced by professors: 1. The need to change specific class components, and 2. Challenges faced delivering courses.

Changing Class Components. Course components were split into two areas of interest including final project components and specific course content that may have been adjusted to accommodate student learning. Table 3 highlights which course components and project components were changed, and which were most challenging to adjust. Results suggest that in addition to changing student presentations (n = 42, 66.7%), instructors primarily changed assignments (n = 34, 54%), project components (n = 30, 47.6%) and lectures (n = 25, 39.7%). These were also considered the most challenging course components to adjust. Additionally, there was interest in understanding which project components were modified. Results suggest that practice and client presentations (each n = 34, 54%) and tactics (n = 29, 46%) were the most frequently changed project components. These were also considered the project components that were most challenging to adjust.

Notably, the timing of the pandemic—approximately midway through the semester—meant that many students had collected data for their clients or were able to collect at least some data remotely. Additionally, as planning and evaluation can be based on secondary research, this may account for there being fewer issues with these phases.

Table 3

Adjusted Course Components

Components Removed or ChangedComponents Most Challenging to Adjust or Adapt
Course ComponentsAssignmentsn = 34, 54%n = 22, 34.9%
Lecturesn = 25, 39.7%n = 26, 41.3%
Exams/Quizzesn = 17, 27%n = 9, 14.3%
Project Componentsn = 30, 47.6%n = 32, 50.8%
Student Presentationsn = 42, 66.7%n = 37, 58.7%
Othern = 8, 12.7%n = 6, 9.5%
Project ComponentsData Collectionn = 15, 23.8%n = 16, 25.4%
Planning Phasen = 5, 7.9%n = 7, 11.1%
Producing Tacticsn = 29, 46%n = 25, 39.7%
Evaluation Strategyn = 15, 23.8%n = 14, 22.2%
Practice Presentationsn = 34, 54%%n = 24, 38.1%
Client Presentationsn = 34, 54%%n = 25, 39.7%
Othern = 4, 6.3%n = 4, 6.3%

*Project Component data is replicated in the second study described in the method.

Course Challenges. Items were developed to understand specific challenges instructors faced in their courses. Results suggest that instructors had fewer challenges when it came to finding time to meet student groups (M = 2.94, SD = 2.08) and had the most difficulty engaging individual students (M = 5.05, SD = 1.99). In general, however, faculty neither agreed nor disagreed that they experienced challenges. Even so, the relatively large standard deviations on these items are noteworthy, as they suggest that instructors had widely varying experiences in their classes. To that end, reliability analysis (α = .852) led to the development of an 8-item course challenges scale (M = 3.95, SD = 1.39). Table 4 highlights challenges faced in the course.

Table 4

Challenges in the Course

Delivering course material.543.851.89
Conveying project expectations.553.981.99
Engaging individual students.555.051.99
Engaging specific student groups.544.352.09
Finding time to meet student groups.542.942.08
Communicating with client.533.301.74
Student professionalism.543.911.96
Student communication with clients.533.921.82

*This table is replicated in the second study described in the method.

Next, items were designed to explore specific content-related issues instructors faced, particularly regarding final client projects. Results suggest mixed perceptions existed regarding the client projects, as most respondents neither agreed nor disagreed with the presented scenarios. Aligning with additional results, instructors had the most issues preparing for presentations (M = 4.87, SD = 1.60). But again, relatively high standard deviations suggest varying experiences across the sample. To build on these findings, reliability analysis (α = .825) led to the development of a 7-item content issues scale (M = 3.98, SD = 1.34). Table 5 shows content-related issues instructors faced in their courses.

Table 5

Content-Related Issues

Conducting research.533.402.15
Establishing goals and objectives.533.041.91
Creating tactics.534.212.13
Developing evaluative criteria.533.871.96
Designing planbooks.534.041.83
Designing presentations.534.471.69
Preparing for presentations.534.871.59

Correlation analysis suggests a relationship between course challenges and content issues. Specifically, instructors who faced course challenges were significantly more likely to experience content issues (p < .001, r = .748). 

Additionally, exploratory one-way ANOVA was used to explore whether the amount of preparation time faculty had impacted course challenges (p = .292) or content issues (p = .321), but there was not a significant relationship between these variables.

Of course, challenges extended beyond the basic execution of the course and client projects. Although open-ended responses generally confirmed the quantitative results, they also highlighted multifaceted challenges participants faced converting their courses online. For example, multiple participants reported a sense of dejection among students whose work was no longer usable. Many reported that students and clients alike were “disappointed that they were not able to get F2F feedback from the clients,” while others acknowledged that students experienced significant outside stressors impacting their ability to complete course components as originally intended. For example, significant technical issues related to not having WiFi, hardware, or software needed to complete specific tasks were routinely reported, leading to course and content changes. Ultimately, student engagement emerged as the most prominent issue for faculty who lamented missed opportunities to read body language, walk between and talk with groups, and create connections with clients. Mental health and stress-related issues were routinely acknowledged as impacting student engagement. The primary solution was to cancel team presentations, modify assignment expectations, and modify scheduling expectations.

Despite these challenges, some faculty (n = 6) reported having few issues converting their classes online. One participant suggested “their skills from all of the PR training made this pretty easy… it was frustrating, but we got through it.” Another suggested the switch to online was easy, but the challenges were primarily student-centered (such as health and technology needs). The few faculty who reported no challenges emphasized that classes continued to meet online, data collection had already been completed, lectures were recorded, and students were notified of the online conversion in advance. One participant even reported that they “took their client presentation events online” and tapped into a national audience, attracting 115-170+ practitioners.

RQ2: Specific Student Group Challenges

As PR campaigns courses often require significant groupwork, items were designed to explore the degree to which instructors experienced group-based issues. Results suggest that instructors somewhat agreed that group-based issues existed, particularly in regard to collaboration (M = 4.87, SD = 1.93), group-based communication (M = 4.87, SD = 1.79), problem-solving (M = 4.89, SD = 1.72), and social loafing (M = 5.02, SD = 1.79). To strengthen analysis of results, reliability analysis (α = .943) led to the development of a 9-item group dynamics scale (M = 4.43, SD = 1.55). Table 6 highlights group-based issues faced by instructors.

Table 6

Group-Based Issues

Group-based communication.524.871.79
General problem-solving.534.891.72
In-group collaboration.534.871.93
Compromising on campaign direction.524.252.03
Developing a cohesive strategy.534.381.83
Conflict resolution.534.191.97
Understanding project direction and goals.533.661.92
Agreeing on project direction and goals.533.721.88
Rise in “social loafing.”525.021.79

Correlation analysis suggests a relationship between course challenges, content issues, and group dynamics. Instructors who faced student group issues were significantly more likely to experience course challenges (p < .001, r = .762) and somewhat more likely to experience content issues (p < .001, r = .658). 

Qualitative results suggest that group-related issues often stemmed from a lack of engagement and outside influences related to the pandemic. As group work shifted online, the dynamic of faculty supporting groups individually changed as “the instructor at the table was not able to happen in the same way.” Accountability among group members was a noted issue, as was the ability to “keep students on track.” Students also faced issues regarding lack of resources and technology at home or new, unpredictable scheduling conflicts that prevented them from routinely meeting with their teams and faculty members. Participants also noted that the shift online meant “underperforming teams” and individual students could “hide,” or that it was harder to “check in with teams and make sure they were working together well and finishing project elements.” Overall, engagement was the key indicator of group successes or challenges.

RQ3: Adjusting Client Relationships

As many PR campaigns classes focus on experiential learning, part of the challenge of moving online involved managing and adjusting client-related work. Participants in the study served between 0 and 10 clients (M = 2.02, SD = 1.73) in Spring 2020. Most classes served 1 client (n = 27, 42.9%), or 3 clients (n = 9, 14.3%). Generally, participants (n = 58) found that client relationships remained relatively stable (see Table 7). However, participants only somewhat agreed that client(s) maintained the same level of engagement with their classes (M = 4.79, SD = 1.99) and neither agreed nor disagreed that client(s) had to adjust their involvement with the class because of the pandemic (M = 4.08, SD = 2.11). 

Table 7

Client Relationships

I was able to continue my client relationships.5.611.57
I communicated with my client(s) about the switch to online.5.981.62
My client(s) was interested in continuing their relationship with my class5.811.53
My client(s) maintained the same level of engagement with the class.4.791.99
My client(s) had to back out of their partnership with my class.1.831.50
My client(s) had to adjust their involvement with my class because their business was impacted by the pandemic.4.082.11

Overall, participants kept lines of communication open with their clients. Participants generally agreed or strongly agreed that they were able to continue their client relationships (n = 40, 63.5%), and n = 44 participants (71.5%) agreed or strongly agreed they communicated with their clients about the move online. Although clients were interested in continuing their class relationships (M = 5.81, SD = 1.53), there was less consistency in the degree of engagement maintained with those classes. For example, 34% (n = 22) of participants said they strongly disagreed to neither agreed nor disagreed that clients maintained the same level of engagement. Even so, clients did not completely back out of their partnerships. Only n = 3 (4.8%) participants agreed or strongly agreed that clients had to back out of their classes; rather, clients seemingly adjusted their involvement because of the pandemic. Overall, n = 31 (49.2%) participants at least somewhat agreed that their clients adjusted their involvement. This suggests that while changes were made to the client relationships, the clients still wanted to continue their partnerships.

To further explore the impact of client relationship experiences, reliability analysis (α = .817) led to the development of a 5-item client relationships scale (M = 5.68, SD = 1.24). Initial reliability analysis suggested the need to reverse-code items exploring whether clients had to back out of their partnerships and whether clients had to adjust their class involvement because they were impacted by the pandemic. The latter item was removed to strengthen Cronbach’s α from .774 to α = .817. Building on RQ1, a weak but significant relationship existed between general course challenges and client relationships. Specifically, the more clients remained involved in the project, the less instructors faced course challenges (p = .024, r = -.307) and content issues (p = .002, r = -.317). Although these relationships are relatively weak, they still suggest that degrees of client involvement may have informed issues faced by participants.

Open-ended results support this finding. In addition to considering the tools used to communicate with clients, participants also reflected on what they communicated and the nature of communication. Qualitative results suggested that the primary point of concern involved updating clients on project-based changes. For example, circumstances required one participant “to shift to a social media campaign since social distancing would not allow for face-to-face tactics.” Another participant found their campaign no longer plausible because target audiences could not be reached in person, so the partners “mutually ended” the client-agency relationship. Generally, however, most participants reported that they simply informed clients of minor changes, such as the need to move presentations online. 

Overall, most client relationships continued, but few appeared to continue without adjustments. As expected, clients faced their own challenges as they modified their business practices and needs; they had to close their businesses, were laid off or furloughed, or in general “were struggling with the reality of COVID-19 and running their organizations.” In some cases, clients did not have video-conferencing tools or other software, became geographically dispersed from colleagues, or were balancing personal issues such as childcare. Some clients were impacted by shifting workloads and making their own rapid changes. This sometimes resulted in lags in responsiveness, but also led clients to “sometimes [give] us the autonomy to make decisions without review or collaboration.” The most noted change in client engagement, however, were changes to the final presentation. Multiple participants reported either canceling presentations altogether, providing recorded presentations, or switching presentations online. In many cases, this removed an opportunity for client evaluation and feedback, but ultimately did not significantly impact the overall client relationship.

RQ4: Positive Curricular Changes

Although there may be a tendency to focus on the negatives of the pandemic, this study sought to explore potentially positive outcomes. Specifically, by being forced to move classes online, many instructors may have discovered strategies or tools that could be adopted in future iterations of campaigns classes. 

Overall, participants neither agreed nor disagreed with moving presentations online (M = 4.30, SD = 1.95) or the idea of changing the overall course format. However, participants somewhat agreed that they would consider having more online course meetings (M = 5.26, SD = 1.54), would deliver various course content online (see Table 8), and would teach students how to conduct group work remotely (M = 5.6, SD = 1.13). Additionally, they agreed that they would connect with teams online (M = 5.86, SD = 1.32). This suggests that while there are still barriers to putting campaigns courses online, the pandemic revealed opportunities to better facilitate course delivery online. Table 8 highlights these changes.

Table 8

Positive Outcomes for Future Use

Have more online course meetings.505.261.54
Teach students how to conduct group work remotely.505.601.13
Deliver lecture-based material online.505.441.36
Deliver project expectation instructions online.505.381.32
Flip my class.494.351.69
Teach my class as a hybrid.494.822.02
Connect with teams online.495.861.32
Connect with client online.495.651.35
Conduct client research fully online.494.271.71
Move general student presentations online.504.401.92
Move client presentations online.504.301.95

To further explore the impact of positive changes, reliability analysis (α = .792) led to the development of an 11-item positive outcomes scale (M = 5.02, SD = 0.90). The relationship between client relationships and positive outcomes was explored, but regression analysis suggests that quality of client relationships did not influence beliefs about positive outcomes related to the switch to online learning, F(1, 47) = 3.185, p = .081, R2 = .044. This builds on previous results suggesting that while the quality of client relationships may have influenced specific course-related issues, the overall positive benefits of moving online were more strongly related to the course itself, its participants, and accessibility to technologies. This is supported in the open-ended findings.

First, participants reported additional benefits to teaching online such as increased access to course materials that students could “access anytime, anywhere.” Some student groups adapted well to online learning, getting better about time- and group-management and participating in more one-on-one meetings with faculty. One participant started a private Facebook group that provided opportunities to ask questions and host Facebook live sessions. In general, participants experiencing positive outcomes felt going online provided “real-world skill when it comes to online calls” and remote working. Additionally, the lack of commute time provided more time to complete tasks such as course preparation and grading. 

Because of these outcomes, at least some participants identified strategies they might change based on their experience teaching campaigns online. Faculty felt there were more opportunities to organize and articulate campaign components, offer flipped-class solutions to give students more time to work in class, and give students autonomy to work on their own time. By having content online, participants see opportunities to provide more resources and clearer descriptions of course expectations that students can review on their own time. Multiple participants also planned to embrace opportunities to spend more time guiding students rather than delivering content—a popular solution appears to be through the implementation of hybrid or flipped models. One participant suggested, “I enjoyed the flipped model with lectures online and using class time – either in person or via video conferencing – to be a guide on the side.” Finally, numerous participants plan to embrace opportunities to reduce in-class time. By seeing that students worked well independently, multiple participants felt moving components online provides an opportunity to practice remote work strategies and move past “antiquated needs to ALWAYS meet in person.” Solutions included giving “groups more opportunities to work independently outside of class with clear expectations for checking in.”

Despite these optimistic approaches to the benefits of and potential strategies for teaching online, multiple participants simply saw no positive outcomes from the switch online. When prompted to reflect on benefits experienced when moving online, n = 8 participants indicated there were no “notable benefits” and that they simply “prefer not to teach the Campaigns class online.” Faculty in this position recognized strategies they can use to create engagement and teach campaigns online, but still felt this was not something they wanted to embrace unless forced to. Ultimately, this suggests that while tools and resources may be available to make positive course changes, there are still outside factors that influence the degree to which faculty want to adopt these changes moving forward.


This exploratory study aimed to examine how PR professors adapted to teaching campaigns courses online at the beginning of the COVID-19 pandemic in Spring 2020. Campaigns courses are routinely taught in public relations programs (Auger & Cho, 2016), and often include opportunities for students to produce full or proposed campaigns for real clients (Aldoory & Wrigley, 2000). Results of this study demonstrated that the emphasis on group work and experiential learning coupled with the rapid switch to online course delivery created distinct challenges for teaching the PR campaigns course compared to those teaching other types of courses. Faculty teaching PR campaigns experienced consistent challenges related to the switch to online learning, but these challenges were not directly related to student groups or client relationship issues as much as they reflected the inability to ensure that students had the technological resources needed to successfully complete their courses. Arguably, student- and client-related issues stemmed from a lack of or inconsistent access to resources among all parties involved. And while the sudden switch to online learning revealed potential opportunities to evolve the structure of future campaigns classes, the perceived potential success of those efforts rests on the ability for faculty to guarantee equal access to tools and resources needed to complete large-scale client projects.

Access to Resources and the Technological Gap

Overall, analysis of results indicates that the degrees of success experienced converting campaigns courses online were a direct result of whether students had access to resources and technology.  For example, when evaluating the final campaign components, client presentations and campaign tactics were most frequently changed or eliminated altogether. A deeper analysis of the data suggests that changes were not made because of an inability to deliver content online but because circumstances required such adaptations. For example, students who had not collected data prior to the shift online were often left with limited options for reaching target audiences. This made it more difficult to produce campaign tactics and materials based on primary data. 

Further, qualitative results suggest the most reported challenge to course delivery involved reaching students who had inconsistent internet access or lacked the hardware and software necessary to produce campaign components. Digital access has been widely reported as a significant predictor of student success at all educational levels, particularly during the pandemic (Sparks, 2020). Such divides appear to have impacted campaigns instructors; Although instructors had less difficulty meeting with groups, they had significant difficulty engaging individuals. Arguably, individual students without consistent internet access or using mobile devices such as phones and tablets were simply unable to participate fully, thus increasing a sense of disengagement. This even extended to clients, as those who lacked resources were most likely to reduce their involvement in student projects. And the more that faculty experienced diminishing engagement from students and clients, the more they faced course challenges and content issues.

The notion that these issues are resource-driven can, in part, be attributed to the findings that faculty seemed undeterred by issues related to short turnover time, the ability to convert course content, and the ability to modify project expectations. Moreover, faculty expressed few problems with delivering materials, conveying course expectations, meeting with students, and maintaining client relationships. So, despite the prevailing notion that converting courses online requires significant time and effort (Keengwe & Kidd, 2010), and even though most faculty had between 3-8 days to convert their courses, they seemed to be nimble in their approaches. Often seeking advice from colleagues, they clearly used readily available and known tools, and relied on multiple, diverse platforms to deliver content. 

Ultimately, the time faculty had to prepare their courses did not influence challenges they faced. However, it seems glib to suggest that no amount of technological know-how and confidence as an instructor can overcome the issues that emerge when students do not have consistent access to the tools being employed to meet best practices for online teaching. This suggests that emergent issues had little to do with instructor ability and more to do with resource and technology access issues.

Moving Forward

In response to the pandemic, Brownlee (2020) identified three opportunities to “[ensure] your institution is best positioned to support its students in the COVID-era of higher education” (para. 2). This included bridging the gaps on digital divide, experiential learning, and campus community. Arguably, PR campaigns courses are uniquely positioned to bridge these specific gaps, and findings suggest that campaigns instructors see this opportunity. Despite the outlined challenges, technology and pedagogical evolution appear to be at the heart of perceived opportunities to learn from and adapt to pandemic-driven teaching experiences. Many instructors felt that being forced online highlighted opportunities to streamline campaigns courses by reducing the number of in-person meetings and delivering lecture-based content in a flipped modality. As the industry was already shifting toward virtual workspaces prior to the pandemic, faculty acknowledged the shift online provided an opportunity to give students real-world virtual teamwork experiences.

Additionally, results suggest opportunities to strengthen learner-to-learner, learner-to-instructor, and learner-to-content opportunities. As faculty appeared to have an easier time communicating with groups, but had more difficulty engaging individual students, the use of groups may provide an opportunity for faculty to reach individual students. Learner-to-learner engagement can be enhanced through guided icebreaker or team-building activities (Martin & Bolliger, 2018), which could lead to increased trust and engagement among group members. This could create an additional line of communication with individual students who may be more difficult to reach. Next, PR faculty can increase learner-to-instructor and learner-to-content opportunities by maintaining a consistent mix of synchronous and asynchronous communication with students about course and assignment expectations. Combining synchronous weekly meetings with regular announcements can foster a collaborative environment while simultaneously providing students the autonomy to work asynchronously. This approach also mimics the PR agency experience, wherein practitioners often work in teams but use meetings to track progress on projects and identify which tasks will be completed independently. 

Moreover, the shift to online learning provides opportunities to strengthen student-client relationships. As a means of engagement, faculty may seek on-campus clients that resonate with students, helping build a sense of personal interest in the campaign projects. Additionally, moving the traditional client discovery meeting online can ensure that the usual interactions between students and clients are maintained. By meeting virtually, it also may be possible for clients to meet more frequently with classes, such as during key points of the project. If that type of increased interaction is unmanageable for the client, it could be supplemented by inviting other professionals to attend important student presentations as an opportunity to gain outside feedback on the work being produced.   

Finally, for faculty, by connecting with teams and clients online, there appear to be opportunities to emphasize giving students the autonomy to work independently as faculty act as guides-on-the-side and reclaim time needed for course preparation and grading. For some instructors, the pandemic led to a paradigm shift that questions the need for in-person engagement when so much of what happens in practice is virtual. Still, not everyone was as optimistic about the future of online campaigns courses. While some acknowledged they could teach this way but simply preferred not to, others felt strongly that campaigns courses require more consistent engagement between students and instructors. Essentially, the outside influencing factors—such as access to resources and lack of technology—seemed like hurdles too big to overcome. 

In short, faculty who had more positive experiences seemingly had the least number of issues with student resource and technology access. And, while there is truth to the arguments that in practice PR professionals should be able to conduct business remotely and in quickly morphing situations, one must consider that students taking these classes are not yet professionals, nor do higher education institutions readily provide the tools necessary to complete their work remotely, as might be the case in professional settings. 

This suggests that, as we consider how campaigns classes will evolve, we must also consider how to create equitable circumstances through which our students can learn. The sudden shift online means that higher education institutions have had to reimagine their technology infrastructures, and faculty and students alike have had to contend with digital divides and changing perceptions about the quality of online learning opportunities (Govindarajan & Srivastava, 2020). Without the technology-based tools necessary to complete projects of the scale generally expected in campaigns courses, it may remain difficult to encourage buy-in from the students and faculty participating in these courses as they continue to face stressors that limit the potential of these courses.

Limitations and Directions for Future Research

A primary limitation of this study is the sample size and its representativeness. A mix of convenience and snowball sampling was used to recruit participants, yielding a non-representative and potentially homogenous group of participants. Although there was balanced representation regarding gender, academic rank, and institutional factors, participants lacked racial diversity. Future studies should consider a more robust sampling strategy that includes the experiences of faculty at more diverse institutions. 

Further, future studies should consider whether students’ socioeconomic factors and shared mental health experiences influence the experiences faculty have converting their campaigns courses online. Early evidence suggests that parent education may be a stronger indicator of potential student success, as those with higher levels of education were more likely to remain employed, have access to home computers and internet, and have access to schools with stronger levels of student support (Sparks, 2020). More troubling, however, is the emergent mental health impacts on both students and faculty. Ongoing research suggests that the pandemic arrived during “a mental health crisis that had been unraveling on college campuses for years” (Lumpkin, 2021, para. 4). And while students experienced decreased well-being related to stress, anxiety, depression, and suicidal ideation (Anderson, 2020), so too did faculty. Within the first year of the pandemic, reports continued to emerge regarding faculty burnout (McMurtrie, 2020) and chronic stress (Flaherty, 2020). In a widely shared research brief by The Chronicle of Higher Education (2021), more than a third of respondents indicated they had considered changing careers or retiring. Undoubtedly, the effects of these experiences are likely to continue playing a role in the perceived success of delivering classes—particularly intensive capstone courses such as PR campaigns—in online or multi-modality formats.

Other limitations of the study include its exploratory nature and timing, although both can be considered starting points for future research examining faculty experiences teaching mixed-modality campaigns courses. Although survey items were evaluated in the context of best practices, the changing nature of the pandemic meant that many factors were not captured quantitatively. Additionally, this study was specifically designed to capture faculty experiences at the start of the pandemic. As many universities have switched to hybrid and HyFlex teaching models, and as the pandemic continues to extend beyond initial expectations, there exist opportunities to understand whether faculty experiences have evolved. 

Finally, future studies may examine specific technology access issues, the degree to which faculty become more comfortable teaching online, and whether students become more accustomed to working remotely. They may also explore whether opportunities to strengthen classes were implemented and the results of those changes. For example, the pandemic circumstances may have led to opportunities to close the technology gaps experienced during the initial stages of the pandemic. In short, like the practitioners they are grooming, PR faculty are nimble and will continue to pivot to meet student needs and produce quality campaigns experiences.


Aldoory, L., Wrigley, B. (2000). Exploring the use of real clients in the PR campaigns course. Journalism and Mass Communication Educator, 54(4), 47–58. https://doi.org/10.1177/107769589905400405

American Psychological Association (2020). Stress in America: A national mental health crisis. American Psychological Association. https://www.apa.org/news/press/releases/stress/2020/report-october

Anderson, G. (2020, Sept. 11). Mental health needs rise with pandemic. Inside Higher Ed. https://www.insidehighered.com/news/2020/09/11/students-great-need-mental-health-support-during-pandemic

Anderson, M. L. (2018). Utilizing elements of The Apprentice in the strategic communication campaigns course. Communication Teacher, 32(3), 141–147. https://doi.org/10.1080/17404622.2017.1372593

Anderson, T. (2001). The hidden curriculum in distance education: An updated view. Change, 33(6), 28-35. https://doi.org/10.1080/00091380109601824

Auger, G. A., & Cho, M. (2016). A comparative analysis of public relations curricula:  Does it matter where you go to school and is academia meeting the needs of the practice? Journalism & Mass Communication Educator, 71(1), 59–68. https://doi.org/10.1177/1077695814551830

Benigni, V., Cheng, I-H, & Cameron, G. T. (2004). The role of clients in the public relations capstone course. Journalism and Mass Communication Educator, 59(3), 259–277. https://doi.org/10.1177/107769580405900305

Blumenfeld, P. C., Marx, R. W., Soloway, E., & Krajcik, J. (1996). Learning with peers: From small group cooperation to collaborative communities. Educational Researcher, 25(8), 37-40. https://doi.org/10.3102/0013189×025008037 

Bolliger, D. U., & Halupa, C. (2018). Online student perceptions of engagement, transactional distance, and outcomes. Distance Education, 39(3), 299-316. https://doi.org/10.1080/01587919.2018.1476845

Brownlee, M. I. (2020, July 24). Online student engagement: Bridging gaps in the midst of COVID-19. National Association of Student Personnel Administrators. https://www.naspa.org/blog/online-student-engagement-bridging-gaps-in-the-midst-of-covid-19

Bush, L. (2009). Student public relations agencies: A qualitative study of the pedagogical benefits, risks, and a framework for success. Journalism and Mass Communication Educator, 64(1), 27-38. https://doi.org/10.1177/107769580906400103

Bush, L., Haygood, D., & Vincent, H. (2016). Student-run communication agencies: Providing students with real-world experiences that impact their careers. Journalism and Mass Communication Educator, 72(4), 410-424. https://doi.org/10.1177/1077695816673254

Bush, L., & Miller, B. M. (2011). U.S. student-run agencies: Organization, attributes and adviser perceptions of student learning outcomes. Public Relations Review, 37(5), 485-491. https://doi.org/10.1016/j.pubrev.2011.09.019

Chronicle of Higher Education (2020). On the Verge of Burnout: Covid-19’s impact on faculty well-being and career plans. The Chronicle of Higher Education. https://connect.chronicle.com/rs/931-EKA-218/images/Covid%26FacultyCareerPaths_Fidelity_ResearchBrief_v3%20%281%29.pdf

Flaherty, C. (2020, Nov. 19). Faculty pandemic stress is now chronic. Inside Higher Ed. https://www.insidehighered.com/news/2020/11/19/faculty-pandemic-stress-now-chronic

Govindarajan, V., & Srivastava, A. (2020, March 31). What the shift to virtual learning could mean for the future of higher education. Harvard Business Review. https://hbr.org/2020/03/what-the-shift-to-virtual-learning-could-mean-for-the-future-of-higher-ed

Grunig, J. E., & Hunt, T. (1984). Managing Public Relations.Holt, Rinehart and Winston 

Kinnick, K. N. (1999). The communications campaigns course as a model for incorporating service learning into the curriculum. In D. Droge and B. O. Murphy (Eds.), Voices of a Strong Democracy: Concepts and Models for Service-Learning in Communication Studies (pp. 155–163).

Jones, R. D., (2008). Strengthening student engagement [White paper]. International Center for Leadership in Education. https://www.literacytakesflight.com/uploads/7/8/9/3/7893595/strengthen_student_engagement_white_paper.pdf

Kayes, A. B., Kayes, D. C., & Kolb, D. A. (1005). Experiential learning in teams. Simulation & Gaming, 36(3), 330-354. https://doi.org/10.1177/1046878105279012

Keengwe, J., & Kidd, T. T. (2010). Towards best practices in online learning and teaching in higher education. Journal of Online Learning and Teaching, 6(2), 533-541. https://jolt.merlot.org/vol6no2/keengwe_0610.htm

Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia Social and Behavioral Sciences, 31, 489-490. https://doi.org/10.1016/j.sbspro.2011.12.091

Lewis, T. (2021, March 11). How the U.S. pandemic response went wrong—and what went right—during a year of COVID. Scientific American. https://www.scientificamerican.com/article/how-the-u-s-pandemic-response-went-wrong-and-what-went-right-during-a-year-of-covid/

Lumpkin, L. (2021, March 30). A mental health crisis was spreading on college campuses. The pandemic has made it worse. The Washington Post. https://www.washingtonpost.com/education/2021/03/30/college-students-mental-health-pandemic/

Martin, F., & Bolliger, D. U. (2018). Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning, 22(1), 205-222. https://doi.org/10.24059/olj.v22i1.1092

McMurtrie, B. (2020, Nov. 5). The pandemic is dragging on. Professors are burning out. The Chronicle of Higher Education. https://www.chronicle.com/article/the-pandemic-is-dragging-on-professors-are-burning-out

Moody, R. F. (2012). Integrating public relations with advertising: An exercise for students in the college public relations campaigns course. Communication Teacher, 26 (4), 203–206. https://doi.org/10.1080/17404622.2012.668201

Moore, J. (2014). Effects of online interaction and instructor presence on students’ satisfaction and success with online undergraduate public relations courses. Journalism & Mass Communication Educator, 69(3), 271-288. https://doi.org/10.1177/1077695814536398

Neuberger, L. (2017). Teaching health campaigns by doing health campaigns. Communication Teacher, 31(3), 143–148. https://doi.org/10.1080/17404622.2017.1314520

Public Relations Society of America (2020). About Public Relations. https://www.prsa.org/about/all-about-pr

Rings, R. L. (1983). Team-taught campaigns course concentrated on communications management. Journalism Educator, 83(1), 38–41. https://doi.org/10.1177/107769588303800113

Rogers, C., & Andrews, V. (2016). Nonprofits’ expectations in PR service-learning partnerships. Journalism and Mass Communication Educator, 71(1), 95–106. https://doi.org/10.1177/1077695815584226

Smalley, A. (2021, March 21). Higher education responses to Coronavirus (COVID-19). National Conference of State Legislatures. https://www.ncsl.org/research/education/higher-education-responses-to-coronavirus-covid-19.aspx

Smallwood, A., & Brunner, B. (2017). Engaged learning through online collaborative public relations projects across universities. Journalism & Mass Communication Educator, 72(4), 442-460. https://doi.org/10.1177/1077695816686440

Sparks, S. D. (2020, July 14). In pandemic, digital access and parents’ education made the biggest difference in schools’ response. Education Week. https://blogs.edweek.org/edweek/inside-school-research/2020/07/in_pandemic_parents_made_the_b.html

Taylor, D. B. (2021, March 17). A timeline of the Coronavirus pandemic. The New York Times. https://www.nytimes.com/article/coronavirus-timeline.html

Wu, J., Smith, S., Khurana, M., Siemaszki, C., & DeJesus-Banos, B. (2020, April 29). Stay-at-home orders across the country. NBC News. https://www.nbcnews.com/health/health-news/here-are-stay-home-orders-across-country-n1168736

© Copyright 2021 AEJMC Public Relations Division

To cite this article: Formentin, M. & Giselle, A.A. (2021). Pivot now! Lessons learned from moving public relations campaigns classes online during the pandemic in Spring 2020. Journal of Public Relations Education, 7(3), 7-44. https://aejmc.us/jpre/?p=2709

Undergraduate Public Relations in the United States: The 2017 Commission on Public Relations Education Report


Marcia DiStaso, Ph.D., APR
Associate Professor
Public Relations Department Chair
University of Florida


As history books document, the field of public relations dates back to the early 20th century. Since then, society and public relations have evolved. This evolution has led to multiple definitions of public relations over the years, and, in fact, the term still continues to evolve today. Currently, the Public Relations Society of America (PRSA) defines public relations as, “A strategic communication process that builds mutually beneficial relationships between organizations and their publics” (PRSA, n.d., para. 3). In October 2019, the International Public Relations Association (IPRA) announced its new definition of public relations as, “A decision-making management practice tasked with building relationships and interests between organisations and their publics based on the delivery of information through trusted and ethical communication methods” (IPRA, 2019, para. 2).

As the public relations profession has evolved, so has education. Edward Bernays is credited with writing the first public relations textbook and teaching the first class in 1923 (Broom & Sha, 2013). Fifty years later, in 1973, the Commission on Public Relations Education (CPRE) was founded. Since then, this group has combined insight from academics and practitioners to provide recommendations on public relations education around the globe. These recommendations have impacted both graduate and undergraduate education as many academic programs have aligned their course offerings as a result of CPRE recommendations. Plus, CPRE recommendations serve as the foundation for the criteria for the Public Relations Student Society of America’s chapter standards (PRSSA, 2019) and the Certification in Education for Public Relations (CPRE, 2006).

Following the recommendations from the 1999 CPRE report, “A Port of Entry,” academic public relations programs commonly included courses in the following topics:

  • Introduction to public relations
  • Public relations research, measurement and evaluation
  • Public relations writing and production
  • Supervised work experience in public relations (internship)

In 2006, the CPRE recommended that public relations programs should include these four core courses plus the following addition: a public relations course in law and ethics, planning and management, case studies, or campaigns.

The purpose of this article is to present the combined findings from the CPRE omnibus survey that is spread across the 17 chapters in the report Fast Forward: Foundations + Future State. Educators + Practitioners. Many of the chapters include the results from educators and practitioners from outside of the United States for a global perspective. This article, however, is delimited to the results for U.S. respondents to highlight the current state of undergraduate public relations education in the United States.


This research built onto past CPRE reports on undergraduate education, mainly A Port of Entry: Public Relations Education for the 21st Century (1999) and The Professional Bond (2006).Similar to those reports, an extensive omnibus survey was also conducted. Where appropriate, the questionnaire remained the same; however, given the vast changes in the public relations field over the last decade, few specifics were retained.

Survey Distribution

While past CPRE surveys were distributed to a stratified random sample of members in public relations associations, that approach in 2016 was not preferred due to typically low survey responses and difficulty obtaining membership lists. Therefore, the 2016 omnibus survey was distributed by email to CPRE members. The individual representatives for these associations invited their members and colleagues to participate in the survey. These members represented the following organizations:

  • Arthur W. Page Center
  • Arthur W. Page Society
  • Association for Education in Journalism and Mass Communication (AEJMC) Public Relations Division
  • Canadian Public Relations Society
  • European Public Relations Education and Research Association
  • Global Alliance for Public Relations
  • Institute for Public Relations (IPR)
  • International Communication Association (ICA) Public Relations Division
  • National Black Public Relations Society
  • National Communication Association (NCA) Public Relations Division
  • Plank Center for Leadership in Public Relations
  • PR Council
  • Public Relations Society of America (PRSA) Educators Academy
  • Public Relations Society of America (PRSA) Educational Affairs Committee
  • PRSA Foundation
  • Public Relations Society of America (PRSA)
  • The Corporate Board/Society of New Communications Research (SNCR)
  • Universal Accreditation Board (UAB)

The survey was open for participation from October 10 to December 19, 2016. Given that the survey distribution was through CPRE member associations, using their own recruitment process, it is not possible to calculate the number of people who actually received the survey.

Overall, a total of 1,601 questionnaires were started. Respondents who indicated they were not in public relations (or a related field) were removed (n = 48), along with anyone who took fewer than 10 minutes on the survey. This survey had a high drop-out rate given that it took an average of 25 minutes to complete (n = 738). The focus of this article is on undergraduate public relations education in the United States, so all respondents from other countries were removed (n = 124).

The questionnaire began with a filter question that asked respondents to identify as an educator, as a practitioner, or as someone not in public relations (or a related field). Based on responses to this question, participants were filtered to either an educator or a practitioner survey. If they were not in public relations, they were thanked for their time, and the survey concluded. The questionnaire contained eight sections. The final sample included in this article was 690, comprised of 231 educators and 459 practitioners.



The demographic information for this study is included in Table 1. Overall, 33% of respondents were educators (n = 231), and 67% were practitioners (n = 459). The percentage of female practitioners in this study matched the approximate percentage in the profession (74%, n = 291). The age distribution was skewed slightly younger in the practitioner sample than the educator sample; however, that is also consistent with both populations. The educator sample was predominantly white (94%, n = 156), and the practitioner sample was 77% white (n = 354), consistent with the lack of diversity in the field. Most educators had a Ph.D. (72%, n = 134), and most practitioners had a bachelor’s degree (54%, n = 209). Only 38% of educators (n = 92) and 28% of practitioners (n = 111) had their Accreditation in Public Relations, and 1% of practitioners were Accredited Business Communicators (n = 4). The practitioners were from a variety of organizational settings and sizes. The educator sample included 70% tenured or tenure-track faculty (n = 121).

The practitioner sample had some academic experience, with 18% of the practitioners having taught as an adjunct (n = 71) and 58% having guest lectured in a public relations course (n = 223). On the job, 52% of practitioner respondents directly supervised entry-level practitioners (n = 203), while 61% had supervised an intern in the last five years (n = 240).

Knowledge, Skills and Abilities

The KSAs (knowledge, skills, and abilities) from the 2006 survey were updated to better align with current public relations education and practice. As a result, only a few KSAs were assessed in both 2006 and 2016, resulting in minimal comparisons (see Table 2). Writing was one skill that was measured in both years. In 2016, the mean scores for desired writing skills increased for both educators (0.19 increase) and practitioners (0.41 increase). The mean scores for delivered or found writing skills also increased (0.77 increase for educators and 0.02 increase for practitioners). Research and analytics was another item measured in both surveys. Educators and practitioners had a decrease in mean scores for research and analytics as a desired skill (0.03 decrease each), while educators believed that the delivery of these skills increased (0.86 increase), and practitioners felt the amount the skill was found had decreased (0.32 decrease).

In 2016, educators indicated a high desirability for 15 KSAs, while practitioners identified 11 as highly desirable (mean ratings of a 4.0 or higher). On the other hand, educators indicated only three KSAs as frequently delivered, and practitioners did not believe any KSAs were frequently found.

The top three knowledge topics desired by educators were: ethics (M = 4.44, SD = 0.95), business acumen (M = 4.09, SD = 0.92), and cultural perspective (M = 4.02, SD = 0.89). The top three desired knowledge topics by practitioners were: ethics (M = 4.57, SD = 0.78), diversity and inclusion (M = 3.95, SD = 1.06), and social issues (M = 3.67, SD = 1.00).

The top three skills desired by educators were: writing (M = 4.90, SD = 0.37), communication (M = 4.78, SD = 0.50), and social media management (M = 4.52, SD = 0.64). The top three desired skills by practitioners were the same: writing (M = 4.88, SD = 0.41), communication (M = 4.76, SD = 0.57), and social media management (M = 4.33, SD = 0.82).

The top three abilities desired by educators were: problem solving (M = 4.55, SD = 0.65), critical thinking (M = 4.53, SD = 0.75), and creative thinking (M = 4.52, SD = 0.71). The top three abilities desired by practitioners were: creative thinking (M = 4.57, SD = 0.70), problem solving (M = 4.52, SD = 0.77), and critical thinking (M = 4.44, SD = 0.82).

Overall, there was a 40% inconsistency in agreement between educators and practitioners about the desirability of the KSAs (12 out of 30). Significant differences in desired KSAs for educators and practitioners included business acumen, crisis management, cultural perspective, ethics, internal communication, PR history, PR laws and regulations, public speaking, social media management, website development, problem solving, and strategic planning. In each of these, the educators in the survey rated the KSA more desired than the practitioners, except for ethics where the practitioners indicated a higher level of desire than the educators. 

The top three knowledge topics educators believed their programs delivered were: ethics (M = 4.11, SD = 0.95), PR theory (M = 3.77, SD = 1.03), and social issues (M = 3.43, SD = 1.06). The top three knowledge topics found by practitioners were: ethics (M = 3.37, SD = 0.96), diversity and inclusion (M = 3.30, SD = 1.02), and social issues (M = 3.20, SD = 0.96).

The top three skills educators believed their programs delivered were: communication (M = 4.44, SD = 0.78), writing (M = 4.32, SD = 0.83), and research and analytics (M = 3.83, SD = 1.04). The top three skills found by practitioners were: social media management (M = 3.84, SD = 0.91), communication (M = 3.31, SD = 0.88), and writing (M = 3.08, SD = 0.94).

The top three abilities educators believed their programs delivered were: critical thinking (M = 3.91, SD = 0.97), strategic planning (M = 3.90, SD = 1.04), and problem solving (M = 3.85, SD = 0.96). The top three abilities found by practitioners were: creative thinking (M = 3.38, SD = 0.94), problem solving (M = 2.75, SD = 0.89), and critical thinking (M = 2.65, SD = 0.89).

There was a 43% inconsistency in agreement between educators and practitioners about recent graduates having these KSAs (13 out of 30). There were significant differences in KSAs delivered by educators and found by practitioners for business acumen, crisis management, cultural perspective, diversity and inclusion, management, social issues, audio/video development, graphic design, media relations, social media management, speechwriting, website development, and strategic planning. In each of these, educators rated the KSA delivered more frequently than the practitioners indicated finding them. 

Hiring Characteristics/Experience

Practitioners were given a list of “possible hiring characteristics” of recent college graduates and were asked to consider what they look for in entry-level new hires (see Table 3).

Practitioners rated the top five desired characteristics/experiences they look for when hiring (all are desired more than found):

  1. Writing performance (M = 4.88, SD = 0.40); 1.98 gap in what is found
  2. Internship or work experience (M = 4.67, SD = 0.71); 0.84 gap in what is found
  3. Public relations coursework (M = 4.47, SD = 0.83); 0.50 gap in what is found
  4. Strong references (M = 4.22, SD = 0.92); 0.86 gap in what is found
  5. Up-to-date with current professional trends and issues (M = 4.10, SD = 0.92); 1.30 gap in what is found

Practitioners’ scores resulted in this list of five least desired characteristics/experiences:

  1. Certificate in public relations (M = 2.38, SD = 1.18)
  2. Study abroad experience (M = 2.39, SD = 1.12)
  3. Certifications (e.g., Hootsuite, Google Analytics, coding) (M = 2.88, SD = 1.19)
  4. Caliber of university attended (M = 3.02, SD = 1.07)
  5. Bi- or multi-lingual (M = 3.17, SD = 1.22)

Results showed five most commonly found characteristics/experiences in new hires:

  1. Active on social media (M = 4.40, SD = 0.76)
  2. Public relations coursework (M = 3.97, SD = 0.82)
  3. Internship or work experience (M = 3.83, SD = 0.86)
  4. Campus involvement (M = 3.48, SD = 0.82)
  5. Liberal arts coursework (M = 3.46, SD = 1.01)

According to the practitioners who participated in the survey, there were five least found characteristics/experiences:

  1. Certificate in public relations (M = 1.64, SD = 0.86)
  2. Certifications (e.g., Hootsuite, Google Analytics, coding) (M = 1.91, SD = 0.89)
  3. Bi- or multi-lingual (M = 2.00, SD = 0.84)
  4. Study abroad experience (M = 2.33, SD = 0.92)
  5. Participation in an on-campus student PR agency (M = 2.46, SD = 0.98)

Public Relations Curriculum

This study sought to identify the implementation of the 2006 CPRE five-course recommendation and determine any needed changes to this standard. Overall, 90% of academic respondents (n = 178) and 95% of practitioner respondents (n = 395) were in favor of retaining the five-course standard. As Table 4 shows, the 2016 study found that practitioner respondents favored programs requiring all five courses.

Importantly, 99% of academic respondents said they have an Introduction to Public Relations or principles class (n = 198), 93% said this course is required (n = 185), and 87% said what they offer is a public relations specific class (n = 173). Most academics also indicated that a research methods course is taught (97.0%, n = 196) and required (89.9%, n = 178), but many indicated that it is not a public relations specific course that is offered in their program (47.0%, n = 93). Writing was also a course that most respondents said is included (97.0%, n = 195), required (93.4%, n = 184), and public relations specific (82.7%, n = 163). Campaigns and case studies courses are also taught (92.5%, n = 186), required (80.1%, n = 157), and public relations specific (82.2%, n = 162). A course for internships was also offered at universities for 91% of respondents (n = 183), but only 45% said it was a required course (n = 89); 58% said the internship course is public relations specific (n = 113).

Curriculum Topics

In addition to the five-course standard, many public relations programs offer courses on additional topics and/or include topics within existing courses. Over the years, the list of possible curriculum topics has changed, resulting in two new topics in the 2006 study and 32 new topics in the 2016 study (see Table 5). Unfortunately, comparisons between the years is made complex due to a change from the 7-point scale used in 1998 and 2006 to the 5-point response metric used in this study; therefore, only the 2016 findings for the individual outcomes are discussed. For the 2016 mean responses, the curriculum topics rated as a 4.00 or higher are highlighted, indicating an essential topic. Educators indicated a high importance for 15 curriculum topics while practitioners identified 13 (mean ratings of a 4.0 or higher). Eleven highly essential curriculum topics were the same for educators and practitioners.

When it came to the most important curriculum topics, educators most often selected: (1) measurement and evaluation (M = 4.60, SD = 0.75); (2) social media (M = 4.58, SD = 0.80); (3) campaign management (M = 4.54, SD = 0.76); (4) strategic communications (M = 4.52, SD = 0.80); and (5) audience segmentation (M = 4.26, SD = 0.97). Practitioners believed the top five curriculum topics to be: (1) content creation (M = 4.52, SD = 0.69); (2) strategic communications (M = 4.48, SD = 0.78); (3) social media (M = 4.47, SD = 0.77); (4) measurement and evaluation (M = 4.41, SD = 0.79); and (5) publicity/media relations (M = 4.40, SD = 0.79).

Most of the items in Table 5 did not have significant differences between the educator and practitioner rankings for the essentialness of each topic. However, educators believed audience segmentation, campaign management, CSR, crisis management, fundraising, issues management, measurement and evaluation, and political communication were all more essential than practitioners did. The practitioners felt that business-to-consumer PR and content creation were more essential than educators thought. 

Online Education

Overall, 53% of educators who participated in this survey indicated that their program offers online public relations courses (n = 102). Six percent of the educators said their program had a completely online undergraduate degree (n = 11). Both educators and practitioners indicated they felt an online degree was not equal to a face-to-face degree (M = 2.27 and M = 2.35) (see Table 6). Furthermore, both educators and practitioners believed job applicants should disclose if all or part of a degree was taken online. 


Of the educators who participated in this study and knew how their program handled internships, 42% said they required an internship (n = 80), 51% had programs that allowed elective credits for an internship (n = 97), and 6% just encouraged internships (n = 12) (see Table 7). Most programs had an internship coordinator (82.1%, n = 156) and 69% of respondents said that coordinator was a faculty member (n = 121).

Only 35% of educators said their program had a training program to prepare students for internships (n = 66), and the most common assessment of internships was a performance review by the supervisor (63.6%, n = 147). Plus, 45% said that to complete an internship for credit, their program required a prerequisite course (n = 103), 46% have minimum credit hours required (n = 107), and 36% have a minimum GPA (n = 83). Many required all three. Overall, 32% of practitioners said their interns were not paid (n = 124). The average pay reported for those who were paid was $13.54 an hour.

The Department of Labor’s Federal Guidelines on Internships based on the Fair Labor Standards Act (FLSA) provides important guidance on internships; however, 36% of educators (n = 66) and 29% of practitioners (n = 111) were not familiar with the guidelines. Overall, of those who were familiar with the guidelines and knew how internships were handled in their area, only 67% of educators (n = 62) and 93% of practitioners (n = 2 44) said these guidelines are always followed.

There were significant differences between educator and practitioner views about interns having a valuable experience (see Table 8). Educators felt more positive about the experience; however, practitioners indicated higher agreement that interns were given meaningful work and that they receive clear and routine instructions.

Membership in Student Associations

Both educators and practitioners found high value in student involvement in associations such as Public Relations Student Society of America and International Association of Business Communicators (see Table 9). They each identified networking as the number one reason for participating in student associations.

Faculty Qualifications

As Table 10 shows, educators and practitioners ranked staying up-to-date on technology as the top faculty qualification (M = 4.51, SD = 0.69 and M = 4.63, SD = 0.65). Educators preferred more than 5 years of professional PR experience (M = 4.15, SD = 1.03), while practitioners ranked more than 10 years of professional PR experience as more important (M = 4.61, SD = 0.69). Similarly, educators rated presenting at academic conferences (M = 3.77, SD = 1.04) as more important than professional conferences (M = 3.47, SD = 0.99), whereas practitioners found the opposite to be more important.


Taking a good look at public relations undergraduate education on a periodic basis is an extremely valuable, though daunting, task. The value that academics and practitioners can derive from the CPRE reports highlight consistencies, gaps, and opportunities.

Consistencies and Gaps

The secret to the success of undergraduate education is collaboration between educators and practitioners. Together they can provide the foundation for a cohesive focus on knowledge, skills, and abilities to prepare undergraduate students for their future careers. While both educators and practitioners identified ethics as the top knowledge topic, there were inconsistencies on the other top knowledge topic areas. Educators identified business acumen and cultural perspective to aid students in having a well-rounded business grounding. Practitioners, on the other hand, identified diversity and inclusion and social issues as core knowledge areas likely to aid graduates to assimilate into the current work environment. Importantly, practitioners identified ethics, diversity and inclusion, and social issues as their top found areas, but none were found at what would be considered a high level; this indicates more work needs to be done to prepare students for all three knowledge areas.   

When assessing the desired skills, practitioners and educators were aligned. Writing is still the most valued skill. In fact, the desire for writing skills has increased since 2006, but the good news is that writing ability has also slightly increased. The other skills both groups identified were communication and social media management. Fortunately, all three of these skills were the highest ranked skills found, but none were frequently found, so there is still a need for continued and increased focus. Unfortunately, there was a gap between the perception of educators delivering writing and communication skills and practitioners identifying the skills as found. 

Both groups included strategic communications, social media, and measurement and evaluation as top curriculum topics, but the practitioners identified content creation as their most important addition to the curriculum.

Practitioners and educators identified creative thinking, problem solving, and critical thinking as the top desired and found abilities (while in slightly different order for the groups). Analytical thinking was not as highly rated by either, and there was a big gap with educators identifying higher levels of delivery of abilities than indications of the abilities being found by practitioners.


While the overwhelming majority of educators and practitioners in this study was in favor of retaining the CPRE five-course standard, some programs do not have these five courses specific to public relations. This is a missed opportunity; for example, 17% of educator respondents said their writing course is not a public relations writing course. Given how important writing continues to be, having a public relations writing course along with multiple other grammar and writing courses would be ideal. This is especially true considering this research found that writing remains the core entry-level skill and hiring characteristic.

In 2018, the CPRE published the global data from the 2016 omnibus survey reported in Fast forward: Foundations + Future state. Educators + Practitioners. In this report, the Commission recommended adding ethics as a sixth course to the standard. By recommending ethics as a required course, programs will be able to improve their focus on ethics and better meet the needs of this dynamic field.

As the profession becomes more integrated and entry-level positions continue to advertise positions looking for a bachelor’s degree in a “relevant field,” seeing public relations coursework as the third desired hiring characteristic is telling. The core competencies students learn in public relations programs are valuable and sought after. This should lead academic programs to question the value of combining advertising and public relations. Consistently, this research found support for core public relations competencies.   

It is concerning to see the percentage of paid internships remains low, yet internship or work experience is highly regarded. There has been a strong call to action from academics and practitioners across the United States to pay student interns. Additionally, internships should be supervised and considered a learning opportunity for the student.

In addition to the content shared in this article, the full 2017 CPRE report Fast forward: Foundations + Future state. Educators + Practitioners contains 17 chapters with global recommendations.  


Broom, G. M., & Sha, B. L. (2013). Cutlip and Center’s effective public relations (11th ed.) Upper Saddle River, NJ: Prentice–Hall.

CPRE. (1999). Port of entry. Commission on Public Relations Education. Retrieved from http://www.commissionpred.org/commission-reports/a-port-of-entry/

CPRE. (2006). The professional bond. Commission on Public Relations Education. Retrieved from http://www.commissionpred.org/commission-reports/the-professional-bond/

CPRE. (2018). Fast forward: Foundations + Future state. Educators + Practitioners. Commission on Public Relations Education. Retrieved from http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf

IPRA. (2019, Oct. 10). The International Public Relations Association wraps its values around a new definition of public relations. Retrieved from https://www.ipra.org/news/press-room/the-international-public-relations-association-wraps-its-values-around-a-new-definition-of-public-relations/

PRSA. (n.d.). About PRSA. Retrieved from https://www.prsa.org/about/all-about-pr  PRSSA. (2019). PRSSA chapter handbook 2019-2020. Retrieved from https://prssa.prsa.org/wp-content/uploads/2019/08/PRSSA-Chapter-Handbook.pdf

JPRE Special Issue, Vol. 5, Issue 3, Fall 2019

Current Issue

Introductory Letter by CPRE Co-Chairs Judith T. Phair, PhairAdvantage Communications, and Elizabeth L. Toth, University of Maryland

Research Article

Undergraduate Public Relations in the United States: The 2017 Commission on Public Relations Education Report by Marcia DiStaso, University of Florida

Teaching Brief

Ethics Education in Public Relations: State of Student Preparation and Agency Training in Ethical Decision-Making by Denise Bortree, Penn State University

Different Formats, Equal Outcomes? Comparing In-Person and Online Education in Public Relations

Editorial Record: Original draft submitted to JPRE July 20, 2017. Revision submitted April 27, 2018. Revision submitted September 28, 2018. Manuscript accepted for publication November 9, 2018. First published online August 17, 2019.


Brook Weberling McKeever, University of South Carolina


A survey of students enrolled in undergraduate online and in-person courses in public relations (N = 452) allowed for comparisons between the two course formats in terms of student motivations, satisfaction, grades, and characteristics that predict success in online environments. Findings revealed that student satisfaction and grades were equivalent between the online and in-person courses. Additionally, the Test of Online Learning Success (TOOLS) was applied to public relations education, and findings indicated some differences but more similarities among students in online and in-person classes regarding the following five factors: academic skills, computer skills, independent learning, dependent learning, and need for online learning (Kerr, Rynearson, & Kerr, 2006). Implications are discussed related to students, faculty, staff, and administrators.

Keywords: Online education, distance learning, public relations, survey, Test of Online Learning Success (TOOLS)

Different Formats, Equal Outcomes? Comparing In-Person and Online Education in Public Relations

A recent article by Seth Godin (2017) offered provocative ideas and stirred debate about higher education. The article’s headline, “No laptops in the lecture hall,” with the sub-head, “How about this instead: No lecture hall,” summarizes the author’s thoughts about traditional college lectures being antiquated and inefficient. Instead of large in-person lectures, he suggests short (8-minute) video lectures delivered online, followed by interactive conversations. He does not seem to argue entirely for online education, but perhaps for a hybrid experience, which some universities are already offering. Godin’s article is just one example of the differing opinions that exist about various formats of education as our society has grown increasingly dependent upon technology for many forms of communication, entertainment, and education.

In recent years, a number of researchers have studied online teaching and learning, sharing perspectives of both faculty and students from various disciplines (e.g., Donavant, 2009; Kerr, Rynearson, & Kerr, 2006; Kim, Liu, & Bonk, 2005; Kleinman, 2005; Poniatowski, 2012; Song, Singleton, Hill, & Koh, 2004; Wallace, 2003). While an image of successful online education is taking shape, more research is needed to continue to understand this educational practice, particularly in the fields of journalism and mass communication (JMC), including public relations (Castañeda, 2011; Moore, 2014). Because these fields rely on media and technology that change rapidly and because these majors are popular at colleges and universities across the United States, studying communication education in an online setting could benefit students, faculty, staff, and administrators in terms of determining if, when, and how to offer such courses, as well as who might be best to teach and take such classes.

As students’ needs change and as universities across the country experience budget cuts and issues with traditional classroom settings, online education is growing at an incredible rate (Kauffman, 2015). Online course options have been proliferating since the 1990s. In 2001, 56% of all two- and four-year degree-granting institutions offered distance education courses with more than 2.8 million students enrolled in these courses across the United States (Waits & Lewis, 2002). By 2010, student demand for online courses and programs increased dramatically with more than 6.5 million students enrolled in at least one online course (Allen & Seaman, 2011). More recent data indicate even more growth in online course offerings with 31.6% of all students taking at least one online course; more than half of those taking online courses (16.7%) are taking a combination of online and traditional courses (Seaman, Allen, & Seaman, 2018). Indeed, while online course options are on the rise, most U.S. colleges and universities still offer traditional classes to accompany their online course offerings. When students have a choice, do they prefer traditional or online public relations courses? And which type of PR course leads to better learning outcomes, grades, or satisfaction among students? These are some of the questions this research seeks to answer. Additionally, what motivates public relations students to take one type of course over the other? And what characteristics are helpful for succeeding in online public relations courses?

A measure of online student characteristics known as TOOLS—Test of Online Learning Success—has been constructed and validated to document the student characteristics that are important for achieving success in online courses (Kerr et al., 2006). However, it is not known whether TOOLS works across academic disciplines, nor whether these characteristics are very different from what makes students successful in traditional classroom settings as well (Kauffman, 2015). Simply put, more research is needed into what makes online teaching and learning successful across disciplines, including in JMC and public relations.

The purpose of this paper is to explore these issues related to online education in public relations (specifically) by comparing four undergraduate courses offered at the same university and taught by the same professor in the same semester in two consecutive years. By comparing student grades and satisfaction, exploring responses to a survey about motivations for taking one course over the other, and assessing the student characteristics measured by TOOLS, this research aims to contribute to our understanding of online education, particularly in JMC and public relations. 


As digital natives continue to enter higher education programs, demand for online courses has grown. What once may have been perceived as a phenomenon associated with for-profit institutions is now a focus for traditional public universities. According to recent data, public universities teach the largest portion of online students (67.8%). However, these online enrollments are highly concentrated; only 5% of educational institutions have almost half of all distance education students (Seaman et al., 2018). Interestingly, students are not necessarily taking online courses because of distance. More than half (52.8%) of all students who took at least one distance course also took courses on campus, and more than half (56.1%) of those who took only online courses actually reside in the same state as the institution in which they were enrolled; a very small proportion (less than 1%) of online students are international. Despite the fact that the majority of students taking online courses are also taking more traditional, in-person courses at the same time, the number of students taking courses on a campus dropped by more than one million (or 6.4%) between 2012 and 2016 (Seaman et al., 2018). To remain competitive and economically viable in this environment, universities have become more proactive about encouraging faculty to develop online courses and degree programs, and JMC programs have responded accordingly (Allen & Seaman, 2010; Castañeda, 2011; Palfrey & Glasser, 2008).

Online Education in Journalism, Mass Communication, and Public Relations

According to Sutherland (2003), JMC programs first implemented “web features” into courses in 1985 and 1990. Referring to diffusion of innovations theory, Sutherland (2003) dubbed these programs “innovators,” and those that started offering such courses in 1993-94 were “early adopters.” Between 1995 and 1999, 101 additional programs began implementing web features into courses, meaning the early and late majority of JMC programs—and even the “laggards” of those surveyed at the time—were offering courses with online components by 2000 (Sutherland, 2003). In a more recent survey of accredited JMC programs (Castañeda, 2011), online courses, defined as those that deliver 80% or more of content online with no face-to-face meetings, were more common than hybrid courses, in which 30-79% of content is provided online with some in-person meetings. Web-facilitated courses, which means 1-29% of content is delivered online, were the most common among accredited JMC programs at the time (Castañeda, 2011). This study did not offer more specific information about public relations courses, in particular.

Some say that research in online learning seems to fall into four camps: there is no significant difference between online and traditional learning; online learning is more effective than face-to-face learning; online learning is less effective than face-to-face learning; and results of comparisons are too mixed, or courses are too different to draw effective conclusions (Castañeda, 2011). Indeed, various studies have reported different results in terms of student grades and satisfaction. At least two studies have compared online and traditional management classes and found no significant differences between the two classes in terms of grades (Daymont & Blau, 2008; Friday, Friday-Stroud, Green, & Hill, 2006). Moore and Jones (2015) studied introductory journalism writing courses and found that students in a hybrid course were more satisfied than students in an online course; the authors found mixed results in terms of grammatical skills among students by the end of the courses. However, multiple instructors taught these courses over the course of four semesters, so results could be attributed to a number of factors (other than or in addition to the course format).

One study that compared different formats of online learning for a mass communication course concluded that online discussion boards may help students who have difficulty expressing themselves in a traditional classroom (Rosenkrans, 2001). In particular, asynchronous discussion environments (where participants can log into a site anytime and participate at their convenience) were preferred by students (Rosenkrans, 2001). Similarly, research on diversity and online education reported that some students prefer asynchronous discussion boards because they provide a permanent transcript and time to reflect and formulate responses to thought-provoking questions (Shlossberg & Cunningham, 2016).

While numerous public relations scholars have turned their attention toward faculty and student use of technology in the classroom in recent years (e.g., Curtin & Witherspoon, 2000; Fraustino, Briones, & Janoske, 2015; Kinsky, Freberg, Kim, Kushin, & Ward, 2016; Tatone, Gallicano, & Tefertiller, 2017), there seems to be limited research into online education in public relations. One study (Kruger-Ross & Waters, 2013) applied the situational theory of publics to online learning and found that awareness (or problem recognition), involvement, and constraint recognition were related to students’ (N = 182) information seeking and processing in online PR courses. Another study examined instructor and student interaction across multiple online public relations courses taught by various faculty at one university over the course of two years (Moore, 2014). The author found that student-student interaction and self-discipline seemed to be the biggest predictors of success and satisfaction with online classes. Moore (2014) also noted that research regarding online PR courses had been “particularly deficient” and suggested that future research compare online PR classes with traditional formats (p. 272).

This research seeks to fill that gap by comparing four undergraduate public relations courses taught by the same professor in the same semester in consecutive years (fall 2016 and fall 2017) at the same university. With all other elements being equal, this study aimed to explore how course format (online vs. in-person) might affect student satisfaction and grades, as well as student motivations for taking one type of course over the other. With these goals in mind, this study asked the following preliminary research questions:  

RQ1: What motivates students to take an online versus in-person course in public relations?

RQ2: What is the relationship between course format (online vs. in-person) and student satisfaction?

RQ3: What is the relationship between course format (online vs. in-person) and student grades?

Student Characteristics for Online Learning Success

Although there seems to be limited research into online education in JMC and public relations, there is significant research about online education in general, including the characteristics that might make students successful in online learning environments. Kauffman (2015) summarized the literature related to student characteristics and skills needed to be successful in online courses in various disciplines. According to existing research, these characteristics and skills include the following: higher emotional intelligence, including self-awareness of needs and adequate management of feelings (Berenson, Boyles, & Weaver, 2008), self-discipline, time management, organization, planning, and self-evaluation skills (Muilenburg & Berge, 2005; Ruey, 2010; Waschull, 2005; Yukselturk & Bulut, 2007). Additional research has shown success in online courses among students who have visual and read-write learning styles (Eom, Wen, & Ashill, 2006), as well as those who are self-motivated and self-directed, demonstrating an internal locus of control, with above-average communication and technological skills (Dabbagh, 2007).

These findings make sense when one considers the nature of online learning. As Kauffman (2015) noted: “More responsibility is placed on the learner, especially in asynchronous courses. The student is responsible for reviewing course material, taking exams at scheduled intervals, etc., which requires adequate self-regulation skills” (p. 7). Other research confirms the need for students to feel involved, have a sense of self-efficacy, and to be self-directed in their learning in online, flipped, or blended courses (Chyr, Shen, Chiang, Lin, & Tsai, 2017). Perhaps not surprisingly, some research has shown that students report feeling alienated or isolated when taking classes in online environments (McInnerney & Roberts, 2004; Tsai, 2013). However, encouraging online academic help-seeking (OAHS), which refers to students requesting assistance from peers or others through the internet, seems to help (Cheng & Tsai, 2011; Chyr et al., 2017). So, how do students know if they possess the skills and characteristics necessary to succeed in online courses, and how can faculty and administrators help students determine this fit before enrolling in online courses that may not be a good fit for their learning styles?

One measure—the Test of Online Learning Success (TOOLS)—has been created and validated to empirically assess such characteristics in students (Kerr et al., 2006). It measures subscales identifying students’ individual behavioral strengths and weaknesses regarding online learning. TOOLS was created over several years through three studies that assessed myriad characteristics that have been observed to be predictive of student success in online settings. While these studies were extensive, all of the research was conducted in Texas with education and social science courses. As scholars have noted (Kauffman, 2015; Kerr et al., 2006), more research is needed to explore whether TOOLS translates across diverse geographic regions, university types, and academic disciplines.

Much of the TOOLS research thus far seems to focus on non-traditional students. One study (Donavant, 2009) applied TOOLS to examine professional development training for police officers and found no statistically significant differences based on their TOOLS scores between those who completed and those who did not complete the training. Another study assessed discussion board post patterns among 14 students in a Teaching English as a Second Language (TESOL) course and found few differences in terms of quality or frequency of posts among students based on their TOOLS scores (Kim & Bateman, 2010).  To date, it does not seem that TOOLS has been applied to students in JMC courses, including public relations, and there does not seem to be much research comparing online and in-person courses, which Moore (2014) suggested is needed.

The test of online learning success includes 45 items measuring five characteristics that correlate with student success in terms of course grades and also with other characteristics that predict positive online learning experiences. The five areas assessed by TOOLS include academic skills, computer skills, independent learning, dependent learning, and need for online learning (Kerr et al., 2006). More specifically, the “academic skills” assessed by TOOLS refer to proficiency in reading and writing, which are arguably important for any type of learning. These academic skills have been reported to be the best predictor of student performance as measured by grades in online courses (Kerr et al., 2006). The test also measures independent learning and dependent learning, which are essentially the inverse of each other. The independent learning subscale yielded the most consistent results across previous studies, and it consists of items that assess a student’s ability to manage time, balance multiple tasks, and set goals. It also measures traits and behaviors regarding self-discipline, self-motivation, and personal responsibility. The items that measure dependent learning, on the other hand, describe students who need reminders from faculty regarding assignment due dates and other characteristics that would seem to make most learning (in-person or online) more difficult. According to previous research, students with high independent learning scores received significantly higher course grades than those with low independent learning scores (Kerr et al., 2006).

Computer skills, also known as computer literacy, is the fourth dimension assessed by TOOLS. Of course, one can imagine why such skills might be important to facilitate online learning or any learning in most modern classroom settings; however, previous research has shown that there was no difference in course grades when comparing students with high versus low levels of computer skills. As scholars note, as long as students have some degree of computer skills and as long as instructors and/or institutions provide some degree of technical support, even students with poor computer skills can perform well in online courses (Kerr et al., 2006).

Finally, TOOLS measures need for online learning, which consists of items related to students’ schedules, geographic distance from campus, and other elements that may make online learning more advantageous for some students. It makes sense that this subscale would predict a student’s need or desire for taking online classes, but what about the other subscales measured by TOOLS? Do these characteristics differ across students taking an online versus in-person course in public relations? Because there is limited research on TOOLS, and it does not seem to have been applied to students in the fields of JMC or public relations, this study proposes a hypothesis related to one of the student characteristics measured by TOOLS and then asks a research question related to the remaining characteristics.

H1: Students in online courses will show higher levels of need for online learning, one of the characteristics measured by TOOLS, than students in in-person courses.

RQ4: What is the relationship between course format and the other student characteristics measured by TOOLS (i.e., academic skills, computer skills, independent learning, and dependent learning)?

The Current Study

The current study compares four undergraduate Public Relations Principles courses––two online and two in-person—taught by the same professor in consecutive fall semesters at the same university—a large, public institution in the United States. All four classes were large. In the first year, 120 students enrolled in the in-person course, and 135 students enrolled in the online course. In the second year, the school lowered the number of students who could enroll in both courses, and the in-person class had 106 students while the online class had 113 students. The instructor aimed to keep the two course formats similar in most respects, other than the modes of course content delivery. For example, the in-person classes met twice a week, for an hour and 15 minutes per class period. Meanwhile, most weeks, students in the online courses had to watch two “modules” per week. The modules consisted of the same (or similar) PowerPoint slides used in the in-person class, and the instructor used Adobe Presenter to record a voiceover to narrate the content in the slides. In conformance to recommendations provided by the university, the modules were typically no more than 10 to 15 minutes in length, so it can be argued that the in-person students received more content, but the modules that were made for the online classes contained the same basic information that would be important to students regarding course concepts, definitions, and examples that would aid in student understanding and completing course assignments. The online courses utilized asynchronous discussion boards, audio and video components, and various types of student-instructor content interactions that are recommended by previous online education research (Fabry, 2009; Moore, 2014; Rosenkrans, 2001).

All four courses used Blackboard as the content management system, although the online classes relied on Blackboard to a greater extent. Both class formats also relied upon four exams as the main method of assessment (80% of the final grade). The other 20% of the final course grades consisted of discussion board assignments in the online classes and of class attendance, participation, and in-class assignments for the in-person classes. Students in both course formats had approximately 10 discussion board or in-class assignments throughout the semester. The online class was asynchronous, meaning discussion boards were open to students 24 hours a day, 7 days a week; however, there were strict due dates for discussion board assignments as well as for exams. Course syllabi and other information are available upon request.


This study employed an online survey of students in the four undergraduate Public Relations Principles courses described at the end of the literature review. Online education researchers encourage the use of surveys to uncover students’ responses regarding effectiveness of such courses (Graham & Scarborough, 1999; Picciano, 1998; Rosenkrans, 2001). The questionnaire included the 45 items that measure the five subscales assessed by TOOLS—academic skills, computer skills, independent learning, dependent learning, and need for online learning—along with items designed to measure student motivations for taking a particular course, as well as satisfaction with the course. Demographic items also assessed students’ gender, age, race/ethnicity, year in school, major, and employment status. There was also one open-ended question that asked students for additional comments about the course. Final grades were pulled from both courses to help answer the research question related to course grades.

The online survey was developed using Qualtrics software and was distributed via email to students in the two courses in late November 2016 and then to the next set of two courses around the same time the next year (late November 2017). Participation was not required, but students were offered extra credit for completing the survey. The survey was anonymous, and all research procedures were approved by the university’s institutional review board. The body of the email and the consent form students read before participating in the study noted that taking the survey was voluntary and anonymous. Students’ emails and names were not collected, and students were notified they would not be personally identified in any published results. Teaching assistants tallied the extra credit points for the students. In terms of the rationale for the survey, the instructor explained that the purpose of the study was to explore motivations for taking the course, as well as “how the class has been for you.” The instructor also relayed the following to students: “There are also several items in the survey that will help educators determine what characteristics are important for success in online and traditional course offerings.”

Survey Measures

After identifying which section of the course students were in (online or in-person), students were asked the following two questions about their motivations: “Why are you taking this course?” and “Why did you choose this section of this course?” Response options are discussed in the findings section related to RQ1.

Student satisfaction was measured with three items adapted from previous research (Kim et al., 2005; Song et al., 2004). Students were asked to indicate the extent to which they agree or disagree with the following statements: “Generally or overall, I am satisfied with this course”; “I would recommend this course to a friend”; and “Compared to other similar courses, I feel like I learned a lot in this course.” Response options were on 5-point Likert-type scales ranging from strongly disagree (1) to strongly agree (5). After collecting responses, these items were assessed for reliability and demonstrated internal consistency (Cronbach’s α = .88). Thus, these items were summed and averaged to create a composite measure of satisfaction (M = 4.32, SD = .67)

The 45 items making up the five subscales assessed by TOOLS are listed in Appendix A.  Participants were asked to rate their agreement with all of the statements using a Likert-type response format ranging from 1 (strongly disagree) to 5 (strongly agree). After collecting survey responses, all of the TOOLS subscales were assessed for reliability and demonstrated internal consistency. Specifically, reliability measures were as follows: academic skills (α = .74, M = 3.94, SD = .41); computer skills (α = .96, M = 4.71, SD = .49); independent learning (α = .84, M = 4.16, SD = .48); dependent learning (α = .81, M = 3.61, SD = .80); and need for online learning (α = .85, M = 2.67, SD = .94). The items in each TOOLS subscale were summed and averaged to create composite measures for further analysis.

Data Analysis

Data analysis was performed using SPSS (Version 24.0) and jamovi (jamovi.org, n.d.). The results of the analyses are described next, beginning with demographic information about the survey respondents.


As mentioned, 226 students enrolled in the two in-person classes over the course of the two fall semesters. The online classes had 248 students the same two consecutive fall semesters; it should be noted that the online courses had five students drop out or withdraw from the class the first year. During the second year, one student moved from the online class to the in-person class because she was visually impaired and decided the in-person class would be better for her. The survey was completed by 452 students across the two classes (in-person n = 207; online n = 245), for a response rate of 95%.

The demographics of the class formats were similar in some ways. The average age of students in all four classes was about 20, and the majority of students in both class formats were white (about 83% in-person; almost 85% online). Both class formats were majority female, with more male students in the in-person sections (22%), compared with the online sections (11%). The in-person classes had more public relations majors (49%) compared with the online classes (22%), while the online classes had more marketing/business and “other” majors (51%) than the in-person classes (31%). In terms of employment, the online classes had more students employed either part-time or full-time (65%) when compared with the in-person classes (50%). Other demographic information about the students in both course formats can be found in Table 1.

Table 1. Sample characteristics by course format (In = In-person; On = Online)

VariablesIn (N or M)(% or SD)On (N or M)(% or SD)
Prefer Not to Answer


< 1%


American Indian
Asian/Pacific Islander
Black/African American


< 1%


Public Relations
Other Communication




Year in School:
First Year
Second Year
Third Year
Fourth Year
Fifth Year +




Not employed





Note. Percentages are based on n = 207 for the in-person class and n = 245 for the online class, with the exception of age, which had 206 responses for that particular item for the in-person class.

The first research question (RQ1) asked: “What motivates students to take an online versus in-person course in public relations?” The majority of students in both course formats took the course because it was required for their major or minor, and for some it fulfilled an elective requirement. However, the in-person classes were filled with primarily PR majors (65%, n = 134) followed by some PR minors (29%, n = 60), while the online classes were filled with primarily PR minors (47%, n = 116) followed by some PR majors (35%, n = 86). There were more students taking the course to fulfill an elective requirement in the online classes (17%, n = 41) than in the in-person classes (5%, n = 10). One student in the in-person format and three students in the online format selected “the topic sounded interesting” as the main reason for taking the course.

Beyond major, minor, and elective requirements, the two major motivations for taking a particular format of the course came down to scheduling and preference. More specifically, the majority of students in the in-person format (58%, n = 121) selected “the course days and times fit my schedule,” while the majority of students in the online format (76%, n = 187) selected “the online option best fit my busy schedule.” The second most selected option for students in the in-person format was “I prefer traditional courses to online courses” (32%, n = 67), while the second most selected option for students in the online format was “I prefer online courses to traditional courses” (11%, n = 27). Approximately 3% of students (n = 7) in the in-person courses selected, “I have never taken an online course and did not wish to take one,” while two students in the online courses selected, “I am not on campus this semester so the online course was my only option.” Another 4% of students (n = 8) in the in-person courses selected “other” and specified things like, “I have ADHD and focus better in an in-person class”; “I am a PR major and decided my PR foundation courses should be taken in person”; “I took the online version and it was far more difficult so I decided to retake it and take the in-person class instead”; and “I am a transfer and wasn’t sure how I felt about doing an online course my first semester.” Similarly, about 3% of students (n = 7) in the online courses selected “other” and wrote in reasons such as: “I needed the class due to medical instructions” and “this was the only section available when I registered.”

The second research question (RQ2) asked: “What is the relationship between course format (online vs. in-person) and student satisfaction?” Results from an independent samples t-test indicated that there were no statistically significant differences in terms of satisfaction between students in the in-person classes (M = 4.35, SD = .67) and the online classes (M = 4.30, SD = .68), t = .839, p = .402. Because of the marginal differences observed between students in the two course formats, follow-up statistical equivalence tests were conducted using two one-sided tests (TOST) procedures (see Lakens, 2017, for a detailed overview) in order to discern whether the difference was small enough to denote equivalence. Findings from the TOST analysis revealed students in the two course formats exhibited statistical equivalence on the measure of satisfaction, as denoted by statistically significant upper and lower TOST equivalence test t-values (TOST Upper = 6.14, p < .001;TOSTLower = −4.46, p < .001).  

An additional open-ended question asked students in both course formats: “Is there anything else you would like the instructor to know about your experience in this course?” Students wrote in varying responses, some of which commented on the nature of the online or in-person courses and noted some differences between the two formats, especially in terms of student satisfaction and learning preferences. Some of these comments shed light on possible best practices for online teaching and learning in public relations. For example, students in the online courses wrote in responses such as, “I really enjoyed the video modules. I love being able to pause them and take notes . . . . You can’t do that in an in-person course!” and “I liked the discussion questions and thought they cultivated critical thinking.” Another wrote, “The discussion boards are just tough enough that you cannot fake them, but relatively easy and stress free.” Students in the online courses seemed to appreciate the organization of the course, consistency of weekly due dates, reminders that were sent, and “very good email communication!” Others in the online courses appreciated being able to work at their own pace: “I liked the format of the course a lot because I was able to learn from it, but at my own time and pace.” Another student said: “This was my first online class I have ever taken and I really enjoyed it! I feel like I learned a lot more with the online class because I could go at my own pace, while also completing many other classes and a part-time job.”

Some of the comments from the online courses seemed to stem from students having fairly low expectations for online learning. For instance, one wrote: “I love the way online lectures are set up with the powerpoint [sic] and audio. You made an online class much more interesting that I thought it would be!” Others commented, “I appreciated the effort put into each online video module as well as the appropriate amount of reminders for remaining online workload,” and “This was one of the better online courses I have taken in the past four years!” Others expressed frustration with previous online course elements: “One thing I dislike about past online courses is that the professor will use a software associated with the textbook that can make the class extremely costly (up to $200) on top of tuition. I like that this class was blackboard based [sic] and a lot more affordable.”

Students in the in-person classes enjoyed the guest speakers that were brought in to speak about various types of public relations jobs. One student in the in-person section wrote: “I have really enjoyed this course and appreciate the frequent guest speakers. I much prefer being in class.” Others commented on the variety of examples, videos, discussions, and other teaching techniques used in the in-person classes. For example, several students enjoyed the “openness to class discussion,” and others commented on qualities such as enthusiasm, care or concern for students, or said things like, “thank you for bringing so much energy and life to this class.” Another student commented: “Having a class that meets face-to-face is nice because it allows us to have conversation, have guest speakers, and other modes of learning that may not be available via an online platform.”

As with any class or set of student evaluations, there were also criticisms and suggestions about how to improve the course. Students in both course formats commented about the exams being too difficult, and one student wrote, “I think more group work would be helpful.” Another student asked for a few more assignments “for students to boost their grade.” Some of these comments were not specific to either format and could be applied to both course formats. At least two students who took the online course seemed to think the in-person course would have been better. One wrote: “I felt that the supplemental videos were not as extensive as the notes people receive in the traditional classroom setting.” Another commented: “I think I would have done better in the in-person section of this class because it was hard to break up the online work throughout the week and I found myself doing it all in one day which wasn’t beneficial.” The attendance policy was mentioned as a complaint among students in the in-person course.

The third research question (RQ3) asked: “What is the relationship between course format (online vs. in-person) and course grades?” Results from an independent samples t-test indicated that there were no statistically significant differences between students in the in-person classes (M = 87.02, SD = 6.80) and the online classes (M = 88.08, SD = 9.05) in terms of course grades, t = -1.75, p =.081. Because of the marginal differences observed, follow-up statistical equivalence tests were conducted using TOST procedures (Lakens, 2017) to determine whether the difference was small enough to denote equivalence. Findings from the TOST analysis revealed students in both course formats exhibited statistical equivalence in terms of their grades, as denoted by statistically significant upper and lower TOST equivalence test t-values (TOST Upper = −9.29, p < . 001;TOSTLower = 5.78, p < .001).  

The fourth research question and one hypothesis were related to the student characteristics measured by TOOLS. The hypothesis predicted that students in the online format would show higher levels of need for online learning than students in the in-person format. Results from an independent samples t-test indicated that there were statistically significant differences in terms of need for online learning between students in the in-person format (M = 2.25, SD = .92) and students in the online format (M = 3.02, SD = .80), t = −9.38, p < .001. Students in the online course reported higher levels of need for online learning, and the difference between the two groups was significant. Thus, H1 was supported.

This study’s last research question (RQ4) asked: “What is the relationship between course format and the other student characteristics measured by TOOLS (i.e., academic skills, computer skills, independent learning, and dependent learning)?” According to independent samples t-tests, there were no statistically significant differences between students in the in-person format and the online format in terms of the remaining characteristics measured by TOOLS, including academic skills, computer skills, independent learning, and dependent learning (see Table 2). Because of the marginal differences observed between students in the two course formats for all four of these subscales, follow-up statistical equivalence tests were conducted using TOST procedures. Similar to the findings related to student satisfaction and course grades, results from the TOST analyses revealed students in the two course formats exhibited statistical equivalence in terms of academic skills, computer skills, independent learning, and dependent learning (see Table 2).

Table 2. Means, standard deviations, and results from t-tests and TOST procedures for academic skills, computer skills, independent learning, and dependent learning

Variable M SD   Welch’s t df P
Academic Skills: In-person 3.95 .41 t-test 0.895 437 0.371
Academic Skills: Online 3.92 .41 TOST Upper 6.19 437 < .001
      TOST Lower −4.40 437 < .001
Computer Skills: In-person 4.68 .46 t-test −0.974 445 0.331
Computer Skills: Online 4.73 .50 TOST Upper 4.34 445 < .001
      TOST Lower −6.28 445 < .001
Independent Learning: In-person 4.11 .46 t-test   −1.698 445 0.090
Independent Learning: Online 4.19 .48 TOST Upper 3.61 445 < .001
      TOST Lower −7.01 445 < .001
Dependent Learning: In-person 3.55 .77 t-test −1.123 445 0.262
Dependent Learning: Online 3.64 .82 TOST Upper 4.19 445 < .001
      TOST Lower −6.43 445 < .001

Note. Levene’s tests were significant (p < .05), indicating a violation to the assumption of homogeneity of variance underlying student’s t. The Welch test is an approximate test for the equality of means without the homogeneous variance assumption.

Finally, although it was not the focus of a research question, the study also examined the TOOLS characteristics and student satisfaction related to students’ year in school, and there were some interesting findings. In terms of satisfaction with the course, there was a significant negative correlation between year in school and satisfaction (r = -.114, p < .05). There was also a significant negative correlation between year in school and independent learning (r = -.108, p < .05). There was a significant positive correlation between year in school and need for online learning (r = .157, p = .001). The relationships between year in school and the three remaining variables measured by TOOLS (computer skills, academic skills, and dependent learning) were not significant. Implications related to these findings as well as to the findings specific to the research questions and hypothesis are discussed below.


This study may be one of the first to assess online undergraduate education in public relations, specifically looking at student motivations for taking online versus in-person courses and comparing similarities and differences across the two formats in terms of student satisfaction and grades. This research also seems to be the first to explore the TOOLS characteristics, which have been explored through online courses in other fields, related to students in JMC and specifically public relations education. The findings uncovered by this research may be useful to public relations and JMC faculty, administrators, advisers, students themselves, parents, and anyone else who might be teaching, taking, or debating the merits of online courses. 

According to the results of this study, student motivations revolved primarily around scheduling needs, and it is not surprising that many of those in the online courses noted their busy schedules as being the main reason for taking that particular section. Meanwhile, preference for a particular format was the other major factor in decision making, with some students preferring in-person classes and others preferring online options. Several students also noted that they had never taken an online course, and there may be some fear of the unknown, or perhaps some stigma about online courses not being quite as good as in-person courses. This should dissipate as online courses continue to proliferate, but faculty teaching online courses can help assuage such fears by letting students know early on what to expect. Pointing to research like this study, for example, which shows equivalence in terms of grades and satisfaction, might help reduce stigma or change perceptions that may exist about online courses not being as good or as rigorous as in-person courses.

Those who advise students could also help provide glimpses of online courses for students who might have concerns, and university websites could do myriad things to help mitigate this fear. For instance, sample discussion board assignments or online exam questions could be shown to students via one of the school or university’s web pages. Universities and staff might also want to post or share the Test of Online Learning Success (TOOLS) online or in-person, which it seems a few schools are doing already (see, e.g., University of Arkansas, 2019 for a modified version of TOOLS). Meanwhile, continuing to create and deliver quality online courses may help dissolve some of the stigma related to online courses being inferior to in-person courses. As more four-year public and private universities offer online courses, and if these courses are taught by professors who spend substantial time creating content and managing the courses, then students’ experiences will change over time, and online courses may become less associated with for-profit institutions or with possible negative perceptions or experiences from a time when online courses weren’t as prevalent as they are now.

Student satisfaction and course grades were statistically equivalent between the two course formats in this study. Of course, this may be because the two courses were taught by the same instructor during the same semester at the same university, and there were attempts made to keep the courses similar in terms of content and assessment. This may suggest a key best practice for online teaching, though, and that is that faculty should treat online courses very much the same as they would in-person courses, in terms of time, energy, communication, and so on. In fact, in some ways, online courses require more communication because students do not have that face-to-face time that they get in an in-person course. Online courses can also take extensive time to develop on the front end, but the return on investment in terms of student learning and satisfaction may be worth the effort. Also, that there was equivalence between four courses (not just two) taught over two years with a fairly substantial student population (N = 452) shows some reliability, although more research is always preferred and would be helpful moving forward.

The open-ended responses related to particular components of the online course may suggest additional best practices. The online modules, which consisted of PowerPoint slides with recorded voiceover and some videos embedded as well, seemed to be appreciated by students. The discussion board assignments also seemed to be valued, in terms of helping students learn and interact on a deeper level with course content and with their classmates. These two elements may be helpful for faculty developing online public relations courses in the future, where online examples abound, and the changing nature of the industry can lead to interesting online discussions. Because of the fast-paced nature of news and examples in PR and related industries, videos should only be embedded in PowerPoints if they seem timeless, or unless the instructor plans to update the modules frequently. Otherwise, videos can be added separately or links can be distributed via email or Blackboard to supplement the pre-recorded lecture modules.

Creating course groups via social media or using a class hashtag in conjunction with whatever online learning system universities prefer or require is also especially relevant for students in public relations and related subject areas where social media is a focus. Such tactics are already being used by faculty in public relations for in-person and online classes (Fraustino et al., 2015; Janoske, Byrd, & Madden, 2019), but these tactics may be particularly helpful for encouraging the type of online academic help-seeking that has been found to make students feel more in control of their learning experience (Cheng & Tsai, 2011; Chyr et al., 2017). For example, while some students may be reluctant to go to the official course page if they have a question while studying late at night, they may be more likely to turn to social media to ask questions of their peers. Online courses might encourage this type of interaction among students and between students and instructors more so than in-person courses, but, of course, more research is needed.

Students also seemed to like the asynchronous method of course delivery and discussion board assignments, as other scholars have found (Rosenkrans, 2001). However, as Kauffman (2015) noted, large courses—such as those included in this study—may make it difficult for instructors to manage and students to participate in highly engaging discussions. While this issue affected student satisfaction in previous studies, it did not seem to happen in this case. However, that may be because of some of the fairly low expectations that seem to exist related to online courses, some of which seem to stem from negative experiences or perhaps false perceptions of online education. Again, findings such as those reported in this research, showing equivalence in student grades and student satisfaction, should help change those perceptions over time.

Many students in the online courses also noted appreciating things like weekly deadlines, the organization and structure of the course, frequent email communication, and reminders about due dates. In terms of structure, the instructor worked closely with someone at the university’s Center for Teaching Excellence to determine the best way to set up an online course. While this may not be available at all universities, more institutions are offering such support, and similar information is also available online or in books and journal articles about online education. Strategies vary and there are many ways to be successful, but in this case, the online course was designed very much like the in-person course with weekly readings, modules (like in-person lectures), and discussion board posts (like in-class discussions), and students seemed to respond well to these elements.

Additionally, while some may argue that students should not need to be constantly reminded of due dates, the students in the online courses typically received at least one email per week from the instructor. These emails would outline the coming week’s content, attempting to make connections to previous weeks or explaining why these topics were important or being covered next, and often reminded students of upcoming deadlines for discussion board posts. Many students noted that they appreciated these emails, although some said they still had a hard time keeping up with deadlines in the online courses because it is just easy to forget when you do not have in-person classes as reminders. Students also commented on instructor qualities such as enthusiasm, even in the online course, and such qualities may be made evident via email. One student in an online course wrote: “You can tell she cares about her students through her emails and voice when speaking on lectures.” Indeed, including elements such as your voice or face through video might make online classes seem more personal and similar to in-person courses. This may be particularly important to students in public relations, where relationships are an important focus and may be the reason some students are majoring, minoring, or are simply interested in PR.

It may be that students in PR have different expectations in terms of online teaching; this would be an interesting topic for future research. New technologies such as VoiceThread (2019) make it possible for students to comment and contribute to discussions via voice, video, or text as well. These types of educational innovations mimic some of the creating and commenting that many students are already doing in various online environments, which may make an online course seem more relevant and/or enjoyable to them, perhaps particularly for students in public relations or JMC, although more research is needed on these various strategies, tactics and technologies for teaching online in these areas.

Many students in the in-person courses commented on the guest lectures, but it should be noted that the online courses also included recorded Skype calls with two former students who were working in the public relations industry. These online guest lectures were also appreciated by the students in the online courses, as noted in comments such as the following: “I enjoyed the interview with the previous student who was working for a big pr agency in new york [sic].” Finally, students in both courses noted their appreciation of study guides, which were provided via Blackboard in both course formats. While study guides are nothing new, it might be worth mentioning that these traditional types of teaching and learning tools are still valued in online environments.

Beyond offering some best practices, this research confirms the reliability of the TOOLS subscales used in previous studies (Kerr et al., 2006). Interestingly, there were no differences between students in the two course formats in terms of academic skills, computer skills, independent learning, and dependent learning. This is somewhat surprising, considering some people’s perceptions about online education and some mixed results among previous research about the effectiveness of online education, but perhaps it should not be surprising in this case where direct comparisons were made. For instance, it makes sense that general academic skills would be important in any kind of learning environment, so perhaps it should not be surprising that there was equivalence among the students in both course formats regarding academic skills.

In terms of computer skills, it could be that as “digital natives” enter higher education and take online courses, there is not much variance in computer skills. It may become a moot subscale, assuming students have had some experience with computers, which is typical of most traditional students these days. Of course, computer skills could still be important to assess among non-traditional students who may not have had as much experience with computers for various reasons. Independent learning is probably important for students in any class and does not apply only to online classes, so perhaps it is not surprising that there was equivalence among students in both course formats in terms of independent and dependent learning. It is still interesting, though, considering that this is one of the first studies to do direct comparisons between the two course formats, and considering some presumptions about independent learners perhaps being more inclined to take online courses while dependent learners might be more likely to choose traditional or in-person courses. Again, as online education options proliferate and become more sophisticated, it may be that these differences do not matter, or perhaps other factors are more important for determining student success.

It is not surprising that the need for online learning was higher among students in the online course. This coupled with student responses regarding motivations for taking online courses, including jobs and other responsibilities, may mean that need and/or preference may be the biggest factors to consider when debating online education. However, as noted earlier in this study, there were at least five students who dropped out of the online course throughout the semester, and it could be that some of those students dropped out because they did not enjoy or could not manage themselves in an online course. Another student tried the online course but switched to the in-person course because the former was too difficult. Time management and self-motivation, which are specifically mentioned in some of the items in the independent learning subscale of TOOLS, may be the most important factors for being successful in online courses. They are also likely influential on students’ perceptions of online courses when they are enrolled in those courses.

In the future, if individuals are trying to pare down TOOLS or combine elements of TOOLS with other student characteristics, independent learning and need for online learning may be the most important items to measure. Academic skills are likely important for all students; computer skills may be a moot subscale to measure among today’s students, and dependent learning may be redundant if a study is already considering independent learning. Interestingly, independent learning and need for online learning were the only TOOLS characteristics to have statistically significant correlations with year in school. The relationship between year in school and need for online learning was positive, meaning the further along in school students were, the more they desired or required taking courses in online environments. This finding may be due to the fact that students who are closer to graduating (in their third or fourth year) are probably more likely to also be working or interning while taking courses. The relationship between year in school and independent learning was negative, meaning the further along in school students were, the less likely they were to perceive themselves as being self-motivated and skilled at time management and related skills. The reason for this relationship is not known; however, it may be connected to the factors just mentioned related to the need for online learning. Perhaps students who are earlier on in their college career have fewer obligations (jobs, internships, extracurricular activities), which could be why they also feel more independent in their ability to learn and achieve at their own pace. Future research could further explore the TOOLS characteristics related to other online public relations courses or other types of courses related to JMC. Other characteristics or variables also may be of interest for understanding success in online education in PR and/or JMC.


This research helps shed light on what it might take to teach or take an undergraduate online course in public relations. In addition to some of the best practices and student characteristics already discussed, research into online education can help us determine who might benefit most from such opportunities. Online courses might be helpful for students who may be more open online or are less likely to speak up in the classroom, or for students who feel discriminated against in a traditional classroom for any reason. Online education might also be best for nuanced topics, for which time, reflection, and many diverse perspectives would aid in understanding, as some scholars have suggested (Shlossberg & Cunningham, 2016). Asynchronous online discussions allow for this time and reflection, and not meeting face-to-face may make it easier for some students to express diverse perspectives. All of this must be managed by an engaged instructor, though, and more research is needed to validate some of the findings from this study, which is limited in size, scope, and subject matter.

Indeed, while it seems public relations education—and JMC education in general—might benefit from online course offerings, more research is needed. More courses could be developed on various topics by faculty at diverse institutions, and similar comparison studies could help provide a more complete picture. Additionally, assessing different types of assignments would help us understand more about what works and does not work in online courses related to these fields. This study relied on undergraduate student responses to a survey (in addition to final course grade comparisons), which could be considered limitations of this research. Qualitative methods, including interviews and focus groups, and experimental research could provide additional insights into online education in public relations and other areas of JMC. Longitudinal research to track whether students retain more over time in PR or JMC courses taught online or in person would add another layer of insight to understanding online education in these fields.

Many graduate programs are now offered completely online, and this is another area that is ripe for future research. Because of the focus and reliance on new technologies, digital and social media, and relationships in PR and JMC, which present unique opportunities and challenges when teaching in online environments, more research may be particularly useful in these areas to further hone best practices and offer faculty specific resources and tips for teaching online successfully. Additional research on hybrid courses is of interest as well. Courses that deliver much of the content online yet meet periodically to engage in discussions and/or hear from guests may provide the best of both types of education for students.

While it seems too early to go along with Godin’s (2017) suggestions of doing away with all lecture halls, there is no doubt that higher education has changed and continues to change at a rapid pace. It would be wise to continue to explore the many possibilities provided by technology and innovation to ascertain what might be best for student learning in public relations, in particular, and in higher education as a whole. This study adds to existing research in these areas, but more research is needed and encouraged for the sake of students, faculty, staff, and other decision makers in higher education. 


Allen, I. E., & Seaman, J. (2011). Going the distance: Online education in the United States, 2011. Wellesley, MA: The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/survey/going_distance_2011

Allen, I. E., & Seaman, J. (2010). Learning on demand: Online education in the United States. Wellesley, MA: The Sloan Consortium. Retrieved from https://eric.ed.gov/?id=ED529931

Berenson, R., Boyles, G. & Weaver, A. (2008). Emotional intelligence as a predictor for successin online learning. International Review of Research in Open & Distance Learning, 9(2),  1-16. Retrieved from https://www.learntechlib.org/p/77656/

Castañeda, L. (2011). Disruption and innovation: Online learning and degrees at accredited journalism schools and programs. Journalism & Mass Communication Educator, 66(4), 361-373. doi: 10.1177/107769581106600405

Cheng, K. H., & Tsai, C. C. (2011). An investigation of Taiwan university students’ perceptions of online academic help-seeking, and their web-based learning self-efficacy. Internet and Higher Education, 14(3), 150-157. doi: 10.1016/j.iheduc.2011.04.002

Chyr, W. L., Shen, P. D., Chiang, Y. C., Lin, J. B., & Tsai, C. W. (2017). Exploring the effects of online academic help-seeking and flipped learning on improving students’ learning. Journal of Educational Technology & Society, 20(3), 11-23. Retrieved from https://www.jstor.org/stable/26196116?seq=1#page_scan_tab_contents

Curtin, P., & Witherspoon, E. (2000). Computer skills integration in public relations curricula. Journalism & Mass Communication Educator, 54, 23-34. doi: 10.1177/107769589905400103

Dabbagh, N. (2007). The online learner: Characteristics and pedagogical implications. Contemporary Issues in Technology and Teacher Education, 7(3), 217-226. Retrieved from http://www.citejournal.org/vol7/iss3/general/article1.cfm 

Daymont, T., & Blau, G. (2008). Student performance in online and traditional sections of an undergraduate management course. Journal of Behavioral & Applied Management, 9(3), 275-294. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?

Donavant, B. W. (2009). The new, modern practice of adult education: Online instruction in a continuing professional education setting. Adult Education Quarterly, 59(3), 227-245. doi: 10.1177/0741713609331546

Eom, S., Wen, H., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: an empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), 215-235. doi: 10.1111/j.1540-4609.2006.00114.x

Fabry, D. L. (2009). Designing online and on-ground courses to ensure comparability and consistency in meeting learning outcomes. Quarterly Review of Distance Education, 10(3), 253-261. Retrieved from http://www.infoagepub.com/index.php?id=89&i=43 

Fraustino, J. D., Briones, R., & Janoske, M. (2015). Can every class be a Twitter chat?: Cross-institutional collaboration and experiential learning in the social media classroom. Journal of Public Relations Education, 1(1), 1-18. doi: 10.1080/17404622.2016.1219040

Friday, E., Friday-Stroud, S. S., Green, A. L. & Hill, A. Y. (2006). A multi-semester comparison of student performance between multiple traditional and online sections of two management courses. Journal of Behavioral and Applied Management, 8(1), 66-81. Retrieved from https://pdfs.semanticscholar.org/bc9b/e03d1c7320942506fb6745e66d7cf67fe0be.pdf

Godin, S. (2017, Nov. 25). No laptops in the lecture hall. Medium. Retrieved from https://medium.com/@thisissethsblog/no-laptops-in-the-lecture-hall-1847b6d3315

Graham, M., & Scarborough, H. (1999). Computer mediated communication and collaborative learning in an undergraduate distance education environment. Australian Journal of Educational Technology, 15(1), 20-46. Retrieved from https://ajet.org.au/index.php/AJET/article/view/1845

Jamovi. (n.d.). Retrieved from https://www.jamovi.org/

Janoske, M., Byrd, R., & Madden, S. (2019). One liners and catchy hashtags: Building a graduate student community through twitter chats. Journal of Public Relations Education, 5(1), 70-100. doi: 10.4018/ijdet.2014070106

Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology, 23, 1-13. Retrieved from http://dx.doi.org.pallas2.tcl.sc.edu/10.3402/rlt.v23.26507 

Kerr, M. S., Rynearson, K. & Kerr, M. C. (2006). Student characteristics for online learning success. Internet and Higher Education, 9(2), 91-105. doi: 10.1016/j.iheduc.2006.03.002

Kim, H. K., & Bateman, B. (2010). Student participation patterns in online discussion: Incorporating constructivist discussion into online courses. International Journal on E-learning, 9(1), 79-98. Retrieved from https://www.learntechlib.org/p/28165/

Kim, K. J., Liu, S., & Bonk, C. J. (2005). Online MBA students’ perceptions of online learning: Benefits, challenges, and suggestions. The Internet and Higher Education, 8(4), 335-344. doi: 10.1016/j.iheduc.2005.09.005

Kinsky, E. S., Freberg, K., Kim, C., Kushin, M., & Ward, W. (2016). Hootsuite University: Equipping academics and future PR professionals for social media success. Journal of Public Relations Education, 2(1), 1-18. Retrieved from https://aejmc.us/jpre/2016/02/15/hootsuite-university-equipping-academics-and-future-pr-professionals-for-social-media-success/

Kleinman, S. (2005). Strategies for encouraging active learning, interaction, and academic integrity in online courses. Communication Teacher, 19(1), 13-18. doi: 10.1080/1740462042000339212

Kruger-Ross, M. J., & Waters, R. D. (2013). Predicting online learning success: Applying the situational theory of publics to the virtual classroom. Computers & Education, 61, 176-184. doi: 10.1016/j.compedu.2012.09.015

Lakens, D. (2017). Equivalence tests: A practical primer for t-tests, correlations, and meta-analyses. Social Psychological and Personality Science, 8(4), 355–362. doi: 10.1177/1948550617697177

McInnerney, J. M., & Roberts, T. S. (2004). Online learning: Social interaction and the creation of a sense of community. Educational Technology & Society, 7(3), 73-81. Retrieved from https://www.jstor.org/stable/pdf/jeductechsoci.7.3.73.pdf?seq=1#page_scan_tab_contents

Moore, J., & Jones, K. (2015). The journalism writing course: Evaluation of hybrid versus online grammar instruction. Journalism & Mass Communication Educator, 70(1), 6-25. doi: 10.1177/1077695814551831

Moore, J. (2014). Effects of online interaction and instructor presence on students’ satisfaction and success with online undergraduate public relations courses. Journalism & Mass Communication Educator, 69(3), 271-288. doi: 10.1177/1077695814536398

Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study. Distance Education, 26(1), 29-48. doi: 10.1080/01587910500081269

Palfrey, J., & Glasser, U. (2008). Born digital: Understanding the first generation of digital natives. New York, NY: Basic Books.

Picciano, A. G. (1998). Developing an asynchronous course model at a large, urban university. Journal of Asynchronous Learning Networks, 2(1). Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=

Poniatowski, K. (2012). Getting students ready to write: An experiment in online teaching and learning. Journalism & Mass Communication Educator, 67(2), 120-133. doi: 10.1177/1077695812440943

Rosenkrans, G. (2001). Design considerations for an effective online environment. Journalism & Mass Communication Educator, 56(1), 43-61. Retrieved from doi: 10.1177/107769580105600105

Ruey, S. (2010). A case study of constructivist instructional strategies for adult online learning. British Journal of Educational Technology, 41(5), 706-720. doi: 10.1111/j.1467-8535.2009.00965.x

Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States. Babson Park, MA: Babson Survey Research Group. Retrieved from https://eric.ed.gov/?id=ED580852

Shlossberg, P., & Cunningham, C. (2016). Diversity, instructional research, and online education. Communication Education, 65(2), 229-232. Doi: 10.1080/03634523.2015.1098713

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education, 7(1), 59-70. doi: 10.1016/j.iheduc.2003.11.003

Sutherland, P. J. (2003). Diffusion of courses with World Wide Web features: Perceptions of journalism and mass communication program administrators. Journalism & Mass Communication Educator, 57(4), 384-395. Retrieved from https://eric.ed.gov/?id=ED456469

Tatone, J., Gallicano, T. D., Tefertiller, A. (2017). I love tweeting in class, but . . .: A qualitative study of student perceptions of the impact of Twitter in large lecture classes. Journal of Public Relations Education, 3(1), 1-13. Retrieved from https://aejmc.us/jpre/2017/05/24/i-love-tweeting-in-class-but-a-qualitative-study-of-student-perceptions-of-the-impact-of-twitter-in-large-lecture-classes/

University of Arkansas. (2019). U of A online: Online course readiness quiz. Retrieved from https://online.uark.edu/students/readiness-quiz.php

Tsai, C. W. (2013). An effective online teaching method: The combination of collaborative learning with initiation and self-regulation learning with feedback. Behaviour & Information Technology, 32(7), 712-723. doi: 10.1080/0144929X.2012.667441

VoiceThread. (2019). Retrieved from https://voicethread.com/

Waits, T., & Lewis, L. (2002). Distance education at degree-granting postsecondary institutions: 2000–2001. Education Statistics Quarterly, 5(3). Retrieved from http://nces.ed.gov/programs/quarterly/vol_5/5_3/4_4.asp

Wallace, R. M. (2003). Online learning in higher education: A review of research on interactions among teachers and students. Education, Communication & Information, 3(2), 241-280. doi: 10.1080/14636310303143

Waschull, S. B. (2005). Predicting success in online psychology courses: Self-discipline and motivation. Teaching of Psychology, 32(3), 190-192. doi: 10.1207/s15328023top3203_11

Yukselturk, E. & Bulut, S. (2007). Predictors for student success in an online course. Journal of Educational Technology & Society, 10(2), 71-83. Retrieved from https://www.jstor.org/stable/jeductechsoci.10.2.71?seq=1#page_scan_tab_contents

Appendix A: The Test of Online Learning Success (TOOLS)

The following items were measured on 5-point scales with the following response options: Strongly Disagree (1), Disagree, Neither Agree nor Disagree, Agree, and Strongly Agree (5).

Note: Items that were reverse scored are indicated below. In addition to looking at the five subscales individually, Kerr et al. (2006) stated that the “total online learning success (OLS) is calculated by summing across all 45 items. Higher scores reflect higher skills. Thus, lower scores on dependent learning denote more dependence (less independence)” (p. 97).

Computer Skills

I am capable of learning new technologies.

I am capable of sending and receiving e-mail.

I am capable of attaching files to an e-mail message.

I am a competent Internet browser.

I am capable of using standard word processing software.

I am capable of managing files on a computer.

I can download new software when necessary.

I can install new software when necessary.

I can copy and paste text using a computer.

I am capable of using discussion boards online.

I am capable of using chat rooms online.

Independent Learning

I am capable of prioritizing my responsibilities.

I am a good time manager.

I am a procrastinator. (reverse scored)

I am capable of making time for my coursework.

I am able to balance many tasks at one time.

I am goal-oriented.

I am self-disciplined when it comes to my studies.

I am self-motivated.

I take responsibility for my learning.

I am capable of critical thinking.

Dependent Learning

I often leave tasks unfinished. (reverse scored)

I require help to understand written instructions. (reverse scored)

I wait until the last minute to work on assignments. (reverse scored)

I have trouble comprehending what I read. (reverse scored)

I need faculty to remind me of assignment due dates. (reverse scored)

I need incentives/rewards to motivate me to complete a task. (reverse scored)

Need for Online Learning

Because of my personal schedule, I need online courses.

It is difficult for me to get to campus to attend classes.

I need online courses because of my geographical distance from universities.

I need online courses because of my work schedule.

I need the freedom of completing coursework at the time and place of my choosing.

Academic Skills

I can learn by working independently.

I am self-directed in my learning.

I am capable of solving problems alone.

I need face-to-face interaction to learn. (reverse scored)

I need faculty feedback on my completed assignments. (reverse scored)

I am a good reader.

I need classroom discussion to learn.

I am capable of asking for help when I have a problem.

I am comfortable learning new skills.

I read carefully.

I am a good writer.

I am capable of following written instructions.

I am capable of conveying my ideas in writing.

Author Note: The author wishes to acknowledge and thank the reviewers and editorial team at JPRE. Additionally, the author is grateful for a Provost’s grant from the University of South Carolina that aided in the development of the online course that is the focus of this study. The Center for Teaching Excellence at University of South Carolina also provided guidance in the development of the online course. Finally, the author wishes to thank Robert McKeever for providing information about jamovi.org, which was used for data analysis, as well as the many students who participated in this research.

Correspondence concerning this article should be directed to Brooke W. McKeever at brookew@sc.edu.

To cite this article: McKeever, B. W. (2019). Different formats, equal outcomes? Comparing in-person and online education in public relations. Journal of Public Relations Education, 5(2). Retrieved from https://aejmc.us/jpre/2019/08/17/different-formats-equal-outcomes-comparing-in-person-and-online-education-in-public-relations/