Tag Archives: pedagogy

Teaching Social Media Analytics in PR Classes: Focusing on the Python Program

Editorial Record: Submitted June 4, 2022. Revised October 21, 2022. Revised January 8, 2022. Accepted January 26, 2023. Published May 2023.

Authors

Kim, Seon-Woo
Ph.D. Candidate
Manship School of Mass Communication
Louisiana State University
USA
Email: kr.seonwoo@gmail.com

Chon, Myoung-Gi, Ph.D.
Associate Professor
School of Communication and Journalism
Auburn University
USA
Email: mzc0113@auburn.edu

Abstract
The teaching brief introduces what to teach and how to teach social media analytics for PR educators in a university. It suggests a semester-long curriculum for an independent research method class for both graduates and undergraduates. First, we discuss why students can better learn programming languages over industrial platforms. In addition, we compare three different ways of data collection (crawling, API, and download) and discuss the pros and cons. Then, it presents (1) data collection through API, (2) text mining, and (3) network analysis with the shared Python code on GitHub and the step-by-step tutorial for PR educators who are unfamiliar with programming languages. This brief is expected to help to bridge the gap between the growing demands of programming-based analytics in PR practice and education.

Keywords: Social media analytics, Pedagogy, Python, API, Text mining, Network analysis

Social media has become integral to digital public relations (Ewing et al., 2018). PR companies perceive social media analytics (SMA) as a useful tool to identify who is a target public, understand the current environment around an organization, measure PR campaign outcomes, build relations with stakeholders and influencers, and many more (Kim, 2021). Responding to the growing demands of social media analytics in the PR industry, analytics curricula in PR programs need to be developed to educate PR students (Commission on Public Relations Education, 2018). 

However, public relations educators have faced challenges in learning and teaching social media analytics. Most PR instructors have not had an opportunity to learn computer programming knowledge for analytics during their academic careers, such as Python or R. Moreover, teaching analytics requires understanding new methodologies and data types, such as natural language processing, network theory, and deep learning. Given this background, the current analytics classes in PR programs mostly focus on conceptual knowledge and the use of other commercial tools. 

For example, students have learned how to use proprietary platforms, such as Brandwatch and Sprinklr, and interpret the results on those platforms. It is also common for instructors to ask students to get certificates in Google Analytics and Hootsuite as evidence of their analytics competence (Ewing et al., 2018). Despite the above efforts, PR professionals recommend that PR graduates have programming knowledge for PR work automation and tailored PR services to clients (Szalacsi, 2019; Trafalgar Strategy, 2022). Heavy reliance on industrial analytics platforms would limit students’ SMA competency within the platforms’ modality, thereby preventing them from developing advanced analytical abilities.   

To fill the gap, this teaching brief aims to provide a pedagogical foundation for utilizing Python as an SMA tool. Particularly, this teaching brief explains an SMA class material based on data collection, text mining, and network analysis. We provide Python codings that PR educators can use in classrooms to teach Python programming language. Python is the most frequently used programming language in data science (TIOBE, 2022; Woodie, 2021). The Python codes are designed as simplified as possible for a PR analytics introduction class. We provide step-by-step instructions about the Python codes to help readers understand and follow the programming function. This teaching brief is expected to encourage programming-based SMA classes in public relations classes.

Teaching Objectives

Table 1 summarizes the learning objectives of what this teaching brief delivers and required Python packages. This brief consists of three parts: data collection, text mining, and network analysis. First, students are expected to obtain knowledge about Tweet data collection through API. Because APIs tend to provide free versions and have similar ways of use across social media platforms (e.g., Facebook, Instagram), the practice of collecting tweets via API helps students equipped with social media data collection skills for various SNSs without cost. In addition to this, we introduce ways to find content published by influencers and popular tweets. Lastly, students can learn to save the collected data as a spreadsheet format (i.e., xlsx).

Table 1: The Overview of Learning Objectives

 Learning outcomesPython package
Python downloadhttps://github.com/formulated/PR_education_Python 
Data collection– Learn how to apply for Twitter Academic Research Access
– Apply a Twitter access code to Python
– Create search query, including keyword and date
– Collect Tweets through the shared Python code
– Sort out tweets by the number of likes, retweets, and followers
– Save the collected data as the excel format on your local computer
– pandas
– twarc
– os
– requests
– time
Text mining– Load the collected Twitter data
– Text data cleaning
– Create Word cloud
– Calculate word frequency and visualize text mining results
– pandas
-nltk.corpus – re
Network analysis– Create data for network analysis from the collected Twitter data (i.e., mention relation and retweet relations)
– Network object generator
– Generate a network graph
– Export network data for visualization on Gephi
– Calculate various centrality scores
– pandas
– networkx

Next, students can apply text mining and network analysis to their own data collected by API. Through the text mining section, students learn to load and clean data, and create a word cloud and calculate word frequency with its visualization. The network analysis section introduces a simple conceptual understanding of network data, how to construct network data, how to calculate centrality scores, and visualization preparation through Gephi, a popular network visualization tool utilized in academia and industry. 

Teaching Preparation

To use the shared Python code for teaching, educators need to have some basic Python skills. Also, educators and students must install some Python applications and packages. Python code for this teaching brief is available on GitHub (https://github.com/formulated/PR_education_Python). We assume that readers have already installed Python 3 (https://www.python.org/downloads/) and Jupyter notebook (https://jupyter.org/install). Python is a programming language, and Jupyter is a web-based interactive computational environment for Python programming. If not, we recommend installing Anaconda (Anaconda, n.d.), which includes Python 3 and Jupyter Notebook. After installation of Python 3 and Jupyter, launch Jupyter and open the shared code in a new notebook. 

Then, readers need to install the required Python packages, such as pandas and networkx, for each Lesson (see Table 1). 

When running the shared Python code without installing required packages on a computer, it will show the following message “ModuleNotFoundError:no module named ‘XXX’.” Module is synonymous for package in Python. 

To install a Python package on Mac, open Terminal application and type “pip3 install [package name]”. For example, below is the command for installing pandas (see Figure 1). 

Figure 1

On Windows, open Anaconda Prompt program and type “pip3 install [package name]” as below (see Figure 2). 

Figure 2

This short brief cannot cover every single Python code one by one. Instead, we focus on which codes should be edited to properly run the Python code and serve different learning objectives in classrooms. For example, some classes would focus more on organizational PR while others political PR. In this case, the code may require unique search keywords depending on the subject. 

PR educators should have some basic knowledge in Python (e.g., installation, running code, basic built-in functions) prior to giving students a demonstration of the shared Python coding and modifying the codes for class projects and activities. To develop this basic proficiency, we recommend Python books for beginners (e.g., Codeone Publishing, 2022; Matthes, 2019) and freely-available online resources from YouTube, such as Learn Python in 1 Hour (Programming with Mosh, 2020, September 16). Also, online Python bootcamp courses, such as DataCamp (https://app.datacamp.com), are valuable resources for PR educators and students as they provide interactive web environments of Python for beginners. PR educators may connect the online bootcamp course to a part of their SMA course curriculum as assignments or pre-class activities. 

Lesson 1. Data collection from Twitter through API

There is nothing to analyze without data. Data collection is the start of extracting valuable insight from analytics before moving to gather information by organizing the data. Thus, among many required skills, data collection is the foundation of SMA (Kent et al., 2011). Growing PR jobs require data collection skills from the web and social media (Meganck et al., 2020). Before digitized public relations, PR practitioners had to manually scan and gather the environment around an organization, such as news monitoring and clippings. However, today’s digital society creates a massive number of user-generated contents about organizations on the web and social media, which makes it nearly impossible for PR practitioners to collect them manually. 

There are three main ways to collect web data: crawling, API, and downloading from industrial platforms. Table 2 compares the data collection methods. Web crawling, or scraping, refers to a mechanical collection of web data (e.g., text, image, sound, and video). A web crawler automatically extracts data from a website based on programming. Technically, it is possible to crawl data using freely-available packages in Python for most web pages, such as social media, news media, and web communities. These packages can be implemented for news clipping and issue/crisis monitoring as a daily PR practice.

Table 2: Comparison of Data Collection Methods

 CrawlingAPIDownload
Level of difficultyDifficultModerate – difficultEasy – moderate
PriceFreeFree or paidPricy
LegalityRiskySafeSafe
AlgorithmTransparentTransparentBlackbox
FlexibilityHighHighLow-High
Data accessibilityPartialFull or partialFull or partial
VariablesLimited – SomeSomeMany but blackbox

Writing crawling programming requires advanced programming language and web structure knowledge such as HTML, HTTP, and CSS. Also, they should be updated whenever a website changes its layout and structure. In addition, social media companies present limited, personalized feeds and content to each account based on their algorithms and other variables (e.g., follower network, search history, location). Thus, a web crawler often cannot access the full-archived data because it can only collect data visible on the website, which may raise content representativeness issues. A crawler also cannot get invisible metadata and variables that a social media company provides to API and industrial platforms, such as user profiles (e.g., when an account was created) and metadata (e.g., the name of the app the user posted from). If necessary, you have to construct variables from crawled data. Crawling may face some legal issues if you do not get an agreement from a social media company prior to collecting the data.

Another way to collect data from social media is to use API (application programming interface). Many software companies provide API to let other third-party services and programmers use their service in a convenient way. For example, Apple and Google use weather API to provide weather services to customers without collecting weather data by themselves. Major social media platforms (e.g., Twitter, Meta) also provide API for users to collect data within the companies’ policy and authentication. Thus, it is relatively easier and safer to collect social media data compared to crawling because it is free of committing a violation of a website’s Policies and Terms of Service. 

Free version APIs usually have a basic data access level like a trial version, providing limited requests that you can make within a day and shorter historical data. There are paid API services with more or full-archived data access and functions. Major social media companies have opened their premium API for research and education purposes. For example, Twitter allows researchers to access the full tweet archive through Twitter’s Academic Research Access (Twitter Developer Platform, n.d.). After filling out an application and it being accepted, Twitter will provide an access code. Currently with API access, ten million tweets can be collected per month. Meta also runs CrowdTangle, where PR educators can access Facebook, Instagram, and Reddit data. APIs present some variables, such as message type (e.g., retweet, original), the number of engagements (e.g., likes, shares, comments), and geographic location.

Lastly, industrial platforms, such as Brandwatch (https://www.brandwatch.com/) and Sprinklr (https://www.sprinklr.com/), allow paid subscribers to download social media data from their platforms. Click-based user interfaces do not require programming. However, those platforms are pricey because their business model is B2B with governments, companies, and universities. Due to high prices, a few universities are not subscribing to those services for teaching and research purposes. If a department already subscribes to such a service, they are a good resource for PR analytics teaching. Like API, there are no legal issues in data collection and use within the companies’ policy and authentication, and many industrial platforms provide full historical archive access. Industrial platforms also provide a rich amount of metadata, such as users’ gender, sentiment, and users’ profession or organizations (e.g., journalists, politicians). However, it is not clearly known how the data resellers construct those variables for users (i.e., blackbox). Although some companies provide explanations about their variable construction, researchers typically cannot replicate the variables due to limited information.

Given the pros and cons of the three data collection methods mentioned above, this teaching brief introduces how to collect Twitter by using the Twitter Academic Research Access API. Because most major companies maintain Twitter accounts, and their contents are publicly available, a few researchers and PR practitioners choose Twitter for real-time issue monitoring and reputation management (e.g., Chon & Kim, 2022; Rust et al., 2021). In addition, data collection with API and Python is similar across social platforms. If educators and students understand the code for Twitter data collection, the code can be adjusted to get data from other platform APIs. 

Tutorial. Data collection

The teaching brief here shows how to collect Tweets by using Twitter API and Python. Twitter allows researchers to access the full tweet archive through Twitter Academic Research Access (Twitter Developer Platform, n.d.). After filling out applications, including research interest and affiliation, Twitter gives users access codes to collect ten million tweets per month. 

To run the Python code from the GitHub (Kim, 2022) that the author has created, you need to change the OAuth 2.0 Bearer Token (i.e., credential key or password for Twitter) and the query parameters (e.g., search keyword, date). The Bearer Token is given after achieving Twitter Academic API permission. In the below code line, the coder would insert their Bearer Token. The Bearer Token format is a long combination of alphabets and numbers (see Figure 3).

Figure 3

In query parameters, query indicates search keywords. Hashtags (i.e., #) and mentions (i.e., @) can be used as a search query (e.g., @PR, #PR). Tweet.fields indicates which variables are collected. The coding includes user numeric IDs (i.e., author_id), timestamp (i.e., created_at), and public metrics (i.e., retweet, reply, like, and quote). Also, data period should be set in start_time and end_time. If the code is run, tweets will be collected in excel data format (see Figure 4). 

Figure 4

We use this data for basic text mining and network analysis. There is a code for exporting the below data as an excel file (see Figure 5). 

Figure 5

[Figure 5 Should Be Here]

Next, because the data has variables such as the number of likes and retweets, it can figure out which tweets have the most engagement. Also, the number of tweets posted by user accounts indicates who are active and potentially stimulated publics on social media. Sorting users by the number of followers results in a list of influencers around a topic. 

The code below filters the top 10 most-retweeted tweets (see Figure 6). To get the most-liked tweets, a variable name in sort_value parameter should be changed (e.g., from ‘retweet_count’ to ‘like_count’). Additional codes filter users who wrote tweets the most about the issue and users with the highest number of followers. Depending on PR campaigns and activities, practitioners would edit to yield other valuable information. For example, combining these metrics with the time variable (i.e., created_at) may produce the best time/weekdays to post a social media posting. Practitioners may summarize weekly engagement with publics from social media campaigns by summing or averaging likes, shares, or the number of replies.

Figure 6

Lesson 2. Text mining 

Text mining (i.e., computational text analysis, natural language processing) is one of the most promising areas in public relations for listening to publics and stakeholders. Digitized communication environments continue to create an unlimited number of digital texts. Knowledge discovery from text data is recommended to increase an organization’s performance and efficiency beyond data retrieval. Excellence theory posits that listening to publics is more important than disseminating information (Grunig & Grunig, 2009). When PR practitioners instill publics and stakeholders’ voices into an organization, it can make effective strategic communication, which contributes to organizational success (Kim & Rhee, 2011). 

There are also many possible ways for text mining to assist public relations practices, such as topic discovery and opinion mining. For example, the topic analysis provides insights about the main topic, issue, and trend around an organization based on descriptive analysis (word frequency, co-occurrence) and algorithm (e.g., topic modeling). Opinion mining, or sentiment analysis, can be used to investigate reputations of an organization and a brand, issue, and crisis (Liu, 2011).

Text mining covers collection, preprocessing, analysis, and summary of text data based on mathematical algorithms. Analyzing a large amount of unstructured text requires different statistical methods and tools (Grimmer et al., 2021). For example, texts are unstructured, unlike traditional structured data (e.g., data in excel), so data cleaning is necessary to transform them into a structured format. Conventional statistical tools, such as SPSS and SAS, provide a limited text mining function, as originally designed to analyze structured data. Hence, programming skills in Python and R are preferred for text mining.

Tutorial. Text mining

In the shared Python code, text mining includes (a) loading the Tweet data, (b) text data cleaning (e.g., low transformation, stopwords removal), (c) word cloud, and (d) word frequency calculation and visualization. Also, this code can be used to analyze other text data from social media and other web pages if a data structure is the same (i.e., data with the same column names). Otherwise, the column names in other data should be edited. The first task for text mining is to load data (see Figure 7). For this example, the code imports the excel file collected through the Twitter API. Pandas is one of the best Python packages to load, preprocess, and analyze data. The pandas package is imported with the abbreviated name, pd, with the following code, “import pandas as pd” in the first code cell.

Figure 7

The next step is text cleaning, or preprocessing. Though any data needs some level of data cleaning before analysis, text data requires more effort in preprocessing due to the complexity of human language. User-generated content tends to include noise elements such as emojis, URLs, and stopwords. It is recommended to remove irrelevant elements for analysis purposes to improve computational efficiency and validity (Hickman et al., 2020; Welbers et al., 2017). Stopwords are functional words that have no substantial meaning, such as article (e.g., the, a, an), conjunctions (e.g., and, but), and prepositions (e.g., of, in) (see Figure 8).

Figure 8

Also, as computers are case-sensitive (e.g., computers cannot identify Computer and computer as having the same meaning like a human), text data are often converted to lowercase before analysis. Beyond the simple steps, there are different types of text cleaning methods, such as stemming/lemmatization, dimensionality reductions, bag-of-words, Word2vec, and so on. Text cleaning depends on which type of algorithms would be used and what the purpose is. The shared code removes URLs, emoticons, special characters (e.g., !, @), and stopwords. 

Next, a word cloud is created to visualize the contents. A word cloud is one of the most frequently used visualizations in text mining. It is similar to the descriptive analysis in statistics (e.g., mean, sd). A word cloud is often seen as a preliminary analysis in PR-published papers (e.g., Plessis, 2018; Macnamara, 2016). The size of word fonts is proportional to the word frequencies. The generated word cloud in the example shows that rt, new, year, happy, and prsaroadsafety are prominent in the text data (see Figure 9).

Figure 9

The next code calculates a word frequency and sorts the result in descending order by frequency. Word frequency generates insightful information, such as daily/weekly issues around an organization (see Figure 10). Also, a PR practitioner may evaluate a campaign’s performance by tracking the relevant hashtag frequency over time.

Figure 10

The last code is to make a word frequency visualization. If the index (e.g., from the current 0:20 to 0:50) is changed, the number of words in the graph will accordingly change (see Figure 11).

Figure 11

Lesson 3. Network analysis
Network analysis is gaining much popularity in public relations (Yang & Saffer, 2019). Network analysis deals with “structure and position” (Borgatti et al., 2013, p. 10). The network actor is an individual, group, organization, or inter-organizations. For example, companies have different types of relations (Borgatti et al., 2013), such as similarities (e.g., type of business), business relations (e.g., joint venture, alliance), interactions (e.g., trade), and flows (e.g., technology transfer). Network analysis has been applied to various PR topics such as organization-public/stakeholder relations, employee communication, crisis communication, and CSR (Yang & Saffer, 2019).

Centrality, the classical structural properties of a network, is one of the most commonly used concepts for network analysis and visualization (Freeman, 1978). A few PR studies have used centrality to investigate key publics/stakeholders (Hellsten et al., 2019; Himelboim & Golan, 2019), issues management (Sommerfeldt & Yang, 2017), agenda-setting (Guo, 2012), content diffusion network (Himelboim & Golan, 2019), and CSR performance (Jiang & Park, 2022). 

Also, network analysis can be combined with text mining to figure out how words occur together in text. Specifically, PR practitioners can illustrate brand images and salient issues of an organization by looking at co-occurrence results with the organization name (Gilpin, 2010). In addition, PR practitioners identify a community network (e.g., friends, followers) around influencers and target them to encourage them to pay attention to the PR campaign, which, in turn, may motivate the influencers to share the content (Zhang et al., 2016). Another possible application of network analysis for PR is to identify potential publics who show several advocacy activities with positive sentiments toward a relevant issue but not yet toward a client’s issue. Organizations target them to foster supportive postings on social media.

Tutorial. Network analysis

Loading data is the same in the text mining section. Because network analysis is based on relations, data should have relational information. Relations are expressed in many different ways. You may construct a relationship variable between organizations and/or publics from outside social media data, such as joint ventures, alliances, and NGO coalitions. You may also infer relationships from social media data. For example, follower-following relationships are a relationship example. If User A follows User B, you may use the relationship information for network analysis (e.g., User A → User B). Likewise, if User A mentions or retweets a User B’s tweet, you may set a tie from User A to User B. The tie direction could be reversed depending on your perspective. For example, some people think that the relation should be User B → User A when User A retweets User B’s tweet because User B’s information flows into User A. The example code shows how to make mention relations. “From” indicates users who mention a certain account, while “to” is a mentioned account by “from.” If you want retweet relationships, replace the red text in the first line (i.e., the regular expression) in the below code with r “RT @([A-Za-z]+[A-Za-z0-9-_]+)”. If so, the data indicates that users in the from column retweet posts generated by a user in the to column (see Figure 12).

Figure 12

The following screenshot shows two codes: network object generator (i.e., G) and its visualization in Python (see Figure 13). If there are more than a few nodes (i.e., actor) and edges (e.g., relation), Python network graphs are not visually attractive. Instead, a few researchers use other visualization tools such as Gephi (e.g., Raupp, 2019; Yang et al., 2017). The software is free to use on Windows and Mac (download and see in detail at https://gephi.org).

Figure 13

The code in Figure 14 transforms the network data into an excel for Gephi. To import the excel spreadsheet on Gephi, click file → import spreadsheet → open excel file (Gephi_df.xlsx) → import as “Edges table” in general excel options → finish.

Figure 14

Here, n indicates the number of the relations (i.e., how many times a source mentions a target on Twitter). Compared to a Python graph, Gephi generates visually attractive and easy-to-understand network graphics (see Figure 15).  

Figure 15

Centrality is one of the most frequently used metrics in network analysis. There are many different types of centrality, such as in-degree/out-degree centrality, betweenness centrality, eigenvector centrality, and so on. In the shared code, the NetworkX Python package provides different types of centrality calculations. See more network algorithm parameters at NetworkX (n.d.). For example, when “degree_centrality” in the below code is replaced with “betweenness_centrality,” it generates between-centrality scores for each node (see Figure 16).   

Figure 16

Suggested Curriculum

If an introductory level SMA course is provided within a semester of 16 weeks, it is possible to design the courses as in Table 3. It is critical for students to type and edit the shared codes rather than just read or see them in order to achieve the learning objectives in this teaching brief. The suggested curriculum, therefore, focuses on hands-on experience for PR SMA with Python. The first week introduces the course. Then, the next two weeks teach PR in the digital era, social media and its application in PR, and the SMA case study. After the conceptual understanding of SMA, two weeks would be required to teach each practical programming section: Python basic, data collection, text mining, and network analysis. Finally, the remaining weeks will be used for final projects and presentations. Considering students’ abilities and prerequisite courses, the curriculum would be adjusted to serve unique class demands.

Table 3: Example of PR SMA Course Curriculum

WeekTopicContents
1Introduction– Introduction to Course and Python
– PR in the digital era
– Social media and its application in PR
– Understanding social media analytics
3-4Python programming– Installation and Setup of Python and Jupyter Notebook
– Installing Python packages
– Reading and writing data (e.g., XLXS, CSV)
– data types (e.g., list, dictionary, tuple, JSON)
– Pandas data structure
– Data cleaning (e.g., data selection, merge, recode)
– Basic functions (e.g., define, for, if-else, while)
5-6Data collection– See learning outcome in Table 1 for programming contents
– Three different ways of data collection: crawling, API, and industrial platform)
– Introduction to data collection with API
– Data collection assignment
7-8Text mining– See learning outcome in Table 1 for programming contents
– Conceptual understanding of text mining
– Text mining assignment
9-10Network analysis– See learning outcome in Table 1 for programming contents
– Conceptual understanding of network analysis
– Network analysis assignment
11-13Applications of social media analytics– SMA case study – Social media metrics and evaluations
– Social media campaigns based on SMA
14-15Final project– Final project introduction
– Group Work days
16Student presentation– Final project presentation

What if educators can’t offer a separate class focusing on social media analytics and PR? We suggest a short course in a PR research class. Generally, PR research classes should cover many topics, such as qualitative research and quantitative research. However, research methods in the digital age should teach how to use social media to solve PR problems. PR educators may suggest multiple research methods using qualitative skills (e.g., focus group interview), quantitative skills (e.g., survey), and social media analytics through Python (e.g., text mining and network analysis). Students will be allowed to analyze unstructured data by choosing between text mining and network analysis.   

Assessment of Student Learning

Simply put, students can be assessed via three assignments (15% each worth of final grade) and a final group project (45% worth of final grade) with the remaining 10% points (e.g., attendance) for a semester class. 

Regarding the data collection assignment, students are required to submit a Python code file edited to collect tweets via their search queries. If it works without error, they get full credit. Instructors would consider extra credit when students collect data from other social media or web crawling. The text mining assignment asks students to submit a text mining Python code to create a word cloud and word frequency visualization with the collected data through the data collection assignment. In addition, students would be required to submit a document file analyzing the text mining results, as editing a few codes is too easy of a task for 15% credit. If students conduct additional analysis, such as sentimental analysis and topic modeling, they can be given extra credit. Likewise, network analysis would require a Python code of edited network analysis and a report. Network analysis assignments get extra credit when students present network visualization through Gephi beyond the suggested code. 

Lastly, the final project is group work with a team of three members. Students select a big organization (e.g., S&P 500) so that students can collect large enough social media data. They are asked to conduct (1) traditional formative research, (2) data collection, (3) text mining, (4) network analysis, and (5) social media campaign plan. Table 4 presents an example of the final project rubric. 

Table 4: Final Project Rubric

CriteriaContentsWeight (%)
Traditional formative research– Organizational history & mission
– Industry background & trend
– Identification of stakeholder, public, and society
– Traditional news media analysis
– SWOT analysis
20
Data collection– Social media data collection (e.g., tweets, Facebook)
– Identification of popular social texts
– Identification of key individuals (e.g., influencers)
20
Text mining– Main topics about company, brand, or products
– Sentiment analysis
– Text mining visualization (e.g., word cloud)
20
Network analysis– Identification and network positions of key public and stakeholder
– Network visualization with Gephi
20
Social media campaign planning– Discussion of current PR-related problems from formative research and social media analytics.
– Making three social media assets/tactics with target audiences
– Presentation of expected outcomes and impact on stakeholders, public, and society and measurement plan of campaign success
20

This project allows students to have a chance to apply the skills and knowledge they learn from the suggested SMA class in practice. Through the final project, they would realize the necessities of SMA along with traditional PR formative research (e.g., media coverage). The final project would also be adjusted if students in the class did not take a PR strategy or campaign class.

REFERENCES

Anaconda. (n.d.). Anaconda. https://www.anaconda.com/products/individual

Borgatti, S., Everett, M., & Johnson, J. (2013). Analyzing social network. SAGE Publications. 

Chon, M.-G., & Kim, S. (2022). Dealing with the COVID-19 crisis: Theoretical application of social media analytics in government crisis management. Public Relations Review, 48(3), 102201. https://doi.org/https://doi.org/10.1016/j.pubrev.2022.102201 

Codeone Publishing. (2022). Python programming for beginners: The #1 Python programming crash course to learn Python coding well and fast

Commission on Public Relations Education. (2018). Fast forward: The 2017 report on undergraduate public relations education. http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf

du Plessis, C. (2018). Social media crisis communication: Enhancing a discourse of renewal through dialogic content. Public Relations Review, 44(5), 829-838. https://doi.org/10.1016/j.pubrev.2018.10.003 

Ewing, M., Kim, C. M., Kinsky, E. S., Moore, S., & Freberg, K. (2018). Teaching digital and social media analytics: Exploring best practices and future implications for public relations pedagogy. Journal of Public Relations Education, 4(2), 51-86. https://journalofpreducation.com/2018/08/17/teaching-digital-and-social-media-analytics-exploring-best-practices-and-future-implications-for-public-relations-pedagogy/

Freeman, L. (1978). Centrality in social networks conceptual clarification. Social Networks, 1(3), 215-239. https://doi.org/10.1016/0378-8733(78)90021-7

Gilpin, D. (2010). Organizational image construction in a fragmented online media environment. Journal of Public Relations Research, 22(3), 265-287. https://doi.org/10.1080/10627261003614393 

Grimmer, J., Roberts, M. E., & Stewart, B. M. (2021). Machine learning for social science: An agnostic approach. Annual Review of Political Science, 24(1), 395-419. https://doi.org/10.1146/annurev-polisci-053119-015921 

Grunig, J. E., & Grunig, L. A. (2009). The excellence theory. In C. H. Botan & V. Hazleton (Eds.), Public relations theory II (pp. 21-62). Routledge.

Guo, L. (2012). The application of social network analysis in agenda setting research: A methodological exploration. Journal of Broadcasting & Electronic Media, 56(4), 616-631. https://doi.org/10.1080/08838151.2012.732148 

Hellsten, I., Jacobs, S., & Wonneberger, A. (2019). Active and passive stakeholders in issue arenas: A communication network approach to the bird flu debate on Twitter. Public Relations Review, 45(1), 35-48. https://doi.org/10.1016/j.pubrev.2018.12.009 

Hickman, L., Thapa, S., Tay, L., Cao, M., & Srinivasan, P. (2020). Text preprocessing for text mining in organizational research: Review and recommendations. Organizational Research Methods, 25(1), 114-146. https://doi.org/10.1177/1094428120971683 

Himelboim, I., & Golan, G. J. (2019). A social networks approach to viral advertising: The role of primary, contextual, and low influencers. Social Media + Society, 5(3). https://doi.org/10.1177/2056305119847516 

Jiang, Y., & Park, H. (2022). Mapping networks in corporate social responsibility communication on social media: A new approach to exploring the influence of communication tactics on public responses. Public Relations Review, 48(1), 102143. https://doi.org/10.1016/j.pubrev.2021.102143 

Kent, M. L., Carr, B. J., Husted, R. A., & Pop, R. A. (2011). Learning web analytics: A tool for strategic communication. Public Relations Review, 37(5), 536-543. https://doi.org/10.1016/j.pubrev.2011.09.011 

Kim, C. M. (2021). Social media campaigns. Strategies for public relations and marketing (2nd ed.). Routledge. 

Kim, J.-N., & Rhee, Y. (2011). Strategic thinking about employee communication behavior (ECB) in public relations: Testing the models of megaphoning and scouting effects in Korea. Journal of Public Relations Research, 23(3), 243-268. https://doi.org/10.1080/1062726X.2011.582204

Kim, S.-W. (2022, March 21). [Twitter API] AcademicTrack. https://github.com/formulated/PR_education_Python/blob/main/Crawling_Twitter%20Academic%20Track/%5BTwitter%20API%5D%20AcademicTrack.ipynb

Liu, B. (2011). Web data mining. Exploring hyperlinks, contents, and usage data (2nd e.d.). Springer. https://doi.org/10.1007/978-3-642-19460-3

Macnamara, J. (2016). Organizational listening: Addressing a major gap in public relations theory and practice. Journal of Public Relations Research, 28(3-4), 146-169. https://doi.org/10.1080/1062726X.2016.1228064 

Matthes, E. (2019). Python crash course. A hands-on, project-based introduction to programming (2nd ed.). No Starch Press. 

Meganck, S., Smith, J., & Guidry, J. P. D. (2020). The skills required for entry-level public relations: An analysis of skills required in 1,000 PR job ads. Public Relations Review, 46(5), 101973. https://doi.org/10.1016/j.pubrev.2020.101973 

NetworkX. (n.d.). Algorithms. https://networkx.org/documentation/stable/reference/algorithms/index.html 

Programming with Mosh. (2020, September 16). Python for beginners. Learn Python in 1 hour. https://www.youtube.com/watch?v=kqtD5dpn9C8&t=24s

Raupp, J. (2019). Crisis communication in the rhetorical arena. Public Relations Review, 45(4), 101768. https://doi.org/10.1016/j.pubrev.2019.04.002 

Rust, R. T., Rand, W., Huang, M.-H., Stephen, A. T., Brooks, G., & Chabuk, T. (2021). Real-time brand reputation tracking using social media. Journal of Marketing, 85(4), 21-43. https://doi.org/10.1177/0022242921995173 

Sommerfeldt, E. J., & Yang, A. (2017). Relationship networks as strategic issues management: An issue-stage framework of social movement organization network strategies. Public Relations Review, 43(4), 829-839. https://doi.org/10.1016/j.pubrev.2017.06.012 

Szalacsi, B. (2019). AI and data science understanding is now a critical path for public relations and communications professionals. https://medium.com/infonation-monthly/ai-and-data-science-understanding-is-now-a-critical-path-for-public-relations-and-communications-1617731a99b0

TIOBE. (2022). TIOBE Index for March 2022. Retrieved March 21 from https://www.tiobe.com/tiobe-index/

Trafalgar Strategy. (2022). PR & Python: Why all PRs can benefit from coding experience. Trafalgar Strategy. https://www.trafalgar-strategy.co.uk/pr-python-why-all-prs-can-benefit-from-coding-experience/

Twitter Developer Platform. (n.d.). Twitter academic research access https://developer.twitter.com/en/products/twitter-api/academic-research 

Welbers, K., Van Atteveldt, W., & Benoit, K. (2017). Text analysis in R. Communication Methods and Measures, 11(4), 245-265. https://doi.org/10.1080/19312458.2017.1387238 

Woodie, A. (2021). What’s driving Python’s massive popularity? Retrieved March 21 from https://www.datanami.com/2021/10/20/whats-driving-pythons-massive-popularity/

Yang, A., & Saffer, A. J. (2019). Embracing a network perspective in the network society: The dawn of a new paradigm in strategic public relations. Public Relations Review, 45(4), 101843. https://doi.org/10.1016/j.pubrev.2019.101843 

Yang, A., Wang, R., & Wang, J. (2017). Green public diplomacy and global governance: The evolution of the U.S–China climate collaboration network, 2008–2014. Public Relations Review, 43(5), 1048-1061. https://doi.org/10.1016/j.pubrev.2017.08.001 

Zhang, K., Bhattacharyya, S., & Ram, S. (2016). Large-scale network analysis for online social brand advertising. MIS Quarterly, 40(4), 849-868. https://www.jstor.org/stable/26629679 

© Copyright 2023 AEJMC Public Relations Division

To cite this article: Kim, S. and Chon, M. (2023). Teaching Social Media Analytics in Public Relations Classes: Focusing on the Python Program. Journal of Public Relations Education, 9(1), 117-146. https://journalofpreducation.com/?p=3663

“You don’t have to become a data scientist”: Practitioner Recommendations for Cultivating PR Student Data Competency

Editorial Record: Submitted August 1, 2022. Accepted October 4, 2022. Published May 2023.

Authors

Julie O’Neil, Ph.D.
Associate Dean for Graduate Studies and Administration, Bob Schieffer College of Communication
Strategic Communication
Texas Christian University
Texas, USA
Email: j.oneil@tcu.edu

Emily S. Kinsky, Ph.D.
Professor of Media Communication
Department of Communication
West Texas A&M University
Texas, USA
Email: ekinsky@wtamu.edu

Michele E. Ewing, APR, Fellow PRSA
Professor
School of Media and Journalism
Kent State University
Ohio, USA
Email: meewing@kent.edu

Maria Russell, APR, Fellow PRSA
Professor Emerita, Public Relations
Newhouse School
Syracuse University
USA
Email: mprussel@sry.edu

Abstract
The growing need for data competency among entry-level PR practitioners underscores why it is imperative that PR educators evaluate how they are teaching data and data analytics to students. Researchers interviewed 28 high-level PR practitioners with significant data and analytics experience to examine how educators can best prepare students to curate, analyze, and discern actionable insight from data. Practitioners said students must understand PR fundamentals, basic research and statistics concepts, and the ability to succinctly and persuasively tell a story using data visualization. Participants also discussed the importance of soft skills, including a willingness to learn, adaptability, and critical thinking. Implications and teaching suggestions for educators are provided.

Keywords: data, analytics, competency, pedagogy, public relations

The communication industry is transforming into a data-driven field (Fitzpatrick & Weissman, 2021; Weiner, 2021). People around the world consume and share information as they play, work, learn, engage, and advocate in digital spaces. Public relations practitioners must accordingly upscale their abilities and efforts to use technology to work in the digital world. As part of this digital revolution, Artificial Intelligence (AI) and Big Data are becoming integrated into contemporary public relations practice (Wiencierz & Röttger, 2019; Wiesenberg et al., 2017). Sommerfeldt and Yang (2018) opined: “The question is no longer if, but how to best use digital communication technologies to build relationships with publics” (p. 60).

Despite the vast opportunities afforded by data and technology, many public relations practitioners are behind on the learning curve (Virmani & Gregory, 2021). According to the 2020-2021 North American Communication Monitor (Meng et al., 2021), 40% of PR practitioners lack data competency; 29% are under-skilled, while 11% are critically under-skilled.

Educators know the importance of embedding data and technology competency into public relations curriculum. Five of the 12 professional values and competencies promoted by the Accrediting Council on Education in Journalism and Mass Communication (ACEJMC) relate to digital analytics (Ewing et al., 2018). In the most recent Commission on Public Relations Education (CPRE) report (2018), educators and practitioners indicated “research and analytics” was the fourth-most desirable skill—out of 13—for entry-level PR practitioners.

The growing need for data confidence and proficiency among entry-level practitioners underscores why it is imperative that public relations educators evaluate how they are teaching data and data analytics to students. Researchers interviewed 28 high-level PR practitioners with significant data and analytics experience to examine how educators can best prepare students to curate, analyze, and discern actionable insight from data.

Review of Literature

How PR Practitioners are Using Data and Technology

According to a McKinsey report, companies’ adoption of digital technologies “sped up by three to seven years in a span of months” in 2020 (Galvin et al., 2021, para. 3). In 2021, the pandemic accelerated companies’ adoptions’ of digital technologies, and according to McKinsey, the future belongs to organizations that fully embrace digital technology, skills, and leadership (Galvin et al,. 2021). Public relations practitioners are responding and leaning into this digital transformation as their usage of digital approaches and technologies increases (Wright & Hinson, 2017). Data infuses the entire PR process, and communication professionals can examine data from social platforms, email, websites, mobile apps, internal platforms, business data streams, and more to inform strategic and tactical decisions. Communicators can examine and analyze data for environmental scanning, issues management (Kent & Saffer, 2014; Triantafillidou & Yannas, 2014), crisis communication, combatting disinformation and misinformation (Weiner, 2021), audience identification and segmentation (Stansberry, 2016), influencer and journalistic outreach (Galloway & Swiatek, 2018; Wiencierz & Röttger, 2019) and campaign evaluation (Weiner, 2021).

The Arthur W. Page Society developed a communication approach called “Comm Tech,” which is designed to help chief communication officers (CCOs) apply data and analytics to create campaigns that are hyper-targeted and optimized to drive business outcomes (CommTech Quickstart Guide, 2020). According to Page members Samson and O’Leary (2020), CCOs must help their communication teams evolve from a proactive to predictive function, transform how they understand and engage stakeholders, and improve their digital skills and agility among team members so they can respond to complex problems and opportunities using real-time data.

A commonly referred-to term is Big Data, which is “advanced technology that allows large volumes of data to drive more fully integrated decision-making” (Weiner & Kochhar, 2016, p. 4). Big Data is often defined by four V’s: volume, velocity, variety, and value, and consists of many small structured and unstructured data streams, including PR data derived from news coverage, internal communication, and social media (Weiner & Kochar, 2016). PR practitioners can collaborate with other organizational units to examine Big Data to make decisions regarding product or service demand, competition, and community trends (Weiner, 2021, p. 24). Communicators are also starting to use AI to enhance their capabilities (Virmani & Gregory, 2021). Defined as the “ability of machines to perform tasks that typically require human-like understanding” (Knowledge@Wharton, 2018, para. 1), AI is being used for tasks such as responding to consumer questions, monitoring social media, conducting journalistic and influencer outreach (Galloway & Swiatek, 2018), and engaging employees (O’Neil et al., 2021).

Pedagogical Approaches to Teaching Data and Analytics

Educators and practitioners alike agree upon the importance of including data and analytics in the public relations curriculum. When asked about the future of PR education, Duhé (2016) said educators should focus on three pillars: fast-forward thinking, interdisciplinary learning, and analytical reasoning. The latter relates to students’ ability to curate, analyze, and effectively describe disparate forms of data. In the 2018 CPRE report, educators and practitioners rated the skill of working with research and analytics a 4.16 (on a scale from 1-5) in importance, yet scored entry-level practitioners only a 3.11 in terms of having that skill (on a scale from 1-5). Relatedly, educators and practitioners rated critical thinking as a 4.45 in importance, and scored entry-level practitioners a 3.07 in terms of having those skills. In addition to the importance of data skills emphasized by CPRE, five of the ACEJMC (2022) professional values and competencies relate to research, data, and technology. Recommended competencies include presenting information; thinking critically, creatively, and independently; conducting research and evaluation; applying basic numerical and statistical concepts; and applying tools and technologies.

In addition to the CPRE (2018) report, Krishna et al.’s (2020) survey of public relations practitioners and Brunner at al’s (2018) analysis of PR job announcements both indicated the importance of research and measurement skills for entry-level practitioners. Based upon a content analysis of university websites and job advertisements, Auger and Cho (2016) concluded that PR curricula were overall aligned with the needs of practice, except for social media and technology. O’Neil and Pham (2020) analyzed 101 full-time communication and research job positions that were posted on Glassdoor in late 2019. The advertisements most commonly required the following knowledge and skills: SEO (search engine optimization), SEM (search engine marketing), OTT (over-the-top), traffic metrics, A/B testing, data analytics, data visualization, presentation, and teamwork.

Other recent pedagogical work has examined how public relations educators are teaching data and analytics, which students have indicated they desire (Meng et al., 2019; Waymer et al., 2018). Ewing et al. (2018) researched how PR faculty are teaching social media analytics by analyzing course syllabi and conducting a Twitter chat with 56 educators and practitioners. Participants (mostly educators) suggested students know how to measure social media results, understand the context of social media, engage in social media listening, and conduct digital storytelling. The researchers’ analysis of syllabi revealed very few included learning outcomes related to analytics in general or required certifications with an analytic underpinning. Fang et al. (2019) also examined digital media content in 4,800 courses offered in 99 advertising and public relations programs. Approximately one in four universities offer digital media courses, and there is a greater emphasis overall on skills than concepts in courses. 

Lutrell et al. (2021) investigated how social media, digital media, and analytics courses have been incorporated into the public relations curriculum in programs accredited by either ACEJMC and/or Certificate for Education in Public Relations (CEPR). Only 32% of 94 programs require either an undergraduate or graduate course in social media, digital media, or analytics; 16% of programs offer these courses as electives. McCollough et al. (2021) examined 154 syllabi to see how programs are teaching new media. Their study indicated 21% of courses offered content related to analytics and interpretation; only a few mentioned “social listening, data insights, or return on investment” (p. 41). Importantly, these two studies indicate only one of three accredited programs—or one out of five when considering syllabi—are teaching data and analytics. 

Feedback from Practitioners About Data Skills and Knowledge Needed

Research has also focused on feedback from practitioners on how to best prepare students for the public relations field. According to communication executives in the United States and China, PR education is not adequately preparing students for emerging media and technology (Xie et al., 2018). The executives named digital and social media as one of the six primary skills needed to succeed and said students should be trained to be “digital thinkers” (Xie et al., 2018, p. 10). “Critical thinking, continuous learning, emotional intelligence, and curiosity” (Xie et al., p. 301) were ranked as the most important soft skills for entry-level practitioners.

Communication practitioners have repeatedly said students do not need to be trained to be digital scientists (Neill & Schauster, 2015; Wiesenberg et al., 2017). Yet, students must embrace numbers, math, business, and statistics (Neill & Schauster, 2015; Wiencierz & Röttger, 2019; Xie et al., 2018). Other suggestions include teaching students how to conduct data analysis, evaluate a campaign’s impact (Freberg & Kim, 2017), engage in social media listening (Neill & Schauster, 2015), and manage a measurement budget (Xie et al., 2018).

Lee and Meng (2021) interviewed South Korean executives for their perceptions of data competency needed among communication practitioners. According to these practitioners, having the right mindset is more important than having the skills to work with data and tools. Lee and Meng (2021) posited that data competency can be fostered by building cognitive analytics, data management, technology literacy, sensemaking skills for data transformation, and crisis management digital skills.

         Fourteen managers from public relations agencies described what analytics-related knowledge and skills are needed for entry-level practitioners (Adams & Lee, 2021). They said educators should focus less on the tools and more on content. The agency practitioners recommended critical thinking, general measurement approaches, communicating data insight, social media listening tools, influencer marketing, message resonance, and data storytelling.

         In summary, this review of literature has indicated the growing need for data and analytics competency among entry-level PR practitioners. Educators are seeking to enhance how they teach data and analytics, but research suggests there is room for improvement. Scholars have noted the need for more feedback from industry professionals about teaching data competency (Ewing et al., 2018; Fang et al., 2019; Luttrell et al., 2021). This study builds upon Adams and Lee’s (2021) research by expanding the sample from agency employees to communicators working in a wide range of industries. Moreover, the focus of this project is on data, in general, and is not limited to analytics. The study seeks to answer the following questions:

RQ1: What knowledge and skills do students need related to data and public relations?  

RQ2: What basic software/tools are organizations using to analyze data and digital analytics and which of these tools should students learn?  

RQ3: What can educators do to improve student readiness in these areas? 

Method

Researchers recruited 28 public relations professionals with data and analytics experience using purposive and snowball sampling. Researchers recruited from their professional networks, many of whom are members of either the Institute for Public Relations Measurement Commission or the International Association for Measurement and Evaluation of Communication (AMEC) and have decades of experience in public relations, research, and analytics. As indicated by Table 1, most participants work for either corporations or agencies, but some work at nonprofit organizations and consultancies; industries represented included air transportation, communication/information, consumer packaged goods, education, entertainment/sports, finance/insurance, government, and healthcare. More than 50% had more than 20 years of experience.

Researchers conducted the interviews via Zoom between November 2021 and January 2022. Interviews, lasting approximately 60 minutes, were recorded and transcribed verbatim for analysis. Participant names were removed from transcripts to protect identities and were replaced with numbers (see Table 1). These numbers appear with responses in the results section. In some examples, a participant’s role is mentioned to provide context. 

Researchers analyzed the interviews using the three processes of data reduction, data display, and conclusion drawing and verification (Miles & Huberman, 1994). Researchers analyzed transcripts line-by-line to generate categories and created broad categories based upon the conceptual framework and variables under investigation. Researchers worked together to identify the major patterns and themes suggested by the coding categories. Next researchers reread the transcripts to code the material according to the emerging categories and to identify frequency of responses and representative quotes and stories. 

Results

RQ1: Knowledge and Skills Students Need Related to Data and Digital Analytics 

Several patterns emerged from the interviews related to the knowledge and skills public relations students need related to data. Before students can analyze data, participants said students must have an understanding of PR fundamentals and basic research and statistics concepts. From a hard skills perspective, students must explain data accurately and clearly through solid storytelling and data visualization. Finally, participants discussed the importance of soft skills, including a willingness to learn, adaptability, and critical thinking. Participants said they could teach employees about tools; however, it was challenging to teach soft skills. 

Knowledge Needed: Understanding PR Fundamentals and Business Functions

In order to conduct effective data analysis for an organization, participants pointed to the foundational need for students to understand fundamentals first, especially how public relations connects to other business functions. According to one communication manager, it is important for students to grasp “the rationale behind public relations,” which means core PR classes “are really important for this [digital analytics] role, getting that domain expertise in the communications and PR area‬” (2). Another participant agreed that knowledge of PR skills, such as writing, reporting, and pitching, is essential for data storytelling. 

Having knowledge of the organization beyond the PR department is crucial. Students need to know enough to communicate with others outside their area. Interview participants encouraged students to learn business basics so they would be able to guide communication efforts that would help meet organization goals. One CEO explained, “if you can’t make it relevant to a business leader because you don’t know very much about business, you’ve got a problem” (17). He said students should learn “all of the contextual pieces” of the organization, from finance to human resources—not to become an expert in every area but to “learn enough” to understand the context—“You don’t have to become a data scientist, but you do have to understand what the fundamentals are so that when you sit down and actually do some of this work or even pose some of these questions, you will have a background” that allows you to proceed effectively (17). ‬A vice president for social and content marketing emphasized the importance of understanding the bigger picture; PR is “one driver, but how do we fit in with the rest of the channels and that consumer experience?” (13). A communication consultancy CEO also recommended students learn every aspect of the organization they work for:

For students to be successful and to deliver value to their organization in the future, I think it’s very important to think broadly to understand how does value happen in an organization. Go out with the sales reps on the road and work in different parts of the organization and learn how people view the customer, the processes internally, the data that results from both of those, and of course, the management structure and layers and ways of getting things done.‬ (21)

Connecting to organizational strategy/objectives. Many of the participants’ responses focused on goals, objectives, and what to measure, which means students need to understand the purposes behind data analysis. One participant said students need to know “how communications data can work in a business—why it’s important, why it’s something that we need to be doing‬” (1). Several participants pointed to the problem of opening an analytics tool without understanding the “why” first. One participant offered the example of someone going into Google Analytics and looking at site visitors and referral sources but not first considering “Why do we care about that?” (14). One CEO said students need to understand that “it’s the questions that come first and then the analytics, and then the analytics tell you whether or not you’re measuring the stuff you need to be measuring” (17). An EVP of analytics agreed, “We really try to first make sure everybody starts with business goals, communications objectives, and audience alignment, and that’s something that is still very confusing to a lot of clients, and even a lot of our junior staff still has a hard time” (7). She encouraged: 

[M]aking sure a goal is a quantifiable goal, so it has a who, what, by when, by how much, whatever, in my opinion, if they get used to doing that, it almost becomes obvious, “Well, do I know enough about my audience to know that this is the right goal? Do I know enough about the culture or the landscape to know if this is something I can do?” If I do, great. Then what are my benchmarks, so I know if I’ve achieved that goal? And it forces that quantified goal to become a way to make sure analytics is part of planning, a part of optimizing, and a part of then the measurement at the end.‬ (7)

Strategy. If faculty have used the ROSTIR (Research, Objectives, Strategy, Tactics, Implementation, Reporting) model in introductory classes, students have learned the importance of objectives being in place before strategies are developed and that students should define their strategy before considering tactics (Luttrell & Capizzo, 2022); students need to grasp how these steps are connected to digital analytics, as well. A CMO said:

Remind students that strategy is timeless…. It’s a very natural tendency on the part of students and practitioners to get caught up in the tactics. But say, “Okay, how are we tying this back to the brand here? . . . How is this tied to the overall approach? How is this supporting this larger goal?‬” (23)

One participant pointed to how vital it is for students to understand strategy before ever using an analytics tool. “A lot of the analytics tools are dependent on you understanding what a strategy is and understanding how you can take your goals and turn them into key performance indicators, your KPIs, and then how you can build reports from that‬” (14). Students must comprehend strategy to be able to select the appropriate analytics.

What to Measure. An analytics manager with 15 years of experience said students need to learn to measure outcomes rather than just outputs. She explained outcomes are “really hard to measure,” but it is ideal if students understand the importance of business outcomes (1). Her advice connects to both the second and third iterations of the Barcelona Principles. According to Barcelona Principle No. 2, “Measurement and evaluation should identify outputs, outcomes, and potential impact” (AMEC, 2020). Barcelona Principle No. 3 says, “Outcomes and impact should be identified for stakeholders, society, and the organization” (AMEC, 2020).

Knowledge Needed: Research and Statistics

A communication manager with more than 16 years of experience said, in addition to a “domain expertise about media,” public relations students need an interest “in numbers and understanding of just the basic analytics principles and what it means to explore data” (2). To work in PR now necessitates “an understanding of statistics of some sort” (22). A participant who heads the analytics team for a large agency said, “this is no longer nice to have. You don’t have to be a data person, but you do need to have a base understanding of how to read a chart‬” (7). Another agency executive pointed to the need for students to know how to write a survey, and an agency founder said all communicators need to complete at least one statistics class that allows students to practice with “a wider range of datasets‬” (19).

A director of data science said students should not run away from statistics. “Statistics is not math; it literally is not math. You don’t have to do any calculations in statistics. You have to understand how to apply something and when to press the right buttons; there’s no math‬” (20). A founder of a communication analytics-focused company with more than 25 years of experience agreed students need to move beyond fear of statistics if they want to work in professional communication: 

A lot of people go into PR or comms or even marketing because at some level they say, ‘Wow, I really did not like math in college or high school, and this looks like something that is math-free.’      That would be a huge mistake to believe that today. Nothing is math-free, numbers-free, technology-free. If you had a real problem with STEM, science, technology, math in school, you definitely should not go into marketing and communications in the future. (17)

Participants suggested students learn about database systems, spreadsheets, Boolean syntax, data literacy, and dashboards. In fact, one source said, “Get really good Boolean operating codes, then that’s your bread and butter” (16). In addition to Boolean syntax, another source suggested learning the programming language SQL: “A foundational skill for analytics is SQL and being able to query, investigate, and understand large datasets” (26). While one source said seeing R and Python on a resume would catch her attention, other participants argued there’s no need for students to learn R and Python because companies can hire a data scientist; instead, PR employees need to be able to work with data scientists and to discern the insight that has relevance for business outcomes and PR programming. A participant with 30 years of experience said, “They don’t need to be data scientists. They need to have an understanding of it… ask questions. . . . be good probers of the data” (18). Students must recognize “what’s an important number and what’s not” (22) and to “be curious about where things came from” (24). More than any particular tool or ability, participants said students need to be comfortable with data: “how to structure it, how to blend it, how to analyze it, and how to communicate about it” (19). 

Hard Skills Needed: Data Visualization and Storytelling

Participants repeatedly said public relations students do not need the same expertise as a data scientist. They need to be able to take complex information and convert it “into simple-to-understand information” (20). Participants spoke of “data-driven storytelling” (6) and simply “being able to explain” (7), which includes presentation skills to “tell your story” (2). One source indicated data visualization is a growth area within their organization, and they will “be hiring big on next year” (7).

Data visualization tools were frequently mentioned by participants, including Tableau and Alteryx; however, one participant warned that tools that create an automatic visual for users might be dangerous: “I’m not a huge fan of data analysis using visualization tools purely because I think it is ripe for the potential of misrepresenting the data” (19). She recommended teaching students basic visualization within communication classes, including the importance of labeling information correctly and providing data sources. Other participants mentioned the frequent need to create their own graphs and other visualization pieces at work, despite the existence of automated tools, so a basic knowledge of good design is helpful.

Soft Skill Needed: Willingness to Learn

While demonstrating curiosity and a commitment to life-long learning is essential in public relations, participants pointed out “genuine curiosity” (8) is critical when it comes to mining and analyzing data and determining insights for communication strategy. Ten of the 28 participants emphasized the importance of curiosity. For example, a corporate communication professional said, “A digital analytics practitioner must have curiosity and strong communication skills” because that interest “will keep them asking why, keep them digging, which will uncover a deeper understanding in their analyses” (26). Another participant said “I try to hire people who are curious” and those with “an aptitude for understanding the bigger story and the strategy” (6).

The participants advised educators to help students and young professionals understand the value of recognizing there’s always going to be more to learn, showing a willingness to learn, and being comfortable with asking questions. A communication executive at a not-for-profit healthcare organization said, “Be willing to say, ‘I’m not an expert at it, but I want to increase my level of understanding,’ because that’s just what it’s going to take for them to be successful” (24).

An executive at a communication consultancy (27) said people with “inquisitive minds” and “a point of view” are more successful working with data and digital analytics. Another executive working for a company specializing in artificial intelligence (14) discussed the value of “being open to trying something” and “digging into the numbers” to discern patterns and insights. According to a participant who directs analytics at a large agency, “Being a person who always wants to know more, wants to understand more, wants to learn more” will lead to both personal and professional success (7).

Soft Skill Needed: Embracing Change and Unexpectedness 

Participants discussed how evolving digital platforms and tools create challenges with data access and analysis, which can be frustrating and time consuming. Students need to learn to deal with these challenges and be open to using different approaches to capture and analyze data. In the words of one seasoned practitioner: “Just encourage [students] to get creative and to try things and to not get upset when things get broken” (23). A corporate communication executive explained: “The number-one quality we look for in candidates is adaptability” because “analytics is a science and, as such, it is always on a journey of discovery” (26).

Soft Skill Needed: Creative and Critical Thinking Skills

Overwhelmingly, the research findings demonstrated the value of creative and critical thinking skills to effectively work with data and digital analytics. Participants described digital analytics as an art and science and how public relations students and professionals need to be both creative and analytical when accessing and reviewing data. A corporate communication manager (2) emphasized the importance of “being comfortable with ambiguity” and “pushing back” to dig deeper into the data to determine relevant insights. Another participant (21) explained: “There’s a creative leap in interpreting data and its application” and students must not accept “what the data may appear to say at face value.” 

To help students develop critical thinking skills, several participants discussed the value of educators encouraging students to ask thoughtful questions. For example, educators can present a problem, share some data, and direct students to probe in a way that leads to insights connected to business and communication goals. This approach for teaching insight creation is practiced in the workplace. An executive for a global agency (7) explained they conduct training sessions to teach employees how to connect the data back to the communication problem and how to use data to lead to actionable insights. 

RQ2: Software and Tools Used to Analyze Data 

When asked about software and tools used for data analysis, participants described almost 80 software tools and programs, including those they use either in house or in collaboration with external partners. Eight tools were mentioned by five or more participants: Google Analytics, Tableau, Excel, Adobe Analytics, Talkwalker, Brandwatch, Salesforce, and Sprinklr (see Table 2).  Google Analytics was mentioned the most. Related to recent tool trends, one participant indicated “the tool conversation, the PR AdTech, MarTech, data tech stack conversation is one where we’re spending an awful lot of time” (3).

Participants explained the excitement and challenge of this explosion of tools. While practitioners may now choose from a wide range of tools, no single program is capable of accomplishing the myriad tasks needed, which means data must be coordinated from multiple sources, and practitioners frequently combine tools or create their own tools to meet their needs. 

When asked which of these tools they recommend for students to learn, 53 different tools/programs were named and of these, only three were mentioned by five or more participants: Google Analytics, Excel, and Tableau (see Table 3). Participants repeatedly emphasized that educators should not worry about teaching the latest data analytics tool because tools change, and employers can teach the tools. Instead, participants suggested educators help students become more comfortable with the meaning of numbers and research in general. 

Although many of the interview participants encouraged professors to be “platform agnostic” — focusing on concepts more than specific platforms, two tools were repeatedly mentioned as critical: Google Analytics and Excel. Google Analytics was often referred to as “table stakes” or “low-hanging fruit” (1), “a must” (23), “a good place to start” (15), and that the platform training “gives you a framework for not only thinking about digital analytics, but a framework for thinking about how users move around the web and interact with digital channels” (9) and “if you understand the terms per Google, you’ll understand about 80% of everything else that you might look at . . . because that’s the terminology that just about every other platform uses, so I would say that’s the starting point” (11). Similarly, sources said “start with Excel” (19). “Microsoft Excel is a good way to understand and learn how to organize data, how to use formulas to manipulate data within Excel. You can create charts and graphs and pie charts and all of those different types of things, so I would definitely look for competency at a bare minimum of Excel” (15). Specifically, sources recommended students learn how to run pivot tables, make charts, and pull graphs out of Excel to put into PowerPoint. In addition to placing emphasis on Google Analytics and Excel, a few sources suggested exposing students to as many tools as possible because “you don’t necessarily know what that company or agency is using” (19).

RQ3: How Educators Can Improve Student Readiness

Participants shared suggestions to help educators prepare students for data and analytics competency. To conquer students’ fear of analytics, some practitioners recommended educators embed data and analytics in multiple courses, with one participant (19) explaining: “You have to socialize them to it and maybe spoon feed in little baby steps, but all along from the beginning.”

Some participants said educators should dig into the context. For example, if students are analyzing social media conversations on Brandwatch, they should also analyze media coverage and competitor information to understand the nuances of micro changes in those conversations. Respondents recommended that PR educators partner with other academic units on campus, such as business or data science, or with industry professionals or agencies, to team-teach data competency to students.

Participants suggested educators use real clients and datasets to deepen learning, something also recommended in the interviews conducted by Adams and Lee (2021). One manager at a global agency (4) said educators should incorporate open-ended assignments that encourage students to ask questions, inspire motivation, and figure out solutions on their own. Respondents also provided a number of assignment suggestions, including:

  • Use AMEC research award entries to write case studies. Students could interview the professionals who submitted an entry to discern best practices and write the study (18).
  • Have students assume the role of a junior executive in a communication agency, and in a 48-hour timeframe, create a client report with insights and infographics (5).
  • Encourage students to participate and learn in online conversations about PR data and analytics on platforms such as Reddit, Slack, and LinkedIn (28).
  • Have students develop weekly reports to examine different sources of data to consider societal factors that may be driving change (18).
  • Give students a large data file on the first day of class. Teach them how to clean the data and how to gain insights in steps across the semester (20). 
  • Require students to attend a dissertation defense presentation from another department to gain practice taking complex ideas and data from outside their field and communicating key takeaways in a way that is understandable to a lay person. They could summarize the highlights in an executive summary or pitch the newsworthy findings in a news release (20).
  • Develop a data integrity assignment that requires students to write and explain their data

          source, including any possible biases and/or limitations (18).

  • Analyze social conversations on Brandwatch and connect the analysis to what’s happening in the news and from a Google search. Connect the analysis to both theory and conceptual frameworks when looking for insight and making recommendations (5).
  • Examine where social media fits within the consumer journey for a business and how it impacts outcomes relative to other channels (13).
  • Use a client or university website to understand how to improve campaigns and readership using data from Google Analytics (24).

One participant encouraged educators not to feel pressured to teach students everything about data analytics: “I think there’s a naive belief that a university can train everything. It can’t, it absolutely can’t and it shouldn’t” (20). He also shared an encouraging message for graduating seniors:

the company is going to invest money and time into training you, but they have a base level of knowledge that they want you to have. And I think there’s this little fear that I should know how to do everything when I walk in the door, and that’s crap, you’re never going to know everything when you walk in the door. We’re going to teach you the things that we think you don’t know, and you should ask questions along the way. (20)

The resulting focus should be for students to learn as much as they can in and out of school, to be ready to continue to learn during the rest of their career as tools change, and ask questions as confusion arises.

Discussion

In this study, seasoned communication professionals from a wide range of industries shared recommendations on how public relations educators can best prepare students to succeed in our increasingly digitized world. According to participants, students need a range of knowledge and hard and soft skills to work effectively with data and analytics. Most importantly, students must understand PR fundamentals, including how PR connects to other organizational functions and goals (Adams & Lee, 2021; Brunner et al., 2018; Ewing et al., 2018; Krishna et al., 2020).  Practitioners explained that knowing business basics and knowing one’s own industry are critical for asking the right questions, considering the nuances and context, and discerning actionable insight. Understanding how data aligns with or drives organizational objectives overshadows knowledge of any one digital tool or metric. While practitioners explained students do not need to be a data scientist (Neill & Schauster, 2015; Wiesenberg et al., 2017) nor know a programming language, they must have a strong grounding in research and statistics (Brunner et al., 2018; Krishna et al, 2020). Students must understand statistics and research in order to know how to examine frequency distributions, correlations, regression analysis, A/B testing, and more when examining data. Qualitative research skills are also needed for examining digital conversations and discerning meaning in data. Finally, students must also know how to succinctly and compellingly tell a story using data visualization for a wide range of audiences. Students must learn how to filter unnecessary data points to construct a simple story.

Much of the feedback from practitioners relates to soft skills, which employers often weigh more heavily than hard skills when making hiring decisions (Lee & Meng, 2021; Xie et al., 2018). The soft skills mentioned by participants included a willingness to learn, adaptability, and critical thinking, all of which align with the cognitive analytics and sensemaking skills recommended for data competency by Lee and Meng (2021) and Xie et al’s (2018) research. PR educators, mentors, and internship supervisors can all help to cultivate these necessary soft skills. Study practitioners suggested assignments that could foster critical thinking and adaptability, such as requiring students to wade through data dumps, thinking about data biases when cleaning and sorting data, and figuring out how the data provides solutions to specific problems.

Given constantly changing technology, a plethora of programs, and the high price tag of many tools, it is daunting to decide which digital tools to teach to PR students. However, participants explained data competency relates more to the approach than the tool. Encouragingly, the tool most widely recommended by participants was Google Analytics, one that provides free training and certification. Excel was another basic and cost-effective tool recommended frequently and vehemently by practitioners. According to participants, students must know how to create and analyze a pivot table and create graphs using Excel; therefore, educators may want to require Excel certification. For faculty who want to learn new tools or software, the key is to start and keep it simple. Educators can tap into resources, like Matt Kushin’s Social Media Syllabus blog and Karen Freberg’s Social Media Professors Facebook Community Group.

While this study builds upon other research touting the necessity for PR students to learn to work with data, the question remains whether educators should create a stand-alone course and/or to integrate data analytics into existing courses. Given increasingly tight resources and crowded curriculum requirements, a separate course might not be possible; therefore, educators should consider spoon feeding data and analytics training across the curriculum, including introductory public relations, campaigns, research, and social media courses. Educators could introduce data and common terminology and metrics in introductory classes and later require students to use and analyze data in more advanced courses (Kent et al., 2011). Educators should continue to foster connections with industry professionals to serve as guest speakers, mentors, and project partners and to use real data and clients (Adams & Lee, 2021; Meng et al., 2019). Finally, students must take some responsibility for their own learning about how to work with data. Students can invest in their own learning by earning certifications, reading blogs and posts related to data analytics, attending brown bags and webinars, and completing internships. 

While this study sheds much-needed insight into how to teach data and analytics, the findings are limited to a sample of 28 communication professionals. Future researchers might implement a survey with a larger sample of communicators to ask about data competency and tools needed. Future research could also compare the efficacy of various pedagogical approaches used by educators to teach data and analytics. Another possibility is to examine and describe data and social media labs housed in communication academic programs.

In conclusion, this research has indicated that while educators have many new tools and ways to teach data competency to public relations students, the basics have not changed. To succeed, students need foundational knowledge in PR concepts and models, strategy, business acumen, and research; skills in analyzing data and connecting to strategy and storytelling; and soft skills in critical thinking, adaptability, and a desire to learn. Educators should focus less on the tools and more on the knowledge outcomes and skills identified in this study. By investing small amounts of time in professional development and focusing on the basics (e.g., Google Analytics and Excel), educators can cultivate data competency among themselves and their students.

REFERENCES

ACEJMC. (2022). Principles of accreditation. http://www.acejmc.org/policies-    process/principles/

Adams, M., & Lee, N. M. (2021). Analytics in PR education: Desired skills for digital communicators. Journal of Public Relations Education, 7(2), 44-76. https://aejmc.us/jpre/2021/08/31/analytics-in-pr-education-desired-skills-for-digital-communicators/

AMEC. (2020). Barcelona Principles 3.0. https://amecorg.com/2020/07/barcelona-principles-3-0/

Auger, G. A., & Cho, M. (2016). A comparative analysis of public relations curricula: Does it matter where you go to school, and is academia meeting the needs of the practice? Journalism & Mass Communication Educator, 71(1), 50-68. https://doi.org/10.1177/1077695814551830

Brunner, B. R., Zarkin, K., & Yates, B. L. (2018). What do employers want? What should faculty teach? A content analysis of entry-level employment ads in public relations. Journal of Public Relations Education, 4(2), 21-50. https://aejmc.us/jpre/2018/08/17/what-do-employers-want-what-should-faculty-teach-a-content-analysis-of-entry-level-employment-ads-in-public-relations/

CommTech Quickstart Guide. (2020). Arthur W. Page Society. https://knowledge.page.org/report/commtech-quickstart-guide/

CPRE. (2018). Fast forward: Foundations + Future state. Educators + Practitioners. The Commission on Public Relations Education. http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf

Duhé, S. (2016, July 27). The three pillars of PR education in the future. Institute for Public Relations. https://instituteforpr.org/view-future-public-relations-education/

Ewing, M., Kim, C.M., Kinsky, E. S., Moore, S., & Freberg, K. (2018). Teaching digital and social media analytics: Exploring best practices and future implications for public relations pedagogy. Journal of Public Relations Education, 4(2), 51-86. https://aejmc.us/jpre/2018/08/17/ teaching-digital-and-social-media-analytics-ex- ploring-best-practices-and-future-implica- tions-for-public-relations-pedagogy/

Fang, F., Wei, W., & Huang, H. (2019). Keeping up with fast-paced industry changes—Digital media education in U.S. advertising and PR programs. Journal of Advertising Education, 23(2), 80-99. https://doi.org/10.1177/1098048219877765

Fitzpatrick, K. R., & Weissman, P. L. (2021). Public relations in the age of data: Corporate perspectives on social media analytics (SMA), Journal of Communication Management, 25(4), 401-416. https://doi.org/10.1108/JCOM-09-2020-0092

Freberg, K., & Kim, C. M. (2018). Social media education: Industry leader recommendations for curriculum and faculty competencies. Journalism & Mass Communication Educator, 73(4), 379-391. https://doi.org/10.1177/1077695817725414

Galloway, C., & Swiatek, L. (2018). Public relations and artificial intelligence: It’s not (just) about robots. Public Relations Review, 44(5), 734–740. https://doi.org/10.1016/j.pubrev.2018.10.008

Galvin, J., LaBerge, L., & Williams, E. (2021, May 26). The new digital edge: Rethinking strategy for the postpandemic era. McKinsey Digital. https://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/the-new-digital-edge-rethinking-strategy-for-the-postpandemic-era

Kent, M. L., Carr, B. J., Husted, R. A., & Pop, R. A. (2011). Learning web analytics: A tool for strategic communication. Public Relations Review, 37, 536- 543. https://doi.org/10.1016/j.pubrev.2011.09.011 

Kent, M. I., & Saffer, A. J. (2014). A Delphi-study of the future of new technology research in public relations. Public Relations Review, 40(3), 568-576.  https://doi.org/10.1016/j.pubrev.2014.02.008

Knowledge@Wharton. (2018). Vishal Sikka: Why AI needs a broader, more realistic Approach. http://knowledge.wharton.upenn.edu/article/ai-needs-broader-realistic-approach/

Krishna, A., Wright, D. K., & Kotcher, R. L. (2020). Curriculum rebuilding in public relations: Understanding what early career, mid-career, and senior PR/communications professionals expect from PR graduates. Journal of Public Relations Education, 6(1), 33-57. https://aejmc.us/jpre/2020/01/21/curriculum-rebuilding-in-public-relations-understanding-what-early-career-mid-career-and-senior-pr-communications-professionals-expect-from-pr-graduates/

Lee, J. J. & Meng, J. (2021). Digital competencies in communication management: A conceptual framework of readiness for Industry 4.0 for communication professionals in the workplace. Journal of Communication Management, 25(4), 417-436. https://doi.org/10.1108/JCOM-10-2020-0116

Luttrell, R. M., & Capizzo, L. W. (2022). Public relations campaigns: An integrated approach (2nd ed.). Sage.

Luttrell, R., Wallace, A. A., McCollough, C., & Lee, J. (2021). Public relations curriculum: A systematic examination of curricular offerings in social media, digital media, and analytics in accredited programs. Journal of Public Relations Education, 7(2), 1-43. https://aejmc.us/jpre/2021/09/10/public-relations-curriculum-a-systematic-examination-of-curricular-offerings-in-social-media-digital-media-and-analytics-in-accredited-programs

McCollough, C. J., Wallace, A. A., & Luttrell, R. M. (2021). Connecting pedagogy to industry: Social and digital media in public relations courses. Teaching Journalism & Mass Communication, 11(1), 36-48. https://aejmc.us/spig/volume-11-number-1-2021/

Meng, J., Jin, Y., Lee, Y-I, Kim, S. (2019). Can Google analytics certification cultivate PR students’ competency in digital analytics? A longitudinal pedagogical research. Journalism & Mass Communication Educator, 74(4), 388-406. https://doi.org/10.1177/1077695818816916

Meng, J., Reber, B. H., Berger, B. K., Gower, K. K., & Zerfass, A. (2021). North American Communication Monitor 2020-2021. The impact of COVID-19 pandemic, ethical challenges, gender issues, cyber security, and competence gaps in strategic communication. The Plank Center for Leadership in Public Relations. http://plankcenter.ua.edu/the-2020-2021-north-american-communication-monitor-identifies-trends-and-challenges-in-a-year-of-continuous-crisis/

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Sage.

Neill, M. S., & Schauster, E. (2015). Gaps in advertising and public relations education: Perspectives of agency leaders. Journal of Advertising Education, 19(2), 5–17. https://doi.org/10.1177/109804821501900203

O’Neil, J., Ewing, M., Smith, S., & Williams, S. (2021). Measuring and evaluating internal communication. In R.L. Men & A.T. Verčič’s (Eds.), Current trends and issues in internal communication (pp. 201-222). Springer Nature.    

O’Neil, J., & Pham, T. (2020). Research, measurement and evaluation job advertisements: Responsibilities, requirements and gendered language. Proceedings of the 23rd International Public Relations Research Conference, 368-378. https://a5522a3d-553d-4677-86c2-c4e5275200c1.filesusr.com/ugd/27a53c_9f86ad7c56cb4302a0988c394c42e542.pdf

Samson, D., & O’Leary, J. (2020). CommTech: The path to a modern communications function. PageTurner Blog. https://page.org/blog/commtech-the-path-to-a-modern-communications-function

Sommerfeldt, E. J., & Yang, A. (2018). Notes on a dialogue: Twenty years of digital dialogic communication research in public relations. Journal of Public Relations Research, 30(3), 59-64. https://doi.org /10.1080/1062726X.2018.1498248

Stansberry, K. (2016). Taming the social media data deluge: Using social media research methods in the public relations classroom. In H. S. Noor Al-Deen (Ed.), Social media in the classroom (pp. 75-92). Peter Lang.

Triantafillidou, A., & Yannas, P. (2014). How public relations agencies in Greece respond to digital trends. Public Relations Review, 40(5), 815-817. https://doi.org/10.1016/j.pubrev.2014.09.004

Virmani, S., & Gregory, A. (2021, November). The AI and big data readiness report. Chartered Institute of Public Relations. https://www.slideshare.net/CIPRPaul/cipr-ai-and-big-data-readiness-report

Waymer, D., Brown, K. A., Baker, K., & Fears, L. (2018). Socialization and pre-career development of public relations professionals via the undergraduate curriculum. Communication Teacher, 32(2), 117-130. https://doi.org/10.1080/17404622.2017.1372590

Weiner, M. (2021). PR Technology, Data and Insights: Igniting a positive return on your communications investment. Kogan Page Limited.

Weiner, M., & Kochhar, S. (2016). Irreversible: The public relations big data revolution. Institute for Public Relations. https://instituteforpr.org/wp-content/uploads/IPR_PR-Big-Data-Revolution_3-29.pdf

Wiencierz, C., & Röttger, U. (2019). Big data in public relations: A conceptual framework.

Public Relations Journal, 12(3), 1-15. https://prjournal.instituteforpr.org/wp-content/uploads/Wiencierz-Roettger_Big-Data-in-Public-Relations-A-Conceptual-

Wiesenberg, M., Zerfass, A., & Moreno, A. (2017). Big data and automation in strategic Communication. International Journal of Strategic Communication, 11(2), 95-11. https://doi.org/10.1080/1553118x.2017.1285770

Wright, D.K., & Hinson, M.D. (2017). Tracking how social and other digital media are being used in public relations practice: A twelve -year study. Public Relations Journal, 11(1), 1-31. https://prjournal.instituteforpr.org/wp-content/uploads/PRJ-2017-Wright-Hinson-2-1.pdf

Xie, Q., Schauster, E., & Neill, M. S. (2018). Expectations for advertising and public relations education from agency executives: A comparative study between China and the United States. Journal of Current Issues & Research in Advertising, 39(3), 289-307. https://doi.org/10.1080/10641734.2018.1490358

Table 1: Interview Participant Information

ParticipantCurrent Job TitleIndustryYears of Experience
1Manager, Analytics & InsightAir Transportation15
2Communications Manager,
Measurement & Insight
Information/
Telecommunications
16
3Managing Director,
Analytics-Based Strategy
Global Health Innovation Agency25+
4Associate Manager,
Digital Analytics
Public Relations Agency5
5Data ConsultantData Consultancy20+
6Director of Communication
Intelligence
Information/
Telecommunications
25+
7EVP, Head of US AnalyticsPublic Relations Agency12
8Assistant Athletic Director for
Digital Strategy and Analytics
Education8
9Digital Communication AgencyDigital Communication Agency10
10Founder & Chair; CEO; ChairData Science & Communication Agency20+
11Chief Visionary Officer and FounderDigital Communication Agency25+
12Partner / Senior Vice President,
social media
Advertising & Public Relations Agency19+
13VP, Social and
Content Marketing Lead
Finance and Insurance15
14Chief Growth OfficerMarketing AI Agency28
15Audience Development DirectorCommunications Agency20+
16Head, Media AnalysisGovernment13+
17Founder and CEOCommunications Agency25
18CEOCommunication Industry Association30
19Founder & Chief Strategy OfficerCommunication Agency
(Oil & Gas focused)
30
20Director of Data ScienceSports & Entertainment Consultancy7
21CEOCommunications Consultancy30
22FounderPublic Relations Consultancy42
23Chief Marketing OfficerArts & Entertainment15
24EVP and Chief Marketing &
Communications Officer
Health Care30+
25Chief Marketing and
Communications Officer
Finance and Insurance22
26Senior Vice President and Chief Communications OfficerConsumer Packaged Goods31
27DirectorPR & Strategic Communications Agency10
28Founder and CEOCommunications Agency22

Table 2: Software and Tools Most Frequently Used to Analyze Data and Data Analytics

Program/ToolFrequencyParticipant Quote about its Use
Google Analytics16“measuring engagement, share of voice, reach, landing, just all of that”
Tableau14data visualization; “Google Analytics overwhelmingly is where we get a lot of our data, but we’re using Tableau to present it.”
Excel12“99.9% of your job in analytics is using Excel” to manipulate data, figure out what’s important and to generate reports for clients
Adobe Analytics7“very similar to Google Analytics, but that’s a paid tool”
Talkwalker6social listening tool; “we have Talkwalker, which we’re huge, huge, huge fans of”
Brandwatch5“I really like Brandwatch from a listening perspective”
Sprinklr5“We also use Sprinklr for our social media monitoring, as well as our social media listening, as well as social media publishing.”
Salesforce5“Salesforce is a CMS system. And so that allows us to analyze things like our electronic newsletters, the open rates, the read rates, as well as social media data.”
Cision4“We use Cision, which is our media monitoring tool. That’s the tool that we distribute most of our news content through. And what I mean by that is reaching out to reporters and distributing our press releases. It’s our media monitoring and our distribution.”
Meltwater4“On the social front, we’re able to look at things like engagement rate through some platforms that we use, including Meltwater”

Table 3: Software and Tools that Students Should Learn

 Program/Tool Frequency  Participant Quote about its Use
  Google Analytics  15  “Google Analytics and any of those social media analytics I think that are more of the low hanging fruit. That’s the table stakes, in my opinion” (and related to certification: “Google web analytics certified. Cool. That’s a marker. When we see a student who’s taken the effort, even outside of the program to go and do that, great.”)
Excel9“you need to be a fricking Excel power user. There’s no getting around that”; “a base Excel knowledge, I think is critical”
Tableau5“Ultimately, you’re looking for any trends or patterns that you can see. So really being able to visualize the data in some way, I think Tableau is great for that”
Brandwatch4“We use Brandwatch a lot for social media. And so, familiarizing yourself with those tools, I think, is very important”
Cision4“We use Cision a lot”
Sprinklr4“So for instance, Sprinklr, all the social media listening tools, basically just get the Twitter Firehose and then you drill down by keyword type of thing.”

© Copyright 2023 AEJMC Public Relations Division

To cite this article: O’Neil, J., Kinsky, E., Ewing, M., and Russell, M. (2023). “You don’t have to become a data scientist”: Practitioner Recommendations for Cultivating PR Student Data Competency. Journal of Public Relations Education, 9(1), 62-81. https://journalofpreducation.com/?p=3616

Reflexive transformative approach to student-centred learning: Insights from the frontlines of Australian higher education teaching during COVID-19

Editorial Record: Original draft submitted September 29, 2020. Manuscript accepted for publication March 9, 2021. First published online December 2021.

Authors

Kate Delmo, Ph.D.
Faculty of Arts and Social Sciences
University of Technology Sydney
Ultimo, NSW, Australia
Email: kate.delmo@uts.edu.au

Natalie Krikowa, Ph.D.
Faculty of Arts and Social Sciences
University of Technology Sydney
Ultimo, NSW, Australia
Email: natalie.krikowa@uts.edu.au

Abstract

COVID-19 has impacted the education sector in a host of ways (including financial, operational and pedagogical), many of which are unprecedented. This article adopts a case study approach to describe the impact that COVID-19 has had on a specific university teaching and learning experience by examining how teachers at one university responded to the sudden shift to online learning. This article discusses findings from two practitioners working in Public Relations and Communication disciplines in an Australian university, focusing on three key areas of impact: technology, class and content design, and student and staff care. It analyses how three approaches to higher education pedagogy: student-centred learning, active learning classrooms, and teacher reflexivity have been adapted/adopted in this process as described in our “Structure, strategy, and sensibility: Pillars of transformative teaching practice framework.”  Finally, this article demonstrates that although there were obvious and disruptive challenges faced by teaching staff in shifting to online learning, these challenges were met with equally unique opportunities for personal growth, professional development and learning and teaching innovation.

Keywords: technology, pedagogy, transformative teaching, active-learning classrooms, student-centred learning, reflexivity

Introduction 

COVID-19 has impacted the higher education sector in a host of ways including financial, operational and pedagogical. Many of these impacts are unprecedented and have created significant challenges for academic and professional staff, and students alike. But, whilst there were clearly significant challenges, COVID-19 provided opportunities for teachers in higher education to become more reflexive in their approach to subject and class design, and provided space for personal growth, professional development and pedagogical innovation. This article addresses the impact that COVID-19 has had on our university teaching and learning experience by using a case study approach to examine how we, as teachers in public relations and communication fields, responded to the sudden shift to online learning. By reflecting on the previous teaching session from February to June (Autumn session) 2020 in Australia, we were able to identify three key areas impacted by the shift from on campus to online teaching: technology, class and content design, and student and staff care for both our students and subject teaching teams. We discuss these areas in the “Structure, strategy, and sensibility: Pillars of transformative teaching practice framework”      in the following sections. By reflecting on our key approaches to teaching and learning in higher education (student-centred learning, active learning classrooms, and reflexivity in teaching practice), we are able to share insights gained from this experience and suggest recommendations for future online learning.

We are transdisciplinary academics working in the field of Communication at an Australian university. Dr Kate Delmo teaches both undergraduate and post-graduate subjects across public relations, strategic communication, organisational communication and crisis communication. Dr Natalie Krikowa teaches undergraduate subjects in digital and social media that focus on user experience, social marketing, and rapid prototype development. We met weekly during the teaching session to discuss our experiences and reflect on our teaching practices. Our shared teaching philosophy is that effective learning comes from collaboration between teachers and students and that as teaching practitioners we should remain reflexive in order to improve and transform the shared learning experience. This philosophy is supported by our university’s approach to teaching and learning, a flipped learning model, which emphasises student-centred learning (SLA). In this model, teachers act as facilitators and encourage students to take responsibility for their own learning while providing the framework and opportunities to develop their learning skills.

During the COVID-19 pandemic however, there were many challenges and difficulties in maintaining the student-centred approach. Many aspects of our teaching and learning strategies had to change, but it was imperative that the student-centred learning approach remained. We relied heavily on regular feedback loops with students and teaching team staff to determine what was working and what was not. There was a weekly requirement to problem-solve and the student experience ultimately drove the reflexive transformation process from Teaching Week 1. As teachers we needed to be agile, adaptive and organic. As a result, changes became instantaneous. Pre-COVID-19, reflexivity was considered going the extra mile. The Early Feedback Survey (conducted in Teaching Weeks 3 to 4 of the session) and the Student Feedback Survey (conducted at the end of the session) were two key occasions where most teachers would reflect on their teaching practice and consider improvements. Often teachers were teaching the same subjects for years and therefore changes were often minimal as the subjects were typically in good shape and working well for the face-to-face environment. During COVID-19, the informal, anecdotal feedback provided by students to teachers in between the main survey periods conducted by the university were critical to the reflexive process that ultimately led to a collaborative, student-centred learning approach during the pandemic. We received this feedback informally as verbal responses to questions posed in classes, or as personal correspondence through emails and messages. Reflexivity became a survival tool — the pandemic required frequent and urgent response to solve problems that arose in the areas of technology use, class and content design, and care given to students and our respective teaching staff in subjects that we handle. In this article, we discuss three key approaches to teaching and learning in higher education that continue to drive our teaching practice. Here we examine how these approaches were activated/adapted during COVID-19 in the subsequent move to online teaching and learning. 

Background Context
When the World Health Organisation declared COVID-19 a global pandemic on the 12th of March 2020, our university in New South Wales, Australia, alongside other organisations and civic institutions followed the lockdown protocols issued by the government. This date was a few days before the first week of the Autumn teaching session commenced (in mid-February). Three days later, our university management issued a directive for the entire university to pause teaching for one week to shift student and learning activities online to align with the wider COVID-19 protocols issued by the state and federal governments. 

During the paused teaching week (referred to as pause week from here), both academic and professional staff worked as a joint silent machinery in recreating learning activities for students through the online learning management systems (LMS) that the university prescribed. Although our university has initiated a move towards embedding online learning with face-to-face, on-campus activities in 2014, it has taken relatively small steps in fully embracing hybrid (i.e. mix of online and on-campus) teaching modes to foster a strong student-centred learning environment. When the pandemic lockdown period commenced, the entire university was compelled to reconfigure teaching and learning from mostly following an on-campus learning model to a fully online approach. 

Due to time constraints, the main purpose of the pause week was for academics to find an approach to substitute for existing on-campus timetabled learning activities. In our faculty, most of the subjects follow the one-hour and two-hour tutorial mode of delivery. Initial discussions amongst academics centred more on how students can access one-hour lectures and complete two-hour tutorial activities online. Academics did not have ample resources to innovate current teaching initiatives towards a hybrid and/or flipped teaching and learning classroom experience for students that encourages an integrated and embedded approach for content provision and student engagement. Instead, the priority was to devise ways to deliver one-hour lectures and two-hour tutorial sessions online either synchronously or asynchronously. The intended effect was to follow the set timetabling schedules and for class activities to be delivered online as if they were facilitated on-campus. 

Students were provided specific instructions as to whether lectures were pre-recorded or delivered live via online video conferencing softwares such as Zoom or Microsoft Teams (MS Teams). Synchronous delivery of tutorial activities consisted of students simultaneously working on assigned tasks uploaded to the prescribed LMS with academic supervision. Asynchronous activities asked students to complete their weekly tasks independently usually with extended time provisions. Academics had to identify ways in providing formative feedback to students’ weekly outputs online as well. 

During the pause week, our university provided institution-wide support for academics to have last-minute changes to subject outlines approved by faculty administration and to quickly learn the appropriate technology to use for online teaching before classes resumed in a few days. Academics made amendments to the assessments and weekly tasks to fit the new parameters set for COVID-19 teaching. There were university-wide sessions offered to staff to introduce skills such as recording lectures and uploading them online, embedding low stakes quizzes in recorded lectures, using wikis on MS Teams for student collaboration, integrating apps such as Padlet, or Jamboard in archiving responses to weekly tasks, or using online polls as discussion starters in tutorials, among others. The aim of the sessions was for academics to identify which tools were simple, functional and fun to use in their respective classes to encourage student participation. 

The immediate shift in teaching and learning resulted in lessons learned in pedagogical challenges and opportunities that academics are continuously discovering at our university. On the one hand, the pause week illuminated issues such as: a) identifying which technology was appropriate, functional, and available both to staff and students, b) staff members’ literacy in the use of LMS, and c) determining dual formats of learning for our onshore and offshore students. Our university had a large cohort of students who were impacted by the overseas travel bans in March. Such students remained overseas for the duration of the Autumn teaching session. This entailed a customised teaching and learning approach in relation to the following issues: bandwidth and interconnectivity concerns, time zone differences, and restricted access to certain websites and social media platforms that were used in weekly activities. For example, Twitter, Instagram and YouTube are key sites used in subjects offered by our faculty. During the pause week, academics had to immediately create alternative learning toolkits or weekly tutorial packages solely to be accessed by overseas students. The learning toolkits consisted of written instructional materials that provided a step-by-step guide for students to follow in navigating the technological requirements on a weekly basis. This was on top of the challenge in simultaneously delivering weekly subject matter to our onshore students.  

Teachers learned and relearned to maximise the university’s LMS and other online softwares that led to an opportunity for us to recognise technology, literacy, adaptability, and reflexivity as integral to effective and efficient teaching and learning during, and perhaps even after the pandemic. The directive for online teaching under COVID-19 protocols paved the way for academics to avoid some of the reluctance in embracing the possibilities of innovating pedagogies around a purposeful use of an appropriate mix of technology in the classroom. It is significant to gather insights from us academics — the essential frontliners in the education sector — on our lived experiences in teaching under the global lockdown period. In particular, we describe in this paper our key learnings on the role of reflexivity as a transformative teaching and learning practice in creating a student-centred, active learning environment during the initial weeks of teaching during the pandemic. 

Conceptual Framework of Teaching Philosophy 
Our shared teaching philosophy is that learning is a collaboration between teachers and students and that as teaching practitioners we should remain reflexive in order to improve the shared learning experience. This paper discusses three key approaches to teaching and learning in higher education that drives our teaching practice: student-centred learning, active learning classrooms, and reflexivity in teaching practice. It examines how these approaches were activated during COVID-19 and the subsequent move to online teaching and learning. 

A student-centred learning approach (SLA) encourages students to take more responsibility for their learning and is a process that relies heavily on teachers’ professional confidence to surrender traditional teaching responsibilities (McCabe & O’Connor, 2014). SLA is ubiquitous throughout pedagogy literature (see Akerlind 2008; Gibbs & Coffey 2004; Kember 1997; Samuelowicz & Bain 2001; Trigwell et al.,1994) and appears in many university and higher education strategic documents. Many studies cite Rogers as the origin of student-centred learning, and in particular Rogers and Freiberg’s Freedom to Learn (1994). In this seminal text, the authors criticise the expert driven, transmission model of university teaching and suggest adapting their “client-centred” approach to counselling to the education arena (Tangney, 2014, p. 266). Research has endorsed the incentives of a collaborative student-centred community (Gilis et al., 2008; Hardie, 2007; Maclellan, 2008), “although it is inherent that deep methodology can be an anathema for some” (McCabe & O’Connor, 2014, p. 354).

As mentioned earlier, our university has undertaken a formal institution-wide learner-focused approach to teaching and learning since 2014. This flipped learning model (as described above) emphasises student-centred learning (SLA), where ownership of learning is shared between the teachers and students. In this model teachers act as facilitators and encourage students to take responsibility for their own learning while providing the framework and opportunities to develop their learning skills. This facilitation role has been discussed in many studies over the past two decades (Blumberg, 2009; McCombs & Miller, 2007; United Nations Educational, Scientific and Cultural Organization [UNESCO], 2002; Weimer, 2002) all of which emphasise the transformative potential for our understanding of teaching and learning practice.

During the COVID-19 pandemic, however, there were many challenges and difficulties in maintaining the student-centred approach which are inherent to adopting SLA in general. These include having limited preparation, competing timetables, resistance from other staff, student reluctance and teachers’ lack of confidence (McCabe & O’Connor, 2014, p. 351). The only preparation time we were afforded in shifting our classes online was our pause week. We had less than six days to completely redesign our subjects for online delivery, select online platforms to deliver our classes and learn them (then teach them to our teaching teams made up of mostly casual/sessional academics). We then had to redesign assessment tasks and weekly content often on a week-to-week basis. Integral to the ability to adopt SLA is a realistic time frame for effective implementation (Felder & Brent, 1996; Lea et al., 2003) and six days is certainly less than ideal.

Pedagogical methods such as student-centred learning are highly context-dependent (Harju & Åkerblom, 2017) and students are not a homogeneous group (Attard et al., 2010).  There is no “one-size-fits-all” model. This is even more apt when it comes to student’s learning online. What is consistent across many contexts is the humanist approach to SLA. Tangney (2014) highlights consistent ideas about SLA environments that emerge from the humanist literature, including:

  • students should have a choice in what they do and how they do it (and subsequent responsibilities of that choice);
  • an underlying faith that students have the potential to make appropriate choices (to them) and maximise their potential; and
  • students are learning in an environment with little power differential, and where unconditional positive regard and attendance to feelings is central, among others.

This humanist approach to student-centred learning is essential in our university’s model of teaching and learning as it foregrounds the student in the learning process and emphasises the role of the teacher in providing the environment in which the students can best learn. During the pandemic, we were encouraged to move away from traditional lecture-style modes of teaching delivery to an active learning model that highlights peer learning and collaboration as key approaches to effective class design.

Prior to COVID-19, most of our face-to-face on campus classes were conducted in collaborative classrooms where active-learning was emphasised. These physical spaces intended to promote peer interaction, engagement and collaboration. Collaborative or active learning classrooms (ALCs) are designed to facilitate collaborative learning activities, minimise the barrier between teacher and student, and to improve teaching practices (Baepler & Walker, 2014; Carpenter, 2013; Metzger, 2015). ALCs can be regarded as rich environments for collaborative, problem-based learning involving dynamic, interdisciplinary and generative learning activities with the goal of achieving higher order thinking and constructing complex knowledge (Grabinger & Dunlap, 1995). Although active learning pedagogies, such as peer learning, team-based learning, cooperative learning, or blended learning (flipped classroom) can certainly apply in traditional classrooms with fixed seat setting (Deslauriers et al., 2011; Lyon & Lagowski, 2008; Mazur, 2009), a better space for these pedagogies are ALCs designed specifically for student interaction and engagement (Chiu & Cheng, 2017, pp. 269-270). 

When COVID-19 happened the question for us was how do we translate “active learning” to the online classroom? We had less than a week to not only interrogate this question and what it meant pedagogically, but rapidly devise a new approach to teaching and learning for the online environment that (as best as possible) mirrored the active learning classroom with which both students and teachers were already familiar. 

The above required many teachers to adopt a more reflective and reflexive approach to teaching practice. On the one hand, reflectivity which is essential to both student and teacher learning is “the use of personal values, experiences, and habits to make meaning” (Wilhelm, 2013, p. 57). Most teachers will undertake some form of reflection throughout their teaching session to identify areas for improvement. Reflective teachers operate in a mode referred to as “knowledge-in-action” whereby they reflect upon their specific content knowledge and teaching practices that are established through their past experience (Brookfield, 1995; Zeichner & Liston, 2013). This reflective practice can be seen before a new teaching session begins as teachers prepare their subjects for delivery. Some teachers use formal student feedback surveys to determine what worked and what did not work, from the student perspective. 

Reflexivity, on the other hand, is an ongoing internal dialogue that leads to action for transformative practices in the classroom (Archer, 2012). Jeffrey D. Wilhelm (2013), a thought-leader in this field, suggests that reflexivity requires that we “suspend […] our own assumptions in order to understand what someone else brings to [our] understanding, learning, and practice, whether this someone else is a historical figure, a student, or a colleague” (p. 57). Taking a more “epistemic reflexivity” approach encourages internal dialogue on personal epistemology to facilitate meaningful and sustainable change in our teaching (Feucht et al., 2017, p. 234). Having the required time and space is needed in order to be reflexive. Under the unprecedented COVID-19 conditions, however, it was challenging to maintain a reflexive process due to the scope and immediacy of changes that academics had to make. 

Discussion of Reflexive Transformation
The sudden shift to online learning due to COVID-19 brought with it many challenges, but also opened up many opportunities to improve the learning experience for both students and teachers alike. By reflecting on the Autumn 2020 teaching session, we were able to elucidate three key areas impacted by the shift from on campus to online teaching: technology, class and content design, and care for both our students and subject teaching teams. By reflecting on our key approaches to teaching and learning (student-centred learning, active learning classrooms, and reflexivity in teaching practice), we were able to develop a transformative teaching practice framework developed from the insights gained from this experience. This framework, referred to as the ‘Structure, strategy, and sensibility: Pillars of transformative teaching practice framework (illustrated below) is a model that demonstrates how these philosophies and practices intersect. The model underpins the discussion in the following sections. 

Diagram, venn diagram

Description automatically generated

Figure 1: Structure, strategy, and sensibility: Pillars of transformative teaching practice framework (2020)

Structure Pillar: Technology
The initial shift to online learning that occurred in the pause week emphasised subject and assessment redesign and the quick adoption of online platforms including Zoom and MS Teams (in addition to our university’s LMS). Directives coming from university management and administration were centred around what teachers or academics needed to do to ensure their subjects could run in an online mode (e.g., checking assessment tasks were individual tasks where possible and writing new tasks if required). For most teachers this also meant re/familiarising themselves with the technologies. The university promptly provided technology workshops, however these focused on the practical how-to’s and not necessarily on how best to use the platforms for pedagogical purposes. Teachers were given many technological options to explore, but due to time constraints were forced to make quick decisions.

What was missing for most during these crucial paused teaching days was input from the students on how they felt about online learning. Jenkins (2014) argues that students are mostly left to navigate a complex and often confusing array of programmes and services on their own. In Nomkhosi Xulu-Gama et al.’s 2018 study, students commented that the main concerns experienced when adapting to the university experience included access to technology (in particular Wi-Fi), confidence in the use of the university online learning management system, and computer literacy skills. For our continuing students, their sudden move to online learning already raised similar concerns, and yet we were also dealing with a large cohort of commencing students, who now had to orient themselves to online learning on top of university learning more generally.

It was important during the pause week to identify the technological capability of our students. Some students lived in urban environments, with good access to broadband internet, however some students lived in more rural areas with patchy access, and others were joining from overseas. Many of our overseas students were impacted by travel bans or were being quarantined in hotels during the first few weeks of the session, and many struggled to gain access to our technology platforms and participate in our classes. 

In one subject with a high overseas student contingent, we sent out a survey in the pause week to all students in the subject to determine their current technology capabilities and preferences. The survey enquired about their levels of comfort in using particular technologies and platforms (video conferencing, LMS etc.) as well as their access to a reliable computer, internet connection and video/camera/audio technology. Without these necessary technological elements, students would struggle to participate in online classes. Students were also asked if they had any accessibility requirements that would require specific modification to class materials or delivery, or if they had external circumstances that might impact their ability to study online remotely, such as health conditions, career responsibilities, or frontline/essential worker considerations. The information obtained through the survey allowed the subject to be tailored to meet the needs of the students undertaking the subject as best as possible. This student-centred approach remained throughout the teaching session.

The main challenge was discovering how to adopt an active learning environment in the online classroom. Our current LMS was not suited to full online course delivery and lacked appropriate interactive and collaborative functions. As a result, many staff were encouraged to adopt Zoom for live tutorials and lectures and MS Teams for asynchronous class activities. Most staff and students were new to these platforms and lacked the required digital literacy to effectively use them. Many teachers needed to be taught how to use the platforms first, before then utilising them for their teaching and learning. Zoom was relatively easy to adopt and all students required was a Zoom link and then they could join at the required time. MS Teams, however, was intended to be used as a collaborative working platform that required both staff and students to be active and contribute content to the platform. The platform was not necessarily designed to be used for the kinds of activities that teachers were hoping to use it for, but it provided a space for classes to share and collaborate in similar ways to that seen in active learning classrooms. 

MS Teams allowed us to create weekly channels for all that week’s content (including peer learning activities), files and resources and facilitate discussions among small and large groups. The video chat tool allowed the teacher to host a large group meeting, and then have smaller groups go into separate chats with one another to complete the activities before then coming back into a main group meeting for debriefing and discussion. The Wiki widget/tool was used over a three-week period to build understanding of key concepts by having students contribute one concept a week in groups of three. This cooperative learning activity remained an archived resource for the rest of the session that students could refer to when completing their assessments. 

Students were surveyed again at the completion of the subject to better understand their experience of online learning and to hopefully gain insight into further improvements and refinement to be made for the next session which was also going to remain online due to COVID-19. The survey was completed by 54 students and 60% of respondents said that they found the use of MS Teams useful for collaborating with peers. When asked if they would prefer to use MS Teams and Zoom in future, 82% said they would use MS Teams again and 48% said they would use Zoom. 

When maintaining reflexivity in teaching practice, the easiest place to start is often the learning environment itself. By engaging the students in the reflexive process and gaining their insights through regular feedback loops, meant that changes to the learning environment could be swifter and often more innovative. Students clearly appreciated being involved in the construction of their learning environment and by the end of the session were able to articulate the benefits and shortfalls of particular technology platforms. 

Strategy Pillar: Class and Content Design
Teaching during the COVID-19 pandemic saw the emergence of reflexivity as a critical tool in adapting to the changing classroom experience. The immediate shift to online teaching led to a further emphasis on the importance of using a reflexive teaching approach anchored in student-centric learning experience (Tangney, 2014). Although subject descriptions were revised to reflect the online teaching environment, we observed that our teaching strategy and tactics changed frequently based on constant feedback from students and our teaching staff. 

In terms of class content and design, we were encouraged to pre-record our lectures for students to access prior to their tutorials. This proved especially helpful to our students who were still overseas due to the travel bans. One of the early decisions made by individual faculty members during the pause week was to determine what types of tutorial learning activities could best be delivered synchronously or asynchronously, or a mix of both. Synchronous activities consisted of class activities that students worked on simultaneously with academic supervision and completed during the prescribed tutorial hours. Teachers used Zoom or the video call function of MS Teams as the main online conferencing tools that allowed students to collaborate with each other in small groups. Asynchronous activities included online group work that students completed independently usually outside of tutorial hours. Canvas and Blackboard were the primary learning management systems used by the University. Both served as key archival and student engagement portals that helped us design and deliver subject content. 

Teaching in a fully online environment resulted in consistent, ongoing reflexivity in terms of re-designing subject content and delivery. As the teaching session progressed, we learned that the decisions we made regarding class and content design during the pause week did not work for the succeeding weeks as initially planned. Prior to the pandemic, adjusting teaching and learning strategy and tactics as the session unfolds occurred regularly. During the pandemic, the need for constant updating of strategy and tactics happened more frequently, mostly on a weekly basis. Knowledge-in-action (Brookfield, 1995; Zeichner & Liston, 2013) was unfolding more rapidly and organically. Most of the changes were either based on observation or informal feedback gathered from our students and teaching staff. 

In one subject, where classes ran Tuesday through Friday, the first Tuesday morning class was used as a “trial” tutorial. Activities or strategies would be tested in that class, and anything that could be tweaked to improve those activities or strategies would be quickly rolled out across the other classes. These “tweaks” were communicated informally, using a shared MS Teams site for the teaching team. Teachers were then encouraged to provide feedback on how the activities and strategies were received in their own classes. This constant feedback loop meant that changes could be made for the following classes and could be tailored to suit the conditions of each class. 

There was less reluctance to be flexible and resilient in developing weekly workshop activities as compared to hesitations we had in changing teaching plans mid-way of the teaching session pre-COVID-19. More importantly, the reflexive approach that emerged was highly motivated by exploring ways to keep our students interested and engaged in their first experience of mandatory remote studying. For example, in one of our undergraduate public relations subjects, by Week 4 (two weeks after the pause week), we learned that students felt less pressured to work on certain tasks asynchronously because they save time in “getting group discussions going” (Anonymous, Student Feedback Survey comment, June 2020). Through informal feedback gathered before a Zoom class concluded, some students remarked that working on some tasks asynchronously helped them minimise broadband costs because there was no live streaming content. There were some, conversely, who found asynchronous activities “overwhelming” (Anonymous, Student Feedback Survey comment, June 2020). As one of our overseas postgraduate students explained, “It is hard to be left alone working on the Canvas exercises with no one to ask if you are on the right track or not” (Z. Zhou, personal communication, May 10, 2020).

Guided by these insights, we decided to intersperse a few more asynchronous activities with the initially planned synchronous ones. For the asynchronous tasks, this entailed providing more instructional and contextual details to make the tasks more structured and coherent. We developed numerous last-minute user guides for students to help them in their workshop participation such as activating mobile/software apps (e.g. Jamboard, Padlet) that are applicable to PR campaign brainstorming sessions. Pre-COVID-19 where most teaching was done face-to-face, explaining details about apps was done verbally in class, hence avoiding the need to prepare written instructional documents beforehand. 

As one first-year student explained:
Having a chance to work on some tasks individually and outside of the tutorial times in certain weeks made me focus on the content more. Sometimes, online group activities that need to be finished within class hours can be rushed, people are just typing away without really discussing things. I quite liked it that you [Kate Delmo] still gave us feedback in time for the following week’s tutorial. It made all the work worthwhile! (Anonymous, Student Feedback Survey, June 2020)

The need to simplify weekly activities was another ongoing priority during the initial phase of teaching under the lockdown period. We noticed that student engagement was more focused and structured if students in Zoom breakout rooms were working on fewer activities. Pre-COVID-19, a two-hour tutorial session usually allowed students to work on a cluster of three small-group activities. After the pandemic occurred, we followed the same format, thinking that the platform of delivery would not affect the quality of student engagement. However, by Week 3, students felt rushed in finishing all the tasks. This observation led us to change both content and number of assigned activities for the students, moving to one major activity/case study but adding more discussion questions. 

The timing of publishing course materials online via the university’s LMS also changed mid-way into the teaching session. Our postgraduate public relations students who were currently overseas due to the travel bans offered feedback that they had difficulties in accessing the materials online real-time, and most importantly, some of the URLs of websites we used were restricted from their location. In response, we developed separate learning materials for our onshore and offshore students to ensure that both cohorts were given equal opportunities to learn the content. We searched for alternative URLs that were accessible from China in order to give our students there an opportunity to work on the tasks remotely. Eventually, our university gave us a summary list of websites that overseas students could and could not use. We also made the online modules available to all students at least three days earlier. 

Finally, establishing a sustainable system for providing feedback on students’ weekly online outputs was also a part of the overall strategy in designing course content during the early weeks of teaching during the pandemic. We maximized the use of Google Docs, MS Teams worksheets, Blackboard wikis, among others so we are able to provide general feedback on students’ group activities. The shared-screen functionality of Zoom and MS Teams rendered useful when students discussed highlights of their group discussions to the wider class. 

One student remarked: 
It is helpful to see that the tutor [teacher] already wrote comments on some of our answers to the discussion questions. This helped us further explain what we wanted to say to the rest of the class. (Anonymous, Student Feedback Survey, June 2020)

Online teaching during the pandemic made us more aware of the student learning experience. There was more room for flexibility both in macro and micro strategies in designing and delivering subject content that is meaningful to students. By continuing to place students at the centre of the learning design process we also ensured that their perspectives, feelings and circumstances were taken into consideration.

Sensibility Pillar: Care and Empathy to Students and Staff 
The final area within which we focused our reflexive practice was in the care of those we were ultimately responsible for — our students and fellow teaching team staff. Being a reflexive teaching practitioner meant securing the perspectives of others, including students and fellow teachers. If we did not consider and understand the unique circumstances that our students and teaching team were now experiencing, it did not matter what technologies we utilised or how we designed our classes, it would be all for naught. 

What we found, through our weekly virtual face-to-face classes, was that students ultimately wanted someone to care about them and empathise with what they were going through. Many of our undergraduate students were losing their jobs and having to move home. We saw statistically significant higher levels of referrals to our university’s counselling services and accessibility services for stress, anxiety and depression. As Black Dog Institute (an Australian mental health charity) notes, those who are unemployed or from a casualised workforce are at increased risk of mental health deterioration during times of economic instability such as pandemics. They state that “high job insecurity is associated with stress, financial strain, poorer health and increased rates of depression and anxiety” (Black Dog Institute, 2020, p. 2). It was no surprise to those of us teaching on the frontlines of this pandemic that our students were suffering. 

For university students, intensified levels of psychological distress and subsequent negative academic consequences were widespread pre-COVID (American College Health Association, 2019 cited in Grubic et al., 2020). It was clear that these mental health concerns were exacerbated by COVID-19 and were unsurprisingly having a detrimental impact on students’ ability to complete their educational responsibilities. In a survey by YoungMinds (a UK-based youth mental health charity), it was reported that 83% of young respondents felt that the COVID-19 pandemic exacerbated pre-existing mental health conditions, with 32% claiming it made their mental health much worse (YoungMinds, 2020, p. 3). 

As Grubic, Badovinac & Johri (2020) point out:
By increasing academic stressors in a population with heightened pre-existing stress levels and a potentially reduced ability to rely on typical coping strategies – such as family who themselves may be experiencing heightened distress – the COVID-19 pandemic has placed an unprecedented mental health burden on students, which urgently requires further examination and immediate intervention. (p. 517)

For the subjects we teach, we held weekly catch ups with our students as a means to check how they were coping with the challenging life in lockdown. We also reminded our individual teaching team staff (tutors) that it did not matter if the weekly activities did not go as planned. What was important was for us to be patient and understanding with our students, so they felt their concerns were heard and resolved. Oftentimes this meant starting Zoom classes with a “check-in” where students were invited to share their worries or inversely their small victories. We invited our pets to class for show and tell and discussed our favourite TikToks of the week. For the first 15-20 minutes, human connection was prioritised. Then, once the students felt grounded and secure, we could begin exploring the content and activities. 

As greater emphasis was placed on listening to students’ situations, it also became apparent that clearer guidelines needed to be put in place to provide a structure to those communications. Teaching staff were seeing a huge influx of emails and MS Teams messages requesting for more one-on-one assistance or addressing students’ personal issues. The immediacy of digital communication meant that many students assumed their teacher would respond immediately to their query. Mid-way through the session, many teachers were having to re-establish boundaries and set clear expectations on the extent of support given to students in between classes via emails and messages. This required more clearly defined consultation hours centred around staff availability. This became even more important when communicating with overseas students, who required greater support but were in different time zones as ours. Students were slow to respond to these expectations and frequently required reminding. They were, however, grateful to receive the added support and care. 

Overall, students were kind in their formal and informal feedback, acknowledging the extra work required by their teachers in shifting their classes online. On the whole, students appreciated that it was a difficult situation for everyone and appreciated the efforts by the university to keep their classes running while keeping them and their communities safe. Some students even emailed personal notes of thanks to their teachers in recognition of their work and care.

One third-year student remarked:
You definitely put the students first in every way and I really appreciate that – couldn’t have asked for a better tutor for this subject. It’s been such a difficult time for everyone, with COVID-19 taking such a hit on Universities, and I commend the seamlessness of the move to online learning in DPA. Hats off to you and the rest of the team for putting so much time and energy into adjusting the course so flawlessly. (M. Sacks, personal communication, June 12, 2020)

Another student remarked:
In short, I am impressed with the transition online and how classes are run in these unusual times. You are an outstanding educator, with clear direction, expectations and assistance that goes way over the extra mile. Your teaching style is thoroughly enjoyable from your positive attitude and clear care for us students. Moreover I would like to commend you, and the team. (M. Billingham-Yuen, personal communication, May 28, 2020)  

Student care was a key focus in the reflexive process, but equally as important was the concern and care for the staff in our teaching teams. Reflexivity meant touching base consistently and openly with the teaching team. While pre-COVID teaching conditions required less focus on wellbeing for staff, there was still an emphasis on collaboration for consistency in teaching delivery. The shift to online and the consequent adoption of new technologies required subject coordinators to provide education and support to their teaching teams to ensure digital literacy across these technologies and platforms. During the pause week, there was also added work on the part of the coordinator to provide crash course, last-minute training on technology use for our part-time casual academics. For some coordinators this meant ongoing, closer mentoring of casual academic teaching staff to improve confidence and competency in running online classes. This extra training on top of their own personal COVID-19 situation also increased their stress and anxiety levels. Many were now also having to do more training and preparation for online classes, all of which was extra unpaid work. During COVID-19, teaching staff were provided with detailed weekly tutorial guides to outline objectives, teaching tactics and desired learning outcomes, but due to the agile approach to class design improvement, these were often given only days before classes were scheduled. 

Previously, staff feedback in terms of their overall experience in teaching on the subject was procured sporadically during the session and at the end of the session. During COVID-19, regular/weekly Zoom meetings were used to provide necessary briefings and roadmaps about what lay ahead and get feedback on their teaching experiences. But these meetings were also valuable opportunities to check in on the teaching team’s mental health and wellbeing. It was important to check in on how people were feeling on a regular basis and ensure we communicated about our own wellbeing. 

One teaching staff member who teaches in one of our public relations major subject said:
With all these abrupt changes, thank goodness for these weekly briefing sessions prior to class time. As industry practitioners who are part-time teaching, we have been trained mostly to share content and experience with students. But these changes in online teaching is something else, it is a crash course to teaching methods for me. Thank you for not getting tired in guiding us in this journey. (E. Barclay, personal communication, June 10, 2020)

Discussing our shared and unique experiences helped build stronger collegial relationships and human connection. It was crucial that we maintain human connections, as connection is one of the most protective factors contributing to our emotional wellbeing. We emphasised the importance of everyone in the team taking the time to take care of their own wellbeing and to reach out if they needed support. We found ourselves more in touch with our part-time colleagues during COVID-19 teaching because we knew that being expected to fully and immediately comply with the university’s directives in online teaching was challenging to their part-time employment status. 

What COVID-19 brought home was the importance of establishing and maintaining positive relationships with our students and fellow teaching staff. It forced us to be more empathetic and responsive to others’ needs. It encouraged us to listen, rather than speak, and to provide safe spaces for our students and staff to share their concerns and worries. While it may have placed a heavier burden on those coordinating the subjects, the efforts were not in vain. Our classrooms became transformative spaces and ultimately opportunities opened up for personal growth, professional development and learning and teaching innovation.

Conclusion, Limitations and Future Research 

In conclusion, we surmise that at the intersection of transformative teaching practice during COVID-19, the three pillars of structure (technology), strategy (class and content design) and sensibility (student and staff pastoral care) create active-learning classrooms (Archer, 2012), student-centred learning experience (Gilis et al., 2008; Hardie, 2007; Maclellan, 2008), and reflexivity in teaching practice (Wilhelm, 2013). Despite issues in technological literacy, bandwidth and interconnectivity, and in overall pedagogical changes brought by an immediate switch to a fully online teaching platform, we found that the pace and rhythm of teaching and learning during the initial phase of the COVID-19 lockdown was highly guided by feedback that we gathered from the frontlines – our daily and/or weekly engagement with our students and our teaching staff.

The importance of a reflective and reflexive approach to teaching became more instrumental as compared to how these guided our teaching pedagogy prior to the pandemic. In hindsight, we were not certain if the changes we introduced at the beginning of the Autumn 2020 teaching session would work for us or to our students and our teaching teams. The humanist approach to teaching (Tangney, 2014) ultimately emerged as the lynchpin of our teaching and learning practice. Every week during the Autumn session, we found ourselves asking two simple but highly critical questions. First, what worked and did not work last week? Then, based on feedback from the first question, what adjustments do we have to make for our students to learn this topic in a structured and engaging way next week

This type of reflective thought purely guided by principles of student-centred learning and unfolding on a weekly basis was not as prominent to our teaching and learning methods prior to COVID-19. It is aligned with how scholars in teaching and learning pedagogy describe knowledge-in-action (Brookfield, 1995; Zeichner & Liston, 2013). However, during COVID-19, we were not only using student feedback to change our teaching for the following teaching session, we were making changes for the following week, every week. In addition, the process was reflexive for us: we were constantly self-checking our classroom methods. This is similar to what Archer (2012) explained that a person who is reflexive engages in ongoing, internal dialogue that leads to action. It also embodied epistemic reflexivity among teachers (Feucht et al., 2017) in a high-pressure, unprecedented situation (COVID-19) that resulted in meaningful changes in our teaching. 

This case study looked at one university within a specific regional and environmental context. As such, its findings are limited to those universities within similar contexts. We understand that faculty in different regions and countries will have had different experiences depending on a number of factors. Future research will broaden the reflexive transformative approach to student-centred learning by examining it in other university contexts (both nationally and globally) and outside of COVID-19-like environmental conditions. What would be of interest is how this “Pillars of transformative teaching practice framework” could be applied in other public relations and broader communication subjects, programs and degrees. Future applications of this framework could provide valuable insights into how it can be adopted effectively in other higher education settings. Similarly, identifying and comparing other COVID-19 responses from other disciplines and universities could further expand our understanding of how students were impacted by the pandemic and the subsequent remodelled approach to teaching and learning. 

Within the context of our university response, the immediate shift to online subject delivery required a change in teaching and learning outlook both for students and staff in our university. We learned that these changes were not simply “putting things online” as what we initially did during the pause week in the initial days of COVID-19 teaching. There are pedagogical aspects to consider from a macro level such as corresponding changes in the following: staff and student expectations, overall learning pace in the online space, extent and depth of engagement both for staff and students, managing feedback, assessing student progress online, staff’s availability in addressing student concerns, and drawing the line in managing communication channels with students, among others. It helps if students understand these realities so they can equally manage their learning expectations. 

Teaching during the initial phase of the pandemic brings key learnings that will introduce more changes to our active-learning classrooms. To date, we are gradually learning that a fully online delivery of classes is not, and should not be viewed as, a direct substitute for face-to-face, on-campus classes. The pace of and expectations in learning are different for both platforms. Beyond COVID-19, we envision that a hybrid teaching approach that combines online and on-campus learning experience will increasingly be a core pedagogical model to follow. A hybrid model introduces innovation, but it should be anchored in the principle of co-creation between students and staff in universities. The changing teaching and learning ecosystem in higher education will continuously undergo changes after lived experiences of the teachers during the global lockdown period. 

Postscript

It is important as COVID-19 continues to impact our lives, workplaces and educational experiences, that teachers maintain a reflexive, transformative approach to student learning. In Australia, city and state-wide lockdowns have once again moved learning online in 2021, and with an uncertain future, online and hybrid learning will remain to some degree. Both teachers and students are feeling the effects of online fatigue and many students are expressing emotional and mental distress. As a result, teachers are reporting that student welfare is their number one priority in their approach to teaching in 2021. Whilst many of the approaches made during the first response to COVID-19 teaching and learning in 2020 can and have been adopted again, sustaining a reflexive approach to learning means that teachers can respond to new challenges quickly and remain agile. Adopting a transformative teaching framework enables teachers to reflect on the structures of their teaching and learning (technologies, tools and platforms used), devise and revise strategies around pedagogy, class design and content delivery, and embrace a student-centred learning approach where empathy, care and humanity are at the core of teaching practice in these uncertain times.

References

Akerlind, G. (2008). A phenomenographic approach to developing academics’ understanding of the nature of teaching and learning. Teaching in Higher Education, 13(6), 633–644. https://doi.org/10.1080/13562510802452350 

Archer, M. (2012). The reflexive imperative in late modernity. Cambridge University Press.

Attard, A., Di Lorio, E., Geven, K. &Santa, R.. (2010). Student-centred learning: Toolkit for students, staff and higher education institutions. European Students Union. https://www.esu-online.org/wp-content/uploads/2017/10/SCL_toolkit_ESU_EI.compressed.pdf       

Baepler, P., & Walker, J. D. (2014). Active learning classrooms and educational alliances: Changing relationships to improve learning. New Directions for Teaching and Learning, 2014(137), 27–40. https://doi.org/10.1002/tl.20083

Black Dog Institute (2020). Mental health ramifications of COVID-19: The Australian context. https://www.blackdoginstitute.org.au/wp-content/uploads/2020/04/20200319_covid19-evidence-and-reccomendations.pdf 

Blumberg, P. (2009). Developing learner-centred teaching, a practical guide for faculty.      Jossey-Bass.

Brookfield, S. (1995). Becoming a critically reflective teacher. Jossey-Bass.

Carpenter, R. G. (Ed.). (2013). Cases on higher education spaces: Innovation, collaboration, and technology. IGI Global. https://doi.org/10.4018/978-1-4666-2673-7

Chiu, P.H.P. & Cheng, S.H. (2017) Effects of active learning classrooms on student learning: a two-year empirical investigation on student perceptions and academic performance, Higher Education Research & Development, 36:2, 269–279, https://doi.org/10.1080/07294360.2016.1196475

Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(6031), 862–864. https://doi.org/10.1126/science.1201783

Felder, R. M., & Brent, R. (1996). Navigating the bumpy road to student-centred instruction. College Teaching, 44(2), 43–47. https://doi.org/10.1080/87567555.1996.9933425      

Feucht, F. C., Lunn Brownlee, J., & Schraw, G. (2017). Moving beyond reflection: Reflexivity and epistemic cognition in teaching and teacher education. Educational Psychologist, 52(4), 234–241. https://doi.org/10.1080/00461520.2017.1350180     

Gibbs, G., & Coffey, M. (2004). The impact of training of university teachers on their teaching skills, their approach to teaching and the approach to learning of their students. Active Learning in Higher Education, 5(1), 87–100. https://doi.org/10.1177/1469787404040463      

Gilis, A., Clement, M., Laga, L., & Pauwels, P. (2008). Establishing a competence profile for the role of student-centred teachers in higher education in Belgium. Research in Higher Education, 9(6), 531–554. https://doi.org/10.1007/s11162-008-9086-7

Grabinger, R. S., & Dunlap, J. (1995). Rich environments for active learning: A definition. Research in Learning Technology, 3(2). https://doi.org/10.3402/rlt.v3i2.9606      

Grubic, N., Badovinac, S., & Johri, A. M. (2020). Student mental health in the midst of the COVID-19 pandemic: A call for further research and immediate solutions. International Journal of Social Psychiatry, 66(5), 517–518. https://doi.org/10.1177/0020764020925108

Hardie, K. (2007). On trial: Teaching without talking-teacher as silent witness. Art, Design and Communication in Higher Education, 5(3), 213–226. https://doi.org/10.1386/adch.5.3.213_7doi:10.1386/adch.5.3.213_7.

Harju, A., & Åkerblom, A. (2017). Colliding collaboration in student centred learning in higher education. Studies in Higher Education, 42(8), 1532–1544.      https://doi.org/10.1080/03075079.2015.1113954

Jenkins, D. (2014). Redesigning community colleges for student success. Overview of the Guided Pathways Approach. Teachers’ College Columbia, Community College Research Centre, Institute on Education and the Economy. https://www.templejc.edu/live/files/37-redesigning-community-colleges-for-student-success 

Kember, D. (1997). A reconceptualisation of the research into university academics’ conceptions of teaching. Learning and Instruction, 7(3), 255–275. https://doi.org/10.1016/S0959-4752(96)00028-X      

Lea, S. J., Stephenson, D. & Troy, J.. (2003). Higher education students’ attitudes to student centred learning: Beyond ‘educational bulimia’. Studies in Higher Education, 28(3), 321–334. https://doi.org/10.1080/03075070309293 

Lyon, D. C., & Lagowski, J. J. (2008). Effectiveness of facilitating small-group learning in large lecture classes. A general chemistry case study. Journal of Chemical Education, 85(11), 1571–1576. https://doi.org/10.1021/ed085p1571

Maclellan, E. (2008). The significance of motivation in student-centred learning: A reflective case study. Teaching in Higher Education, 13(4), 411–421. https://doi.org/10.1080/13562510802169681

Mazur, E. (2009). Education. Farewell, lecture? Science (New York, N.Y.), 323(5910), 50–51. https://doi.org/10.1126/science.1168927

McCabe, A. & O’Connor, U. (2014). Student-centred learning: the role and responsibility of the lecturer. Teaching in Higher Education, 19(4), 350–359. https://doi.org/10.1080/13562517.2013.860111

McCombs, B. L., & Miller, L. (2007). Learner-centred classroom practices and assessments, maximizing student motivation, learning and achievement. Corwin Press.

Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene, 41(1), 3–9.

Rogers, C., & Freiberg, J. (1994). Freedom to Learn (3rd ed.). Macmillan College.

Samuelowicz, K., &Bain, J. (2001). Revisiting academics’ beliefs about teaching and learning. Higher Education, 41(3), 299–325. https://doi.org/10.1023/A:1004130031247 

Tangney, S. (2014). Student-centred learning: A humanist perspective. Teaching in Higher Education, 19(3), 266–275. https://doi.org/10.1080/13562517.2013.860099

Trigwell, K., Prosser, M., & Taylor, P. (1994). Qualitative differences in approach to teaching first year university science. Higher Education, 27, 75–84. https://doi.org/10.1007/BF01383761

United Nations Educational, Scientific and Cultural Organization. (2002). Information and communication technologies in teacher education: A planning guide. UNESCO Division of Higher Education. http://unesdoc.unesco.org/images/0012/001295/129533e.pdf 

Weimer, M. (2002). Learner-centred teaching. Jossey-Bass.

Wilhelm, J. D. (2013). Opening to possibility: Reflectivity and reflexivity in our teaching. Voices from the Middle: The National Council of Teachers in English 20(3), 57–59.

Xulu-Gama, N., Nhari, S. R., Alcock, A. & Cavanagh, M. (2018). A student-centred approach: A qualitative exploration of how students experience access and success in a South African University of Technology. Higher Education Research & Development, 37(6), 1302–1314. https://doi.org/10.1080/07294360.2018.1473844 

YoungMinds. (2020). Coronavirus: Impact on young people with mental health needs. https://youngminds.org.uk/media/3708/coronavirus-report_march2020.pdf

Zeichner, K. M., & Liston, D. P. (2013). Reflective Teaching: An Introduction. Routledge.

© Copyright 2021 AEJMC Public Relations Division

To cite this article: Delmo, K. & Krikowa, N. (2021). Reflexive transformative approach to student-centred learning: Insights from the frontlines of Australian higher education teaching during COVID-19. Journal of Public Relations Education, 7(3), 68-99. https://aejmc.us/jpre/?p=2733

How to CARE for PRSSA Faculty Advisers: The Impact of Competence, Autonomy, Relatedness, and Equity on Role Satisfaction

Editorial Record: Original draft submitted June 2, 2020. Revisions submitted September 11, 2020. Accepted October 31, 2020. First published online December 2021.

Authors


Amanda J. Weed, Ph.D.
Assistant Professor, Digital and Emerging Media 
School of Communication & Media
Kennesaw State University
Kennesaw, GA
Email: aweed2@kennesaw.edu


Adrienne A. Wallace, Ph.D.
Associate Professor
School of Communications
Grand Valley State University
Allendale, MI,
Email: wallacad@gvsu.edu

Betsy Emmons, Ph.D.
Associate Professor
Howard College of Arts and Sciences
Samford University
Birmingham, AL
Email: ememmons@samford.edu

Kate Keib, Ph.D.
Assistant Professor
Communication Studies
Oglethorpe University
Brookhaven, GA
Email: kkeib@oglethorpe.edu

Abstract

PRSSA faculty advisers play a critical role in public relations education by facilitating experiential learning and professional networking that connect classroom learning with the practical application of knowledge, skills, and understanding of the public relations industry. Yet, many faculty advisers feel overworked, misunderstood, under-appreciated in their role. A two-wave survey of current PRSSA faculty advisers examined the shared challenges that impact personal and professional satisfaction through the lens of Self-Determination Theory. Organizational recommendations provide new directions for national PRSSA programs that promote CARE for faculty advisers in the areas of competence, autonomy, relatedness, and equity.

Keywords: faculty adviser, student organization, tenure, promotion, pedagogy, equity, self-determination theory, Public Relations Student Society of America, PRSSA

Introduction

Undergraduate public relations students benefit from direct professional networking and industry introduction. One way to provide this industry exposure is via pre-professional societies such as the Public Relations Student Society of America (PRSSA). PRSSA supplements the traditional public relations curriculum by providing student members with enhanced learning and networking opportunities. Faculty advisers of PRSSA assume an advanced teaching and mentoring role in this organization by connecting students with unique experiences that link classroom learning to practical application of knowledge and skills in the public relations industry. 

As the Commission on Public Relations Education’s 2018 report on public relations education noted, pre-professional organizations “prepare students for their careers by providing an introduction to and understanding of the profession, as well as offering experiential learning and networking with other practitioners (p.133). Membership in university pre-professional organizations have been studied as critical links between classroom instruction and entry into the profession (Pohl & Butler, 1994), and department and faculty support of those organizations is directly related to the beneficial outcomes to students (Nadler, 1997).  

Faculty advising duties of student organizations can vary among different organizations and/or campuses, a university-level disconnect might emerge between the service expectations of PRSSA advisers versus other student organizations such as a department honor society. Administrators often lump all student organization service efforts into similar labor expectations (Nadler, 1997). However, PRSSA is often a more labor-intensive service load than other organizations, an issue of which administrators and tenure committees are often unaware (Waymer, 2014). Faculty must sometimes choose between time-consuming efforts of sustaining a PRSSA chapter or engaging in teaching or research activities that hold greater weight in the tenure-and-promotion process. While some PRSSA faculty advisers do receive strong support from university administration, other advisers are faced with a hard choice between chapter success or career success. This research addresses the lived experiences of the PRSSA faculty adviser, investigates the gap in knowledge surrounding advising perspectives, and seeks to draw awareness to the key issues that impact the personal and professional satisfaction of PRSSA faculty advisers.

Literature Review

PRSSA and Benefits of Pre-Professional Association Membership
Started as an affiliate organization of the Public Relations Society of America (PRSA) in 1967, PRSSA now has 370 chapters internationally located at universities of all sizes. PRSSA exists to support students studying the field of public relations and communication and reports a membership of more than 10,000 students and advisors throughout the United States and its territories, as well as in Argentina, Columbia, and Peru (PRSSA, n.d.-c). More than 375 faculty advisers, including co-advisers, now serve university PRSSA chapters. 

The PRSSA national chapter handbook (PRSSA, n.d.-c) states that a faculty adviser must be “a full-time teacher of at least one of the public relations courses offered (p. 12).” The specific duties of a typical PRSSA faculty adviser are explained in the national chapter handbook in 11 articulated areas, which include mentorship, liaison duties to various constituencies, and communication duties (PRSSA, n.d.-c). However, specific day-to-day duties, such as writing PRSSA student scholarship recommendation letters, chapter communication, and clerical duties, are not articulated in the handbook. 

PRSSA chapters organize activities on- and off-campus to satisfy the national chapter requirements and serve the interests of members (PRSSA, 2017). Many chapters focus on networking activities, experiential learning, and participation in PRSSA-sponsored awards programs (Andrews, 2007). Students may also attend PRSA professional meetings and attend regional PRSA conferences. Nationwide competitions, such as the Bateman Case Study Competition, are sponsored by the PRSSA national organization. PRSSA members benefit from professional networking, educational opportunities, resume building, and monetary awards from scholarships.

The PRSSA national office sponsors several types of chapter activities including community service, PRSA outreach, diversity and inclusion initiatives, national/regional event conferences, student-run firms, as well as scholarship and award competitions (PRSSA, n.d.-d). Participating in those activities can qualify chapters for awards such as PRSSA Star Chapter or the Dr. F. H. Teahan Chapter Awards Program. The PRSA Foundation offers educational and conference scholarships to members (PRSA Foundation, n.d.). 

Previous PRSSA research has studied how satisfied students are with their PRSSA membership (Andrews, 2007), what students gain from membership (Pohl & Butler, 1994) and how PRSSA prepares students for careers in PR (Andrews, 2007; Sparks & Conwell, 1998). In a survey of students enrolled in PRSSA chapters in Ohio, Andrews (2007) found that PRSSA member students reported joining the organization to: 1) network, 2) build their resume, 3) learn career-related skills, and 4) gain hands-on experience. 

Defining Faculty and Faculty Service 
PRSSA requires faculty advisers to be full-time faculty members. The definition of a full-time faculty member varies, however, based on the type of contract under which a faculty member is hired. Tenure-track faculty often hold a Ph.D. and are expected to pursue an active research agenda. Professors-of-practice and non-tenure lecturers are often hired to capitalize on the industry knowledge that public relations executives bring to the classroom and allow an avenue for executives to transition to higher education. Prior research has identified public relations executive knowledge as a great benefit to students (Todd, 2009), both as tenure and non-tenure faculty. 

Most full-time faculty must complete university service in addition to teaching and/or research. Carnegie-classified R1 universities generally place a strong emphasis on producing research and grant funding for tenure and tenure-track faculty, and service expectations are less robust than at more teaching-centric universities. As Boyer (1991) asserted, tenure-track faculty must often limit student-centric pursuits to meet research needs. Each university defines its own tenure guidelines, but research production often takes priority over service for tenure-track faculty at most universities. Non-tenured faculty may not have research requirements, and that is often supplemented through an increased teaching and/or service expectation.  

Fostering Role Satisfaction through Self-Determination Theory
Self-determination theory (SDT) explores the psychological motivations of organization members to work toward common goals. SDT has been applied in the context of student participation in university organizations (Filak & Sheldon, 2003) and faculty advisers’ perceived performance in their role (Filak & Pritchard, 2007). At the core of SDT is the human desire to satisfy three psychological needs—competence, autonomy, and relatedness—to feel valued as a group member and commit individual efforts to group outcomes (Ryan et al., 1996). Competence represents the need to feel capable to effectively navigate the environment and make successful steps for improvement (Filak & Pritchard, 2007). In the context of PRSSA advising, competence might relate to issues of sufficient training, constructive feedback from peers, and positive support from department administration. Relatedness represents the need to feel connection with others who hold importance to the organization or task-at-hand (Ryan et al., 1995). Autonomy represents the need to function under personal power without the influence of external control (Deci & Ryan, 2013). PRSSA faculty advisers can perceive autonomy in a two-fold manner through the sense that a) they came to their role out of personal desire, and b) they have independence to advise the organization without unreasonable oversight. PRSSA faculty advisers are likely to feel relatedness to three distinct groups: a) members of the PRSSA chapter, b) peer faculty members, and c) department administration.

In addition to identifying need satisfaction, SDT also categorizes different types of motivations along a spectrum from extrinsic-to-intrinsic. As the least self-determined motivation, extrinsic motivations are those that satisfy needs from external sources, and are often not in line with the individual. Introjected motivation occurs when the individual accepts extrinsic motivation due to emotional influence exerted by an external source. Those emotional influences might come into play through the application of guilt (“we need you”), loyalty (“be a team player”), or status tactics (“pay your dues”). Introjected motivations do not necessarily increase commitment to tasks, but are effective through appealing to an individual’s perception of relatedness with those who are in power positions. Identified motivation occurs when one values the outcomes of their actions but gains little enjoyment or fulfillment from the activity. For some PRSSA faculty advisers, identified motivation might come from the sense of engaging in an activity that is assessed for employment review but holds little personal interest. At the opposite end of the motivation spectrum is intrinsic motivation, in which the individual finds internal enjoyment and fulfillment from the activities (Filak & Pritchard, 2007; Deci et al., 1989). 

This study explores the following questions about PRSSA faculty advising:

RQ1: What are the common qualities of faculty who assume the role of PRSSA adviser?

RQ2: What is the common level of knowledge about the roles and responsibilities related to PRSSA faculty advising?

RQ3: What are the most significant challenges for PRSSA faculty advisers?

RQ4: What factors have the greatest impact on PRSSA faculty advisers’ role satisfaction?

Method
This study used a two-phase online questionnaire of current PRSSA faculty advisers. Data was collected for phase one of the study in November of 2019, and phase two was collected in January and February of 2020. Questionnaires were developed using Qualtrics software and distributed via individual emails to PRSSA faculty advisers. Survey procedures were approved by the respective institutional review boards of the authors. 

Study Population 
An initial request was placed through the PRSSA national office for a list of current PRSSA faculty advisers, and the request was denied. Moving forward, the authors identified PRSSA faculty advisers through the national chapter directory, available through the PRSSA national website, to develop an internal contact database of faculty advisers. When faculty adviser information was available in the PRSSA chapter directory, the authors conducted a search of faculty on university websites to identify the current PRSSA faculty adviser. In total, 381 PRSSA faculty advisers, including co-advisers, were identified at 370 U.S. university chapters. Participants were recruited for the phase one questionnaire through three unique tactics. First, a questionnaire information card with a QR code was given to advisers at the 2019 PRSSA National Conference. Second, three rounds of email invitations were sent to PRSSA faculty advisers over two months. Finally, questionnaire invitations were posted on private digital/social media groups such as the PRSSA Advisers Google group, PRSA Educators Academy social media channels, and Facebook groups for the Social Media Professors Community and Student-Run Agency Advisers. A qualifying question at the beginning of the survey and online individual interview asked participants if they were a current faculty adviser of their university PRSSA chapter. In total, 153 advisers completed the questionnaire for a response rate of 40.2%. 

At the end of the phase one questionnaire, participants could opt-in to the phase two questionnaire through a separate sign-up link. Additional invitations were distributed to current PRSSA faculty advisers who: a) won the PRSSA Faculty Adviser of the Year award in the past decade, b) were members of the Commission on Public Relations Education, or c) were a Champion for PRSSA, a subgroup of PRSA “that brings together those who have special, ongoing interest in PRSSA, its student members and public relations education” (PRSA, n.d., para 1). In total, 44 invitations were distributed for the second-phase questionnaire, and 19 advisers completed the qualitative questions, for a response rate of 43.2%.

Phase One Questionnaire Design 
The first phase questionnaire included 70 items that measured five categories of information: a) general chapter information, b) faculty adviser information, c) PRSSA mission and requirements, d) faculty adviser insights, and e) personal and university demographic information. No identifying information was collected, though respondents were able to opt-in for a $40 Amazon gift card drawing through a separate link.

General Chapter Information
This section included 12 questions to collect PRSSA chapter data about: a) chapter size, b) chapter practices including the frequency of chapter meetings, executive board meetings, fundraisers, and attending PRSA sponsored chapter events, and c) chapter participation in PRSSA-affiliated competitions, national awards programs, scholarships, and grants.

Faculty Adviser Information
Sixteen questions covered topics such as a) the appointment process for PRSSA faculty advisers and the length of their term, b) faculty status and expected workload in teaching, research, and service, c) time commitment to PRSSA faculty advising duties, and d) compensation for faculty advising.

PRSSA Mission and Requirements
Participants were shown excerpts of the PRSSA 2019-2020 Chapter Handbook (PRSSA, n.d.-c) that included Mission Statement (p. 5), Minimum Chapter Standards (p. 9), and Faculty Adviser Responsibilities (p. 12). Participants answered 12 Likert-scale questions to indicate their level of agreement with statements related to their personal understanding of the above areas as well as their perceptions of how well PRSSA chapter members, department colleagues, and administrators understood those guidelines.

Faculty Adviser Insights
Participants answered six Likert-scale questions that assessed their level of agreement with statements related to a) personal satisfaction as a PRSSA faculty adviser, b) confidence in balancing PRSSA faculty advising with teaching, research, service and personal life, and c) their belief about whether first-year faculty should advise PRSSA.

Personal and University Demographic Information 
One personal demographic question related to gender was included to further examine Waymer’s (2014) findings of gender-based differences in PRSSA faculty advising. University demographic information included a) university location based on PRSA district chapter maps, b) university size, c) Carnegie classification, and d) program certification through the Accrediting Council on Education in Journalism and Mass Communication or PRSA Certification in Public Relations Education.

Phase Two Questionnaire Design
The phase two questionnaire included 13 open-ended questions to gain additional qualitative insights about PRSSA faculty advising. Two rounds of email invitations were sent over one month. Participants answered questions about various aspects of PRSSA faculty advising including: a) how PRSSA national organization expectations align with university expectations, b) how PRSSA faculty advisers’ workload compared to other service duties (including advising other student organizations, c) what parts of PRSSA faculty advising administration doesn’t understand or recognize, d) how support services from the PRSSA national office help with PRSSA faculty advising, and e) what a faculty member should be aware of regarding PRSSA advising before accepting the role.

Results

Who is the PRSSA Faculty Adviser? 
The vast majority of PRSSA faculty advisers are female at 69.9% (n = 107), followed by males at 29.8% (n = 44) and one respondent who declined to identify gender. PRSSA faculty advising duties primarily fall to full-time lecturers at 39.3% (n = 57) and tenure-track assistant professors at 29.0% (n = 42). Associate professors accounted for 19.3% (n = 28) of respondents, followed by full professors at 11.0% (n = 16), and one respondent who was a part-time lecturer. 

Most respondents advised small- to medium-size PRSSA chapters with 37.5% (n = 57) advising chapters with 10-19 dues-paid members and 27.6% (n = 42) for chapters with 20-49 members. Only 18.4% (n = 28) advised chapters of more than 50 members. Advisers of chapters with fewer than 10 members accounted for 16.4% (n = 25) of respondents. An information request was made with the PRSA national office to provide the breakdown of all PRSSA chapters by membership size for 2020 to provide comparison data. The request was denied because “The membership numbers for both, PRSA and PRSSA change daily – especially PRSSA given its dues deadline ends is December 1st which will change the numbers dramatically. Prefer the member numbers do not get published given they change so frequently” (J. Starr, personal communication, November 19, 2021).

When examining how PRSSA faculty advisers come into their role, the majority (53.7%) of respondents reported that it was part of their job duties with 34.7% (n = 51) who were appointed by a supervisor, and 19.0% (n = 28) indicated advising was part of their official job description. Among the remaining responses, 27.9% (n = 41) volunteered for the role, 8.8% (n = 13) were elected by the PRSSA chapter, and 9.5% (n = 14) assumed the role by an “other” means such as founding the chapter (n = 5) or were the only faculty member available (n = 5). 

When asked about the term length as PRSSA adviser, 72.1% (n = 106) of respondents indicated that no timeline was determined. Remaining respondents indicated defined term limits including 1 year at 2.7% (n = 4), 1 year with renewal at 8.2% (n = 12), two to three years at 6.1% (n = 9), four to five years at 2.7 % (n = 4), and five years or more at 8.2% (n = 12). 

In terms of teaching load, 38.5% (n = 55) of respondents teach three classes per semester, followed closely by four classes at 37.8% (n = 54). The teaching loads of the remaining respondents were two classes per semester at 13.3% (n = 19), five classes or more at 8.4% (n = 12), and one class at 2.1% (n = 3). 

What is the Common Level of Knowledge About the Roles and Responsibilities Related to PRSSA Faculty Advising?
Respondents were asked their level of agreement, from 1 = strongly disagree to 5 = strongly agree, with a statement that they understood the purpose of PRSSA and their perceptions that chapter members, colleagues, and administration understood the purpose of PRSSA. Faculty advisers agreed that they understand the purpose of PRSSA (M = 4.42, SD = .84), though they indicated less agreement that PRSSA chapter members (M = 3.83, SD = .948), colleagues (M = 3.12, SD = 1.11), and administration (M = 3.18, SD = 1.20) understood the purpose of PRSSA. A one-way analysis of variance (ANOVA) test determined no significant differences between groups along the factors of gender or employment status. No correlations were found for PRSSA chapter size or university size.

In a related question, respondents were asked their level of agreement, from 1 = strongly disagree to 5 = strongly agree, with a statement related to the understanding of the minimum chapter standards. Respondents indicated less agreement with their understanding of the minimum standards of PRSSA chapters, though they still somewhat agreed with the statement (M = 4.0, SD = 1.18). Lesser agreement was found in respondents’ perception of understanding of minimum PRSSA chapter standards among chapter members (M = 3.4, SD = 1.28), colleagues (M = 2.56, SD = 1.24), and administration (M = 2.57, SD = 1.26). An ANOVA test determined no significant difference between gender or employment status. A moderate positive correlation was found between chapter size and the respondents’ agreement that their administration understood the minimum chapter standards, r(132) = .195, p < .05, though the same relationship was not reflected in university size.

When asked about what training resources were used when assuming the role of PRSSA faculty adviser, respondents were most likely to use the PRSSA chapter handbook at 58.2% (n = 85), followed by advising materials on the PRSSA national website at 50.0% (n = 73). Respondents also consulted with a former PRSSA faculty adviser at the same university at 46.6%, or another university at 22.6% (n = 33). Respondents were least likely to reach out to the PRSSA national office at 17.8% (n = 26) or PRSA parent chapter office at 14.4% (n = 21). Respondents also indicated “other” training resources at 8.2% (n = 12) that included faculty adviser training available at the PRSA national conference (n = 2) or previous experience with professional or student organizations (n = 4). More than 17% (n = 25) of respondents did not use any training resources when assuming the role of PRSSA faculty adviser (see Figure 1).

Figure 1

Training Resources that PRSSA Faculty Advisers Used When Assuming Their Role

What are the Most Significant Challenges for PRSSA Faculty Advisers?

Workload
The first step of examining the impact of PRSSA faculty advising was to ask tenured and tenure-track respondents to explain their expected workload breakdown in the context of teaching, research, and service as described in their respective faculty handbooks. Overall, the mean was 52.9 % for teaching, research 27.1%, and service 20.0%. The second step was to ask the same respondents their actual workload to determine if PRSSA faculty advising caused deviations from the expected workload. The mean percentages for actual workload were 51.4% for teaching, 19.1% research, and 29.5 % service. Differences between expected workload and actual workload in research and service were noted among all respondents, regardless of the size of the chapter they advised (see Table 1).

Table 1

Expected and Actual Workloads of PRSSA Faculty Advisers by Chapter Size

PRSSA Chapter Size by MembersExpected Teaching LoadActual Teaching LoadExpected Research LoadActual Research LoadExpected Service LoadActual Service Load
Less than 10Mean57.8%56.3%25.0%17.3%17.4%26.4%
N141414141414
Std. Deviation13.96518.30615.12015.5588.53712.811
10-19Mean54.2%52.2%24.3%16.8%21.5%31.0%
N434243424342
Std. Deviation12.29216.43911.6749.4607.00112.139
20-49Mean52.5%50.4%29.7%21.8%17.9%27.8%
N202020202020
Std. Deviation14.36516.94012.74612.6808.64715.982
50-99Mean41.7%44.7%35.6%30.1%22.7%25.1%
N777777
Std. Deviation9.8959..4999.2176.36210.4528.194
100-149Mean34.0%34.5%47.0%14.0%19.0%51.5%
N222222
Std. Deviation1.41413.43518.38512.72819.79926.163
TotalMean52.8%51.4%27.1%19.1%20.0%29.5%
N868586858685
Std. Deviation13.44216.50313.03511.7538.26613.584

Time Commitment
When asked about their weekly time engaged in PRSSA faculty advising duties, 62.2% (n = 89) of respondents spent between one and three hours per week engaged in advising duties, followed by four-to-six hours per week at 16.8% (n = 24), and less than one hour per week at 16.1% (n = 23). Respondents who spent at least seven hours per week engaged in PRSSA faculty advising duties came in at 4.9% (n = 7). When taking a deeper look at what duties comprised the time spent in advising, 36.6% is spent attending PRSSA chapter and executive board meetings, followed by chapter communication at 15.9%, planning on- and off-campus events at 13.7%, PRSSA member recruitment at 9.7%, completing and submitting documentation to maintain chapter status with the PRSSA national office or university at 8.8%, training the chapter executive board at 7.0%, review and submission of documentation for PRSSA chapter awards at 3.8%, and 5.0% of time was spent engaged in other duties like writing thank-you notes, advising individual PRSSA members, and writing recommendation letters for chapter members (see Figure 2). There was a moderate positive correlation between PRSSA chapter size and the amount of time faculty advisers spent on related duties each week, r(150) = .249, p < .001. 

Figure 2

Percentage of Time Committed to PRSSA Faculty Advising Duties 

Compensation
Compensation was examined in terms of expected workload and financial accommodations. Most PRSSA faculty advisers received some type of workload compensation for their service. Partial fulfillment of service was the most common form of compensation at 59.4% (n = 85), followed by a course release at 7.7% (n = 11), or total fulfillment of service requirements at 5.6% (n = 8). In contrast, 22.4% (n = 32) of respondents receive no workload compensation for their service as PRSSA faculty adviser. A one-way analysis of variance (ANOVA) test found no significant difference in workload compensation along the factors of gender or chapter size. A significant association existed between faculty status and workload compensation, X2 (8, N = 142) = 23.046, p = .003. More lecturers indicated that they received a course release (n = 10) than tenure-track (n = 1) or junior (n = 0) faculty. Lecturers were also more likely to receive no compensation (n = 16) than tenure-track (n = 5) or tenured (n =10) faculty (see Figure 3).

Figure 3

Workload Compensation by Faculty Status

In terms of financial compensation, 66.0% (n = 89) of respondents indicated their university fully paid their PRSA membership dues and an additional 2.9% (n = 4) received partial payment. Advisers who received no financial compensation accounted for 32.4% (n = 45) and 14 respondents declined to answer the question.

What Factors have the Greatest Impact on PRSSA Faculty Advisers’ Role Satisfaction?
Respondents were asked to indicate the level of agreement, from 1 = strongly disagree to 5 = strongly agree, with the statement, “I find satisfaction in being a PRSSA faculty adviser.” Respondents at least somewhat agreed with the statement (M = 4.18, SD = 1.047). Various statistical tests (t-test, ANOVA, correlations) were conducted to determine what factors might impact role satisfaction among PRSSA faculty advisers. No significant differences were found along factors of gender, faculty status, chapter size, or university size. A moderate positive correlation was found with how many hours per week respondents engaged in PRSSA advising duties, r(130) = .232, p < .001.

Meeting Expectations
Respondents were asked their level of agreement, from 1 = strongly disagree to 5 = strongly agree, to statements about their confidence in meeting expectations as a PRSSA faculty adviser. Respondents indicated high confidence in meeting personal expectations (M = 4.43, SD = .910), as well as the expectations of their PRSSA chapter (M = 4.48, SD = .886), colleagues (M = 4.62, SD = .715), and administration (M = 4.58, SD = .742). An independent samples t-test found no differences in confidence between gender. A one-way analysis of variance (ANOVA) found a significant difference in confidence in meeting administration expectations between faculty status, F(2, 128) = 4.140, p = .018, with lecturers expressing the greatest confidence (M = 4.77, SD = .505), by tenured faculty (M = 4.56, SD = .852), and tenure-track faculty expressing the least confidence (M = 4.33, SD = .838). A moderate positive correlation was found between chapter size and meeting colleagues’ expectations, r(136) = .280, p<.001, as well as between chapter size and meeting administration expectations, r(136) = .305, p<.001. University size also had a positive, though smaller, correlation with meeting administration expectations, r(129) = .191, p<.05. Moderate positive correlations were found between role satisfaction and confidence to meet personal expectations and the expectations of others, with each correlation equal or greater than r(130) = .364, p < .001 (see Table 2). 

Table 2

Role Satisfaction and Meeting Expectations as PRSSA Faculty Adviser

Work and Life Balance
Respondents were asked their level of agreement, from 1 = strongly disagree to 5 = strongly agree, with statements about their ability to balance PRSSA faculty advising with teaching, research, and service responsibilities, as well as their personal life. The mean response for all items indicated respondents experienced lesser agreement with confidence in balancing PRSSA faculty advising with teaching (M = 3.68, SD = 1.321), research, (M = 3.29, SD = 1.250), service (M = 3.96, SD = 1.261), or their personal life (M = 3.78, SD = 1.198). An independent samples t-test found significant differences between male and female faculty advisers in their level of agreement toward balancing advising with teaching, as well as personal life. Female respondents (M = 3.55, SD = 1.333) indicated less agreement than males (M = 4.05, SD = 1.224) in balancing PRSSA faculty advising with teaching, t(129) = 1.980, p = .05. Additionally, female respondents (M = 3.60, SD = 1.176) indicated less agreement than males (M = 4.25, SD = 1.156) in balancing PRSSA faculty advising with their personal life, t(128) = 2.852, p = .005. A one-way analysis of variance (ANOVA) test found no significant difference between faculty status. A moderate positive correlation was found between chapter size and agreement of balancing PRSSA faculty advising with service, r(129) = .178, p < .05, though no significant correlation was found for university size. Moderate positive correlations were found between role satisfaction and confidence in balancing workload/personal life with PRSSA faculty advising, with each correlation equal to or greater than r(130) = .343, p < .001 (see Table 3).

Table 3

Role Satisfaction and Work/Life Balance as PRSSA Faculty Adviser

Advising PRSSA in the First Year on the Job 
Respondents were asked their level of agreement, from 1 = strongly disagree to 5 = strongly agree, to the statement, “First year faculty should not advise PRSSA.” Respondents (n =131) expressed limited agreement with statement (M = 3.57, SD = 1.342). Various tests (t-test, ANOVA, correlations) were conducted to determine differences among the factors of gender, faculty status, chapter size, university size, Carnegie classification of the university, compensation for advising, confidence in meeting expectations, balancing PRSSA advising with work/personal life, and personal satisfaction in advising PRSSA. Moderate negative correlations were found in relation to the balance with teaching responsibilities, r(129) = -.223, p < .05, balance with research responsibilities, r(129) = -.288, p < .001, and personal life, r(129) = -.236, p < .001 (see Table 4.)

Table 4

Correlations Between “First Year Faculty Should Not Advise PRSSA” and Work/Life Balance

Discussion
The current study provides a multidimensional perspective about the shared concerns and challenges of PRSSA faculty advisers. Through the theoretical lens of CARE—competence, autonomy, relatedness, and equity—the authors advocate for the following recommendations to benefit the advisers and members of the PRSSA organization.

Enhance Training and Support Services to Build the Feeling of Competence
PRSSA faculty advisers’ satisfaction in their roles was significantly correlated to two key factors: a) confidence in meeting expectations and b) ability to balance PRSSA advising duties with other workload requirements and personal life. Meeting expectations at unique levels—personal, chapter, colleagues, and administration—all had a significant positive correlation on a PRSSA faculty adviser’s sense of satisfaction in their role. Meeting expectations reflects the SDT needs of satisfaction of competence (Filak & Pritchard, 2007) and relatedness (Ryan et al., 1995), as well as the emotional satisfaction that can stem burnout (Brown & Roloff, 2011; Brown et al., 2014). In examining the impact of faculty status on confidence in meeting expectations of administration, lecturers expressed the greatest confidence. As lecturers often have significant industry experience and/or membership with PRSA, that experience might provide a better foundation of organizational knowledge and best practices in the PRSSA faculty advising role. Chapter size also demonstrated a smaller, yet significant, correlation with meeting the expectations of colleagues and administrators. 

As membership recruitment can be a strong indicator of success, additional training resources, support services, and adviser mentorship programs should be proactively implemented for PRSSA faculty advisers who do not have previous experience with PRSA or PRSSA. Support services provide a strong foundation for chapter success and, in turn, improve satisfaction among faculty advisers (Filak & Pritchard, 2007), especially those who are junior faculty. A female assistant professor commented, “When I became an adviser last year, it would have been great to have some sort of guide…an idea of expectations would be nice.” While the PRSSA national website does contain written resources for faculty advisers, more efforts are needed from PRSSA national leadership to proactively identify new faculty advisers and provide comprehensive support service. As a female lecturer shared, “I don’t seem to receive a lot of support, email, materials from PRSSA National. Often feel like I am on my own to figure it all out.”

There was a significant negative correlation between a PRSSA faculty adviser’s ability to balance their advising duties with their other work duties or personal life and their belief that first year faculty should advise PRSSA. This is important because while nearly 30% of PRSSA faculty advisers who responded to this survey were tenure-track assistant professors, there was no correlation between faculty status and the level of agreement that first-year faculty should not advise PRSSA. That could be a potential indicator that advisors who are unable to balance advising with other work and/or their personal duties are experiencing burnout and would not recommend the experience to others. 

Recommendations
Four key initiatives should be implemented by the PRSSA national office to improve the feeling of competence among PRSSA faculty advisers, which is positively correlated with job satisfaction. First, the PRSSA national office should empower faculty advisers to manage their chapter directory listing on the organizational website and add a feature to the chapter information page that notes when it was last updated. By maintaining a current directory, the national office can ensure communication is reaching the correct individuals. Second, more video training or synchronous training sessions should be offered by the PRSSA national office to ensure effective orientation of new faculty advisers and improve the understanding of the PRSSA mission, minimum chapter standards, and best practices of chapter management. Those materials should be clearly identified on the PRSSA national website and distributed as an electronic orientation package to new faculty advisers. Third, a district ambassador program, similar to the PRSSA national committee (PRSSA, n.d.-e), will allow ambassadors to act as a liaison between faculty advisers and PRSSA national leadership.  Fourth, a faculty adviser mentorship program should be established by the PRSSA national office to pair veteran advisers with new advisers at different universities. While informal mentorships within universities might pair outgoing and incoming PRSSA faculty advisers, these relationships might not be an option when a current faculty adviser leaves the university. Through offering cross-university mentorship programs, the PRSSA national office can start new advisers on the right foot with community support and guidance. Finally, the authors recommend that first-year faculty should not advise PRSSA in a sole capacity but in a co-adviser capacity, when possible. As first-year faculty are often acclimating to the expectations of a new university and possibly a new city, a one-year transition period of co-advising will offer new faculty the time to become acquainted with PRSSA members, understand chapter expectations, and build vital networks in the professional community.

Support Autonomy in Meeting Unique Chapter Needs
In examining how PRSSA faculty advisers came into their roles, there was a common conflict between the guidelines of the PRSSA national office and internal practices of university departments. The national PRSSA Chapter Handbook states that the faculty adviser should be elected annually by the chapter membership (PRSSA, n.d.-c, p. 12), but fewer than than 10% of advisers came into their role through an election process. In contrast, more than half of the advisers have the role written into their job descriptions or were appointed by department supervisors. An appointment process circumvents the input of chapter members to select an adviser who understands the needs of the organization and an ability to provide effective counsel for successful chapter management. A common challenge for smaller universities is that there might only be one or two faculty who are qualified to assume the role of adviser. That scenario leads to another common aspect of faculty advising, in that more than 70% of advisers have no timeline determined for their role. An undetermined timeline can potentially lead to job burnout (Brown & Roloff, 2011) especially when no incentives or compensation exist for advising PRSSA.  

Recommendation
As fewer than 10% of faculty advisers are currently elected to their role, this is an unnecessary policy that does not align with university needs. The authors recommend the elimination of the faculty adviser election requirement or engage in stronger educational efforts that explain why yearly elections of PRSSA faculty advisers are necessary to the health of individual chapters. 

Foster Relatedness between PRSSA Stakeholder Groups
Support from colleagues, administration, and the PRSSA national office are crucial to the success of chapters, which can potentially have a dramatic positive impact on the PRSSA faculty adviser’s confidence in meeting expectations and greater role satisfaction. As the results of this study demonstrated, greater understanding is needed from colleagues and administration about the mission and minimum standards of PRSSA. A female assistant professor shared, “I do not get any support. It is really hard to get other faculty members excited about what PRSSA is doing or encourage their students to get involved.” That understanding is especially important from administrators as they are often in the position to assign the faculty adviser and provide financial support to the organization through departmental funding. Respondents indicated they disagreed that administrators understood the minimum standards of PRSSA. While the PRSSA chapter might meet the university standards for a student organization, administrators might not understand that the chapter does not meet the minimum standards of the national PRSSA organization and, thus, runs the risk of having its status revoked. As an organization that charges $55 in 2019 for national dues, it is also important that students receive value-added chapter programming and support that justifies students’ financial investment. A female lecturer shared, “I don’t think our university has any idea what the PRSSA National values or expectations are. In general, PRSSA National’s expectations are much more stringent than any the university requires of us.”

Recommendations
While the PRSSA national board does include representation of one national faculty adviser, there is a missed opportunity to implement shared governance that is representative of a diverse community of PRSSA faculty advisers. The PRSSA national office should adopt an organizational philosophy that prioritizes stakeholder democracy (Deetz, 1995) where organization management, faculty advisers, student leaders, and university administration are working in consort to address common concerns and find mutually beneficial solutions. The authors recommend the establishment of an advisory board comprised of current PRSSA faculty advisers that includes a broad representation based on chapter size, geographic location, faculty status, and university Carnegie classification. The advisory board should meet, at minimum, once per semester to address ongoing issues and to identify emerging issues that impact the PRSSA organization. In addition to the establishment of the advisory board of PRSSA faculty advisers, the PRSSA national office should implement a yearly stakeholder summit that includes representation of the national student executive board, university administration, college relations committees of PRSA local chapters, professional advisers, and faculty advisers.  

Advocate for PRSSA Faculty Adviser Equity
When analyzing the common qualities of PRSSA faculty advisers, nearly half of PRSSA faculty advisers teach four or more classes in addition to their advising duties. That workload can create a physical and emotional strain on advisers who feel like they are asked to do more than their colleagues. Equity emerged as the common thread through many shared challenges of PRSSA faculty advisers

PRSSA faculty advisers face specific challenges regarding their workload, time commitment, and financial obligations related to their role. In examining the breakdown of workload along the context of teaching, research, and service, survey respondents indicated their expected workload (as described in their faculty handbook) and actual workload. There was minimal difference between expected and actual workload for teaching. In contrast, there was an inversion when examining the expected and actual workloads for research and service. This is important to note because PRSSA faculty advising increases the service workload for faculty, which is taking away from time that would be dedicated to research. This time imbalance includes the spontaneous demands of extra-role labor such as student recommendation letters and award applications that Brown and Roloff (2011) warned contribute to teacher stress and burnout. A male assistant professor offered this insight, “Advising PRSSA is at the bottom of my list. My other duties and workload is considered a higher priority by the university.”

In terms of actual time commitment, the vast majority of PRSSA faculty advisers spend between one and three hours per week on advising duties. When put into the context of a 40-hour week, that compromises between 2.5% to 7.5% of the workweek that is dedicated to PRSSA advising duties. Yet, 21.7% of faculty advisers spend more than four hours each week engaged in chapter duties. While PRSSA is commonly promoted as a “student-led organization,” it should be noted that faculty advisers might shoulder a significant level of day-to-day management duties when executive boards are small, thus increasing their time commitment beyond their service expectations. A male associate professor stated,

When you focus on the PRSSA Chapter, in building it and sustaining it, it becomes a part-time job that can easily consume 20 hours a week in peak periods of work. This has actually been an unhealthy tension that negatively impacts [the] service load, which puts the total workload out of balance.

In addition to the issue of time commitment, it is important to note the financial obligation required of PRSSA faculty advisers. As of 2020, national membership in the PRSA costs $260. Additional survey comments suggest that advisers are also active in local PRSA chapter, district, or national-level service commitments. Interest group or local chapter memberships may add $100 or more for each additional membership. A trip to the PRSSA or PRSA international conferences (including the PRSA Educators Academy’s Super Saturday conference) is an additional layer to the financial investment wherein the adviser incurs an expense for hotel, airfare and ground transportation, conference registration fee, meals, social events, and celebration dinners or other events which are all charged a la carte and, then per organizational policy, awaits reimbursement if it is offered at all. 

Despite the efforts through the national PRSSA office (PRSSA, n.d.-b) and PRSA Foundation (PRSA Foundation, 2020) to incentivize student engagement within PRSSA chapters through scholarships, grants, and awards, PRSA traditionally does not offer membership or conference discounts for PRSSA faculty advisors (though a limited PRSA national dues waiver was offered in the fall of 2020 due to COVID-19). Nearly one-third of faculty advisers who participated in this study indicated their university did not cover the cost of PRSA membership fees. Given the research findings that the vast majority of advisers are lecturers or junior faculty, the expense of PRSA membership might be a financial hardship to those who can least likely to afford it. The issue of financial compensation, minimally for dues, should be addressed by both the PRSSA national organization and university administrations to ensure PRSSA faculty advisers do not experience a financial burden as a result of their service.

As research is often prioritized over service in tenure-and-promotion review, PRSSA faculty advising poses a potential threat to maintaining an active research agenda. That aligns with Waymer’s (2014) finding that “females are carrying a larger service responsibility than their male counterparts at a potentially critical time in the tenure process” (p. 412). This study found the actual service load was significantly increased, and actual research load was decreased, in comparison to the stated expectations of the university faculty handbook. As a female tenure-track assistant professor shared, “One of the most frustrating parts is seeing the workload of other faculty members in the department. If they don’t advise an org like PRSSA, they are able to accomplish a lot more research, or have time to pursue other areas of service.”

Nearly 60% of PRSSA faculty advisers receive partial credit to their service requirement with their advising duties, but 24.5% receive no time compensation. That inconsistency can lead to feelings of inequality and frustration among advisers because there is no consistency in how their role applies in the annual review or tenure-and-promotion process. One female assistant professor added context to this conundrum, “There are some schools that already grant their advisors course releases— so I do feel there should be consistencies and a recommendation by PRSSA— to recognize advisors.” That sentiment was also reflected by a female associate professor, 

Frankly, if the strategic aim is to build a chapter that achieves Star status, regularly attends nationals, and generates teams for Bateman competitions, the faculty likely needs a course release to facilitate it, and the department needs to incorporate PRSSA into the annual budget to support the chapter.

Adding service assignments to advising can push PRSSA faculty advisers well beyond the expected service requirements, causing a situation where a) less time is given to research, b) there is a diminished work-life balance, or c) the PRSSA faculty adviser is not able to provide substantial counsel to maintain chapter success. The added stress of having to intentionally forego some PRSSA chapter advising standards to maintain career equilibrium ties to the emotional toll of not keeping promises (in this case, to the PRSSA chapter and stakeholders expecting chapter success) that Brown and Roloff (2011) warn contribute to burnout. Administrators need to communicate with PRSSA faculty advisers to understand how much time is spent advising and assign other service duties only in proportion to the overall expected service workload as determined by the university faculty handbook. This is best summarized by responses from a male lecturer, “I am not evaluated at all on PRSSA service for my evaluation. It’s all teaching evaluation. Those courses are often a priority, meaning I tackle PRSSA when everything else is done.” 

An unexpected finding that emerged in this study was the impact of emotional labor on role satisfaction of PRSSA faculty advisers. Job-focused emotional labor is the “emotional display” that employees perform in a “people-centric” job with expected emotional duties (Brown et al., 2014). Emotional labor is another possible concern for advising. Teaching is already a job known to cause possible high negative emotional labor tolls due to sustained interaction with students of varying needs (Brown & Roloff, 2011; Brown et al., 2014; Zhang & Zhu, 2007), and adding advising creates another service component requiring sustained student interaction. “Teachers experience repeated interactions with the same students in a way that is both long-term and intense” (Brown et al., 2014, p. 207). As a female full professor said, “There is a lot of coaching and supporting, and it cannot be done in absentia.” Administrators should be sensitive to the extra-role labor and emotional labor of advising a student organization that can extend a faculty member’s service contribution beyond university expectations. 

Recommendations
As the issue of equity emerged as the primary concern among PRSSA faculty advisers, the authors offer several recommendations to address this issue. First, the PRSSA national office should permanently waive a) the PRSA membership fee, b) local chapter membership fee, and c) PRSSA national conference registration fee. The waiving of those fees relieves the financial burden many faculty advisers personally shoulder and recognizes the value PRSSA faculty advisers bring in service to their respective chapters.

Second, the PRSSA national office should strongly advocate for time compensation for faculty advisers. As this study has demonstrated, PRSSA faculty advisers who receive little-to-no compensation in regard to time commitment often struggle to balance advising duties with other faculty job expectations. As a result, faculty advising might become a low-level priority that can be detrimental to growth of individual chapters. At minimum, the PRSSA national office should advocate for PRSSA faculty advisers to receive full credit for service requirements or, ideally, a course release for advising PRSSA. To manage a successful chapter might be compared to teaching a year-long campaigns class that can be aligned to specific learning outcomes in the public relations curriculum. By advocating for equitable time compensation, the PRSSA national office will provide necessary resources to faculty advisers to provide effective counsel to their chapters that support membership growth, improved programming, and greater participation in national initiatives and events.

Finally, the PRSSA national office should issue an informational document that can be distributed to university administration as an educational tool about the PRSSA organization and its expectations for university chapters. This document should provide a) the mission and scope of PRSSA, b) minimum PRSSA chapter standards, c) a detailed description of faculty adviser duties, d) minimum expectations of the time commitment to PRSSA faculty advising, e) financial obligations to be a PRSSA faculty adviser, and e) recommendations to fairly compensate PRSSA faculty advisers.  The document should be developed with the input of the PRSSA faculty advisor board previously recommended in this paper. 

Conclusion
This study represents a first wave of research by the authors about the opportunities and challenges of PRSSA faculty advising. As this study illustrates, PRSSA advising is an experience from which most faculty gain a strong sense of satisfaction. Yet, there are specific challenges that must be addressed to ensure that faculty are supported and compensated fairly. The confidence in meeting the expectations has a direct impact on role satisfaction of PRSSA faculty advisers. Greater efforts should be implemented to provide advisors with the tools, resources, and support—at both the university level and via the PRSSA national office—to help faculty advisers, especially those new to the role, succeed in their efforts. This paper serves as a collaborative tool for current and future advisers, university administrators, and PRSSA national leadership to understand the common challenges PRSSA faculty advisers experience. Likewise, this study allows for faculty members to create strategies for chapter and student-level improvements based on the reported experiences of other advisers based on their chapters. This research serves as a tool through which to create a more controlled investment of time and energy into the service realm of faculty requirements for promotion and/or tenure. 

Certain limitations existed in this study. Though best efforts were made by the authors to ensure all faculty advisers could participate in the study, only a small number (n = 2) of faculty advisers of large PRSSA chapters (>100 members) participated in the study. Greater participation from large chapter advisers might have provided insights into best practices that could be shared to benefit small chapters’ development and growth. In addition, a parallel faculty adviser study was launched by the PRSSA national office during the same timeline of phase two of this study, which might have limited participation in the qualitative questionnaire. While the PRSSA national office did launch new initiatives in 2019 in an effort to address concerns expressed by faculty advisers through its own research, the results of this research were not made public. There are key issues found in this study related to role satisfaction, as well as work and life balance, that remained unaddressed by PRSSA national. Finally, information requests by the authors to provide organization membership data were denied by the PRSA national offices.

Future research by the authors will focus on solutions to address the challenges identified in the current study. Specifically, the issues of emotional and extra-role labor appear to hold importance to many PRSSA faculty advisers, and the authors will pursue additional research to explore those issues in more depth. In addition, further research should explore the role of the professional adviser as a partner who helps shoulder the load of advising duties. Through collaborative participation between PRSSA national leadership, university administration, current and present faculty advisers, and chapter leadership, future research holds the potential to create a more rewarding and successful experience for PRSSA faculty advisers and their chapters.

References 

Andrews, L. A. (2007). Should you join PRSSA? Public relations undergraduate students’ perceptions of the benefits of participating in professional student organizations through organizational assimilation theory in preparation of entering the professional workforce. [Masters Thesis, Kent State University]. Ohio Link. https://etd.ohiolink.edu/apexprod/rws_etd/send_file/send?accession=kent1185575909&disposition=inline

Boyer, E. (1991). Highlights of the Carnegie Report: The Scholarship of Teaching from “Scholarship Reconsidered: Priorities of the Professoriate.” College Teaching, 39(1), 11-13. https://www.jstor.org/stable/27558441 

Brown, E., Horner, C., Kerr, M., & Scanlon, C. (2014). United States teachers’ emotional labor and professional identities. KEDI Journal of Educational Policy, 11(2), 205-225. https://search.proquest.com/docview/1641932018?accountid=12948

Brown, L. A., & Roloff, M. E. (2011). Extra-role time, burnout, and commitment: The power of promises kept. Business Communication Quarterly, 74(4), 450–474. https://doi.org/10.1177/1080569911424202

Commission on Public Relations Education. (2018). Fast forward: Foundations + future state. Educators + practitioners: The Commission on Public Relations Education 2017 report on undergraduate education. http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf  

Deci, E. L., Connell, J. P., & Ryan, R. M. (1989). Self-determination in a work organization. Journal of Applied Psychology, 74(4), 580–590. https://doi.org/10.1037/0021-9010.74.4.580 

Deci, E. L., & Ryan, R. M. (2013). Intrinsic motivation and self-determinism in human behavior. Springer.

Deetz, S. (1995). Transforming communication, transforming business: Stimulating value negotiation for more responsive and responsible workplaces. International Journal of Value-Based Management, 8(3), 255-278. https://doi.org/10.1007/BF00942839

Filak, V. F., & Sheldon, K. (2003). Student psychological-need satisfaction and college teacher evaluations. Educational Psychology, 23(3), 235-247.  https://doi.org/10.1080/0144341032000060084 

Filak, V. F., & Pritchard, R.S. (2007). The effects of self-determined motivation and autonomy support on advisers and members of a journalism student organization. Journalism & Mass Communication Educator, 62(1), 62-76. https://doi.org/10.1177%2F107769580706200106

Nadler, M. K. (1997). The value of student organizations and the role of faculty advisers. Journalism & Mass Communication Educator, 52(1), 16-25. https://doi.org/10.1177/107769589705200102 

Pohl, G. M., & Butler, J. M. (1994). Public relations in action: A view of the benefits of student membership in pre-professional organizations (ED384080). ERIC. https://eric.ed.gov/?id=ED384080

Public Relations Society of America (n.d.). Champions. Retrieved June 2, 2020, from https://champions.prsa.org/

Public Relations Society of America Foundation. (n.d.). Scholarships & Grants. Retrieved June 2, 2020, from https://www.prsafoundation.org/scholarships-awards/

Public Relations Student Society of America. (March, 2017). PRSSA chapter requirements.  http://prssa.prsa.org/wp-content/uploads/2017/03/PRSSACharterApplication.pdf

Public Relations Student Society of America. (n.d.-a). About PRSSA. Retrieved June 2, 2020, from http://prssa.prsa.org/about-prssa/

Public Relations Student Society of America. (n.d.-b). National initiatives. Retrieved June 2, 2020, from https://prssa.prsa.org/about-prssa/national-initiatives/

Public Relations Student Society of America. (n.d.-c). PRSSA chapter handbook 2019-2020. Retrieved June 2, 2020, from https://prssa.prsa.org/wp-content/uploads/2019/08/PRSSA-Chapter-Handbook.pdf 

Public Relations Student Society of America. (n.d.-d). Scholarships & awards. Retrieved June 2, 2020, from https://prssa.prsa.org/scholarships-and-awards/

Public Relations Student Society of America. (n.d.-e). District ambassadors. Retrieved June 2, 2020, from https://prssa.prsa.org/chapter-firm-resources/tools-for-chapter-leaders/district-ambassadors/     

Ryan, R., Deci, E., & Grolnick, W. (1995). Autonomy, relatedness, and self: Their relations to development and psychopathology. In D. Cicchetti & D. Cohen (Eds.), Developmental Psychopathology (pp. 618-655). Wiley.

Ryan, R., Sheldon, K., Kasser, T., & Deci, E. (1996). All goals were not created equal: An organismic perspective on nature of goals and their regulation. In P. M. Gollwitzer & J. A. Bargh (Eds.), The Psychology of Action: Linking Motivation and Cognition to Behavior (pp. 7-26). Guilford.

Sparks, S. D., & Conwell, P. (1998). Teaching public relations – does practice or theory prepare practitioners? Public Relations Quarterly, 43(1), 41-44. 

Todd, V. (2009). PRSSA faculty and professional advisors’ perceptions of public relations curriculum, assessment of students’ learning, and faculty performance. Journalism & Mass Communication Educator, 64(1), 71-90. https://doi.org/10.1177/107769580906400106

Waymer, D. (2014). Shouldering the load: An analysis of gender-based differences in the undergraduate PR writing classes and advising undergraduate PRSSA chapters. Journalism & Mass Communication Educator, 69(4), 404-414. https://doi.org/10.1177/1077695814538824

Zhang, Q., & Zhu, W. (2008). Exploring emotion in teaching: Emotional labor, burnout, and satisfaction in Chinese higher education. Communication Education, 57(1), 105-122. https://doi.org/10.1080/03634520701586310

© Copyright 2021 AEJMC Public Relations Division

To cite this article: Weed, A.J., Wallace, A.A., Emmons, B., & Keib, K. (2021). How to CARE for PRSSA faculty advisers: The impact of competence, autonomy, relatedness, and equity on role satisfaction. Journal of Public Relations Education, 7(2), 170-205. https://aejmc.us/jpre/2021/08/31/analytics-in-pr-education-desired-skills-for-digital-communicators/

Special Issue Call for Papers | Leadership, Mentorship and DEI in the Post-Pandemic Public Relations Classroom

Special Issue – Volume 8 (4), Journal of Public Relations Education

Full manuscript submission deadline: June 1, 2022

Special Issue Co-Editors:

Juan Meng, Ph.D.  Department of Advertising & Public Relations, University of Georgia, jmeng@uga.edu

Nilanjana Bardhan, Ph.D. Department of Communication Studies, Southern Illinois University Carbondale, bardhan@siu.edu

Rationale:

The combined effects of COVID-19 and racial unrest following the killing of George Floyd have significantly changed how we teach, and the PR classroom is no exception. Numerous webinars have been hosted by the Public Relations Society of America, the Plank Center for Leadership in Public Relations and the Institute for Public Relations, to name some, to discuss race and diversity/inclusion/equity (DEI) in the classroom and industry. This special issue will add the topics of leadership and mentorship to the mix, and specifically focus on the intersections of leadership and mentorship in fostering DE&I in public relations education. Leadership and mentorship are especially important during times of upheaval, uncertainty and radical change. Educators and students are grappling with new pedagogical challenges, and we need scholarship that can aid in navigating these challenges and discovering opportunities (Bardhan & Gower, 2020).

The Commission on Public Relations Education (CPRE) has unequivocally emphasized the pressing need to make DE&I an integral part of public relations education, especially the undergraduate curriculum, for the purpose of “creating a more diverse school-to-industry pipeline” (Mundy et al., 2018, p. 144). This work requires proactive leadership and mentorship. Research over the decades shows a clear link between leadership engagement and DE&I. Educators play a critical and instructive role in enhancing students’ competitive advantage by incorporating leadership content and training into undergraduate curriculum (Meng, 2013, 2015).

The purpose of this special issue call is to invite research articles, teaching briefs, scholarly and critical essays, and case studies, and we are especially interested in articles that explore BOTH the challenges and opportunities for public relations pedagogy focusing on leadership and mentorship and how the mix could foster a more diverse, equal and inclusive environment in public relations classroom.  Submissions that offer practical knowledge and guidance for undergraduate and graduate public relations education are encouraged as are articles that enhance our theoretical understanding of this topic. We invite original submissions, and areas of focus could include but are not limited to:

  • Pedagogical, theoretical and practical implications of jointly engaging leadership and mentorship to foster DE&I in undergraduate and/or graduate public relations education
  • Current challenges associated with teaching PR at the intersections of leadership/mentorship/DE&I
  • Resources that aid in teaching PR at the intersections of leadership/mentorship/DE&I
  • Best practices ranging from experiential learning, activities and cases for teaching PR at the intersections of leadership/mentorship/DE&I
  • PRSSA and other PR student organizations and extracurricular activities as a site for learning and teaching PR at the intersections of leadership/mentorship/DE&I
  • The role of leadership and mentorship in cultivating a diverse generation of future leaders
  • The role of student leaders in advancing DE&I
  • Creation of platforms and networks to connect educators, practitioner and students for enhancing leadership/mentorship/DE&I in PR pedagogy
  • Curricular issues related to teaching PR at the intersections of leadership/mentorship/DEI
  • Faculty preparation/training and peer mentoring for teaching PR to advance DE&I in this time of great uncertainty
  • Structural issues for teaching PR at the intersections of leadership/mentorship/DE&I (e.g., how to recruit more diverse students and faculty)

Contributions that provide insights with robust pedagogical, practical and theoretical implications and recommendations on leadership, mentorship and DE&I in post-pandemic public relations education will be given the highest consideration.

Submission Guidelines:

Submissions should follow the Author Guidelines on the JPRE website. Authors should include the special call name in parentheses after their manuscript title to indicate the submission is for this particular special call. Authors should submit their manuscript through Scholastica, the online submission system for JPRE. All submissions will be anonymously reviewed, following the guidelines of JPRE. Authors must use APA style for citations, references, tables and figures caption. All identifying information must be deleted before full paper submissions.

Timeline with Key Dates:

  • Deadline for full manuscript submission to JPRE’s Scholastica submission portal: https://jpre.scholasticahq.com/ June 1, 2022
  • Notification of review results, including invitations for revision and resubmission (R&R): August 1, 2022
  • Deadline for R&R submission: September 1, 2022
  • Scheduled Publication: Volume 8 Issue 4 (November/December 2022)

Selected References:

Bardhan, N., & Gower, K. (2020). Student and faculty/educator views on diversity and inclusion in public relations: The role of leaders in bringing about change. Journal of Public Relations Education, 6(2), 102-141. Available at https://journalofpreducation.com/wp-content/uploads/2022/12/d7c44-pdf-of-bardhan-and-gower-2020-from-jpre-6.2-1.pdf

Meng, J. (2013). Learning by leading: Integrating leadership in public relations education for an enhanced value. Public Relations Review, 39(5), 609-611.

Meng, J. (2015). Integrating leadership in public relations education to develop future leaders. Journal of Public Relations Education, 1(1), 31-37. Available at https://aejmc.us/jpre/2015/08/04/integrating-leadership-in-public-relations-education-to-develop-future-leaders/

Mundy, D., Lewton, K., Hicks, A., & Neptune, T. (2018). Diversity: An imperative commitment for educators and practitioners. In Fast Forward: The 2017 Report on undergraduate public relations education (pp. 139-148). Commission on Public Relations Education. Available at: http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf

Any questions or inquiries about the special issue?

Please contact guest editors by email: Dr. Juan Meng at jmeng@uga.edu and/or Dr. Nilanjana Bardhan at bardhan@siu.edu

An Examination of Student Perceptions of Teacher Social Media Use in the Classroom

Editorial Record: Original draft submitted August 17, 2019. Revision submitted November 23, 2019. Revision submitted March 16, 2020. Manuscript accepted for publication May 18, 2020. First published online May 2021.

Author

Pamela Brubaker, Ph.D.
Associate Professor, Public Relations
Brigham Young University
Provo, Utah
Email: pamela_brubaker@byu.edu

Diana C. Sisson, Ph.D.
Assistant Professor, Public Relations
Auburn University
Auburn, Alabama
Email: dcs0016@auburn.edu

Christopher Wilson, Ph.D.
Associate Professor, Public Relations
Brigham Young University
Provo, Utah
chriswilson@byu.edu 

Ai Zhang, Ph.D. 
Education Consultant, Classroom Without Walls; Adobe Education Leader; HubSpot Academy Instructor
aiaddysonzhang@gmail.com

Abstract

Equipping students with knowledge, skills, and abilities in social media requires incorporating social media into communication classes. This study explores how teachers are adopting social media and the impact classroom adoption of social media is having on students’ perceptions of their teacher’s technological coolness and credibility. Survey data was collected from students at three U.S. universities. Data revealed using social media platforms that are not widely adopted in communication classrooms (i.e., Instagram, Snapchat, Pinterest, LinkedIn, etc.) positively influences perceptions of technological coolness (originality and attractiveness) more than the mainstream social media platforms students are accustomed to teachers integrating into the curriculum (i.e., Facebook, YouTube, and Twitter). Additionally, adopting non-mainstream social media platforms positively impacts teacher credibility (trustworthiness and goodwill) among students who use these platforms more frequently. Findings suggest students positively evaluate teachers who stay up-to-date on social media and experiment with newer platforms in their classes.  

Keywords: Social media use, teacher credibility, technological coolness, pedagogy

Public relations professors often talk about being models for students (e.g., Remund & Freberg, 2013). However, changes in communication technology (Daniels, 2018; USC Annenberg Center for Public Relations, 2019; The Plank Center for Leadership in Public Relations, 2019; Wright & Hinson, 2017) and in the generational expectations of students (Kim, 2018) make it difficult for public relations educators to stay on top of new technology trends and simultaneously master them to the point that they can teach their students how to use them effectively. Nevertheless, public relations practitioners and academics recognize that new technologies, including social media, must now be an integral part of the public relations curriculum (Commission on Public Relations Education, 2018). In fact, The Commission on Public Relations Education’s (2018) latest report on undergraduate education recommends that “as much as possible, technology tools should be incorporated into courses” (p. 94) in order to “equip students with the needed knowledge, skills, and abilities (KSAs) to best serve the practice of public relations” (p. 85).

From a practitioner perspective, social media has widespread implications for organizations particularly in terms of organizational reputation (Agozzino, 2012; Floreddu et al., 2014). Social media is defined as “open source (i.e., publicly accessible) media sites on the internet that accept user-generated content and foster social interaction” (Stacks & Bowen, 2013, p. 30). Scholars have argued that public relations professionals view social media use as a means of credibility building, as well as a venue for sharing transparent and accurate information on behalf of clients (Wright & Hinson, 2012). As a result, how public relations professors teach up-and-coming professionals about social media may have a significant impact in social media use for the public relations industry. A variety of studies have been conducted to understand how public relations educators are using social media in their undergraduate classrooms (e.g., Ewing et al., 2018; Zhang & Freberg, 2018) from the instructors’ perspective. Similar to research conducted by Tatone et al. (2017), this study examines students’ perspectives about teacher adoption and use of social media for educational purposes. Specifically, this study assesses student perceptions of social media use in the classroom and the effect of those perceptions on how students evaluate teachers in terms of technological coolness and credibility to offer practical and theoretical implications as a means of informing social media pedagogy. 

Literature Review
This study is situated at the intersection of social media classroom trends, teacher credibility, and the technological coolness literature. 

Social Media Classroom Trends
In a national survey of higher education faculty (N = 7,969), Seaman and Tinti-Kane (2013) found respondents who reported using social media as a teaching tool (41%) lagged behind respondents’ professional (55%) and personal (70.3%) social media use. Among the faculty respondents who used social media in their teaching, middle-aged faculty members, ages 35-54, had higher rates of using social media for teaching purposes than younger faculty (under 35). Additionally, faculty in the disciplines of arts and humanities as well as applied sciences used social media as a teaching tool at a higher rate than faculty in other disciplines. The most frequently used social media platforms for teaching were: (1) Blogs and wikis (26.9%), (2) podcasts (16.3%), (3) LinkedIn (11.1%), (4) Facebook (8.4%), and (5) Twitter (4.1%).

Seaman and Tinti-Kane (2013) explained lower adoption rates of social media in teaching is likely due to the concerns of faculty. Two of the top faculty concerns about these publicly accessible platforms included integrity of student submissions and privacy.

Researchers have observed similar trends among mass communication faculty. McCorkindale (2013) found that only a third of the public relations professors who had a Facebook or Twitter account used those social media platforms in their classes. She also reported public relations professors were divided about whether it was appropriate to become “friends” with students on Facebook or connect with students on Twitter because of concerns about professionalism and privacy. However, according to Kothari and Hickerson (2016), nearly three-quarters of journalism faculty said they used Twitter in the classroom, while 42% reported using Facebook, to teach students about recruiting sources, crowd-sourcing ideas and promoting stories.   

Remund and Freberg (2013) suggested public relations professors should embrace the role of social connector as they prepare students for an increasingly interconnected, digital world. According to these scholars (Remund & Freberg, 2013), becoming a social connector requires professors to “[build] and [leverage] social networks to implement pedagogical methods much richer and dynamic than the traditional classroom experience” (p. 2). As a result, public relations professors must become active users of social media channels, model online reputation management, and facilitate collaboration between students and professionals.

Studies have evaluated the use of Twitter in public relations classrooms. Fraustino et al. (2015) conducted Twitter chat discussions and found that students reported learning about public relations concepts including professionalism, media influence, crisis communication, social media campaigns, and best practices. They also noted Twitter facilitated experiential learning because students were able to see learning as a process, as constructing and deconstructing knowledge and as conversation. Similarly, Tatone et al. (2017) tested Twitter use in a large lecture class. Subsequent focus groups with students revealed that using Twitter created a sense of classroom community and allowed them to learn from a variety of opinions. However, students also noted Twitter use during class could turn into a distraction because of the temptation to use their smartphones for non-academic purposes. Additionally, they noted this distraction sometimes caused some students to compete to be the most entertaining with their posts.

Teacher Credibility and Social Media
One of the most important concepts affecting the student-teacher relationship in the instructional literature is teacher credibility (Carr et al., 2013). Teacher credibility was originally derived from the rhetorical research on source credibility, which was defined as “the attitude toward a source of communication held at a given time by a communicator” (McCroskey & Young, 1981, p. 24). Building on this definition, scholars have defined teacher credibility as student attitudes toward a teacher that are based on observations of the teacher’s communication behavior (Schrodt et al., 2009; Teven & McCroskey, 1997). Also, researchers have identified three dimensions of teacher credibility: competence, trustworthiness and caring (DeGroot et al., 2015; McCroskey & Teven, 1999; Teven & McCroskey, 1997). Competence relates to the instructor’s perceived expertise in a given subject area. Trustworthiness describes a teacher’s perceived character and sincerity. Caring has been described as the degree to which an instructor shows concern for his/her students’ welfare. 

Finn and colleagues’ (2009) meta-analysis found that teacher credibility was related to a variety of student learning outcomes and teaching behaviors. For instance, student learning outcomes that have been shown to be related to teacher credibility include enhanced motivation to learn and improved cognitive learning. Additionally, teaching strategies, such as affinity-seeking, and teaching behaviors, including immediacy, assertiveness and humor, also have relationships with teacher credibility. Interestingly, moderate technology use has been shown to increase teacher credibility (Schrodt & Turman, 2005; Schrodt & Witt, 2006).   

With the proliferation of publicly accessible social media channels and their potential as learning and communication tools (Junco et al., 2011; Waters & Bortree, 2011), scholars have investigated the impact of instructors’ use of these channels and its impact on teacher credibility. For example, Johnson’s (2011) experimental study found that an instructor’s Twitter profile with socially-oriented posts produced higher perceived teacher credibility among student participants than a profile with only scholarly posts. The results also showed perceptions of teacher credibility were moderated by students’ level of comfort viewing a Twitter profile and whether students thought it was a good idea for a college professor to have a publicly accessible Twitter account. Her findings also showed that students were split on the question of whether professors should have a Twitter account that students can see. Those who thought it was a bad idea (47%) reported that the professor’s account may display unprofessional content, it may eliminate social boundaries, and it might decrease students’ respect for the professor. Those who felt that it was a good idea noted that the Twitter account could help the professor seem more approachable, more human, and up-to-date on the latest technology.

However, in a related experiment, DeGroot et al.(2015) reported students scored an instructor’s Twitter profile higher on teacher credibility when the tweets were strictly professional. Additionally, they found students were more likely to give the instructor higher credibility ratings when the students thought it was a good idea for instructors to use Twitter. As a result, DeGroot and colleagues identified three core reasons a professor should use Twitter: (1) to extend the classroom; (2) to improve student–instructor relationships; and (3) to teach students how to use Twitter in a professional manner. They also provided two reasons professors should not have a public Twitter account: (1) It can violate typical classroom and time expectations, and (2) the boundaries between students and instructors might be broken down in a negative way.

McArthur and Bostedo-Conway (2012) conducted a study of student-instructor interaction on Twitter. They operationalized this interaction as the student-reported frequency of reading instructor tweets and writing their own tweets. They reported that student perceptions of teacher credibility were related to student frequency of Twitter use. They explained, “students did not perceive greater feelings of character, competence, or caring from instructors using Twitter unless they used Twitter themselves” (p. 289).

Technological Coolness in the Classroom
Research shows beliefs, attitudes and subjective norms lead to behavior (Ajzen, 1991; Bean & Eaton, 2000). Likewise, students’ perceptions of their educational environment, including perceptions of their teacher, play a pivotal role in how receptive students are to learning (Carr et al., 2013; McCormick et al., 2013). These perceptions also influence students’ educational satisfaction, learning outcomes and the educational path they choose (Finn et al., 2009; Schrodt et al., 2009). 

One aspect of the educational environment is the technology instructors employ for teaching students. With public relations practitioners and scholars (Commission on Public Relations Education, 2018) encouraging professors to stay up-to-date with and incorporate communication technologies, including social media, into the curriculum, it becomes increasingly important to understand the influence these technologies are having on perceptions of teachers. Current research about pedagogy in public relations does not specify the impact of teachers incorporating newer versus older forms of communication technologies in the classroom on student perceptions. In order to examine perceptions of teachers who adopt different types of social media channels, this study adopts the concept of coolness from the consumer marketing literature and applies it to student perceptions of teacher’s technology use. 

While teachers don’t necessarily seek or even desire to be perceived by their students as a cool person, students formulate perceptions about their teacher’s use of technology. In general, coolness is a positive evaluation attributed to either a person, a thing (e.g., product or technology), or a brand that deviates from the norm and in doing so provides a unique or hip socially desirable contribution to the social environment (Dar-Nimrod et al., 2012; Sundar et al., 2014). Specifically, the focus of coolness in this research is centered on a thing (i.e., a social media platform) rather than on a person (i.e., the professor). Student perceptions of a teacher’s technology use, which are referred to in this study as perceived technological coolness, result from teachers adopting newer communication technologies (i.e., social media) in their classrooms. Students associate new technologies in the classroom as being attractive, hip, or unique. For example, Sundar and colleagues (2014) found users considered communication technology devices cool if they were “novel, attractive and capable of building a subculture around it” (p. 179).  In other words, technological coolness is not a popularity contest, nor is it about liking the technology or its degree of usefulness (Dar-Nimrod et al., 2012; Sundar et al., 2014). 

Student perceptions of classroom technology use can heighten expectations and can lead to negative evaluations, particularly when expectations are not met. Such is the case when cool communication technology devices come on the market and underwhelm consumers by not performing to expectations or meeting expectations (Sundar et al., 2014; Sundar, 2008).

As new technology ages and more teachers adopt it for classroom use, student perceptions of the coolness of the technology evolve (Dar-Nimrod et al., 2012; Sundar et al., 2014).  The more widespread a trend, the less autonomous it becomes and the less cool it is perceived (Berger, 2008; Warren & Campbell, 2014; Sundar et al., 2014). Through a series of experiments Warren and Campbell (2014) explored the relationship between autonomy and coolness. In their research, consumers perceived a product design that deviated from the norm as being cooler than a typical product design that reflected the norm. However, deviating too far from the norm did not necessarily influence perceptions positively. Researchers found a curvilinear relationship between the level of autonomy and perceptions of coolness, with those ideas that deviated too far from the norm influencing perceptions negatively (Warren & Campbell, 2014). Essentially, when a trend or technology is widely adopted, it loses its coolness (Berger, 2008; Sundar et al., 2014; Warren & Campbell, 2014). 

Anik (2018) suggests one challenge of maintaining the perception of being cool is “keeping up with ever-changing trends and fads while still being perceived as autonomous, authentic and having an attitude” (para. 19). The same could be argued for faculty who aim to engage with students in meaningful ways and strive to enhance student learning by using newer social media platforms as pedagogical tools. Much like evaluations of cool technology, student’s perceptions of technological coolness (i.e., perceptions of teachers’ use of communication technology—social media—in the classroom) are likely to evolve, making it difficult for teachers to remain perceptively cool without adopting the latest technology trends within their classrooms (Anik, 2018; Sundar et al., 2014). 

Research Questions 
Literature reviewed for this study presented opportunities for further research regarding students’ perceptions of teacher credibility, technological coolness, and social media use in communication classrooms. The following research questions are offered:

RQ1. How do students report that teachers use social media platforms for teaching purposes in communication courses? 

RQ2: To what extent does teacher use of social media platforms in communication classes affect student perceptions of technological coolness?

RQ3: To what extent does teacher use of social media platforms in communication classes affect student perceptions of teacher credibility?

RQ4: To what extent are student perceptions of technological coolness related to their perceptions of teacher credibility?

Methodology

Participants
Participants were college students (N=330) enrolled in communication programs at one of three universities across the United States. Communication students were recruited at universities ranging in size from 10,000 to 35,000 students, with two of the universities enrolling 30,000 to 35,000 students per year. Within the sample, 24% of the participants were male (n =78), 62% (n = 206) were female, and 14% (n = 45) did not self-identify. Students ranged from 19 to 46 years in age (M=22.36; SD = 3.05). A majority of the students were seniors (47%; n = 154) and juniors (33%; n = 108). Because students had to be taking classes within their major (i.e., public relations, journalism, advertising, etc.), students were more likely to be upperclassmen opposed to freshmen (0.3%; n =1) and sophomores (7%; n = 22).  

Table 1

Use of Social Media Platforms Identified in this Study and from a National Study

Note. Data from U.S. adults reflects those people who said they have ever used the social media platform. This national survey data was collected by Pew Research Center from Jan. 8 to Feb. 7, 2019 (Perrin & Anderson, 2019). Data from the study’s sample reflects students’ typical use of these platforms at least one or more days per week as well as the social media platforms students reported their communication’s professor used most recently for teaching one of their classes. Other social media platforms reflect student reports of faculty use of Vimeo, Blogger, and Slack. 

As shown in Table 1, data collected from students in this study are reflective of national social media platform trends. Students primarily use Facebook (94%), Instagram (91%), YouTube (89%), and Snapchat (72%) at least one or more days per week. Students also reported their teachers are using Facebook (49%) and YouTube (19%) more than any other platform in their classes. A national study conducted by the Pew Research Center (Perrin & Anderson, 2019) revealed people 18-24 years old use YouTube (90%), Facebook (76%), Instagram (75%), and Snapchat (73%) the most, with U.S. adults using YouTube (73%) and Facebook (69%) more than any other platform.  

Procedures
Data for this study was collected from college students enrolled in communications programs at three universities in the western, eastern, and southeastern part of the United States. Students minoring in communications and pre-majors were not included in the study. The online survey was sent to a purposive sample of students majoring in communications at each of the respective universities. The survey was distributed to students after Institutional Review Board approval. As an incentive, participants were entered into a drawing for one of four $25 Amazon gift cards.

Measures
Only students who indicated they had a communications professor who used social media for teaching purposes were allowed to participate in the study. Before completing the survey, students were told to “think about the communications professor who most recently used social media for teaching one of your classes” and then indicate which platform their professor used the most: Facebook, Instagram, Twitter, Snapchat, Instagram, YouTube, and LinkedIn. Afterwards, students described how the social media platform was used in class. As part of the qualitative analysis of the open-ended question of the survey instrument, common topics and ideas were identified when they were repeated throughout student comments. The topics and ideas were grouped into themes and then reported by social media platform. 

Teacher Credibility. To measure student evaluations of teacher credibility, this study adopted McCroskey and Teven’s (1999) 18-item teacher credibility scale. This scale consists of three subscales that measure the three dimensions of teacher credibility: competence, trustworthiness (McCroskey & Young, 1981) and goodwill (Teven & McCroskey, 1997). Each subscale consists of six indicators that use seven-point semantic differential response scales. For example, indicators of trustworthiness are: (1) honest/dishonest, (2) untrustworthy/trustworthy, (3) honorable/dishonorable, (4) moral/immoral, (5) unethical/ethical, and (6) phony/genuine. The competence indicators are: (1) intelligent/unintelligent, (2) untrained/trained, (3) inexpert/expert, (4) informed/uninformed, (5) incompetent/competent, and (6) bright/stupid. The goodwill indicators are: (1) cares about me/doesn’t care about me, (2) has my interests at heart/doesn’t have my interests at heart, (3) self-centered/not self-centered, (4) concerned with me/unconcerned with me, (5) insensitive/sensitive, and (6) not understanding/understanding. The teacher credibility scale has been found to be valid and reliable (e.g., Teven & McCroskey, 1997; Thweatt & McCroskey, 1998) and has been used to evaluate teacher credibility in a variety of teaching contexts (e.g., DeGroot et al., 2015; Johnson, 2011; Schrodt & Turman, 2005). All items were measured on seven-point Likert scales ranging from strongly disagree (1) to strongly agree (5). Cronbach’s alpha for each factor was satisfactory: competence (α = 0.91; M = 6.22, SD = 1.02), trustworthiness (α = 0.86; M = 6.40, SD = 0.90), and goodwill (α = 0.91; M = 5.99, SD = 1.15). 

Technological Coolness. To gauge the impact of teachers’ social media use in the classroom on student’s perceptions, this study adapted the three-factor coolness measures (originality, attractiveness and subculture) from Sundar et al. (2014). These measures were originally developed for assessing perceptions of technology products. However, they are useful for gauging student perceptions of teachers’ pedagogical use of social media as they have the potential to reveal the impact of adopting different forms of communication technology on individuals, or what is referred to in this study as technological coolness. Specifically, researchers adapted the five-item originality scale to measure college student perceptions about whether or not they felt their professor who used social media in the classroom was original, unique, out of the ordinary, stood apart from other communication professors, and was novel. 

To gauge whether or not students perceived teachers who employed social media within the classroom as being up-to-date and leveraging modern communication technologies, researchers employed two attractiveness measures identified by Sundar et al. (2014). After participants were prompted to think about the communications professor who most recently used social media in the classroom, students assessed whether or not they considered that professor to be hip or cutting edge. The other three attractiveness measures used by Sundar and colleagues (2014) were not employed as they were more likely to produce evaluations of the teacher’s personal appearance (e.g., this instructor is stylish, sexy, and hot) rather than the teacher’s technology use (i.e. social media).

Assessments of the subculture surrounding classroom social media use was assessed using five items. Specifically, students were asked if instructors who use social media for teaching purposes are different from instructors who do not use it for teaching purposes. Students also indicated if instructors who use social media for teaching stand apart from other communication instructors as well as whether or not these instructors stand out from other instructors outside communications. The last two questions assessed whether or not instructors who use social media for teaching are unique and if students consider them to be better instructors than those who do not use social media for teaching purposes. All items were measured on five-point Likert scales ranging from strongly disagree (1) to strongly agree (5). Cronbach’s alpha for each factor was satisfactory: originality (α = 0.90; M = 5.49, SD = 1.15), attractiveness (α = 0.87; M = 5.15, SD = 1.51), and subculture (α = 0.88; M = 5.23, SD = 1.18). 

Findings

RQ1. Student Reports of Social Media Platform Use
Table 2 outlines the various ways students explained how teachers incorporated social media into their communication classes. These themes emerged from the analysis of qualitative data. 

Table 2

Student Explanations of How Teachers Use Social Media in Communication Classes

Note. Students were asked, “Which social media platform did your communication’s professor use the most for your class?” This was followed up with a question about “How did he/she use the social media platform for your class?” Percentages in each column represent the frequency of students’ mentions of how their professors used each social media platform. The number of responses vary depending upon how many people responded and also because some people gave multiple examples of how the social media platform was used. 

Facebook. Half of the respondents said their instructors asked them to use Facebook to submit assignments. Additionally, students said their professors used Facebook for discussion prompts, receiving feedback, gathering assignments, and providing examples of concepts that were taught in class. Most students said they “loved” this, but a couple noted that it was just one more place to check notifications. One student said, “I hated it because along with all the million other things I had to keep tabs on, I then had to keep tabs on Facebook, too. Which I honestly don’t have time nor care to do.” Eight percent of respondents also said their professors used Facebook as a teaching aid to help students understand its features, such as Facebook ads, algorithms, insights and analytics, and live streaming.

Twitter. More than half of the time (54%) students reported professors were leveraging Twitter for individual or in-class assignments. In addition, when used as a teaching aid, students praised the use of this interactive platform and liked it when professors used Twitter for illustrating concepts. One student shared, “We were assigned to tweet at a company to see how fast they responded! An experiment that taught us the power of social media…Making time for it showed that this professor was actually experienced in the field and prioritized an effective application activity like this over book work.”

Twenty one percent of students who identified their professor used Twitter mentioned their professor used the platform to provide some kind of “how-to” lesson. These lessons included best practices for writing tweets, conducting research, and using analytics. For example, in one class, students had to write weekly tweets. Each week the student with the best tweet would win a prize. Some students said their professors use Twitter as a form of communication with them and one respondent said their professor took attendance via Twitter by using a specific hashtag.

Snapchat. Students who responded to the survey did not provide much input about their professors’ use of Snapchat, but when they did provide more details, students indicated professors use the platform as a means of faculty-student communication. For example, one student said their professor held “Snapchat office hours” where the professor was available to provide students with out-of-class help while traveling for work.

Instagram. Thirty-six percent of respondents noted Instagram as being used as part of bigger assignments, such as campaign analytics or research projects. Students said their professors also used Instagram to show them how to create a personal branding page and how to do an Instagram story. One student shared, “I’ve had an art professor who has used Instagram to portray an artist’s layout and I’ve had professors use it to teach us about personal brands and your online image as well.”

Pinterest. Little information was provided by students about their professors’ use of Pinterest; it was only mentioned briefly as being used to show students the basics on the nature of the platform.

LinkedIn. Respondents (67%) said their professors used LinkedIn primarily to teach students about career development, job hunting, and networking. Students said their professors required them to create profiles and upload portfolios of their work. The respondents also said their professors taught them how to properly communicate with others on LinkedIn. Students found this helpful and worth their time. One respondent said, “I had not been familiar with the social media outlet before, and it turned out to be extremely helpful for networking.”

YouTube. Students overwhelmingly (77%) said their professors used YouTube as a teaching aid to show examples of concepts being taught. For example, respondents indicated  they watched videos to see good and bad examples of advertisements, public relations, and visual concepts related to what they were discussing. Additionally, a few respondents said their professors had them upload video projects to YouTube, and then, the students would watch these video assignments in class and discuss.

Other. Students mentioned three additional digital platforms used by their professors: Slack, blog platforms, and Vimeo. Slack was used to communicate with students and upload assignments, in particular writing assignments. The blog platform was used to have students submit writing assignments. Similar to YouTube, Vimeo was used to upload video assignments and watch examples in class. Half of respondents who mentioned these platforms noted assignment submission as a reason for using it. 

RQ2. Student Perceptions of Technological Coolness
One-way ANCOVAs were run to determine whether students’ perceptions of technological coolness differed based upon the type of social media platform teachers used in the classroom whilst controlling for perceived credibility. Perceived credibility was used as a covariate because research suggests (DeGroot et al., 2015) credibility influences student perceptions of teachers, which for purposes of this study is perceptions of technological coolness. The data revealed significant correlations between the three dimensions of coolness and credibility (see RQ4, Table 3). In order to run credibility as a covariate, credibility was reduced to a single dimension (M = 6.20, SD = 0.91). 

For the independent variable, social media platforms were divided into two groups. Researchers based these groups on the social media platforms students reported teachers using more and less frequently in the classroom. These groups were created because research suggests perceptions of coolness among technology devices are often diminished as technology adoption becomes more mainstream and widely adopted in society (Warren & Campbell, 2014). It was anticipated the same would be true for perceptions of teachers who use more mainstream social media channels. Therefore, social media that students perceived to be used more frequently in their communication classes were thereby considered mainstream.

Mainstream platforms were then compared with those platforms students reported teachers using less frequently. The mainstream social media platforms students reported teachers using more frequently than any other included Facebook, YouTube, and Twitter. The non-mainstream social media platforms teachers used less often in the classroom included Instagram, Snapchat, Pinterest, LinkedIn, and few other self-reported channels. Table 1 shows the prevalence of each social media platform students identified communication teachers were using in their classes. The researchers did not report or examine differences among each platform individually as the prevalence of each platform differed so widely. For example, students reported half of the teachers (49.4%) were using Facebook compared to 1.8% who were using Snapchat.   

Originality. After adjustment for perceived teacher credibility, there was a statistically significant difference in perceptions of originality among teachers who use different social media platforms, F(1, 280) = 7.09, p < .01, partial η2 = .025. The data provided includes the adjusted mean ± standard error. Teachers who used non-mainstream social media (5.83 ± 0.14), were perceived to be significantly cooler than those who used mainstream social media (5.42 ± 0.06), a mean difference of 0.41 (95% CI, 5.29/5.56 to 5.55/6.11), p < .05.

Attractiveness. After adjustment for perceived teacher credibility, there was a statistically significant difference in perceived attractiveness among teachers who use mainstream vs. non-mainstream social media platforms, F(1, 280) = 9.48, p < .01, partial η2 = .033. The data provided includes the adjusted mean ± standard error. Teachers who used non-mainstream social media (5.68 ± 0.19), were perceived to be significantly cooler than those who use mainstream social media (5.05 ± 0.08), a mean difference of 0.63 (95% CI, 4.88/5.31 to 5.21/6.04), p < .05.

Subculture. After adjustment for perceived teacher credibility, there was not a statistically significant difference in the cool subculture created by teachers who use mainstream versus non-mainstream platforms, F(1, 281) = 1.63, p > .05, partial η2 = .006. The data provided includes the adjusted mean ± standard error. Teachers who used non-mainstream social media (5.42 ± 0.16) were not perceived to be significantly cooler than those who use mainstream social media (5.19 ± 0.07), a mean difference of 0.23 (95% CI, 5.05/5.10 to 5.34/5.74), p > .05.

RQ3. Student Perceptions of Teacher Credibility
For each dimension of credibility, a three-way (2 x 2 x 2) ANOVA was run to determine whether or not the type of social media teachers used (mainstream vs. non-mainstream) and the frequency with which students used mainstream (light users vs. heavy users) and non-mainstream (light users vs. heavy users) social media sites, influenced perceptions of teacher credibility. Frequency scores were calculated by adding the number of days a week students reported using each of the mainstream (Facebook, Twitter, and YouTube) and non-mainstream (Instagram, Snapchat, Pinterest, and LinkedIn) social media sites. Scores were then divided in half, with light users accessing the specified social media sites an average of zero to three days per week and heavy users accessing the sites an average of four to seven days per week.  

For goodwill, the omnibus test revealed a statistically significant simple two-way interaction between the type of social media teachers use and students who are heavy/light users of non-mainstream social media platforms, F(1, 278) = 5.89, p < .05, partial η2 = .021, but not for mainstream social media platforms, F(1, 279) = .67, p > .05. The main effects as well as the other two-way and three-way interactions were not significant. One potential reason for the lack of significance among the additional interactions might be due to the fact that the sample did not include students who were both light users of mainstream social media and heavy users of non-mainstream social media sites.

For trustworthiness, data showed a statistically significant simple two-way interaction between the type of social media teachers use and students who are heavy/light users of non-mainstream social media platforms, F(1, 279) = 5.41, p < .05, partial η2 = .019, but not for mainstream social media platforms, F(1, 279) = 1.41, p > .05. The main effects as well as the other two-way and three-way interactions were not significant.

For competence the omnibus test did not reveal any significant main effects or interactions. 

Goodwill. To further investigate the significant one-way interaction for goodwill (teacher use of mainstream/non-mainstream social media and student use of non-mainstream social media platforms), a two-way ANOVA was run. The data revealed a significant interaction, F(1, 280) = 5.63, p < .05. Students who are light users of non-mainstream social media platforms consider teachers who use mainstream platforms to have more goodwill (M = 6.13, SE = 0.10) than students who use these platforms more often (M = 5.75, SE = 0.12). The opposite was true for teachers who use non-mainstream social media platforms. Teachers were perceived to have more goodwill by students who use non-mainstream social media platforms more frequently (M = 6.45, SE = 0.26) opposed to students who did not use these platforms very much (M = 5.95, SE = 0.22).    

Trust. A similar two-way ANOVA was used to further investigate the significant one-way interaction for trust (teacher use of mainstream/non-mainstream social media and student use of non-mainstream social media platforms). The data revealed a significant interaction, F(1, 281) = 3.99, p < .05. Students who are light users of non-mainstream social media platforms consider teachers who use mainstream platforms to be more trustworthy (M = 6.49, SE = 0.08) than those students who use non-mainstream platforms more often (M = 6.29, SE = 0.09). The opposite was true for teachers who use non-mainstream social media platforms. These teachers were perceived as more trustworthy by students who frequently use non-mainstream social media platforms (M = 6.61, SE = 0.21) opposed to those who do not use these platforms very much (M = 6.21, SE = 0.17).    

RQ4. Perceptions of Technological Coolness and Teacher Credibility
A Pearson product-moment correlation coefficient was run to assess the relationship between technological coolness and teacher credibility. The data revealed a positive and relatively strong/moderate relationship between each dimension of credibility (competence, goodwill, and trust) and technological coolness (originality, attractiveness, and subculture). Table 3 shows the variables with the strongest relationships as being competence and attractiveness (r =.568) and competence and originality (r =.526).

Table 3

Relationship between Teacher Credibility and Technological Coolness

CredibilityCoolness:Originalityr (N)Coolness:Attractivenessr (N)Coolness:Subculturer (N)
Competence .526** (284).568** (284).338** (285)
Goodwill .460** (284).427** (284).321** (285)
Trust .414** (285).366** (285).299** (286)

** p < 0.01 (2-tailed) 

Discussion
This study examined student perceptions of social media use in the classroom and technological coolness and their effect on teacher credibility. While some teachers may struggle      with the topic of coolness as it relates to the classroom, it should be remembered that technological coolness is a measure of student perceptions of social media technology that has been adopted for classroom use. As seen in Table l, more than three-fourths of all teachers adopted one of the current mainstream social media platforms in their classrooms: Facebook, YouTube, and Twitter. Facebook was the teacher’s preferred social media platform as half of the students reported teachers using it within the classroom.           

Collectively, YouTube or Twitter was adopted by a third of the teachers. Primarily, they used YouTube to show curriculum-related videos in class and Twitter for one-off, in-class assignments. However, less than a fourth of teachers adopted one of the current non-mainstream platforms, even though these platforms were used by nearly two-thirds of the student sample. Of the few teachers who did adopt newer platforms, students reported these teachers were using Instagram as part of larger social media research projects, LinkedIn for career development, Snapchat for teacher-student communication, and Pinterest to teach students how to use the platform. Students also reported a small minority of professors using Slack, blog platforms, and Vimeo. 

These findings reveal a disconnect between the social media platforms students report teachers using and the social media platforms students use most often. For example, Twitter ranked third on the list of platforms used most often by teachers, but it was last on the list of platforms used by students. Moreover, Instagram, LinkedIn, Snapchat, and Pinterest were platforms that students reported teachers using the least, but students’ use of these platforms was high in comparison. Instagram in particular ranked second on the list of platforms used by students. Additionally, comparison of the student social media usage data in this study with the recent Pew data (Perrin & Anderson, 2019) show that a greater percentage of communication students use almost all of the social media platforms (except Snapchat) more frequently than the general population of U.S. adults and their 18-24 year-old cohort (see Table 1). 

Study findings also demonstrate that teacher use of social media in the classroom has a positive effect on student perceptions of teacher credibility and technological coolness. When teachers adopted social media platforms that were not widely used in the classroom by other professors (i.e., Instagram, LinkedIn, Snapchat, Pinterest, etc.), the perceived technological coolness of the instructor increased. This finding is not surprising considering when a trend or technology is widely adopted it loses its coolness (Warren & Campbell, 2014). 

Leveraging social media platforms that are not widely adopted helped communication professors’ classroom experiences stand apart from the classroom experiences of other communication professors. This occurred because the social media technologies that are not widely used were perceived as being more original, unique and novel (i.e., original) and they were seen as considerably more hip and cutting edge (i.e., attractive). But, using different types of social media, whether or not they are widely adopted by other teachers is not necessarily going to create a unique subculture in the classroom. That is, students did not think the experiences with technology in communications classrooms assessed in this sample were different or unique from the classroom technology experiences of those who teach other subjects inside or outside communications. To create a subculture, teachers have to do something that is totally different and outside student’s expectations within the classroom. Even adopting newer social media channels doesn’t help professors create a classroom experience with technology that stands apart because these channels are the same options that everyone has (Sundar et al., 2014). 

While practitioners and educators agree that “staying up-to-date on technology is the single most important credential public relations educators can focus on” (Commission on Public Relations Education, 2018, p. 108), deviating from the norm or expected social media platforms most other teachers are using can result in positive perceptions of technological coolness. Like other socially constructed concepts, perceptions of technological coolness evolve and change (Sundar et al., 2014). Therefore, teachers should continually work to stay current on social media and find innovative ways for incorporating newer platforms into the curriculum. Much like brands and products that appropriately diverge from the norm in an effort to be cool (Warren & Campbell, 2014), this study shows teachers who deviate from the norm or expected social media platforms within the classroom can positively influence perceptions of technological coolness. 

When examining the impact of social media use on teacher credibility, the findings confirm and expand research by McArthur and Bostedo-Conway (2012) who found perceptions of teacher credibility were related to the instructor’s Twitter use. This study found that students who frequently use newer social media platforms evaluate teachers who use these same platforms as being more trustworthy and as having more goodwill than teachers who do not use these platforms in their classes. If professors do not use these newer platforms, then they run the risk of losing an opportunity to increase trust and goodwill among students who use these newer platforms. But, there is really no loss (or gain) of credibility for using social media that has become more ubiquitous.  

Finally, this study revealed that there is a significant, positive correlation between teacher credibility and technological coolness, as it relates to instructor use of social media in the classroom. As this finding highlights, these two student perceptions do not exist in isolation, but they vary together. While the data do not support a cause-effect relationship, they do provide evidence that, no matter what teachers may think about students’ perceptions of technological coolness, perceptions of faculty member credibility seem to be intertwined with perceptions of technological coolness.

Pedagogical implications. Examinations of teacher social media platform use in the classroom provide opportunities for all teachers to: 1) see what other professors are using to engage and communicate with students, 2) learn new, best practices, and 3) experiment with social media platforms that students taking communication courses are currently using.

Given this study’s findings, professors shouldn’t be afraid to experiment with platforms that are not mainstream among the general population but are widely adopted by students. Professors who were evaluated by students in this study are considered highly credible. By experimenting with different social media platforms, professors will not lose credibility, but by strategically choosing platforms that students frequently use, they can gain credibility in the classroom. Also, understanding what social media platforms students are using can help illuminate the dichotomy between teacher social media use and student use. Potential social media platforms for professors to consider including in pedagogical practices can be found in Table 1. The study’s qualitative data also provides insight into how professors can use these social media platforms (see Table 2).

Limitations. While this study provides a thorough statistical analysis of the data, more data from professors who use non-mainstream social media platforms would allow for broader statistical analyses and comparisons. Additionally, students were asked to respond about only one platform that one of their communication professors used, which limits data analysis regarding professors who used more than one social media platform in the classroom. Furthermore, students may not have understood the distinction between digital media and social media as they offered Blogger and Slack as other social media platforms in the open-ended question of the survey instrument.

Future research. Future research should examine when and how professors ought to adopt novel social media platforms as teaching tools, given that professors must make a significant investment of time and effort to learn how to incorporate these platforms into their classrooms to improve students’ perceptions of their credibility and technological coolness. In addition, while this study found evidence of a significant relationship between perceptions of teacher credibility and technological coolness, more research is needed to understand this correlation and the potential extraneous variables that could be contributing to the relationship. Also, future research should further examine the relevance of technological coolness by determining if it has an impact on learning outcomes, professor likability (e.g., official or informal student evaluations), course enrollment, and classroom engagement. Moreover, future research should explore whether technological coolness and credibility have implications for the professor’s perceived authenticity. Finally, future research should examine how social media use in the classroom affects perceptions of teacher autonomy and privacy.

References

Agozzino, A. (2012). Building a personal relationship through social media: A study of Millennial students’ brand engagement. Ohio Communication Journal, 50, 181–204.

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179-211. https://doi.org/10.1016/0749-5978(91)90020-T

Anik, L. (2018, January 31). A general theory of coolness. Brand Equity. https://brandequity.economictimes.indiatimes.com/be-blogs/a-general-theory-of-coolness/2860

Bean, J. P. & Eaton, S. B. (2000). A psychological model of college student retention. In J.M. Braxton (Ed.), Reworking the student departure puzzle (pp. 48-61). Vanderbilt University Press. 

Berger, J. (2008) Identity signaling, social influence, and social contagion. In M. J. Prinstein & K. A. Dodge (Eds.), Duke series in child development and public policy. Understanding peer influence in children and adolescents (pp. 181-199). Guilford Press.

Carr, C. T., Zube, P., Dickens, E., Hayter, C. A., & Barterian, J. A. (2013). Toward a model of sources of influence in online education: Cognitive learning and the effects of Web 2.0. Communication Education, 62(1), 61–85. https://doi.org/10.1080/03634523.2012.724535

Commission on Public Relations Education. (2018, April). Fast forward: Foundations + future state. Educators + practitioners: The Commission on Public Relations Education 2017 report on undergraduate education. http://www.commissionpred.org/commission-reports/fast-forward-foundations-future-state-educators-practitioners/

Daniels, C. (2018, October 1). PR is in a state of reinvention: Is it up to the challenge? PR Week. https://www.prweek.com/article/1493257/pr-state-reinvention-challenge 

Dar-Nimrod, I., Hansen, I. G., Proulx, T., Lehman, D. R., Chapman, B. P., & Duberstein, P. R., (2012). Coolness: An empirical investigation. Journal of Individual Differences, 33(3), 175–185. https://doi.org/10.1027/1614-0001/a000088 

DeGroot, J. M., Young, V. J., & VanSlette, S. H. (2015). Twitter use and its effects on student perception of instructor credibility. Communication Education, 64(4), 419–437. https://doi.org/10.1080/03634523.2015.1014386

Ewing, M., Kim, C., Kinsky, E. S., Moore, S., & Freberg, K. (2018). Teaching digital and social media analytics: Exploring best practices and future implications for public relations pedagogy. Journal of Public Relations Education, 4(2), 51-86.

Finn, A. N., Schrodt, P., Witt, P. L., Elledge, N., Jernberg, K. A., & Larson, L. M. (2009). A meta-analytical review of teacher credibility and its associations with teacher behaviors and student outcomes. Communication Education, 58(4), 516–537. https://doi.org/10.1080/03634520903131154

Floreddu, P. B., Cabiddu, F., & Evaristo, R. (2014). Inside your social media ring: How to optimize online corporate reputation. Business Horizons, 57(6), 737-745. https://doi.org/10.1016/j.bushor.2014.07.007

Fraustino, J. D., Briones, R., & Jansoke, M. (2015). Can every class be a Twitter chat?: Cross-institutional collaboration and experiential learning in the social media classroom. Journal of Public Relations Education, 1(1), 1–18. https://aejmc.us/jpre/2015/08/04/can-every-class-be-a-twitter-chat-cross-institutional-collaboration-and-experiential-learning-in-the-social-media-classroom-journal-of-public-relations-education/

Johnson, K. A. (2011). The effect of Twitter posts on students’ perceptions of instructor credibility. Learning, Media and Technology, 36(1), 21-38. https://doi.org/10.1080/17439884.2010.534798

Junco, R., Heiberger, G., & Loken, E. (2011). The effect of Twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27(2), 119-132.      https://doi.org/10.1111/j.1365-2729.2010.00387.x

Kim, C. M. (2018). Millennial learners and faculty credibility: Exploring the mediating role of out-of-class communication. Journal of Public Relations Education, 4(2), 1-24. https://aejmc.us/jpre/2018/08/17/millennial-learners-and-faculty-credibility-exploring-the-mediating-role-of-out-of-class-communication/

Kothari, A., & Hickerson, A. (2016). Social media use in journalism education: Faculty and student expectations. Journalism & Mass Communication Educator, 71(4), 413–424. https://doi.org/10.1177/1077695815622112

McArthur, J. A., & Bostedo-Conway, K. (2012). Exploring the relationship between student-instructor interaction on Twitter and student perceptions of teacher behaviors. International Journal of Teaching and Learning in Higher Education, 24(3), 286–292.      

McCorkindale, T. (2013). Will you be my friend? How public relations professors engage with students on social networking sites. Teaching Public Relations Monographs, 85.  http://aejmc.us/prd/wp-content/uploads/sites/23/2014/11/tpr85sp13.pdf

McCormick A. C., Kinzie, J., & Gonyea, R. M. (2013). Student engagement: Bridging research and practice to improve the quality of undergraduate education. In M. B. Paulsen (Ed.), Higher Education: Handbook of Theory and Research (Vol. 28, pp. 47-92). Springer. https://doi.org/10.1007/978-94-007-5836-0_2      

McCroskey, J. C., & Teven, J. J. (1999). Goodwill: A reexamination of the construct and its measurement. Communications Monographs, 66(1), 90–103. https://doi.org/10.1080/03637759909376464

McCroskey, J. C., & Young, T. J. (1981). Ethos and credibility: The construct and its measurement after three decades. Central States Speech Journal, 32(1), 24–34. https://doi.org/10.1080/10510978109368075

Perrin, A. & Anderson, M. (2019, April 10). Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/

Remund, D., & Freberg, R. (2013). Scholar as social connector effectively linking public relations theory and practice in this fast-changing digital world. Teaching Public Relations Monographs, 86. http://aejmc.us/prd/wpcontent/uploads/sites/23/2014/11/tpr86su13.pdf 

Schrodt, P., & Turman, P. (2005). The impact of instructional technology use, course design, and sex differences on students’ initial perceptions of instructor credibility. Communication Quarterly, 53(2), 177–96. https://doi.org/10.1080/01463370500090399

Schrodt, P., & Witt, P. L. (2006). Students’ attributions of instructor credibility as a function of students’ expectations of instructional technology use and nonverbal immediacy. Communication Education, 55(1), 1–20. https://doi.org/10.1080/03634520500343335

Schrodt, P., Witt, P. L., Turman, P. D., Myers, S. A., Barton, M. H., & Jernberg, K. A. (2009). Instructor credibility as a mediator of instructors’ prosocial communication behaviors and students’ learning outcomes. Communication Education, 58(3), 350–371. https://doi.org/10.1080/03634520902926851

Seaman, J., & Tinti-Kane, H. (2013). Social media for teaching and learning. Pearson. http://200.3.145.35/rid=1N90YDCQV-W1Y0T6-2743/social-media-for-teaching-and-learning-2013-report.pdf 

Stacks, D. W., & Bowen, S. A. (Eds.). (2013). Social media. Dictionary of Public Relations Measurement and Research (3rd ed.). Institute for Public Relations Measurement Commission. http://amecorg.com/wp-content/uploads/2013/09/Dictionary-of-Public-Relations-Measurement-and-Research-3rd-Edition-AMEC.pdf 

Sundar, S. S. (2008). Social psychology of interactivity in human-website interaction. In A. N. Joinson, K. Y. A. McKenna, T. Postmes & U-D. Reips (Eds.), The Oxford handbook of internet psychology (pp. 89–104). Oxford University Press.

Sundar, S. S., Tamul, D. J., & Wu, M. (2014). Capturing “cool”: Measures for assessing coolness of technological products. International Journal of Human-Computer Studies, 72(2), 169-180. https://doi.org/10.1016/j.ijhcs.2013.09.008     

Tatone, J., Gallicano, T. D., & Tefertiller, A. (2017). I love tweeting in class, but…. A qualitative study of student perceptions of the impact of Twitter in large lecture classes. Journal of Public Relations Education, 3(1), 1–13. https://aejmc.us/jpre/2017/05/24/i-love-tweeting-in-class-but-a-qualitative-study-of-student-perceptions-of-the-impact-of-twitter-in-large-lecture-classes/

Teven, J.J., & McCroskey, J. C. (1997). The relationship of perceived teacher caring with student learning and teacher evaluation. Communication Education, 46(1), 1–9. https://doi.org/10.1080/03634529709379069

The Plank Center for Leadership in Public Relations. (2019). North American communication monitor 2018-2019. http://plankcenter.ua.edu/north-american-communication-monitor/

Thweatt, K. S., & McCroskey, J. C. (1998). The impact of teacher immediacy and misbehaviors on teacher credibility. Communication Education, 47(4), 348–358.          https://doi.org/10.1080/03634529809379141

USC Annenberg Center for Public Relations. (2019). PR:Tech – The future of technology in communication. 2019 Global Communications Report. http://assets.uscannenberg.org/docs/2019-global-communications-report.pdf

Warren, C., & Campbell, M. C. (2014). What makes things cool? How autonomy influences perceived coolness. Journal of Consumer Research, 41(2), 543- 562.     https://doi.org/10.1086/676680

Waters, R. D., & Bortree, D. S. (2011). Exploring the impact of new media on out-of-class communication in public relations education. Teaching Public Relations Research, 80, 1-4. https://aejmc.us/wp-content/uploads/sites/23/2014/11/tpr80sp11.pdf

Wright, D. K., & Hinson, M. D. (2012). Examining how social and emerging media have been used in public relations between 2006 and 2012: A longitudinal analysis. Public Relations Journal, 6(4). https://instituteforpr.org/examining-how-social-and-emerging-media-have-been-used-in-public-relations-between-2006-and-2012-a-longitudinal-analysis/

Wright, D. K., & Hinson, M. D. (2017). Tracking how social and other digital media are being used in public relations practice: A twelve-year study. Public Relations Journal, 11(1), 1-30. https://prjournal.instituteforpr.org/wp-content/uploads/PRJ-2017-Wright-Hinson-2-1.pdf 

Zhang, A., & Freberg, K. (2018). Developing a blueprint for social media pedagogy: Trials, tribulations, and best practices. Journal of Public Relations Education, 4(1), 1-28. https://aejmc.us/jpre/2018/05/21/developing-a-blueprint-for-social-media-pedagogy-trials-tribulations-and-best-practices/

© Copyright 2021 AEJMC Public Relations Division

To cite this article: Brubaker, P.J., Sisson, D.C., Wilson, C. & Zhang, A. (2021). An Examination of Student Perceptions of Teacher Social Media Use in the Classroom. Journal of Public Relations Education, 7(1), 1-39.  http://aejmc.us/jpre/2020/12/22/accreditation-curriculum-and-ethics-exploring-the-public-relations-education-landscape/

A Simulation as a Pedagogical Tool for Teaching Professional Competencies in Public Relations Education

Editorial Record: Original draft submitted to JPRE October 2, 2019. R&R decision November 30, 2019. Revision submitted January 11, 2020. Manuscript accepted (with changes) for publication April 10, 2020. Changes received June 11, 2020. Final changes received July 17, 2020. First published online August 15, 2020.

Author

Aoife O’Donnell 
Faculty of Media Communications 
Griffith College
Dublin, Ireland
Email: aoife@vitalcommunications.ie

Abstract

Research indicates there are common competencies that are required by the public relations industry, such as business acumen, communication skills, and critical thinking. This study examined how the use of a simulation exercise could assist students in developing these competencies. The simulation exercise was blended with other pedagogical tools to assist in teaching crisis communications to a group of post-graduate public relations students in Ireland. A mixed methods methodology was used. Situational judgment tests were exclusively designed for this research, in consultation with a team of public relations professionals. These tests were used for the quantitative analysis while a focus group and reflection were used for the qualitative analysis. The exercise was found to have a positive effect on the development of competencies in students. The findings are useful for establishing competency standards for entry-level preparation and for identifying pedagogical approaches that may assist students in preparing for careers in the industry.

Keywords: public relations, pedagogy, competencies, situational judgment tests, experiential learning, blended learning

A Simulation as a Pedagogical Tool for Teaching Professional Competencies in Public Relations Education 

Higher education institutions are grappling with the challenges of meeting the modern learning needs of students and the ever-evolving demands of industry. Research indicates there are competencies needed within the public relations industry such as critical thinking and communication skills (Barnes & Tallent, 2015; Commission on Public Relations Education, 2018; Flynn, 2014; Madigan, 2017).  The purpose of this study was to explore if specific pedagogical techniques, including a simulation exercise, could assist students in developing these competencies. To achieve this aim, a research study was designed based on Picciano’s (2009) multimodal model of blended learning and Kolb’s (2015) experiential learning cycle. A simulation exercise was blended with other face-to-face and online pedagogical tools and used to assist in teaching students to manage media communications in a crisis situation. 

The study was conducted with post-graduate public relations students, using a concurrent mixed methods methodology. To assess the efficacy on the development of the competencies in students, situational judgment tests were designed specifically for this study, in consultation with a team of public relations professionals. A focus group and a reflection formed the qualitative analysis. The findings of the research are of benefit to the public relations industry in that they could help with the testing of competencies required at entry-level into the profession and in the identification of pedagogical approaches that have the potential to assist public relations students in preparing for careers in the industry. 

Competencies Required by the Public Relations Industry

The higher education sector has been in a period of significant transition over the last two decades as a result of the evolution of technology, widespread participation in education, and changing competency demands from the industry regarding entry-level preparation (Price Waterhouse Coopers, 2018; Strategy Group, 2011). Flynn (2014) postulated that 21st century public relations practitioners are required to have a “different skill set and competencies [than] their counterparts” who practiced before them (p. 363). In its 2018 report on the Workforce of the Future, Price Waterhouse Coopers (PWC) stated: “We are living through a fundamental transformation in the way we work. Automation and ‘thinking machines’ are replacing human tasks and jobs and changing the skills that organisations are looking for in their people” ( p. 3). In a survey of academic and industry leaders, the IBM Institute for Business Value found that 71% of industry recruiters had difficulty finding applicants with sufficient practical experience (King, 2015). The IBM study also revealed that the skills leaders required in the industry were the very skills graduates lacked: problem solving, collaboration and teamwork, business-context communication and flexibility, agility, and adaptability. In its most recent report on the needs within the PR industry, the Commission on Public Relations Education (2018) echoed those desires for problem solving—“the most desired abilities are creative thinking, problem solving, and critical thinking” (p. 15). Alongside problem solving, other communication skills are requirements for senior public relations professionals, whose role is ultimately to communicate on behalf of an organization in a written or oral manner. 

The essential skill of critical thinking is defined by the Foundation for Critical Thinking (n.d.) as:

That mode of thinking—about any subject, content, or problem—in which the thinker improves the quality of his or her thinking by skillfully analyzing, assessing, and reconstructing it. . . . Critical thinking is self-directed, self-disciplined, self-monitored, and self-corrective thinking. . . . It entails effective communication and problem-solving abilities. (para. 2) 

A study by Barnes and Tallent (2015) focused specifically on teaching critical thinking skills to Millennials (people born between 1981 and 2000) in public relations classes. They referred to an ability to think critically as vital in public relations professionals and recommended that it should be taught in communication courses.

Communication skills are clearly foundational in public relations, as can be seen in this definition of the field: “the art and social science of analyzing trends, predicting their consequences, counseling organization leaders and implementing planned programs of action which will serve both the organization’s and the public interest” (Theaker, 2016, p. 5). These “planned programs of action” can be interpreted as strategies that assist an organization in communicating its messages with its publics, including through the media. Thus, in addition to general oral and written communication skills, an ability to communicate specifically with the media is a vital skill required by all public relations professionals.

In addition to skills such as writing, content creation, and problem solving, the Commission on Public Relations Education’s (2018) latest report lists items entry-level PR practitioners need to know, including business acumen. The CPRE report defines the term business acumen as “understanding how business works, to provide the contextual significance of public relations” (p. 28). Following the Oxford Dictionary of English’s (n.d.) definition of acumen, competency in business acumen would indicate an ability to make good judgments and quick decisions that are appropriate in business. Business acumen has also been explained as a “good appreciation of business, business strategy, and business intelligence” (Gregory, 2008, p. 220). Flynn (2014) proffers that business acumen is a competency that has been widely reported in the literature and by industry professionals as important to public relations practice.  In their article published on the Institute of Public Relations’ (IPR) website titled “Public Relations and Business Acumen: Closing the Gap,” Ragas and Culp (2014) stated, “As the public relations industry evolves, the need for greater business acumen among professionals working in all levels of the field . . . has never been more important” (para. 1). They added that “to be a strategic partner to clients requires an intimate understanding of business, and how your counsel can advance organization goals and objectives” (Ragas & Culp, 2014, para. 1). 

Blended Learning and Learning Theory

While the industry is demanding graduates with more specific skills, higher education institutions are grappling with larger class sizes and a more diverse student population comprising a range of ages, genders, nationalities, and academic abilities (Strategy Group, 2011). To address the increased diversity of the student population, higher education is required to be more creative in its curricula design and in its teaching methods. As the Irish Strategy Group led by Dr. Colin Hunt stated, “we need new structures that better reflect the diverse learning requirements of our students” (p. 4). At an institutional level, this has resulted in a move away from the traditional didactic approach of teaching toward a more student-centered approach, involving a more interactive style of learning (Kember, 2009). This has translated to curriculum design that encourages active learning and employs pedagogical techniques that can assist in the development of what the Strategy Group refers to as the “high-order knowledge-based skills” (p. 4).  

Blended learning is an educational approach that combines traditional and contemporary teaching and learning methods. Cost effectiveness, access, flexibility, and an ability to address diverse student needs are cited among its benefits (Bonk & Graham, 2006). “Blended learning” is a term that has evolved in tandem with the evolution of technology over the last 20 years and although it has many definitions, it is most commonly used to describe a program or module where face-to-face and online teaching methods are combined (Partridge et al., 2011). 

There are many models of blended learning available, one of which is the multimodal model of blended learning (Picciano, 2009). This model offers clear direction on basic pedagogical objectives and approaches that can be employed to assist the instructor in achieving the required outcomes. This model recognizes the role of blended learning in addressing the varying learning needs in a group of learners. Picciano (2009) states that this model caters to the diverse needs of a modern classroom that may include different personalities, generations, and learning styles. In the multimodal model, six basic pedagogical objectives are recommended when designing a blended learning program, including content, social and emotional factors, dialectic/questioning, synthesis/evaluation (assignments/assessment), collaboration/student-generated content, and reflection. Picciano recommends teaching approaches to assist the learners and the teachers in meeting these objectives, including content management systems (CMS), multi-user virtual environments (MUVE), discussion boards, presentations, assessments, e-portfolios, wikis, blogs, and journals.

Blended learning is a style that is rooted in constructivist teaching and learning theory. According to Schunk (2012), “Constructivism requires that we structure teaching and learning experiences to challenge students’ thinking so that they will be able to construct new knowledge” (p. 274). Within the constructivist learning philosophy, several teaching and learning strategies have been proposed, with one of the most influential contemporary models being Kolb’s (2015) experiential learning cycle. The experiential learning cycle identifies four modes of learning the learner needs to transition through to develop a deep understanding of a topic. The modes are defined as concrete experience, reflective observation, abstract conceptualization, and active experimentation. Concrete experience involves the dissemination of information, for example, through a lecture or another means of content delivery. Abstract conceptualization refers to the development of the learner’s own thoughts. Reflective observation allows the learner to learn through reflecting on the information acquired, and during the active experimentation phase, the learner puts the learning into practice.

Public Relations Education 

The Commission on Public Relations Education (2018) made recommendations for designing and structuring higher education undergraduate public relations programs. It stated PR educational curricula should cover six essential topics, including introduction to or principles of public relations, research methods, writing, campaigns and case studies, supervised work experience or internships, and ethics. A course that has work experience incorporated into its curriculum would be ideally placed to provide students with the best opportunity to learn in the areas of campaigns and case studies, work experience/internships, and ethics, as they are topics that are more practical in nature. Many public relations courses offer a combination of theoretical content and work experience, presumably to prepare the students for industry by equipping them with both theoretical and practical knowledge. The question is then, what are the most appropriate pedagogical methods to use to equip students with this practical knowledge? 

Present research supports the use of creative teaching methods in the classroom to teach practical skills, such as simulations and what Barnes and Tallent (2015) referred to as “constructivist thinking tools” (p. 437). Their study offered examples of exercises, such as group work, discussions, reflective writing, and mind-maps in which students are encouraged to visualize information, group related items together, and identify problems and solutions as a result.

The word “simulation” can be used to define the “imitation of a situation or process” or “the production of a computer model of something, especially for the purpose of study” (Oxford Dictionary of English, n.d.). Evidence of the use of simulations in PR pedagogy as an “imitation of a situation or process” (Oxford Dictionary of English, n.d.) is more common. In an Australia-based study, Sutherland and Ward (2018) conducted research on the efficacy of using an immersive simulation as a pedagogical tool to provide students with practical experience of a media conference. In the study, they combined simulation tools such as role-play and immersive technology in which scenes from PR scenarios were projected onto the walls. They found that students enjoyed the experience, and it enhanced their learning and analytical skills. The students recommended the use of the pedagogical tools in the future. Similarly, Veil (2010) simulated a press conference held in response to a crisis. Role-based scenario simulations were the main simulation tool used in the study, which was conducted with communication students. Students reported finding the exercise beneficial to their learning, although some did report reservations about the spontaneous nature of the activity.  Another study in the U.S. found that crisis simulation can significantly increase students’ crisis management competencies. The author recommends simulation-based training could be used in other areas of public relations and should become part of the “pedagogical toolbox” (Wang, 2017, p. 107).

When assessing the specific competencies required by the PR industry, more interactive tools than the common written assignments might be required. For example, Bartam (2004) links competencies to performance and identifies workplace assessments and simulations as appropriate measurement tools. An example of an assessment format that has been used in the medical profession to measure non-academic attributes in medical graduates is the situational judgment test (SJT) (Patterson et al., 2016). Specific competencies the SJT can test include reasoning, problem solving, and decision making. An SJT comprises a hypothetical scenario (presented in written or video format) that medical graduates are likely to encounter in the workplace. Candidates are asked to identify the appropriateness or effectiveness of various response options from a predefined list. Patterson et al. (2016) recommend that response instructions for SJTs should fall into one or two categories: knowledge-based (what is the best option?/what should you do?) and behavior (what would you be most likely to do?). To ensure validity, the response options and scoring mechanism should be agreed upon in advance by industry experts. 

This study set out to explore the use of a simulation as a pedagogical tool and its ability to assist students of public relations in developing competencies required by the public relations industry. A constructivist pedagogical approach that used a specifically designed blended learning model, comprising a practical simulation at its core, was designed for this research and to assist students in developing the competencies identified. A situational judgment test was specifically designed for this study and used to assess the development of these competencies in students alongside a focus group and a reflective exercise. 

Method

This study sought to examine if the use of a simulation as a pedagogical tool could assist students of public relations in developing competencies required by the public relations industry. The research involved the use of a concurrent mixed methods methodology involving qualitative and quantitative techniques. The quantitative analysis was conducted through a situational judgment test (SJT) specifically designed to measure the competencies identified as required by the PR industry. The SJTs were designed to ultimately assess students for the competencies of critical thinking and media communication skills, and the questions were therefore centered around the development and communication of effective arguments in response to difficult questions. Qualitative methods included a focus group and a reflective exercise, which were used to analyze the students’ learning experiences and the development of the competencies of critical thinking, business acumen, and communication skills. The design of the research strategy was rooted in constructivist teaching and learning philosophy through the use of Kolb’s (2015) experiential learning cycle and Picciano’s (2009) multimodal model of blended learning.

Participants

The research was conducted over a two-month period as part of the standard curriculum delivery in a PR module. Sixteen full-time post-graduate students of public relations in Ireland volunteered to participate in the study. The group was approximately split 50% between male and female students, and all participants were within the 21-30 age bracket with limited to no relevant work experience. The participants were students of the researcher’s in the final semester of a one-year post-graduate module in public relations. The course is registered as a Level 9 course on the National Framework of Qualifications Grid as set by the National Qualifications Authority of Ireland (2020). As the lecturer for these students was also the conductor of the research, there may have been a potential for bias. However, to conduct the simulation, a facilitator was required who had the specific knowledge of and skills in the practice of public relations, and it was deemed that this requirement would outweigh any potential for bias. It was submitted to the Ethics Committee at Griffith College Dublin, and all participants indicated their understanding and agreement to participate by signing a consent form. Participants were offered the opportunity to revoke their consent at any stage during the process. All information provided by the participants was treated in the strictest of confidence. Data collected from the questionnaires was anonymized, and the data were not identifiable during the research process or in the findings presented. 

Research Design

The study was developed around the teaching of crisis management. A blended learning model was designed to ensure the simulation could be combined into the course in a manner that enabled the pedagogical objectives and learning outcomes of the public relations curriculum to be achieved. The program was designed using the pedagogical objectives and approaches outlined in the multimodal model of blended learning. These were then mapped against Kolb’s (2015) experiential learning cycle to direct the learning stages and approaches and ensure the model was rooted in learning theory. This process is illustrated and explained in Figure 1.  

Figure 1

Kolb’s Experiential Learning Cycle and the Multimodal Model of Blended Learning

In Picciano’s (2009) multimodal model of blended learning, six basic pedagogical objectives are recommended when designing a blended learning program, including content, social and emotional, dialectic/questioning, synthesis/evaluation (assignments/assessment), collaboration/student-generated content and reflection. Picciano proposes teaching approaches to assist the learners and the teachers in meeting these objectives, including CMS, MUVE, discussion boards, presentations, assessments, e-portfolios, wikis, blogs, and journals.

The process outlined above explains how this model was mapped against the four modes of learning identified in Kolb’s (2015) experiential learning cycle to produce a program of face-to-face and online pedagogical activity that could meet the learning objectives. The process can be broken down by examining each mode of the experiential learning cycle individually. For example, in this instance, the theory and relevant information delivered by the lecturer on the management of media relations in the event of a crisis provided the “concrete experience.” Figure 1 demonstrates the lecture and content that was made available on Moodle during these exercises also fulfilled two of the multimodal model’s pedagogical objectives of “content” and “social and emotional factors.” The multimodal model views “content” as “the primary driver of instruction” and states that it can be delivered and presented via numerous means (Picciano, 2009, p. 14). In this program, the content was delivered by using a lecture and PowerPoint slides and by making case studies and articles available on the course management software system. The delivery of the content through an in-class lecture also fulfills the “social and emotional” pedagogical objective of the multimodal model. The model stipulates that “social and emotional development is an important part of anyone’s education” and that even students on advanced graduate courses require “someone with whom to speak, whether for understanding a complex concept or providing advice” (Picciano, 2009, p. 14). Therefore, the diagram demonstrates that the delivery of the content using these face-to-face and online approaches meets the “content” and “social and emotional” pedagogical objectives of the multimodal model and falls under the “concrete experience” learning mode of Kolb’s experiential learning cycle. Figure 1 can continue to be followed in the same manner to examine each of the modes of experiential learning and the associated pedagogical objectives, as well as the learning approaches used to achieve them. 

To explain the timeline of the study, at the outset, students were presented with content on crisis management through a PowerPoint lecture. The lecture provided students with information about crisis management and steps as to how to communicate with the media on behalf of an organization in a time of crisis. Students were presented with a case study involving a data breach by an internationally renowned technology company. They watched a video relating to this crisis, followed by a Socratic discussion led by the lecturer. As identified in a study by Parkinson and Ekachai (2002), the leader of the Socratic discussion is required to have a knowledge of the subject, in addition to an understanding of how to conduct a Socratic discussion. The aim of this discussion was to assist the students in developing an understanding of the principles and concepts involved in representing an organization in the media in response to a crisis. Following the completion of the Socratic discussion, students were directed to work in groups to develop their media strategies to respond to the crisis.  Each group then worked together outside of the classroom and online in a collaborative forum where the groups posted their strategies to enable feedback from their peers and from the lecturer. 

An immersive simulation exercise then took place in which students assumed the role of the spokesperson for the organization in crisis in an interview with a professional news journalist. A camera was set up and operated by a professional camera technician. Microphones and lighting were connected to simulate a real-life television news interview situation. The students were split into two groups of eight. Each individual group member was then immersed in the experience as they were interviewed individually by the journalist and asked to put their learning into practice by responding to the crisis in a simulated live media interview. The journalist asked challenging questions, such as “When did you learn about this issue?,” “Why did it take so long to communicate with your customers?,” “How do you plan to prevent this from happening again?” Students were required to think quickly and revert to their key messages and their preparation to respond. Students had been given 24 hours’ notice to prepare their key message to simulate a real-life situation in which a spokesperson would often be given very short notice before a media interview. Each student was recorded on camera and observed and assessed as the interview took place. On completion of all eight interviews, a selection of excerpts from videos were played back for discussion and formative feedback. The process was repeated with the second group. Students were assessed on their performance and the mark/grade represented a percentage of their overall grade for the module.

Measurement

The methods used in this research were evaluated using bespoke scenario-based multiple-choice questionnaires (situational judgment tests, known as SJTs), a focus group, and a reflective exercise. At the commencement of the study, prior to the first lecture and again on completion of the study, students were directed online to complete the SJT. An SJT template was designed for the purpose of this research by a team of senior PR professionals who were assembled to consult on the scenarios, questions and answers, and scoring method for each test (see Appendix A). Scenarios were drafted and questions were formulated around these scenarios. Critical thinking and media communications skills were the core competencies that were measured in the SJTs. The questions were designed to demonstrate an ability to make effective arguments to support the key messages that the students were attempting to communicate. In line with best practice as identified in the literature review on SJTs, the questions were set into the two categories of knowledge (what is the best option?/what should you do?) and behavior (what would you be most likely to do?). Answers were proposed for each question, and the expert team reached a consensus on the most appropriate answers for each question. A scoring key was then developed for each test in order to group student responses into the categories of excellent, good, satisfactory, and poor. Students’ answers were analyzed and counted on completion of the first test, and responses were compared to those of the second test on completion of the entire study to provide a quantitative analysis on the development of each of the predefined competencies. The process of designing the SJT in consultation with industry experts ensured the validity of these tests in their use for the first time as tests to measure competencies in PR students. Examples of the tests and scoring key are available in Appendix A.

Following completion of the simulation and the second SJT, students were afforded the opportunity to reflect on their performance and the learning experience in an online exercise. All participants watched their performance through a secure video link on their own time and in privacy. Students then completed an online reflection, the objective of which was to inform the research as to the development of the competencies of critical thinking, business acumen, and communication skills. The reflection also served as a learning exercise for the students to encourage a deeper learning experience. The reflection consisted of a question asking the students to provide their opinions, in no more than 500 words, on the simulation exercise. 

Finally, on completion of the study, students participated in a focus group to discuss their perceptions of the learning experience and the impact they felt it had on the development of the competencies identified (see Appendix B). According to Daymon and Halloway (2011), the purpose of a focus group is “to concentrate on one or two clear issues or objects and discuss them in depth” (p. 241). In addition to offering insight as to students’ perceptions, the focus group was a useful exercise in itself for students in using and developing critical thinking skills. Eight students participated in the focus group, which was facilitated, recorded, and transcribed by the lecturer.  

Results 

The overall objective of this study was to ascertain if the use of a simulation could assist students in developing the competencies required by the public relations industry. Overall, the results show that the questions in the situational judgment test that were most focused on critical thinking and media communication skills showed slight improvements. The development of business acumen was not evidenced in the SJTs specifically; however, the development of this competency was inferred from the results of the qualitative analysis. 

In an effort to quantify any change in student performance from SJT I to SJT II, it was necessary to develop a standardization that was not sensitive to the different number of students in each. Direct comparison is not possible with two different student totals and the small data sets. Thus, each student who received a “poor” score was given one point, two points were given for each “satisfactory,” three for “good,” and four for excellent. The point total was then divided by the total number of students (16 for SJT I; 13 for SJT II) to determine the dimension’s mean score. Since the mean scores are a measure of the overall performance on the SJT and are not sensitive to different response totals, they allow for direct comparison. 

Figure 2 shows a comparison of the means between SJT I and SJT II for each of the six dimensions, while Table 1 presents the numerical scores for each measure of student performance. The number of students receiving that score for each of the six dimensions is reported. Using the scale described in the previous paragraph, a point total for the student performance for that SJT is attained and a mean score is calculated.  The final column gives the difference in the mean scores for SJT I and SJT II. The mean score differences for five of the six dimensions were positive, indicating improved student performance. The greatest increase in student performance was for Dimension 2: Key Messages. Only one dimension, Aftermath, showed a negative difference, demonstrating lower mean scores on the second SJT. 

Table 1

SJT I and II Scores and Standardized Mean Scores

Figure 2

Overall Means for SJT 1 and SJT 2

The first question asked in the SJTs was centered around research. The question asked participants to explain how they would approach the fact-gathering exercise involved in crisis communication management. There was an improvement of one participant achieving an excellent result in this question between the first and second questionnaires. The second question was focused on the development of key messages and asked students to identify the three most important key messages. In this question, there was an improvement of six people in those achieving an excellent result in the second questionnaire. Question three in each SJT asked participants to explain how they would approach the media. This question was the most difficult for participants with limited experience in media communication, and the results are perhaps indicative of this with five fewer students achieving an excellent result. However, five more students received a good result in SJT II in this question. Questions four and five were centered around the media response and the arguments to make within the media. In both these questions, there were slight improvements of three and two participants, respectively, in those achieving an excellent result. Finally, in question six, participants were asked to explain how they would manage communications in the aftermath of the crisis. The majority of participants in both SJTs achieved a satisfactory to excellent result with two fewer participants in the poor category in SJT II.  

Analysis

Critical Thinking

Two questions within the SJTs were specifically focused on critical thinking (questions 4 and 5).  These questions centered around the response to and the making of effective arguments in the media. In the first of these questions, three more participants in question 4 and two more participants in question 5 achieved an “excellent” result in the second test. 

The next of these questions related to the construction of arguments. In this question, two more students received an “excellent” result in SJT II compared to the same style of questions in SJT I. The content of these questions is detailed in Appendix A. 

The increase in the number of students achieving good and excellent scores in the second test for both these questions could indicate that the exercise had a positive impact on the participants’ ability to think critically. In addition, in the qualitative analysis through the reflection and the focus group, students pointed to critical thinking as one of the learning achievements from the exercises. For example, when asked what they learned from the experience, one student said, “being creative in thought—creative mentally,” while another mentioned “on-the-spot critical thinking.” The observations of the students indicate the immersive nature of the simulation exercise impacted their critical thinking skills.  For example, a student cited a key learning takeaway from the activity was “applying your skills in the outside world.” 

Business Acumen

In this study, business acumen was largely demonstrated through the students’ conveyed understanding of the challenges that businesses face in the event of a crisis and in communicating with the public through the media as a result. One student said, “It was an eye-opener to find solutions to problems other companies are facing. It was practical.” Another stated, “If I were working in a massive organization that had this crisis and I’m approached by the media, even without them informing me in time, I would have something to say. It was of immense benefit for me.” These comments indicate students developed a better understanding of how a situation like this might affect a business and how it could protect its reputation in the media as a result.  

Communications Skills

The effect of the exercise on media communication skills can be seen in the responses to this question on key message development. The results for this question in SJT II indicated an increase of six people in those selecting all three most appropriate key messages (or an “excellent” result). The content of this question is detailed in Appendix A. Students also referenced the importance of key messages several times within their feedback during the focus group and reflections. For example, one student said, “I learned how important it is to have key messages that you can refer to when answering tricky questions,” while another said: “I was pleased with how I communicated my message. I thought that I reverted back to the key messages when in a difficult corner.” 

In addition to media communication skills, the qualitative analysis offered insight into the impact of the exercise on students’ verbal and non-verbal communication skills. Verbal communications were assessed through the students’ ability to make effective arguments during the simulation and to effectively express their key messages they had prepared in advance. Non-verbal communication consisted of tone of voice, hand gestures, body language, and facial expressions. The majority of students referenced communication skills as a key takeaway and focused heavily on this in their reflections and the focus group. Students discussed the importance of content, such as communicating their top-line and three key messages, and they addressed style, such as speaking clearly and slowly in concise sentences. One student commented, “I sometimes talked more than needed, so going forward I could stop sooner when I was happy with my answer.”

The analysis also reveals that there was a tendency for students to be self-critical of their non-verbal communication skills, more so than their verbal communication skills. This is evidenced in the following comments: “I think that at times my facial expressions during the questioning were a little bit distracting, so I would try and keep a less expressive face next time” and “I assumed my body language was OK but I realized there were some mistakes after I watched the video.”

Discussion

The higher education sector worldwide is endeavoring to meet the learning requirements of a technologically savvy and increasingly diverse student demographic. Simultaneously, the sector is challenged with ensuring higher education graduates can bring modern relevant competencies required by industry with them into entry-level positions upon graduation. There is evidence to suggest the industry is actively seeking competencies in new entrants to the PR profession that can also be difficult to teach such as business acumen, communication skills, and critical thinking.  The objective of this research was to ascertain if a simulation could assist students in developing these competencies that are required by the public relations industry. 

To investigate this, a blended learning model was designed that was based on the multimodal model of blended learning and mapped against Kolb’s (2015) experiential learning cycle. In this model, a simulation exercise was blended with other face-to-face and online pedagogical tools to teach students how to manage media communication in the event of a crisis. To analyze the efficacy of this model in assisting in developing these competencies, a concurrent mixed methodology was employed using both quantitative and qualitative data collection methods.  Qualitative methods included a reflection and a focus group. The quantitative method implemented in the research was an SJT. To ensure its validity for use in this research, the test was designed exclusively in consultation with a team of public relations professionals to test for competencies required at the entry level in the public relations profession.

Limitations and Future Research

This study was limited by the size of the sample group and the duration of the study. A more detailed study using a larger group, including a control group over a longer period of time would offer further insight into the efficacy of the methods used in this study on the development of competencies required in the PR industry. The results of the exercises, however, indicated the activity had a positive impact on the development of key competencies in students. The qualitative analysis, which included the student reflections and focus groups, offered an indication the students themselves felt the exercises had an impact on their learning experience and assisted them in developing their business acumen, critical thinking, and communication skills. 

Although the students indicated they sensed an impact, the SJTs did not show an impact on the development of business acumen among participants. Future investigation would be required to ascertain the most appropriate measurement tool to analyze the development of this competency. The tests did demonstrate slight improvements in the competencies of critical thinking and media communication skills. These tests could be developed further for use as an assessment tool within a public relations curriculum to teach students to consider how they would respond to difficult questions in media interviews or in crisis situations. The tests require students to think of solutions or arguments to difficult scenarios quickly and, combined with a simulation exercise, this pedagogical approach may be particularly useful in teaching students how to manage common practical problems faced by public relations practitioners. It is worth noting that the comments from participants in the reflections indicated the students tended to be self-critical of their body language; future studies should encourage educators to guide students in order for them to recognize the importance of nonverbal communication, but help them not to focus on it to the exclusion of other elements of their message delivery.

In addition to the pedagogical benefits, the SJTs may also contribute positively to the PR industry in that they could be used by employers to test interviewees for competencies in specific areas. A standardized SJT could contribute positively to the PR industry and increase employability of students. They could be designed to complement CPRE’s list of competencies as tests for employers when interviewing new entrants to the industry and as class assignments for more practical subjects such as crisis communications and medical skills.

Another example of simulation as “the production of a computer model of something” can be seen in the emerging technologies of virtual reality (VR) and augmented reality (AR), which could be the topic of future research. VR allows users, through the use of a headset, to immerse themselves completely in an alternative reality. AR allows the user to bring elements of the artificial world into the real world. Both technologies are being used in education in the STEM disciplines, but there is little evidence cataloging their use in the teaching of public relations. Research in this area of PR education could offer insight as to whether simulations of this nature could be beneficial in teaching media communication skills and critical thinking by enabling learners to immerse themselves in computer- or video-generated common scenarios, such as press conferences or media events.

The results of this research benefit the public relations industry and public relations education.  In relation to experiential and blended learning, this research offers an insight as to how simulations and situational judgment tests can be used as a form of active experimentation and assessment. In terms of public relations education, the findings offer insight to educators as to the most appropriate pedagogical and assessment approaches that can be implemented to assist students in developing competencies required by the public relations industry and thus assist in increasing students’ employability. Further research at an industry level would help define the competencies and qualifications required, and additional research at an educational level could help set standards in best practice in public relations pedagogy. 

References

Barnes, J. J., & Tallent, R. J. (2015). Think bubbles and Socrates: Teaching critical thinking to Millennials in public relations classes. Universal Journal of Educational Research, 3(7), 435-441. https://doi.org/10.13189/ujer.2015.030702 

Bartam, D. (2004). The SHL Universal Competency Framework [White paper]. SHL Group Limited. http://connectingcredentials.org/wp-content/uploads/2015/02/The-SHL-Universal-Competency-Framework.pdf

Bonk C., J., & Graham C., R. (2006). The handbook of blended learning: Global perspectives, local design. Pfeiffer.

Commission on Public Relations Education. (2018). Fast Forward: Foundations + future state. Educators + practitioners: The Commission on Public Relations Education 2017 Report on undergraduate education. http://www.commissionpred.org/wp-content/uploads/2018/04/report6-full.pdf

Daymon, C., & Halloway, I. (2011). Qualitative research in public relations and marketing communications (2nd ed.). Routledge.

Flynn, T. (2014). Do they have what it takes? A review of the literature on knowledge competencies and skills necessary for twenty-first-century public relations practitioners in Canada. Canadian Journal of Communications, 39, 361-384. 

Foundation for Critical Thinking. (n.d.). Our conception and definition of critical thinking. http://www.criticalthinking.org/pages/our-conception-of-critical-thinking/411&nbsp;

Gregory, A. (2008). Competencies of senior communication practitioners in the UK: An initial study. Public Relations Review, 34(3), 215-223. https://doi.org/10.1016/j.pubrev.2008.04.005

Kember, D. (2009). Promoting student-centred forms of learning across an entire university. Higher Education, 58, 1-13. https://doi.org/10.1007/s10734-008-9177-6  

King, M. D. (2015, July 17). Why higher education and business need to work together. Harvard Business Review. https://hbr.org/2015/07/why-higher-ed-and-business-need-to-work-together 

Kolb, D. A. (2015). Experiential learning: Experience as the source of learning and development (2nd ed.). Pearson. 

Madigan, P. (2017). Practitioner perspectives on higher education as a preparation for employment in public relations in Ireland. [Doctoral thesis, University of Sheffield]. https://pdfs.semanticscholar.org/4b85/bf19a783bb645b80d78bf3232c35b4fe066e.pdf

National Qualifications Authority of Ireland. (2019). Grid of level indicators.   https://www.qqi.ie/Downloads/NFQLevelindicators.pdf

Oxford Dictionary of English. (n.d.). Acumen. https://en.oxforddictionaries.com/definition/acumen.&nbsp;

Oxford Dictionary of English. (n.d.). Simulation. https://en.oxforddictionaries.com/definition/simulation

Patterson, F., Zibarras, L., & Ashworth, V. (2016). Situational judgement tests in medical education and training: Research, theory and practice: AMEE guide no. 100. Medical Teacher 38(1), 3-17. https://doi.org/10.3109/0142159X.2015.1072619 

Parkinson, M. G., & Ekachai, D. (2002). The Socratic method in the introductory PR course: An alternative pedagogy. Public Relations Review, 28(2), 167-174. https://doi.org/10.1016/S0363-8111(02)00123-6

Partridge, H., Ponting, D., & McCay, M. (2011). Good practice report: Blended learning. Australian Learning and Teaching Council. http://eprints.qut.edu.au/47566/1/47566.pdf 

Picciano, A. G. (2009). Blending with purpose: The Multimodal Model. Journal of Asynchronous Learning Networks, 13(1), 7-18. https://www.learntechlib.org/p/104026/ 

Price Waterhouse Coopers. (2018). Workforce of the future: The competing forces shaping 2030. https://www.pwc.com/gx/en/services/people-organisation/workforce-of-the-future/workforce-of-the-future-the-competing-forces-shaping-2030-pwc.pdf 

Ragas, M., & Culp, R. (2014, December 22). Public relations and business acumen: Closing the gap. Institute for Public Relations. https://instituteforpr.org/public-relations-business-acumen-closing-gap/ 

Schunk, D. (2012). Learning theories, An educational perspective (6th ed.). Pearson. 

Strategy Group. (2011). National strategy for higher education to 2030.  Department of Education and Skills. https://www.education.ie/en/publications/policy-reports/national-strategy-for-higher-education-2030.pdf

Sutherland, K., & Ward, A., (2018). Immersive simulation as a public relations pedagogical tool. Asia Pacific Public Relations Journal, 19,  66-82. 

Theaker, A. (2016) The public relations handbook. (5th ed.). Routledge.

Veil, S. R. (2010). Using crisis simulations in public relations education. Communication Teacher, 24(2), 58-62. https://doi.org/10.1080/17404621003680906

Wang, M. (2017). Using crisis simulation to enhance crisis management competencies: The role of presence. Journal of Public Relations Education, 3(2), 96-109. https://aejmc.us/jpre/2017/12/29/using-crisis-simulation-to-enhance-crisis-management-competencies-the-role-of-presence/

Appendix A

Situational Judgement Tests

The National Vegan Association has launched a national campaign to raise awareness on animal rights and promote veganism. The campaign includes high-visibility outdoor advertising activity that uses a range of emotive posters to encourage people to cease meat and dairy consumption and to convert to veganism.

The organisation’s spokesperson has been in the media (radio, TV, print and online) discussing the new ad campaign and the rationale behind it. The organisation’s central argument is that the widespread consumption of animal products is having a catastrophic effect on the environment. The source of the Vegan group’s funding is unclear. 

You are the public relations officer/consultant for the National Farmers’ Association, who view this as a potential crisis situation. The Farmers’ Association is concerned that the Vegan Association is communicating information that could be harmful to the business of its members.

Please outline your PR strategy in response to this crisis by responding to the following questions.

Please answer all questions with a view to what the best course of action should be and do not base your answers on your own personal beliefs. For example, if you yourself agree with the vegans or the meat-eaters, it is of no relevance to this test.

  1. Research

The first step in managing a crisis is to gather the facts. Rank the actions you would take in order of priority below. (1 = most effective, 2 = very effective, 3 = quite effective, 4 = slightly effective and 5 = least effective).

A: Check media (including social media) and analyse coverage.
B: Find out what the best practice is in your organisation and check if there is a precedent for this activity in other countries.
C: Pull together a crisis management team consisting of the most informed people in the organisation on this topic, brief them on the situation and acquire their feedback.
D: Contact a journalist for an “off-the-record” chat on the topic. Investigate the potential of running a negative story about the vegan group.
E: Contact the vegan group, away from media view, to discuss and try and silence the conversation.
  1. Key Messages

Your key messages should aim to present the organisation’s business objectives and protect the reputation of your organisation and its members.

Choose the THREE most appropriate key messages that you think would be most effective in your communication with the media (all three choices are equal in importance).

A: There are many benefits to eating meat and dairy.
B: Vegans are prone to various health issues.
C: The source of the vegan group’s funding is not clear.
D: The importance of farming and agriculture to the economy.
E: A list of top ten healthy meat and dairy recipes.
  1. Media Strategy

As part of its campaign, the vegan group has also cited a report stating that the public’s consumption of meat and dairy is harming the environment.

Please rank the most appropriate media approaches below (1 = most appropriate, 5 = least appropriate).

A: Host a press conference to announce your response to the ad campaign and state your case. Invite all media to attend.
B: Contact a select number of trusted journalists and arrange to set up feature interviews with them in which you set out your key messages and evidence-based arguments.
C: Contact a prime-time current affairs show and request a live debate between the heads of the two organisations.
D: Issue a press statement to all media criticising the vegan campaign and dismissing its arguments.
E: No comment.
  1. Response to Media

There has been some discussion in the media regarding the sources of funding for the vegan group’s campaign. The vegan group has not disclosed its sources.

During an interview, a journalist cites a recently published report in which it states that meat consumption must decrease significantly to avert a climate catastrophe. The journalist has asked you, as the representative for the Farmers’ Association, for your response to this report.

Choose the THREE most appropriate responses below:

A: Highlight the lack of transparency in the vegan group’s finances.
B: You agree that sustainable farming is important, but this country has one of the most sustainable records in the world.
C: Question the accuracy of the vegan group’s research.
D: Agree with the seriousness of some of the issues presented in the report, but outline the health benefits of meat and dairy consumption.
E: Present research and studies supporting meat and dairy consumption.
  1. Arguments

Your arguments should assist the interviewer and the listener/reader in understanding your key messages. Choose the THREE most appropriate arguments to support your key messages below:

A: An emeritus professor of agricultural policy at Trinity College Dublin has said that Ireland’s agriculture is mostly grassland-based and there is no need for a reduction of 90% in meat consumption.
B: A renowned economist from the London-based Institute of Economic Affairs, an organisation funded by the tobacco industry said that the potent combination of nanny state campaigners, militant vegetarians and environmental activists poses a real and present danger to a free society.
C: Prior to the release of the findings of this report, the Irish Prime Minister had said that he was cutting down on his meat consumption and increasing his intake of vegetables.
D: The Minister for the Environment, said it’s really important that agriculture has a long-term strategy as to how it can contribute to decarbonisation and be competitive in an environment when people’s choices and expectations may be different.
E: A report published by a renowned environmental group has outlined a clear strategy for the reduction of greenhouse gas emissions in this sector in Ireland.
  1. Aftermath

The immediate crisis is over and media attention has been diverted to another issue. Rank the most appropriate course of action for your organisation now (1= most appropriate, 5 = least appropriate).

A: Correct a journalist on one radio interview in which on one occasion, they used an incorrect name for one of your representatives.
B: Assess and analyse the media coverage and the reaction of your stakeholders/audiences.
C: Immediately launch a high visibility campaign informing people of the benefits of consuming meat and dairy.
D: Seek corrections in any significant inaccuracies in the media coverage.
E: Conduct research to support your arguments and launch a campaign promoting the benefits of consuming meat and dairy products.

Situational Judgement Test II

You are the public relations manager/communications officer for an international technology company and leading producer of smartphones.

One of your phone products, which is already on the market, has been found to have a defect in the batteries. The company has already sold over two million devices, but there have been reports of fires breaking out with some. As a result, all the phones now have to be recalled at a cost of over $5 million.

Please respond to the questions below to explain how you would manage this crisis.

Please answer all questions with a view to what the best course of action should be and do not base your answers on your own personal beliefs.

  1.  Research

The first step in managing a crisis is to gather the facts. Rank the first steps you would take to manage this crisis in order of priority below. (1 = most effective, 2= very effective, 3 = quite effective, 4 = slightly effective and 5 = least effective).

A: Check media (including social media) and analyse coverage.
B: Find out what the best practice is in your organisation and check if there is a precedent for this activity here or in other countries.
C: Pull together a crisis management team consisting of the most informed people in the organisation on this topic, brief them on the situation and acquire their feedback.
D: Contact a journalist for an “off-the-record” chat on the topic.
E: Contact the people who have been affected, away from the eyes of the media.
  1. Key Messages

Your key messages should aim to present the organisation’s business objectives.

Choose the THREE most appropriate key messages that you think would be most effective in your communication with the media (all three choices should be equal in importance).

A: We are conducting an investigation, which will result in the development of even better and safer phones.
B: Our phones aren’t the only ones on the market with safety concerns. There are some safety issues that we are aware of with competitor phones.
C: We have launched an investigation into the problem.
D: We can assure customers that there are no other phones or products at risk.
E: A list of the top five safety features of this product.
  1.  Media Strategy

You have conducted an extensive investigation into the issue and are now ready to release the results. Please rank the most appropriate media approaches below (1 = most appropriate, 5 = least appropriate).

A: Announce a press conference and invite all media to attend.
B: Contact a select number of trusted journalists and arrange to set up interviews with them in which you set out your key messages and evidence-based arguments.
C: Contact a prime-time current affairs show and request a live interview on the topic.
D: Issue a press statement to all media highlighting safety issues with competitor phones.
E: No comment.
  1.  Response to Media

In an interview about phone safety, a journalist has thrown you a curve-ball. The journalist has decided to ask you for your views on a recently published report from a reputable medical organisation into mobile phone usage. The report warns parents to limit screen-time for children due to health risks. The journalist has asked you, as the representative of a leading manufacturer of mobile devices, for your response to this report.

Choose the THREE most appropriate responses below:

A: Dismiss the findings of this report.
B: You agree that monitoring children’s phone usage is important.
C: Question the accuracy of this research.
D: Encourage responsible usage of phones amongst children.
E: Highlight some of the benefits of phone use for children, once usage is controlled by guardians.
  1.  Arguments

Choose the THREE most appropriate arguments to support your messages:

A: The Royal College of Paediatrics and Child Health recommended time-limits and a curfew on “screen-time,” but said parents need not worry that using the devices is harmful.
B: Experts say that looking at screens such as phones, tablets or computers in the hour before bed can disrupt sleep and impact children’s health and wellbeing. Spending long periods on the gadgets is also associated with unhealthy eating and a lack of exercise.
C: Parents are often told that gadgets can pose a risk to their children, but they can in fact be a valuable tool for children to explore the world. Nevertheless, screen time should not replace healthy activities such as exercising, sleeping and spending time with family.
D: A review published by the British Medical Journal found “considerable evidence” of an association between obesity and depression and higher levels of screen time.
E: Although there is growing evidence for the impact of phone usage on some health issues such as obesity, evidence on the impact of screen-time on other health issues is largely weak or absent.
  1.  Aftermath

The immediate crisis is over and media attention has been diverted to another issue. Rank the most appropriate course of action for your organisation now (1= most appropriate, 5 = least appropriate).

A: Correct a journalist on one radio interview in which on one occasion, they used an incorrect name for one of you representatives.
B: Assess and analyse the media coverage and the reaction of your stakeholders/audiences.
C:  Immediately launch a high visibility campaign informing people of the safety features of your phones.
D: Seek corrections in any significant inaccuracies in the media coverage.
E: Analyse the findings of the investigation and launch a campaign to communicate the findings and the new safety measures in place as a result.

Appendix B

Focus Group Questions

  1. Do you think you were well prepared for the interview simulation exercise? 
  2. Did you enjoy the interview simulation exercise?  
  3. What did you like most about it?  
  4. What did you like least about it? 
  5. Do you think you learnt from the exercise?
  6. What is the key thing that you think that you learnt from this experience and that you will take into the future when you graduate?  Give an example.
  7. Rate the experience from 1 (poor) to 5 (excellent) in terms of your enjoyment of the learning experience.
  8. Rate the experience from 1 (poor) to 5 (excellent) in terms of the learning you think you achieved.

© Copyright 2020 AEJMC Public Relations Division

To cite this article: O’Donnell , A. (2020). A simulation as a pedagogical tool for teaching professional competencies in  public relations education. Journal of Public Relations Education, 6(2), 66-101. http://aejmc.us/jpre/2020/08/13/a-simulation-as-a-pedagogical-tool-for-teaching-professional-competencies-in-public-relations-education/

Millennial Learners and Faculty Credibility: Exploring the Mediating Role of Out-of-Class Communication

Editorial Record: Original draft submitted to the AEJMC-PRD Paper Competition by April 1, 2017. Selected as a Top Teaching Paper. Submitted to JPRE Nov. 27, 2017. Final edits completed July 13, 2018. First published online August 17, 2018.

Author

Carolyn Kim, Biola University

Abstract

Every generation experiences distinct events and develops unique values. As Millennial learners enter classrooms, they bring with them new views about education, learning and faculty/student communication. This study explores the mediating role of out-of-class communication (OCC) in relation to the historical dimensions known to compose faculty credibility. Findings indicate that OCC has a positive, mediating influence that enhances two of the three key dimensions of credibility for faculty members: trustworthiness and perceived caring. In addition, this study suggests that there is a fourth potential dimension that composes the construct of faculty credibility in the perspectives of Millennial learners: sociability, which should be included alongside the three historical dimensions scholars have used in previous studies.

Millennial Learners and Faculty Credibility: Exploring the Mediating Role of Out-of-Class Communication

The landscape of higher education constantly shifts. Shaping influences include increased faculty loads, diminished budgets, and limited resources (Kim, 2015; Swanson, 2008). A lesser-examined element, however, is the generational influence from Millennial learners. According to Pew Research Center, Millennials were born between 1981 and 1997 (Fry, 2016). As these students have filled classrooms, the educational environment and pedagogical approaches of faculty have pivoted to address the unique needs of Millennials (Kim, 2017b). One particular area of change is the emphasis on out-of-class communication (OCC) between faculty members and students. Scholars suggest OCC is a significant element for students, as it leads to increased learning and immediacy with faculty (Jaasma & Koper, 2002). Formerly faculty were viewed as the “sage on the stage” and espoused wisdom for students to gain. Now they are viewed as a “guide on the side” and encouraged to facilitate a process where students co-create a learning environment (Jaasma & Koper, 2002; Kim, 2017a). These changes have resulted in a new paradigm for learners. Due to these changes, re-examining the construct of faculty credibility in light of Millennial learners, as well as examining the mediating influence of OCC on faculty credibility, is significant.

LITERATURE REVIEW

In order to fully explore this issue, there are three significant bodies of scholarship to examine: 1) generational identity; 2) faculty credibility; 3) out-of-class communication.

Generational Identity

A growing focus among scholars has been the concept of how individuals self-subscribe into social groups within organizational settings. Scholars suggest social identities are self-designated by individuals “to impose order on the social environment and make sense of who they are” (Urick, 2012, p. 103). While there is significant focus in social identity theory that looks at classifications related to constructs such as in-groups and out-groups, race, and gender (Urick, 2012), there is an increasing need to understand generational identities, which can be defined as “an individual’s awareness of his or her membership in a generational group and the significance of this group to the individual” (Urick, 2012, p. 103).

Each generation has distinct values and attitudes that manifest via their interactions with others in organizational settings (Smola & Sutton, 2002). Kowske, Rasch, and Wiley (2010) suggest that Millennial learners are connected due to the fact that they shared key common experiences at significant development points which led to unique characteristics:

Millennials embody an age-based generational identity that has grown through strong formative influences, including parental styles that allowed them a strong voice in family decisions, nurtured their egos and self esteem, and encouraged cooperation and team oriented behavior. (Gerhardt, 2016, p. 3)

Faculty have recognized these shaping influences in Millennial learners and suggest that a shift is required to provide “nuanced pedagogies” that will provide the strongest learning environment possible (Miller-Ott, 2016; Wilson & Gerber, 2008, p. 29).

Sociability and Millennial learners. With this shift in pedagogies, faculty now are tasked with creating learning environments that Millennial learners will feel comfortable contributing to and voicing opinions in, rather than approaching education as lecture-based experiences with an instructor providing content for students to absorb (Gerhardt, 2016). In short, this kind of engaged learning environment is “essential to a successful experience for Millennials in the classroom, and this generation has a strong need to be heard, recognized and included” (Gerhardt, 2016, p. 4). Additionally, Millennial learners expect “more frequent, affirming communication with supervisors compared to previous generations” (Gerhardt, 2016, p. 4; Hill, 2002; Jokisaari & Nurmi, 2009; Martin, 2005). In other words, Millennial learners place a high value on sociability, or the opportunity to interact, connect, and engage with leaders. This value of sociability is higher than previous generations and drastically influences their satisfaction, motivation and commitment to environments (Gerhardt, 2016; Kim, 2017b). In some ways, the concept of sociability is closely aligned with the idea of immediacy.

Immediacy. Immediacy has been defined as “those communication behaviors that reduce perceived distance between people” (Thweatt & McCroskey, 1996, p. 198). A number of scholars have explored the influence of immediacy within the context of faculty/student relationships (e.g., Christensen & Menzel, 1998). In the context of Millennial learners, however, immediacy seems to incorporate concepts that were not as prevalent for earlier generations. Thus, sociability, or the desire to have a voice, receive feedback and interact, are key components for Millennial learners’ perspective of immediacy. In the context of this paper, sociability is used to represent immediacy viewed through the lens of Millennial learners’ expectation of two-way communication, which includes gaining a voice in decision making.

In summary, Millennial learners represent an age-based generational identity that is prevalent in higher education today. Millennial learners have a high focus on participatory culture, having their voice heard, and developing immediacy with those who are leading them, which are more distinct traits from previous generations of learners. It is reasonable, therefore, to expect that these values would influence the overall perspective of a faculty person’s credibility.

Faculty Credibility

Research indicates that faculty credibility plays a significant part in the educational process (Kim, 2017b). For example, student perceptions of faculty credibility influence evaluations of courses (Tindall & Waters, 2017). With the new wave of technology, scholars have also examined how faculty use of social media within a course influences perceptions of the faculty member’s credibility (DeGroot, Young, & VanSlette, 2015). Examining the role of faculty credibility becomes more salient when placed in the larger context of a theoretical framework for credibility.

The construct of credibility has a rich history in communication scholarship. This construct is a composite of perspectives held by receivers of communication toward a particular source, message or medium (Newell & Goldsmith, 2001, p. 236). Credibility is a fluid construct, as it is based on perceptions held by individuals instead of a set state of being. Thus, scholars use dimensions that contribute to individuals perceiving something as credible in order to understand the specific components that enhance or diminish credibility (Kim & Brown, 2015). Scholars examine the construct of credibility through specific categories such as source credibility (Berlo, Lemert, & Mertz, 1969; Hovland, Janis, & Kelley, 1953; McCroskey, 1966), media or medium credibility (Gaziano & McGrath, 1986; Kiousis, 2001; Meyer, 1998; West, 1994) and message credibility (Appelman & Sundar, 2016; Kim & Brown, 2015). Scholars focusing on faculty credibility do so using the dimensions from source credibility.

Historically, scholars suggested that the two primary dimensions present in source credibility were trustworthiness and expertise (Hovland & Weiss, 1951; Teven & McCroskey, 1997). Trustworthiness is a dimension where receivers perceive that a source will keep promises, fulfill obligations, and act in a manner consistent with what is communicated. Expertise deals with competencies, qualifications, and skills. While these two dimensions have consistently shown to be significant in a receiver’s perceptions of a source’s credibility, there is a third dimension that has recently been measured as a distinct dimension for faculty credibility: perceived caring.

Perceived caring. The concept of goodwill has been present in the construct of source credibility since its inception with Aristotle’s rhetoric and discussion of ethos (Teven & McCroskey, 1997). Scholars suggest that goodwill, caring, or affinity (all terms applied to the same concept) is the perception of whether someone genuinely cares about an individual, which is decidedly different from trustworthiness as an overall source (Kim, 2017a). Initially, scholars suggested that the reliability of measurements related to goodwill were too highly correlated to the dimensions of trustworthiness to truly be distinctly measurable. However, in 1997, Teven and McCroskey created a scale that successfully measured “perceived caring” as a distinct dimension, and thus they argued for the inclusion of this as a third piece to consider in faculty credibility. The concept of “perceived caring” (McCroskey, 1992; McCroskey & Teven 1999; Teven & McCroskey, 1997) for this study is defined as immediacy, or the feeling of closeness due to the perception of personal care.

While McCroskey and Teven (1999) argued for “perceived caring” to represent the third and final dimension of source credibility, this construct does not fully capture the new value Millennial learners place on interaction. While perceived caring is based on perceptions of the faculty member toward the student, sociability focuses on the two-way communication and role of student voice within interactions. This distinction is important to the overall construct of faculty credibility. Thus, sociability is used to represent a fourth dimension to perceived source credibility that will be unique to Millennial learners.

Lastly, in recent years, perceptions of faculty members’ credibility and their interest in students has been a growing focus among scholars. The concept of OCC is regularly identified as an influence in faculty/student relationships and may provide a powerful mediating influence for Millennial learners’ perspectives of credibility, particularly in relation to out-of-class communication.

Out-of-Class Communication

What takes place inside of a classroom is only a partial view of what influences student learning. Over the last several years, scholars have increasingly focused on understanding out-of-class communication and its impact to areas such as student motivation, student retention, student/faculty trust, and immediacy (Jaasma & Koper, 2002; Kim, 2017a; Kim, 2017b; Terenzini, Pascarella, & Blimling, 1996).

Dimensions of OCC.  Like many constructs that deal with humans, OCC is multi-faceted and cannot be understood simply as a one-dimensional activity. For example, OCC can be either formal or informal communication between a student and faculty member that occurs outside of the classroom. An example of formal OCC would be a student attending office hours, whereas an example of informal OCC would be a student sending a text to a professor (Furlich, 2016).  Beyond classifying OCC into formal or informal communication patterns, it is also evaluated on criteria such as frequency of occurrences, length, content, and student satisfaction (Jaasma & Koper, 1999). Building on these dimensions are also the perspectives, values and ideals of the individuals involved, including both faculty members and students.

Faculty behaviors and OCC. The role of an individual instructor also has an impact to the theory of OCC. Teacher behaviors in a classroom have been shown to influence students’ perceptions of quality, trust, and immediacy, and, ultimately, a student’s decision to engage in OCC with a specific faculty member (Faranda, 2015; Kim, 2017b). Just as faculty behaviors can enhance learning, Thweatt and McCroskey (1996) identified that faculty “misbehaviors” are those activities that faculty do which result in interference to learning. Misbehaviors do not have to be overtly intentional actions that interfere with students but rather may also encompass more subtle activities, such as actions that communicate a sense of distance or disinterest in student interaction (p. 199).   

Understanding the multi-faceted nature of OCC theory, it is logical to expect a connection between the perceptions students hold of OCC and the perceptions they hold of faculty credibility. Scholars have explored these two constructs and verified that they seem to be correlated in some manner (Gerhardt, 2016; Kim, 2017a; Myers, 2004). In light of this connection, examining the construct in light of Millennial learner expectations is also important.

In light of the existing body of research, as well as the gap in understanding Millennial learners’ perceptions of faculty credibility and the mediating role of OCC, the following research questions guided this study:

RQ1: In what ways does OCC influence Millennial learners’ perspectives of faculty credibility?

RQ2: In what ways does OCC enhance the perceived sociability between Millennial learners and their faculty?

H1: The more students believe that faculty are A) more trustworthy, B) more of an expert, and C) have a greater affinity for students because of OCC, the more likely they are to rate faculty higher on final evaluations.  

H2: The more students believe that faculty are genuinely interested in their lives because of OCC, the more likely they are to rate faculty higher on final evaluations.

H3: The more students believe that faculty are A) more trustworthy, B) more of an expert, C) have a greater affinity for students, and D) possess a genuine interest in their individual life because of OCC, the more likely they are to rate faculty higher on final evaluations.

METHOD

To address these research questions, an online survey was employed using Survey Monkey, a well-known survey platform. With approval from the Institutional Review Board, participants were recruited via email from a private university in the spring 2017 semester. Participants were recruited from all majors and class ranks and were not compensated for participation in the survey. In addition, participants could opt out at any point or skip questions on the survey instrument.

Participant Demographics

A total of 289 qualified responses were collected. Of those who reported gender, 29.9% (n = 86) were male and 69.9% (n = 201) were female. Of those who identified class rank, 13.1% were freshmen (n = 38); 22.1% were sophomores (n = 81); 34.9% were juniors (n = 81); and 34.9% were seniors (n = 101). Participants represented all seven schools at the university and 30 majors, including Public Relations, Journalism and Integrated Media, Business Administration, Communication Studies, Nursing, Intercultural Studies, Education, Cinema and Media Arts, Biological Sciences, Anthropology, and others. By sampling a variety of majors, participants were able to represent the diversity in degree programs and student personalities, allowing for the results to be more representative of an entire student body.

Instrument Design

In addition to the demographic information collected, participants also responded to Likert-scale items related to credibility and OCC. Three scale items related to previously identified dimensions of faculty credibility (trust, expertise, and perceived caring) were used in the survey instrument. Since scholars have previously identified that these three dimensions are present and distinct within the construct of faculty credibility, it was important to include them each as a scale item (Teven & McCroskey, 1997). Each item asked participants to evaluate whether OCC resulted in an increased perception of the particular dimension.

In addition, this study sought to measure the way in which OCC would influence all three of these dimensions as a unified construct. In order to evaluate the combined influence, a fourth scale item asked students to respond to whether OCC would likely lead them to rate faculty higher on evaluations. This is an important measurement as previous research has shown that credibility is “positively correlated with students’ overall rating of the level of excellence of the course and instructor” (Beatty & Zahn, 2009, p. 275). Knowing that previously scholars found credibility to influence faculty evaluations, it was significant to measure whether OCC had a positive, mediating impact on the evaluation as well.  

Finally, in light of the new findings related to Millennial learners (Gerhardt, 2016), this study incorporated a scale item related to sociability. Participants rated whether they felt that faculty who engaged with them through OCC “genuinely cared about their lives” more than faculty who did not engage in OCC.

ANALYSIS

RQ1: In what ways does OCC influence Millennial learners’ perspectives of faculty credibility?

In order to address the first research question, three Likert-scale questions were used, based on the three commonly identified dimensions of faculty credibility: trustworthiness, expertise and perceived caring. These questions were posed to assess whether students who experienced OCC were likely to have increased perceptions of specific dimensions related to faculty credibility. Each scale question specifically asked whether, in light of out-of-class communication, the participant perceived trustworthiness, expertise, or perceived caring to be greater.

Trust

Out of the 287 participants who responded, 78.4% (n = 225) either agreed or strongly agreed that they trust faculty who are willing to meet with students outside of class more than faculty who do not meet with students outside of class. The mean for this scale item was 4.02.

Expertise

Out of the 288 participants who responded, only 18.8% (n = 54) either agreed or strongly agreed that faculty who are willing to meet with students outside of class are more of an expert in their field than faculty who do not meet with students outside of class. The mean for this scale item was 2.54.

Perceived Caring

Out of the 288 participants who responded, 68.8% (n = 198) either agreed or strongly agreed that faculty who are willing to meet with students outside of class care more about students than faculty who do not meet with students outside of class. The mean for this scale item was 3.72.

Internal Reliability of Scale

While these three dimensions have previously been shown to influence faculty credibility within the classroom, it was important to verify the internal consistency or reliability of these dimensions in relation to the credibility scale and OCC. The Cronbach alpha for the scale was .68, indicating a moderate internal reliability. In addition, none of the scale items had a high correlation (> 0.60), indicating that they did, in fact, measure distinct dimensions.

RQ2: In what ways does OCC enhance the perceived sociability between Millennial learners and their faculty?

A majority of students (84.75%; n = 239) agreed or strongly agreed with the statement that when a professor interacts with them outside of class, it indicates faculty are genuinely interested in individual students’ lives. The mean for this Likert-scale item was 4.12.

H1: The more students believe that faculty are A) more trustworthy, b) more of an expert, and C) have a greater affinity for students because of OCC, the more likely they are to rate faculty higher on final evaluations.  

While 69.3% (n = 194) of the participants either agreed or strongly agreed that they rate faculty higher on course evaluations if they interact outside of class, it is useful to also examine the influence of the dimensions of credibility on this scale item. This hypothesis was used to examine the influence of OCC and credibility on perceived faculty excellence.  

This hypothesis was supported: F = 19.92, df = 3, p = .000. The factor with the greatest influence on whether students were likely to rate faculty higher on evaluations due to OCC was the belief that faculty who are willing to meet outside of class care more about students (affinity).

H2: The more students believe that faculty are genuinely interested in their lives because of OCC, the more likely they are to rate faculty higher on final evaluations.

Hypothesis 2 was also was supported, F = 50.54, df = 1, p = .000.

H3: The more students believe that faculty are A) more trustworthy, B) more of an expert, C) have a greater affinity for students, and D) possess a genuine interest in their individual life because of OCC, the more likely they are to rate faculty higher on final evaluations.

The third hypothesis was supported, as well, F = 22.94, df = 4, p = .000.

DISCUSSION

OCC and Faculty Credibility

While OCC has previously been shown to have a strong connection with faculty credibility and student learning (Jaasma & Koper, 2002; Kim, 2017a), this study leads to a more precise understanding of the way OCC enhances credibility. Participants indicated that they are much more likely to perceive faculty members as trustworthy and to perceive care from faculty who engage in OCC. However, expertise is not a dimension that seems to be particularly influenced through OCC. So, while OCC does enhance students’ perceptions of credibility, it does so by increasing perceptions of two of the three dimensions. While participants indicated that OCC would have the greatest influence on trust, when it comes to evaluating a professor, the perception that faculty who engage in OCC care more about students seems to play the greatest role in evaluations. This indicates that, while trust is built through OCC, when students determine the overall excellence of a faculty person, perceived care plays the most significant part. This study supports the idea that, while faculty credibility is a fluid set of perceptions that is heavily influenced by in-class behaviors, faculty who choose to engage in OCC have a significant opportunity to build trust and illustrate care for students.

OCC and Faculty Sociability

In addition to previously identified measures, faculty sociability seems to be a particularly poignant component to an educational experience for Millennials (Gerhardt, 2016). In light of this, it was important to understand how OCC may influence the perception of sociability. Participants reported not only that OCC would significantly influence their perception of a faculty person genuinely caring about their lives, but also that this would result in higher evaluations of that faculty member. This seems to indicate that, beyond simply perceived caring, which is an existing dimension, the concept of being genuinely interested in the individual student’s life is a shaping factor for student perceptions of faculty. Recognizing that source credibility is a construct that evaluates whether the perceptions of a receiver toward a source will result in changed attitudes, opinions, or behaviors, it seems like there is strong theoretical support to consider whether sociability should be a fourth dimension in faculty credibility (Hovland et al., 1953). Findings indicate that incorporating sociability alongside of the three existing dimensions did not result in highly correlated variables and, as a unified construct, provided a model that led to higher evaluations of a faculty person.

Theoretical Contributions

This study provides two significant theoretical contributions. First, it expands the construct of faculty credibility in the context of Millennial learners to suggest the inclusion of a fourth dimension: sociability. Second, it advances the understanding of OCC as a pedagogical approach by identifying it as a positive, mediating influence on the perception of faculty credibility.

Faculty credibility theory. Historically, faculty seem to have a larger focus on establishing expertise and trust with students. However, recently, faculty have begun focusing on the dimension of perceived caring. With Millennials filling classrooms, it is more important than ever to understand what dimensions truly build their perceptions of credibility. Beyond simply goodwill or affinity for students, Millennials are looking for personalized interest and connection. They want a voice in their educational process and to know their contributions are heard. In addition, they want to have leaders, or, in this case, faculty, who are authentically interested in their personhood. This study goes beyond calling for sociability as something that Millennial learners value and instead identifies it as something at the heart of their perspective toward faculty. If faculty fail to illustrate sociability or a genuine interest and engagement with Millennial learners, their credibility will be diminished. Furthermore, this may result in misbehaviors (Thweatt & McCroskey, 1996) that ultimately diminish learning and reduce the impact of what faculty set out to do in the first place.

OCC mediating faculty credibility. A further contribution of this study is the finding that not only are OCC and faculty credibility interconnected, OCC actually mediates the perceptions of faculty credibility in Millennial learners. Participants identified that they view the trustworthiness, perceived care, and sociability of faculty members to be greater when they engage in OCC compared to those who do not. In other words, this study confirms that OCC is a direct mediator of increased perceptions of credibility. Moving forward, faculty may benefit from recognizing that OCC can play a pivotal role in pedagogical practices. Those who do not purposefully engage in OCC may end up experiencing students who perceive them as less credible, particularly when compared against other faculty who have adopted this pedagogical approach.

Future Research and Limitations

This study has made two significant contributions to theoretical frameworks. First, it has suggested that for Millennial learners, sociability is a key dimension in faculty credibility. Second, it suggests that OCC is a positive, mediating factor in developing faculty credibility. Future research should explore these two constructs by examining it on a variety of college campuses, as well as incorporating additional scale components that may measure the validity of each of these elements in relation to the existing concept of faculty credibility.

There were several limitations within this study. First, the study took place at a private institution. It would be beneficial to expand the participants and include a variety of institutional types to validate the findings. Additionally, this study did not control for factors such as previous interactions with highly social (or not social) faculty members and the way those interactions might have influenced participants’ perceptions within this study. Finally, self-reported measures on behavioral outcomes have the potential to differ from ways people might actually respond. In light of this, while students reported certain behavioral intentions, it would be beneficial to conduct additional research to see if those self-reported concepts align with real-world application.

CONCLUSION

While source credibility has a rich history of scholarship, the presence of Millennial learners suggests that the current approach to faculty credibility needs to be adjusted. Their values are distinct compared to other generations and, thus, their perspectives on what makes faculty members credible are equally distinct. While trustworthiness, expertise, and perceived caring continue to be important, the addition of sociability is something that changes the current model. Additionally, OCC is more than simply an enhancement to student motivation or learning. It, in fact, enhances perceptions of credibility by bolstering the dimensions of trustworthiness, perceived care, and sociability. Thus, engaging in OCC seems to be more than a pedagogical approach; this study indicates it may be a crucial component to faculty that hope to have a meaningful influence on Millennial learners.

REFERENCES

Appelman, A., & Sundar, S. S. (2016). Measuring message credibility: Construction and validation of an exclusive scale. Journalism & Mass Communication Quarterly, 93(1), 59-79. https://doi.org/10.1177/1077699015606057

Beatty, M. J., & Zahn, C. J. (1990). Are student ratings of communication instructors due to “easy” grading practices?: An analysis of teacher credibility and student‐reported performance levels. Communication Education, 39, 275-282. https://doi.org/10.1080/03634529009378809

Berlo, D. K., Lemert, J. B., & Mertz, R. J. (1969). Dimensions for evaluating the acceptability of message sources. Public Opinion Quarterly, 33, 563- 576. https://doi.org/10.1086/267745

Christensen, L. J., & Menzel, K. E. (1998). The linear relationship between student reports of teacher immediacy behaviors and perceptions of state motivations, and of cognitive, affective, and behavioral learning. Communication Education, 4, 82-90. https://doi.org/10.1080/03634529809379112

DeGroot, J. M., Young, V. J., & VanSlette, S. H. (2015). Twitter use and its effect on student perception of instructor credibility. Communication Education, 64, 419-437. https://doi.org/10.1080/03634523.2015.1014386

Faranda, W. T. (2015). The effects of instructor service performance, immediacy, and trust on student–faculty out-of-class communication. Marketing Education Review, 25, 83-97. https://doi.org/10.1080/10528008.2015.1029853

Fry, R. (2016, April, 25). Millennials overtake Baby Boomers as America’s largest generation. Pew Research Center. Retrieved from http://www.pewresearch.org/fact-tank/2016/04/25/millennials-overtake-baby-boomers/

Furlich, S. (2016). Understanding instructor nonverbal immediacy, verbal immediacy, and student motivation at a small liberal arts university. Journal of the Scholarship of Teaching and Learning, 16(3), 11-22. https://doi.org/10.14434/josotl.v16i3.19284

Gaziano, C. & McGrath, K. (1986). Measuring the concept of credibility. Journalism Quarterly, 63, 451-462. https://doi.org/10.1177/107769908606300301

Gerhardt, M. W. (2016). The importance of being…social? Instructor credibility and the Millennials. Studies in Higher Education, 41, 1533-1547. https://doi.org/10.1080/03075079.2014.981516

Hill, R. P. (2002). Managing across generations in the 21st century: Important lessons from the ivory trenches. Journal of Management Inquiry, 11(1), 60-66. https://doi.org/10.1177/1056492602111020

Hovland, C. I., Janis, I. L., & Kelley, H. H. (1953). Communications and persuasion: Psychological studies in opinion change. New Haven, CT: Yale University.

Hovland, C. & Weiss, W. (1951). The influence of source credibility on communication effectiveness. Public Opinion Quarterly, 15, 635-650. https://doi.org/10.1086/266350

Jaasma, M. & Koper, R. (2002). Out-of-Class communication between female and male students and faculty: The relationship to student perceptions of instructor immediacy. Women’s Studies in Communication, 25, 119-137. https://doi.org/10.1080/07491409.2002.10162443

Jokisaari, M., & Nurmi, J. E. (2009). Change in newcomers’ supervisor support and socialization outcomes after organizational entry. Academy of Management Journal, 52, 527-544. https://doi.org/10.5465/amj.2009.41330971

Kim, C. (2017a). Out-of-class communication and personal learning environments via social media: Students’ perceptions and implications for faculty social media use. Teaching Journalism & Mass Communication, 7(1), 62-76. Retrieved from http://aejmc.us/spig/wp-content/uploads/sites/9/2017/01/tjmc-w17-kim.pdf

Kim, C. (2017b). Millennial learners and out-of-class communication: Expectations and perceptions. Teaching Journalism and Mass Communication, 7(2), 23-31. Retrieved from http://aejmc.us/spig/wp-content/uploads/sites/9/2017/11/tjmc-2017-7-2-kim.pdf

Kim, C. (2015). Pedagogical approaches to student-run PR firms using service learning: A case study. Teaching Journalism & Mass Communication, 5(1), 57-68. Retrieved from http://aejmc.us/wp-content/uploads/sites/9/2015/07/tjmc-s15-kim.pdf

Kim, C., & Brown, W. (2015). Conceptualizing credibility in social media spaces of public relations. Public Relations Journal, 9(4). Retrieved from https://prjournal.instituteforpr.org/wp-content/uploads/2016v09n04KimBrown.pdf

Kiousis, S. (2001). Public trust or mistrust? Perceptions of media credibility in the information age. Mass Communication and Society, 4, 381-403. https://doi.org/10.1207/S15327825MCS0404_4

Kowske, B. J., Rasch, R., & Wiley, J. (2010). Millennials’ (lack of) attitude problem: An empirical examination of generational effects on work attitudes. Journal of Business and Psychology, 25, 265-279. https://doi.org/10.1007/s10869-010-9171-8

Martin, C. A. (2005). From high maintenance to high productivity: What managers need to know about Generation Y. Industrial and Commercial Training, 37, 39-44. https://doi.org/10.1108/00197850510699965

McCroskey, J. C. (1992). Reliability and validity of the willingness to communicate scale. Communication Quarterly, 40, 16-25. https://doi.org/10.1080/01463379209369817

McCroskey, J. C. (1966). Scales for the measurement of ethos. Speech Monographs, 33, 65-72. https://doi.org/10.1080/03637756609375482

McCroskey, J. C., & Teven, J. J. (1999). Goodwill: A reexamination of the construct and its measurement. Communications Monographs, 66, 90-103. https://doi.org/10.1080/03637759909376464

Meyer, P. (1988). Defining and measuring credibility of newspapers: Developing an index. Journalism Quarterly, 65, 567-572. https://doi.org/10.1177/107769908806500301

Miller-Ott, A. E. (2016). Helicopter parenting, family communication patterns, and out-of-class communication with college instructors. Communication Research Reports, 33, 173-176. https://doi.org/10.1080/08824096.2016.1154836

Myers, S. A. (2004). The relationship between perceived instructor credibility and college student in-class and out-of-class communication. Communication Reports, 17, 129-137. https://doi.org/10.1080/08934210409389382

Newell, S. J., & Goldsmith, R. E. (2001). The development of a scale to measure perceived corporate credibility. Journal of Business Research, 52, 235-247. https://doi.org/10.1016/S0148-2963(99)00104-6

Smola, K. W., & Sutton, C. D. (2002). Generational differences: Revisiting generational work values for the new millennium. Journal of Organizational Behavior, 23, 363-382. https://doi.org/10.1002/job.147

Swanson, J. (2008). Training future PR practitioners and serving the community through a “learn by doing” undergraduate university curriculum. Public Relations Quarterly, 52(3), 15-20.

Terenzini, P. T., Pascarella, E. T., & Blimling, G. S. (1996). Students’ out-of-class experiences and their influence on learning and cognitive development: A literature review. Journal of College Student Development, 37, 149-162.

Teven, J. J., & McCroskey, J. C. (1997). The relationship of perceived teacher caring with student learning and teacher evaluation. Communication Education, 46, 1-9. https://doi.org/10.1080/03634529709379069

Thweatt, K. S., & McCroskey, J. C. (1996). Teacher nonimmediacy and misbehavior: Unintentional negative communication. Communication Research Reports, 13, 198–204. https://doi.org/10.1080/08824099609362087

Tindall, N. T., & Waters, R. D. (2017). Does gender and professional experience influence students’ perceptions of professors? Journalism & Mass Communication Educator, 72, 52-67. https://doi.org/10.1177/1077695815613932

Urick, M. J. (2012). Exploring generational identity: A multiparadigm approach. Journal of Business Diversity, 12, 103-115.

West, M. D. (1994). Validating a scale of the measurement of credibility: A covariance structure modeling approach. Journalism Quarterly, 71, 159-168. https://doi.org/10.1177/107769909407100115

Wilson, M., & Gerber, L. E. (2008). How generational theory can improve teaching: Strategies for working with the “millennials.” Currents in Teaching and Learning, 1(1), 29-44. Retrieved from https://pdfs.semanticscholar.org/4aec/f98b4cd5c7dad19e27f1bd85d5befd3e3121.pdf