Monthly Archives: December 2017

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics

Author

Thomasena Shaw

Thomasena Shaw, Bridgewater State University

Abstract

Internships have significant early career advantages for undergraduates including less time finding a first employment position, increased monetary compensation and greater overall job satisfaction. Considerable professional and scholarly evidence highlights the important role of undergraduate internships, as well as gaps that exist between students and supervisors regarding the relative importance of specific job skills and professional characteristics. While previous studies have explored the underlying feelings and expectations of the two groups in professional and academic contexts, this exploratory case study uses coorientation as the theoretical framework to examine the levels of agreement, congruency and accuracy that exist between them in relation to key jobs skills and professional characteristics linked with career success; it also provides insight into the extent to which respondents perceive that the internship improved students’ college-learning outcomes. The key findings of this study indicate that the majority of respondents believed that the experience improved performance in relation to college learning outcomes. The study also found that students and supervisors are accurately cooriented with one another in relation to job skills items, but less so when it comes to professional characteristics. This could be particularly problematic for student interns as misperceptions and misunderstanding can potentially lead to missed opportunities for collaboration and integration, and/or a self-fulfilling prophecy where supervisors’ lack of coorientation damages the possibility of a cooperative relationship with current and future student interns, and the academic programs that bring them together.

 SlideShare PDF

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics

The internship experience is broadly regarded by practitioners and educators as a critical event that often serves as a transition to an entry-level position (Gault, Redington, & Schlager, 2000; Gibson, 2001) and better employment opportunities for students (Knouse & Fontenot, 2008; Knouse, Tanner, & Harris, 1999; Redeker, 1992; Taylor, 1988). Internships improve college performance via experiential learning (Cantor, 1997; Ciofalo, 1989; McCarthy, 2006), improve personal habits such as time management and dependability (Sapp & Zhang, 2009; Taylor, 1988), have the potential to strengthen academic programs via service learning and citizenship (Fall, 2006; Mendel-Reyes, 1998), and help students make valuable connections with industry (Tovey, 2001) and community partners (Bringle, 2002; Soska & Butterfield, 2013). Internships provide students with a unique opportunity to gain valuable interpersonal, social, and contextual attitudes necessary for entry into non-academic settings (Anson & Forsberg, 1992), and crystallize personal interests and career ambitions (Coco, 2000).

Professional and scholarly evidence suggests a gap exists between students and supervisors regarding the relative importance of specific job skills and professional characteristics (CPRE, 1999; CPRE, 2006; Daugherty, 2011; Neff, Walker, Smith, & Creedon, 1999; Todd, 2014). While these and other studies have explored the underlying feelings and expectations of the two groups in professional and academic contexts, this study uses coorientation as the theoretical framework. Specifically, the researcher examines the levels of accuracy, congruency and agreement that exist between the two groups in relation to a number of job skills and professional characteristics considered necessary for a positive internship experience and future career success. The results are intended to extend existing understanding of the topic and suggest intentional changes to course design and dialogue regarding teaching practices that could improve student learning outcomes – ultimately laying the groundwork for the two groups to “coorient” toward one another accurately.

In the next section of this paper, a review of literature defines and examines the benefits of the internship experience, explores it in a public relations program context, and outlines the study’s theoretical framework: coorientation. Next, the researcher outlines the survey methodology employed, describes results, and discusses implications of the findings.

LITERATURE REVIEW 

Benefits of the Internship Experience

Internships help students transition to entry-level positions (Gault, Redington, & Schlager, 2000; Gibson, 2001), improve interconnections between service learning and citizenship education (Fall, 2006; Mendel-Reyes, 1998), and have the potential to strengthen relationships between the academy and business and community partners (Tovey, 2001). An article in The Chronicle of Higher Education states that academic internships are valuable partnerships that allow students to collaborate closely with faculty, and strengthen ties between the academy and the community—whether students are paid or not (Westerberg & Wickersham, 2011). Regarding the benefits to the organization, internships provide direct business contact for students in an employment setting (Gupta, Burns, & Schiferl, 2010), prepare students with realistic expectations of their future careers, and an opportunity to gain on-the-job experience (Paulins, 2008). They provide additional well-educated, talented labor capacity (Brindley & Ritchie, 2000; Callanan & Benzing, 2004; Mihail, 2006), “compensation efficiencies” (Maertz, Stoeberl, & Marks, 2014), and an opportunity to see how much potential a student has in the field before hiring them (Coco, 2000). Indeed, Watson (1995) estimated that it is $15,000 per person less expensive to hire interns than to recruit and select candidates from an at-large pool. Maertz, Stoeberl, and Marks (2014) assert that interns are often more loyal toward the company and stay longer than the average non-intern hire.

College Internship Experiences Defined

The earliest recorded college-endorsed employment program was established in 1906 at the University of Cincinnati’s Cooperative Education Program (Thiel & Hartley, 1997). Typical contemporary internship programs have the following attributes: they offer a specific number of work hours, paid or unpaid employment, credit for college classes, supervision by a faculty coordinator or other university contact, and supervision by an organization mentor (DiLorenzo-Aiss & Mathisen, 1996; Gault, Redington, & Schlager, 2000; Roznowski & Wrigley, 2003). To maximize the internship experience, Coco (2000) asserts that students should be held accountable for projects and deadlines. Lubbers and Bourland-Davis (2012) suggest that on-site supervisors should provide incoming interns with some kind of orientation, where goals are clearly articulated, and with access to regular meaningful feedback. This type of internship experience resembles what Kuh (2008) describes as high-impact practices when they are effortful, help students build substantive relationships, help students engage across differences, provide students with rich feedback, help students apply and test what they are learning in new situations, and provide opportunities for students to reflect on the people they are becoming.

Divine, Linrud, Miller, and Wilson (2007) indicate that approximately 90% of U.S. colleges offer internships or similar experiential opportunities. In 2016, a US News and World Report survey of 324 ranked colleges and universities found that on average 40% of the undergraduate class of 2014 had internship experience. At the eight schools with the highest rates of participation, 100% of undergraduates completed an internship (Smith-Barrow, 2016). A National Association of Colleges and Employers (NACE, 2016) report found that more than 56% of students from the class of 2015 who participated in an internship had received at least one job offer by April of that year (compared to only 36.5% of undergrads who did not have an internship) and that the intern conversion rate was 51.7%.

The Internship Experience in a Public Relations Program Context

Internships are strongly encouraged and valued among both public relations educators and employers; the experience lends credibility to university public relations programs (Van Leuven, 1989a), and allows students to observe public relations practitioners in the roles of manager, strategist, planner, problem solver and counselor to management (Baxter, 1993). Lubbers, Bourland-Davis and Rawlins (2008) describe it as a process of socialization through which interns learn the values associated with the profession.

The industry’s largest organization of public relations professionals, the Public Relations Society of America (PRSA), encourages internships as a key way for students to enhance their education, résumé, portfolio, networking, and technical skills (Beebe, Blaylock, & Sweetser, 2009). A national study conducted by the Commission on Public Relations Education entitled “A Port of Entry” recommends a supervised work experience as one of the core courses for students majoring or pursuing an emphasis in public relations (CPRE, 1999); the Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) also advocates and encourages opportunities for internship and other professional experiences outside the classroom (ACEJMC, 2013).

Research also supports the notion that a quality public relations internship increases job satisfaction after graduation (Horowitz, 1997), is a necessity for mass communication students making the transition from college to career (Beard & Morton, 1999), and is typically favored by students to seek mentoring and to make contacts (Basow & Byrne, 1993).

With regard to discipline-specific skills supervisors believed most necessary for public relations interns, Beard and Morton (1999) identify six predictors for internship success in a public relations context: (1) academic preparedness, (2) proactivity/aggressiveness, (3) positive attitude, (4) quality of worksite supervision, (5) organizational practices and policies, and (6) compensation. Brown and Fall (2005) identified writing, oral, and organizational skills, and note that the most valued professional characteristics were intangible: motivation and “healthy, upbeat attitudes” (p. 303). The aforementioned “Port of Entry” report (1999, p. 12) identified the following as core skills: mastery of language in written/oral communication; community relations, consumer relations, employee relations and other practice areas; research methods and analysis; problem solving and negotiation; and informative and persuasive writing.

Disparities Regarding Learning Outcomes

Despite the obvious benefits of the internship experience, research does indicate that disparities exist between how public relations practitioners, academic programs, and students perceive the importance of job skills and professional characteristics, which has the potential to lead to missed opportunities for all parties.

A study conducted on behalf of the Association of American Colleges and Universities (Hart, 2016) indicated that the college learning outcomes employers considered top priorities include demonstration in “cross-cutting skills” related to communication, teamwork, ethical decision-making, critical thinking, and applying knowledge in real-world settings (p.1). Sixty percent of employers indicated that they would be much more likely to consider a candidate that had recently completed an internship. However, 44% felt that recent college graduates were not well-prepared to apply their knowledge in real-world settings, and gave students low scores for preparedness across a range of college learning outcomes including ability to communicate orally, working effectively with others in teams, and critical thinking and analytical reasoning skills. There was alignment in the category referred to as staying current with new technologies; however, students were more than twice as likely as employers to think they were prepared in terms of oral communication, written communication, critical thinking, and creativity.

Two separate Commission on Public Relations Education reports (CPRE, 1999; CPRE, 2006) indicate that a number of key competencies and skills were weak or missing among entry-level public relations graduates, including: writing skills, understanding of business practices, and critical thinking and problem-solving skills. Neff, Walker, Smith, and Creedon (1999) assert that gaps exist between the outcomes educators and employers desire and those presently achieved in public relations education. They found that public relations graduates don’t always meet entry-level outcome competencies expected by employers, and recommended changes in curriculum, pedagogy and assessment.

It would appear that these disparities also spill over into the internship experience. Meng (2013) found differences between students and practitioners; practitioners ranked strategic decision-making capability, ability to solve problems and produce desired results, and communication knowledge and expertise highest. Meanwhile, public relations students rated ability to solve problems and produce desired results, being trustworthy and dependable, and relationship-building abilities highest. Sapp and Zhang (2009) found that industry supervisors rated students’ performance in the categories of attitude and interaction the highest, and skills related to the students’ writing skills, ability to take initiative, professional skills, spoken communication skills, and time management skills among the lowest. In Daugherty’s (2011) study, students indicated that they wanted more skill development and hands-on training, while on-site supervisors saw their role as more holistic. Todd (2014) found that public relations managers rated the job skills and professional characteristics of their entry-level millennial charges significantly lower than the latter group rated themselves.

Many of the research articles, studies and reports detailed above explore the public relations internship experience from a variety of perspectives, including that of student interns and their on-site supervisors, but none have explored the degree of coorientation—agreement, congruence and accuracy—each group perceives the other to have with his/her own evaluations in relation to recognized job skills and professional characteristics. Coorientation rests on the assumption that a person’s behavior is based on a combination of his/her personal construction of the world and the perception of orientations of those around them (Heider, 1946; Newcomb, 1953). As such, the theory suggests methods for measuring the degree of mutual orientation of individuals, groups or organizations toward an object, or the consensus among them about an object (Pearson, 1989). In this study, coorientation theory will be used to explore if perceptions regarding the job skills/professional characteristics necessary for a successful public relations internship experience are accurate or not. This will identify underlying disparities (if they exist), and facilitate discussion of implications for public relations educators, student interns, and on-site supervisors.

Theoretical Framework: Coorientation

Coorientation theory stems from the study of social psychology. Essentially, the term coorientation refers to simultaneous orientations, so if person A (on-site supervisor) feels negatively toward B (student intern) and positively about X (job skills and/or professional characteristics), and finds out that B feels positively about X as well, then the system can be said to be imbalanced, or asymmetrical. Ultimately, this imbalance can impede any moves toward balance or improvement of the relationship between the two parties. Therefore, coorientation can be seen as a relational term, and it is via communication that it is achieved. According to Johnson (1989), from this perspective, it is imperative that consensus is examined as an interaction between people rather than being the property of a single individual.

Perhaps the most recognizable names in this research stream are McLeod and Chaffee (1973) who developed a coorientation measurement model with three variables: agreement, congruency and accuracy. Perfect communication between the two groups (A and B), totally free of constraints, would not necessarily improve agreement, and it might even reduce congruency. Indeed, if the two are motivated to coorient, it can actually facilitate understanding, but it should always improve accuracy, even to the point where each person knows exactly what the other is thinking; this would be perfect communication in a quite literal sense.

The model, outlined in Figure 1, provides a visual representation of coorientation in relation to this study, which explores the relationship between the two groups’ (on-site supervisors and student interns) self-reported attitudes toward an object (rating of job skills and professional characteristics) as well as their perceptions of each other’s self-report. This produces three coorientation variables: agreement, congruency and accuracy.

Figure 1    Coorientation model representation of supervisors’ and interns’ ranking of job skills and professional characteristics. (Adapted from: The coorientation model of measurement. (From McLeod, J.M. & Chaffee, S.H. (1973). Interpersonal approaches to communication research. American Behavioral Scientist, 16, 484. Sage.)

Coorientation Variables Defined: Agreement, Congruency and Accuracy

Agreement indicates the degree to which the two groups’ beliefs on the issue (rating of job skills and professional characteristics) are similar. Perceived disagreement/agreement on the issue by the two groups is described as congruency. Accuracy is the extent to which one group’s cognition (e.g., interns’ perception of supervisors’ ranking of job skills and professional characteristics) equals what the other group actually reported.

According to Kim (1986), of the three measurements, accuracy is considered to be the most important because it can provide a clear picture of the effects of communication. For example, in terms of this study, agreement on the focal point—what job skills and professional characteristics are most important—must take place before true understanding can occur. Although communication may often produce some increase in accuracy, it rarely produces total agreement because each person arrives at his/her beliefs through personal experiences. Communication can produce marked increases in accuracy between the two groups because the more two parties coorient by communicating private values to each other, the more accurate perceptions of those values have the potential to become (Chaffee & McLeod, 1968).

It is important to note at this point that the coorientation variables—agreement, congruency and accuracy—are not functionally independent of one another, since each is based on two measures. Thus, if agreement is low and congruency is high, accuracy is necessarily low; if agreement and congruency are both high (or low), accuracy is high. A change in one of these variables will affect change in another if the third is held constant (Chaffee & McLeod, 1968). For example, if a public relations program makes student interns more accurate in their perceptions of the rigors and demands of actual public relations practice, then congruency for that public will also change. The direction of the change, higher or lower congruency, depends on the degree to which the initial supervisors’ definition of the issue was similar to student interns’ views.

Examples of the theory being used by public relations researchers include use in the exploration of public issues (Broom, 1977), media relations (Kopenhaver, Martinson, & Ryan, 1977), understanding between government organizations and interest groups (Grunig, 1972), non-profit organizations and donors (Waters, 2009), journalist and practitioner attitudes toward social media (Avery, Lariscy, & Sweetser, 2011), and international relations (Verčič & Verčič, 2007).

There can be no doubt that student interns are operating in more competitive and dynamic environments than ever before, and it is therefore imperative that both groups identify issues that may help or hinder their relationship. Expanding knowledge of the role and importance of the relationship that exists between them, as well as how each group reacts to similar stimuli/events (i.e., improving the level of coorientation), will potentially lead to improved student effectiveness and success, and more fruitful collaborations between academic programs and real-world industry/organizations.

Research Questions

This study will address the following research questions:

RQ1a: How do respondents’ rate/score specific job skills (JS) and professional characteristics (PC)?

RQ1b: Is there a significant difference in the levels of coorientation (agreement, congruency and accuracy) regarding JS and PC between the two groups?

RQ2: Do respondents perceive that the internship experience improves students’ learning outcomes?

METHOD

Survey instrument

The researcher secured IRB approval, and pre-tested the survey with a small sample of faculty and students to verify categorical representation, and assess validity and comprehension. A Qualtrics survey link was then distributed to all students listed as belonging to the Strategic Communication/Public Relations concentration in the final three weeks of a traditional 15-week fall (2015) semester (N = 135) at a mid-sized public Northeastern regional university. All of the students who participated had completed (or were currently taking) a public relations practicum class, which uses a 120-hour required field experience as a focal point (course prerequisites include Introduction to Public Relations and Strategic Writing). Students worked at the job site 6-8 hours per week with an on-site supervisor (who is employed in a public relations capacity at the job site) and engaged in similar types of activities—event planning and coordination, strategic writing, preparing strategic awareness/promotion materials, etc. The on-site supervisor survey was emailed to students’ supervisors (students provided contact information in their survey). An initial solicitation email with a web-link to the survey was distributed to both groups and followed up with one reminder email; this yielded 32 completed student surveys (n = 32; response rate = 22%) and 15 supervisor surveys (n = 15, response rate = 50%).

The survey was comprised of three sections. The first gathered relevant demographic data from respondents, the second section asked respondents to rate/score eight job skills and 12 professional characteristics according to (1) his/her own perceptions, and (2) how they predict the other group would rank them (1 being most important, 12 least important).

This section has preliminary convergent validity as it adapts criteria presented in a study conducted by Todd (2014) that also divided tasks and responsibilities into two of these constructs. The third section of the survey explored the extent to which the internship experience improved students’ abilities related to a number of college learning outcomes (5-point Likert scale; 1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable). This section has preliminary convergent validity because it uses several of the same constructs presented in a study conducted on behalf of the Association of American Colleges and Universities that identified college learning outcomes employers considered top priorities. The Cronbach’s α score was 0.86, which demonstrates acceptable internal reliability. The final section of the survey asked respondents to answer open-ended questions related to the overall experience, and challenges/suggestions. The convenience nature of the survey and small sample size mean that external validity for both the quantitative and qualitative parts of the study are low; therefore, only face validity can be assumed.

RESULTS

Description of Respondents

Of the 47 respondents participating in the study, 68% (n = 32) were student interns and 32% (n = 15) were on-site supervisors. Sixty-eight percent (n = 24) of the interns were female and 32% (n = 8) were male; on-site supervisors were 53% female (n = 8) and 47% male (n = 7). Student respondents were mostly aged 21-25 (93% of students; n = 28); on-site supervisors’ ages ranged from 26-65, the median age being 39. The majority of both student and on-site supervisors identified as Caucasian (81%; students n = 26 and supervisors n = 13). The student respondents were mostly seniors (93%; n = 28), 19 % (n = 6) were juniors. All on-site supervisors (n = 15) reported having a 4-year college degree, two of them (20%) have a master’s degree. In the on-site supervisor group, 67% (n = 10) work in private not-for profit (charitable organization), the remainder work in other non-profit settings (local government n = 2; state government n = 2). Just over half of the students (53%; n = 17) reported that this was their first internship; 22% (n = 7) have had two; 19% (n = 6) have had three internships. In terms of how many hours students have worked at their internships, 66% (n = 21) worked under 10 hours; 19% (n = 6) worked over 15 hours. On-site supervisors indicated that 47% (n = 7) have had just one student intern, 33% (n = 5) have had more than three, and 20% (n = 3) had two interns. The majority of supervisors indicated that interns worked fewer than 10 hours per week (80%; n = 12).

RQ1a: How do respondents rate/score the importance of related job skills and professional characteristics?

Job skills: student interns. With regard to the eight job skills (see Table 1), student interns reported their top four (in order of preference) as, quality of work (M = 6.28, SD = 1.37), overall performance (M = 5.72, SD = 2.55), writing skills (M = 5.56, SD = 1.62), and job task preparation (M = 5.06, SD = 2.15). Their bottom four were oral communication skills (M = 4.81, SD = 1.92), knowledge of social media (M = 3.19, SD = 2.07), computer skills (M = 3.0, SD = 1.66), and research skills (M = 2.4, SD = 1.38).

Table 1

Job skills – Students’ and Supervisors’ Self Mean

Job Skill Student self-mean Supervisor self-mean Difference in means
Research skills 2.4 3.84 -1.44
Computer skills 3.0 2.9 .1
Knowledge of social media 3.19 3.14 .05
Oral communication skills 4.81 5.48 -.67
Job task preparation 5.06 5.13 -.07
Writing skills 5.56 5.91 .35
Overall performance 5.72 3.31 2.41
Quality of work 6.28 6.5 .22

Job skills: on-site supervisors. On-site supervisors reported their top four job skills (see Table 1) in order of preference as, quality of work (M = 6.5, SD = 1.50), writing skills (M = 5.91, SD = 1.22), oral communication skills (M = 5.48, SD = 1.84), and job task preparation (M = 5.13, SD = 2.40). Their bottom four were research skills (M = 3.84, SD = 2.54), overall performance (M = 3.31, SD = 2.84), knowledge of social media (M = 3.14, SD = 1.33), and computer skills (M = 2.9, SD = 1.03).

Professional characteristics: students. As there are 12 professional characteristics (PC), the researcher divided them into two groups—top and bottom (see Table 2). Student interns reported the top PC needed by interns as, willingness to learn (M = 9.75, SD = 2.47), time management (M = 9.12, SD = 1.69), attention to details (M = 9.03, SD = 2.54), accept responsibility (M = 7.87, SD = 2.54), follow instructions (M = 7.84, SD = 2.7), and punctuality (M = 6.34, SD = 3.17). The bottom lower ranked were, take on new tasks (M = 6.12, SD = 2.98), cooperation (M = 5.96, SD = 2.23), accept criticism (M = 5.65, SD = 2.71), work independently (M = 5.25, SD = 3.3), aware of ethics (M = 2.65, SD = 2.85), and understand diversity (M = 2.37, SD = 1.94).

Table 2

Professional Characteristics – Students’ and Supervisors’ Self Mean

Professional Characteristics Student self-mean Supervisor self-mean Difference in means
Understand diversity 2.38 2.27 .11
Aware of ethics 2.66 3.07 -.41
Work independently 5.25 6.93 -1.68
Accept criticism 5.66 5.93 -.27
Cooperation 5.97 6.07 -.10
Take on new tasks 6.12 5.20 .92
Punctuality 6.34 3.93 2.41
Follow instructions 7.83 7.93 -.10
Accept responsibility 7.88 7.27 .61
Attention to details 9.03 10.13 -1.10
Time management 9.13 7.40 1.73
Willingness to learn 9.75 11.87 -2.12

Professional characteristics: on-site supervisors. On-site supervisors reported their top PC as (see Table 2), willingness to learn (M = 11.87, SD = 0.516), attention to details (M = 10.13, SD = 1.55), follow instructions (M = 7.93, SD = 2.54), time management (M = 7.4, SD = 2.13), accept responsibility (M = 7.27, SD = 1.94), and work independently (M = 6.93, SD = 3.47). The bottom ranked PCs were, cooperation (M = 6.07, SD = 1.86), accept criticism (M = 5.93, SD = 1.94), take on new tasks (M = 5.2, SD = 1.78), punctuality (M = 3.93, SD = 2.78), aware of ethics (M = 3.07, SD = 3.49), and understand diversity (M = 2.27, SD = .88).

RQ1b: Is there a significant difference in the levels of coorientation (agreement, accuracy, congruence) between the two groups? 

Agreement. When respondents’ self–reports are compared to the self-reports of members of the other group, a coorientational insight into the level of agreement that exists between the two groups was obtained by utilizing a non-parametric statistical measure: the Mann-Whitney U test. The central question here is: Do students and supervisors agree on the rating/scoring of the items (student self vs supervisor self)?

Mann-Whitney U-tests indicated that, for the most part, the two groups agreed with one another on the ratings/scores of the eight JS presented in the survey. The only exception relates to the item overall performance (z = -2.813, p = 0.005). Here, students’ mean scores were higher than supervisors’ self-reports (student mean = 5.7; supervisor mean = 3.30).

Regarding the 12 PCs, respondents’ scores were similar on the majority of the items except for three items: (1) willingness to learn (z = -3.474, p = 0.001)—supervisors rated it higher than students (supervisor mean = 11.80; student mean = 9.70); (2) time management (z = -2.601, p = 0.009)—students rated it higher than supervisors (student mean = 9.1; supervisor mean =7.40); and (3) punctuality (z = -2.503, p = 0.012)—students rated it higher that their on-site counterparts (student mean = 6.3; supervisor mean = 3.9).

Congruency. To achieve coorientational insight into the level of congruency, respondents’ self–reports are compared to their projections of “other group” responses. Mann-Whitney U-tests compared respondents’ selections. The central question here is: How similar are respondents’ ratings/scores of job skills and professional characteristics to how they perceive their counterparts will rate/score the items (student self vs. student other; supervisor self vs. supervisor other)?

Student interns. Student intern ratings/scores were congruent with their perceptions of how supervisors would rate/score the items. No significant differences occurred in the JS category. Regarding professional characteristics, congruence also exists across all items; students’ ratings/scores were similar to their perceptions of how supervisors’ would rate/score the items across all items.

Table 3

Professional Characteristics – Supervisor Congruency

Professional Characteristics z score p value
Willingness to learn -4.670 .000
Attention to details -2.585 .010
Follow instructions -1.996 .046
Time management -2.936 .003
Accept responsibility -3.330 .001
Punctuality -4.037 .000
Cooperation -4.231 .000
Accept criticism -3.639 .000
Take on new tasks -4.648 .000
Work independently -1.827 .068
Understand diversity -4.670 .000
Aware of ethics -3.656 .000

 

On-site supervisors. Supervisors’ ratings/scores of job skills were congruent with their perceptions of how students would rate/score all JS items except for social media (z = -1.900, p = 0.050). However, in the PC category, there was a distinct lack of congruency across all items except work independently (z = -1.827, p = 0.068; see Table 3); supervisors’ ratings/scores were significantly different to their perceptions of how students would rate/score the items.

The central question was: How similar are respondents’ ratings/scores of job skills and professional characteristics to how they perceive their counterparts will rate/score the items (student self vs. student other; supervisor self vs. supervisor other)? Students displayed high levels of congruency—how they ranked all items in the job skills and professional characteristics categories matched how they perceived their supervisor counterparts would rank the items. On-site supervisors also displayed high levels of congruency in the job skills section; however, in the professional characteristics category, supervisors perceived that students’ selections would be different to their choices.

Accuracy. Finally, when student intern self-reports (or on-site supervisors) were compared to their projections of how the other group would respond, a coorientational insight into the level of accuracy that exists between the two groups is obtained. Mann-Whitney U-tests calculated accuracy within the student intern and on-site supervisor groups respectively. The central question here is: How do respondents’ (self) ratings/scores compare with their counterparts’ perceptions (other) of how they will rate/score the items (student self vs. supervisor other; supervisor self vs. student other)?

Student interns. Regarding JS, student interns’ ratings/scores compared with on-site supervisors’ perceptions of how they would respond was mostly accurate, except in relation to the item overall performance (z = -2.447, p = 0.014). Regarding the PC items listed in the survey, student interns’ ratings/scores compared with supervisors’ perceptions of how they would respond was accurate for just three items: willingness to learn, attention to details, and time management. Inaccuracy existed in relation to the ratings/scores of nine items: following instructions (z = -2.338, p = 0.019), taking responsibility (z = -2.453, p = 0.014), punctuality (z = -3.320, p = 0.001), cooperation (z = -4.197, p = 0.000), accept criticism (z = -4.197, p = 0.000), taking on new tasks (z = -3.680, p = 0.000), working independently (z = -3.982, p = 0.000), diversity (z = -5.362, p = 0.000), and ethics (z = -4.801, p = 0.00).

On-site supervisors: Regarding JS items, supervisor’ ratings/scores compared with student interns’ perceptions of how they would respond was mostly accurate. The only exception was regarding the items oral communication (z = -2.754, p = 0.006) and overall performance (z = -2.716, p = 0.007). In relation to the rating/score of PC items, on-site supervisors’ ratings/scores compared with students’ perceptions of how they would respond was accurate across most of the items. Inaccuracy existed in relation to three: willingness to learn (z = -3.103, p = 0.002), time management (z = -2.556, p = 0.011), and punctuality (z = -2.687, p = 0.007)

The central question here is: How do respondents’ (self) ratings/scores compare with their counterparts’ perceptions (other) of how they will rate/score the items? In this study, supervisors provided stronger evidence of coorientational accuracy than their student counterparts. When asked to project themselves as the opposite group, supervisors were better at predicting on-site supervisors’ responses (inaccuracy only occurred in two job skills items: oral communication and overall performance; and three professional characteristics items: willingness to learn, time management and punctuality). Students did display evidence of accuracy in their predictions of supervisors’ ratings of job skills (except for one item, overall performance); however, they were very poor at predicting their counterparts’ responses in the majority (nine) of the professional characteristics categories (they only accurately predicted students’ ratings of willingness to learn, attention to detail and time management).

RQ3: Do respondents perceive that the internship experience improved students’ learning outcomes?

A Mann-Whitney U-test revealed that significant differences did not exist between the two groups regarding perceptions of whether the internship experience improved students’ learning outcomes; both groups reported that the experience resulted in moderate to significant improvement across all 12 recognized college learning outcomes (Cronbach’s α = 0.86).

Students. On a 5-point Likert scale (1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable), the majority of student respondents (N = 32) indicated that they improved across all college learning outcome categories while working as an intern (M = 3.43).

Supervisors. On a 5-point Likert scale (1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable), the majority of on-site supervisors indicated that students improved across all college learning outcome categories while working as interns (M = 3.49).

Responses to open-ended questions

Students and supervisors were asked several open-ended questions about challenges they experienced related to the internship, and suggestions related to curriculum/coursework to make the internship experience more successful.

Students. Student interns indicated that the most significant challenges they faced related to time and work-load management, the unpaid nature of internships, the strong emphasis on writing ability, and adapting to working in a “professional” environment:

[My challenges] were definitely being able to balance the work load [while] still being a full time [sic] student. Being involved on campus, having 3 internships in total, and still trying to make money [with] a part time job. It was tough balancing everything, as all the work from each of these things was incredibly important…at times it was really hard to make [priority] decisions.

[When] the internship is unpaid, it makes it very difficult to make ends meet. This is especially true when having to travel to the job site.

I think one of my biggest challenges was being able to write press releases since I never [wrote] them at a professional level before. I definitely had trouble with certain types of writing such as creating brochures and news releases.

Learning the expectations of my co-workers/supervisor and making sure I always met, and/or exceeded them. This was a challenge at times because I was new to the real world [sic] environment and didn’t know what to expect.

With regard to suggestions to the curriculum/coursework to make the internship experience more successful, most students did not respond to this question. Those who did were very satisfied with their preparation and experience: “I wouldn’t change a thing, it was a great experience. I loved the balance between the classroom and the field experience.” Another student stated: “I can’t imagine it being more successful. I learned so much.”

Some student suggestions included: “[Adding] a writing refresher workshop prior to beginning [the] internship would be beneficial,” and “Taking a business management class may have really helped too.” Additionally:

Possibly a class with reminders on basic guidelines on how to write press releases and other basic PR writing tools. I found myself looking at past assignments from previous years for help, my writing was not always as strong as I wanted it to be.

When asked what the internship taught them about their major/discipline, students indicated that they learned more about the scope of public relations: “It taught me valuable writing skills and how to tailor wording to meet the needs of specific audiences. I think I improved my listening skills as well.

Two other students responded:

I learned that there are many different facets to public relations, and problems are always going to occur. Working for a non-profit was challenging, but there were also many benefits. I now know that it requires passion and a dedication not required in most regular office jobs.

I definitely learned how to communicate in a more professional setting, i.e. through emails, phone calls, person-to-person, etc. This experience opened my eyes to not only the inner workings of a real world business, but also to new workplace skills that I will definitely use in the future.

Supervisors. With regard to challenges, for supervisors it seems that the biggest issue related to the limited time interns worked on-site: “[The] only challenge was that she only worked two days a week and [I] felt bad trying to reach her to follow up on items during days when she wasn’t working.” Another student stated:

I couldn’t be happier with the experience I’ve had with my intern. All of her work has been of the highest quality and she never hesitates to take on new tasks and responsibilities. She consistently surpasses expectations and brings great insight and value to my department. The only challenge I may have encountered was keeping her busy because she was so efficient!

Regarding suggestions, supervisors indicated that perhaps more interaction with academic advisors would be helpful:

More correspondence from the advisers is always helpful – I like having a weekly bi-weekly or monthly check-in with the college staff to ensure the student, adviser, and internship supervisor are all on the same page.

I felt like my intern had a very strong grasp of communication principles, specifically in regards to public relations and social media. Her coursework absolutely prepared her for work in those fields. Communications work can often come with broad job descriptions and require the communicator to wear many ‘hats’ [sic]. It seems to me that my intern had a strong academic foundation that would be an asset in adapting to this kind of situation.

Finally, additional comments offered by supervisors were complimentary of interns:

Our experience so far has been awesome. We currently have two different interns here for different reasons and they are both very motivated, intelligent and helpful. They are a great addition to our organization.

I’d just like to compliment the faculty on offering a generation of new communicators such a high level of preparation for an industry that changes daily with the advent of new technologies and vehicles for messaging. I’m excited to see what these future professionals will bring to the table!

DISCUSSION AND CONCLUSION

Discipline-specific skills that supervisors consider most necessary for public relations interns include strategic writing, oral, and organizational skills, research skills; problem solving and negotiation; and informative and persuasive writing (Brown & Fall, 2005; CPRE, 1999). Meng (2013) and Sapp and Zhang (2009) found that the practitioners rank strategic decision-making capability, problem solving, and communication knowledge and expertise highest, while public relations students rate ability to solve problems and produce desired results, writing skills, oral communication skills, and time management skills among the lowest. The results of this study indicate that students mostly agreed with on-site supervisors (and vice-versa) in relation to their ratings of job skills and professional characteristics. Students placed high ratings on quality of work, overall performance, writing skills, and job task responsibility; oral communication, knowledge of social media, computer skills and research skills were lower rated. On-site supervisors’ top-rated job skills were quality of work, writing skills, oral communication and job task responsibility; lower rated items were research skills, overall performance, knowledge of social media and computer skills.

While there are many benefits related to the internship experience, disparities do exist between how students and supervisors perceive the importance of job skills and professional characteristics, which can lead to missed opportunities for all (Meng, 2013; Sapp & Zhang 2009; Todd, 2014). This survey indicates that regarding job skills, student interns and on-site supervisors are both cooriented to one another across all three coorientation variables (agreement, congruency and accuracy). Regarding professional characteristics items, both groups were also cooriented to one another regarding the agreement variable (student self vs. supervisor self); however, significant differences exist among on-site supervisors regarding the congruency variable (supervisor self vs supervisor other), and students regarding the accuracy variable (student self vs supervisor other). This finding is potentially more problematic for student interns than on-site supervisors because, according to Kim (1986), of the three measurements, accuracy is the most important; it must take place before true understanding can occur. Misperceptions and misunderstanding have the potential to result in missed opportunities for collaboration and integration, and/or a self-fulfilling prophecy where a lack of coorientation between both students and supervisors damages the possibility of a cooperative relationship with current and future student interns, and the academic programs that provide access to students.

With regard to college learning outcomes, literature indicates that employers believe that engaging students in internships improves college-learning outcomes, makes students better prepared for career success, and potentially a high-impact learning experience that deepens learning (Hart, 2016; O’Neill 2010). In this study, the majority of students perceived that improvement was “significant,” while supervisors’ perceived improvement was “moderate.” These findings differ from several reports that indicate that public relations graduates re not meeting entry-level outcome competencies (CPRE, 1997; CPRE, 2006; Neff, Walker, Smith, & Creedon, 1999). The high-impact focus of the internship experience respondents of this study participated in may have deepened perceptions of learning and successful outcomes for students.

In the open-ended portion of the survey, students stated that they valued the real-world nature of the experience, and learned a lot about the scope of public relations; challenges mostly related to time and work-load management, the unpaid nature of experience, and the strong emphasis on writing ability. Supervisors identified limited time interns worked on site as a key challenge, but for the most part, they reported being very satisfied with their interns.

The findings of this study suggest that both groups were cooriented to one another in relation to perceptions of the job skills associated with the internship experience; however, in relation to the professional characteristics category, supervisors indicated lower levels of congruency (supervisor self vs. supervisor other), which means accuracy and overall coorientation between the two groups is low. Blindly assuming that all parties share a common understanding of goals, outcomes, tasks and responsibilities can lead to missed opportunities for collaboration and integration, and/or damage the possibility of a cooperative relationship with current and future student interns, and the academic programs that provide access to students

Suggestions to overcome discrepancies

  1. Faculty supervisors should clearly communicate to all parties (not just students) what practical expectations, roles, and responsibilities are associated with the experience. This can be achieved by encouraging collaboration between student and supervisor (prior to the start of the internship) in the learning goals and outcomes identification process.
  2. Details related to projects and deadlines, expectations regarding the degree of autonomy/independence versus teamwork/direction could also be established. This could be achieved by collaborating in the creation of a “contract” document in the opening days/weeks of the internship.

In addition to collaboration related to expectations, the provision of rich feedback to the student from both the faculty and on-site supervisor can benefit all parties and the hallmark of high-impact internships. This feedback can relate to the practical day-to-day tasks/responsibilities, but also engaging students and their supervisors in reflective conversations related to the interns’ career goals and opportunities to reflect on the people they are becoming.

Scaffolding relevant prior learning (Introduction to Public Relations and Public Relations Writing classes as prerequisites) and encouraging reflection on challenges/opportunities can take the form of journals—shared with faculty and on-site supervisors—that hone writing skills and prompt students to engage in critical thinking related to the experience; it can also provide an opportunity to coorient more accurately with one another.

To conclude, the two groups in this study have a lot more in common than they don’t; perfect communication may not necessarily improve accuracy between these two groups, but if two are motivated to coorient, it can facilitate understanding. For the public relations educator and student intern, the goal of communication must be to improve accuracy, even if they agree to disagree or even choose not to coorient to the same things in the same degree. As such, greater dialogue about the fact that students are more cooriented to supervisors regarding the importance of jobs skills and professional characteristics than supervisors suspected, will ultimately lead to greater understanding and opportunities for all parties involved.

Limitations and Future Study

Although the survey was sent to over 135 strategic communication/public relations concentration students, the response rate and subsequent sample size was small. The convenience sample nature of the supervisor sample—determined by student interns providing their supervisors’ contact information—is also a limitation and while the response rate was relatively high, the researcher acknowledges that external validity for the study is low. The study’s results may not be generalizable with a certain margin of error toward the larger population of student interns and on-site supervisors. Another limitation is that that the majority of students who participated in the study worked at the internship site fewer than 10 hours; their experiences would likely differ from students whose internships require them to work significantly greater hours. Despite these limitations, the results provide a valuable exploratory insight into how respondents’ rate job skills and professional characteristics, the level of coorientation that exists between them, and the extent to which they view the internship experience improves a variety of college learning outcomes.

Future research could expand this study by incorporating some qualitative elements, and increasing the representativeness and generalizability of the study by increasing the sample size (including other universities). The researcher intends to incorporate a longitudinal approach, continuing to gather and analyze information from student interns and their supervisors and explore the implications of their orientations on the quality of the experience for both parties.

REFERENCES

Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) (2013), “Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) Accrediting Standards.” Retrieved from https://www2.ku.edu/~acejmc/PROGRAM/STANDARDS.SHTML.

Anson, C. M., & Forsberg, L. L. (1990). Moving beyond the academic community transitional stages in professional writing. Written Communication, 7(2), 200-231.

Avery, E., Lariscy, R., & Sweetser, K. D. (2010). Social media and shared—or divergent—uses? A coorientation analysis of public relations practitioners and journalists. International Journal of Strategic Communication, 4(3), 189-205.

Basow, R. R., & Byrne, M. V. (1993). Internship expectations and learning goals. Journalism Educator, 47(4), 48-54.

Baxter, B. L. (1993). Public Relations Education: Challenges and Opportunities: Public Policy Committee of the Public Relations Society of America. New York.

Beard, F., & Morton, L. (1999). Effects of internship predictors on successful field experience. Journalism & Mass Communication Educator, 53(4), 42.

Beebe, A., Blaylock, A., & Sweetser, K. D. (2009). Job satisfaction in public relations internships. Public Relations Review, 35(2), 156-158.

Brindley, C., & Ritchie, B. (2000). Undergraduates and small and medium-sized enterprises: opportunities for a symbiotic partnership? Education+ Training, 42(9), 509-517.

Bringle, R. G., & Hatcher, J. A. (2002). Campus–community partnerships: The terms of engagement. Journal of Social Issues, 58(3), 503-516.

Broom, G.M. (1977). Coorientational measurement of public issues. Public Relations Review, 3(4), 110-118.

Brown, A., & Fall, L. T. (2005). Using the port of entry report as a benchmark: Survey results of on-the-job training among public relations internship site managers. Public Relations Review, 31(2), 301-304.

Callanan, G., & Benzing, C. (2004). Assessing the role of internships in the career-oriented employment of graduating college students. Education+ Training, 46(2), 82-89.

Cantor, J. A. (1997). Experiential learning in higher education: Linking classroom and community. ERIC Digest. Retrieved from http://files.eric.ed.gov/fulltext/ED404949.pdf

Chaffee, S.H., & McLeod, J.M. (1968). Sensitization in panel design: A coorientational experiment. Journalism Quarterly, 45, 661-669.

Ciofalo, A. (1989). Legitimacy of internships for academic credit remains controversial. Journalism Educator, 43(4), 25-31.

Coco, M. (2000). Internships: A try before you buy arrangement. SAM Advanced Management Journal, 65(2), 41-43.

CPRE (1999). Public relations education for the 21st century: A port of entry. Retrieved from http://www.prsa.org/_Resources/ resources/pre21.asp?ident=rsrc6.

CPRE (2006). Public relations education for the 21st century: The Professional Bond. http://www.commpred.org/theprofessionalbond/index.php (accessed April 27).

Daugherty, E. L. (2011). The public relations internship experience: A comparison of student and site supervisor perspectives. Public Relations Review, 37(5), 470-477.

Fall, L. (2006). Value of engagement: Factors influencing how students perceive their community contribution to public relations internships. Public Relations Review, 32(4), 407-415.

Gault, J., Redington, J., & Schlager, T. (2000). Undergraduate business internships and career success: Are they related? Journal of Marketing Education, 22(1), 45-53.

Gibson, D. C. (2001). Communication faculty internships. Public Relations Review, 27(1), 103-117.

Grunig, J.E. (1972). Communication in community decisions on the problems of the poor. Journal of Communication, 22, 5-25.

Gupta, P., Burns, D., & Schifer, J. (2010). An exploration of student satisfaction with internship experiences in marketing. Business Education & Administration, 2(1), 27-37.

Hart Research Associates. (2016). Falling short? College learning and career success. NACTA Journal, 60(1a).

Heider, F. (1958). The Psychology of Interpersonal Relations. New York: John Wiley and Sons.

Horowitz, E. M. (1997, August). Does money still buy happiness: Effects of journalism internships on job satisfaction. Paper presented at the meeting of the Association for Education in Journalism and Mass Communication, Chicago, Il.

Johnson, D. J. (1989). The Coorientation Model and Consultant Roles. In: Botan, C.H. and Hazleton, Jr. (Eds.). Public Relations Theory (pp. 243-263). New Jersey: Lawrence Erlbaum Associates.

Kim, H. S. (1986). Coorientation and ommunication. In B. Dervin and M.I. Voight (Ed.), Progress in Communication Sciences (pp. 31-54). Norwood, NJ: Ablex.

Knouse, S.B, Tanner, J. R, & Harris, E. W. (1999). The relation of college internships, college performance, and subsequent job opportunity. Journal of Employment Counseling, 36(1), 35-43.

Knouse, S. B., & Fontenot, G. (2008). Benefits of the business college internship: A research review. Journal of Employment Counseling, 45(2), 61-66. doi: 10.1002/j.2161-1920.2008.tb00045.x

Kopenhaver, L.L., Martinson, D.L., & Ryan, R. (1984). How public relations practitioners and editors in Florida view each other. Journalism Quarterly, 61(4), 860-865,884.

Kuh, G. D. (2008). Excerpt from High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter. Assoc. of Am. Colleges and Univ., Washington, DC.

Lubbers, C., Bourland-Davis, P., & Rawlins, B. (2008). Public relations interns and ethical issues at work: Perceptions of student interns from three different universities. PRism 5(1&2). Retrieved from http://praxis.massey.ac.nz/prism_on-line_journ.html

Lubbers, C. A., Bourland-Davis, P. G., & DeSanto, B. (2012). An exploration of public relations internship site supervisors’ practices. In M. A. Goralksi and H. P. LeBlanc (Eds.), Business Research Yearbook, (511-518). International Academy of Business Disciplines and International Graphics, Beltsville, MD.

Maertz Jr, C., A. Stoeberl, P., & Marks, J. (2014). Building successful internships: Lessons from the research for interns, schools, and employers. Career Development International19(1), 123-142.

McCarthy, P., & McCarthy, H. (2006). When case studies are not enough: Integrating experiential learning into business curricula. Journal of Education for Business, 81(4), 201-204.

McLeod, J., & Chaffee, S. (1973). Interpersonal approaches to communication research. American behavioral scientist, 16(4), 469-499.

Mendel‐Reyes, M. (1998). A pedagogy for citizenship: Service learning and democratic education. New Directions for Teaching and Learning, 1998(73), 31-38.

Meng, J. (2013). Learning by leading: Integrating leadership in public relations education for an enhanced value. Public Relations Review, 39(5), 609-611.

Mihail, D. (2006). Internships at Greek universities: An exploratory study. Journal of Workplace Learning, 18(1), 28-41.

National Association of Colleges and Employers (NACE). (2016). 2016 Internship & Co-op Survey. Retrieved from http://www.naceweb.org/intern-co-op-survey/

Neff, B., Walker, G., Smith, M., & Creedon, P. (1999). Outcomes desired by practitioners and academics. Public Relations Review, 25(1), 29-44.

Newcomb, T.M. (1953). The approach to the study of communication acts. Psychological Review, 60, 393-404.

O’Neill, N. (2010). Internships as a high-impact practice: Some reflections on quality. Peer Review, 12(4), 4-8.

Redeker, L. (1992). Internships provide invaluable job preparation. Public Relations Journal, 22(3), 20.

Sapp, D., & Zhang, Q. (2009). Trends in industry supervisors’ feedback on business communication internships. Business Communication Quarterly, 72(3).

Smith-Barrow, D. (2016). 10 colleges where almost everyone gets internships. US News and World Report. Retrieved from http://www.usnews.com/education/best-colleges/the-short-list-college/articles/2016-03-08/10-colleges-where-most-students-get-internships

Soska, T., Sullivan-Cosetti, M., & Pasupuleti, S. (2010). Service learning: Community engagement and partnership for integrating teaching, research, and service. Journal of Community Practice, 18(2-3), 139-147.

Taylor, S. (1988). Effects of college internships on individual participants. Journal of Applied Psychology, 73(3), 393.

Thiel, G., & Hartley, N. (1997). Cooperative education: A natural synergy between business and academia. SAM Advanced Management Journal, 62(3), 19.

Todd, V. (2014). Public relations supervisors and Millennial entry-level practitioners rate entry-level job skills and professional characteristics. Public Relations Review, 40(5), 789-797.

Tovey, J. (2001). Building connections between industry and university: Implementing an internship program at a regional university. Technical Communication Quarterly, 10(2), 225-239.

Verčič, D., & Verčič, A. T. (2007). A use of second-order co-orientation model in international public relations. Public Relations Review, 33(4), 407-414.

Waters, R. (2009). Comparing the two sides of the nonprofit organization–donor relationship: Applying coorientation methodology to relationship management. Public Relations Review, 35(2), 144-146.

Watson, B. (1995). The intern turnaround. Management Review, 84(6), 9-13.

Westerberg, C., & Wickersham, C. (2011). Internships have value, whether or not students are paid. The Chronicle of Higher Education. Retrieved from http://www.chronicle.com/article/Internships-Have-Value/127231/

© Copyright 2017 AEJMC Public Relations Division

Educating students for the social, digital and information world: Teaching public relations infographic design

Authors

Diana Sisson Tara Moretensen

Diana C. Sisson, Ph.D., Auburn University
Tara M. Mortensen, Ph.D., University of South Carolina

 Abstract

 This study employs an exploratory content analysis of current public relations information graphics to examine variables within two concepts pertaining to public relations: transparency and clarity. These two concepts were chosen because they apply to both traditional public relations practice and are also widely taught amongst contemporary infographics design experts. The subjects of the study are nonprofit organizations’ online informational graphics (N = 376) that have been released on Twitter. Findings suggest that nonprofit organizations are not applying traditional public relations principles to their design of online information graphics, demonstrating difficulty in translating these principles to visual design, a skill that is becoming more important. While the study is not intended to generalize, this snapshot of current practice is used to offer improvements in preparing public relations students for communication with information visualizations. This exploration illuminates the need for public relations education geared toward the social, visual, and data-driven environment. To this end, the study uses these findings to develop an initial set of practices for infographic design that can be implemented into current public relations education.

Keywords: infographics, public relations, visual communication, nonprofit organizations, public relations education, visual literacy

SlideShare PDF

Educating students for the social, digital and information world:  Teaching public relations infographic design

Educating students for the social, digital and information world:  Teaching public relations infographic design

Social media have transformed public relations education, forcing students to apply traditional public relations principles, such as transparency and clarity, to new forms of communications such as infographics. Infographics are design pieces that may include “data visualizations, illustrations, text, and images together into a format that tells a complete story” (Krum, 2013, p. 6). In the contemporary mediascape that caters to low-attention spans, infographics have become hugely popular forms of communication. Public relations firms are using the medium to build awareness of products and brands, provide information to shareholders, and increase the value of the brand or cause (Krum, 2013, p. 88). Effectively creating infographics requires an understanding of visual communication principles and for niche industries such as public relations, requires translating legacy principles to new forms of communication. Data visualizations are compelling to audiences, and “present the illusion of trustworthiness due to their visual nature and presentation of statistical information” (Toth, 2013, p. 449). Thus, understanding how to correctly present data in visual form is imperative.

No research was found by the researchers regarding how public relations professionals are applying traditional principles to the design of information graphics, nor how students can better prepare to work in a modern media environment. Given the popularity of infographic use among nonprofit organizations in an online environment, these are significant gaps. This exploratory study examines public relations graphics released via Twitter to identify the manner in which the principles of transparency and clarity are being applied, and to ultimately offer an initial list of suggestions for public relations educators.

Based on a review of the literature in the following sections, opportunities for further study arose and research questions are proposed. Visuals as a form of communication in the contemporary visual-social mediascape will be introduced, concentrating on infographics. A discussion of the importance and communicative powers of visuals will also be undertaken. Following, the variables within the concepts of transparency and clarity will be laid out as they pertain to public relations and visual communication, specifically infographics.

LITERATURE REVIEW

An Onion article jokes that people “shudder” at large blocks of uninterrupted text, requiring a colorful photo, an illustration, or a chart to comprehend the information (“Nation shudders,” 2010). Satire aside, contemporary news consumers are indeed skimmers, primarily reading exciting words and facts, as well as headlines and visuals (Nielsen, 2011; Rosenwald, 2014). This trend has contributed to a massive increase in the use of infographics to spread information, as well as a need for educators to teach new tools.

Between 2013 and 2015, Google searches for infographics increased 800% (Meacham, 2015). Infographics intend to tell a story primarily in pictures, while minimizing the number of words and maximizing visual impact (Meacham, 2015). The production of data and its graphic representation were once specialized trades but are now accessible to nearly everyone (Yaffa, 2011). Infographics harness the power of visuals to grab readers’ attention, reduce the amount of time it takes to understand data, provide context by showing comparisons, and make messages more emotional, memorable and accessible (Kimball & Hawkins, 2008; Kostelnick & Roberts, 2010; Schafer, 1995; Tufte, 2003).

In an age of “fake news” and audience mistrust of traditional media sources, understanding how to communicate truthfully in multiple forms is particularly important for students (Rutenberg, 2016). The 2016 presidential elections brought the term “fake news” into mainstream awareness, raising widespread knowledge of the viral spread of untruthful information via social-networking sites (Wingfield, Isaac, & Benner, 2016). Twitter and Facebook have been urged to take their part of the responsibility in this spread, and tomorrow’s communicators, too, must be prepared to understand, identify, and create truthful and clear visual-statistical messages. Members of the media, following Kellyanne Conway, have used the term “alternative facts” to describe a problematic trend of a growing perception of multiple truths, which affects the credibility of politicians, corporations and the media (Rutenberg, 2017, para. 7). Data design has special considerations in this regard (Kienzler, 1997; Rosenquist, 2012; Stallworth, 2008; Tufte, 2001). Visual content creators can accidentally and easily mislead their audience because visuals earn more importance and emotional impact than texts (Kienzler, 1997). Infographics can unintentionally distort or make data opaque to gain viewers (McArdle, 2011). As Toth (2013) noted, infographics represent an extension of fundamental issues, including “presenting information clearly and succinctly, targeting audiences, defining clear purposes, developing ethos, understanding document design principles, using persuasion techniques effectively, branding, and conducting and summarizing research” (p. 451).

Public Relations Education and Visual Communication

Educational materials for creating and disseminating infographics have only recently been developed and are not widely adopted within the various streams of communication education. Experts on infographics contend that there are thousands of poorly-constructed infographics online, but “the good designs rise to the top and are the designs that most often go viral in social networks” (Krum, 2013, p. 271). The challenge is melding the principles of various fields, including public relations, with the principles of infographic design and visual communication.

Researchers and professionals have noted the increased need for education in infographics in public relations due to employers’ demand for such skills and increased usage in the field (Gallicano, Ekachai, & Freberg, 2014). Advocates of visual literacy have long held that visual education, including knowledge of how to create visuals, is the missing piece of contemporary education (Metros, 2008; Sosa, 2009). Visuals have a powerful impact on audiences in ways that text does not. Visuals grab readers’ attention (Boerman, Smith, & van Meurs, 2011) and stick in the memory longer than other forms of communications (Graber, 1990). Krum (2013) referred to this as the “picture superiority effect” (p. 20). Further, images are subject to less scrutiny than other forms of communication (Messaris, 1994, p. x). In other words, viewers of images tend to believe what they see (Newton, 2013; Wheeler, 2001), and this is especially the case with visualized data (Cairo, 2012; Krum, 2013). While modern college students are consumers and producers of highly visual content on the web, they lack the skills to effectively communicate visually (Metros, 2008). Visual intelligence influences perceptions and interpretations of visual materials (Moriarty, 1996). Schools are encouraged to introduce concepts of visual literacy to understand, analyze, interpret and create effective visual information (Burns, 2006).

ACEJMC suggests, broadly, that all programs should teach students to apply the appropriate tools and technologies for the communication professions in which they work. There is greater importance to teach students visual communication skills due to the digital landscape and shorter attention spans (Lester, 2015). This need is particularly pertinent to public relations students and infographics. The Commission on Public Relations Education met in 2015 to discuss undergraduate public relations education, noting a need for better verbal as well as graphic communications (p. 8). Kent (2013), in his suggestions for using social media in public relations, states that publics are better served by thoughtful, thorough, and relevant information including high-quality infographics that contain complete information, rather than “eye-candy” (p. 343). Richard Edelman (2012) said to public relations educators that, “There is a huge place for deeper, more informative visuals . . . which infographics – visual representations of information, data or knowledge – provide” (p. 4).

The following sections of this paper review two principles of public relations, and within each principle, rules of effective infographic design are applied. Transparency and clarity were examined because: 1) organizational transparency is necessary to provide coherence, visibility, and clarity (Albu & Wehmeier, 2014); and, 2) clarity assures that information communicated is easily understood by various publics and does not contain jargon (Rawlins, 2009). Further, concepts of transparency and clarity each encompass variables that theoretically and practically derive from and can be applied to visual communications, specifically infographics. The researchers were interested in studying the junction of these two fields and extracting implications for students who will be working within this increasingly-popular, professional niche.

Transparency

Public relations students are taught to be transparent, but may not know how this applies to infographic design. According to Rawlins (2009):

Transparency is the deliberate attempt to make available all legally releasable information—whether positive or negative in nature—in a manner that is accurate, timely, balanced, and unequivocal, for the purpose of enhancing the reasoning ability of publics and holding organizations accountable for their actions, policies, and practices. (p. 75)

Plaisance (2007) argued while transparency “is not always a sufficient condition for more ethical behavior, its absence is a prerequisite for deception” (p. 193). Transparency has been studied from conceptual (Rawlins, 2006, 2009), journalistic (Plaisance, 2007), and social media campaign (Burns, 2008; DiStaso & Bortree, 2012) perspectives.

Rawlins (2006) argued transparency is comprised of three components: participation, substantial information, and accountability. Drawing on previous transparency literature, as well as on the Global Reporting Index (GRI) Guidelines and other guidelines promoting transparent communication, Rawlins (2006) found substantial information was the “strongest predictor among transparency components” (p. 433). From this perspective, Rawlins (2009) noted disclosure is about providing information, but can be used to distort perspectives, rather than provide clarity.

Transparency has been studied from a social media campaign perspective (Burns, 2008) and from a dialogic perspective with particular focus on mutual understanding (Albu & Wehmeier, 2014). Using content analysis, Burns (2008) examined the Wal-Mart and Edelman “Wal-Marting Across America” blog crisis to argue that a lack of transparency in blogging leads to harsh criticism despite classic crisis response strategies such as apology. DiStaso and Bortree (2012) echoed similar sentiments about transparency through their evaluation of award-winning campaigns. DiStaso and Bortree (2012) found that many of the campaigns reflected transparency in that they “provid[ed] information that is useful for others to make informed decisions” (p. 513). Transparency in social media tactics kept organizations accountable to their publics (DiStaso & Bortree, 2012). Albu and Wehmeier (2014) argued that transparency and dialogue were “interconnected,” which was often overlooked in the literature (p. 129). Echoing Rawlins (2009), they posited that disclosure alone was insufficient for publics’ understanding; rather, true understanding was based in the coherence, clarity, and visibility of information (Albu & Wehmeier, 2014). In communicating transparently to foster mutual understanding, Albu and Wehmeier (2014) argued accountability, credibility, and loyalty of stakeholders may be heightened.

Transparency and visual communications. While transparency is a vital principle for public relations professionals to abide by, contemporary public relations educational materials fall short of teaching the application of transparency to infographics design. On the same token, textbooks specific to visual communication explain the importance of transparency in infographic design, but do little to translate these principles to public relations (e.g., Knaflic, 2015; Krum, 2013; Smiciklas, 2012). Transparency with data is, in fact, of utmost importance in the creation of infographics. Viewers tend to see visualized data as both important and scientifically true, placing increased pressure on infographic designers to be transparent about the data. To be transparent, the infographic needs to “address the sources of the data included in the design in an open and honest manner” (Krum, 2013, p. 295). Sharing where the data came from, the age of the data, and the credibility of the data source can help establish the believability of the data. Further, copyright law means that the designer of the infographic and the names of any contributing illustrators and photographers be given credit (Lester, 2015; Walter & Gioglio, 2014).

Still, a massive portion of information graphics appearing online have either no data source listed, vague data sources provided, or simply provide questionable data sources, including personal blogs and websites. Krum (2013) suggested infographic designers should track down and cite the original source of data, list the source, and list a specific URL to the exact report or dataset that was used, as well as including the date of the data. Once an infographic is released online, its whereabouts will become unpredictable. In fact, a purpose of infographic design is to “go viral.” Therefore, in addition to source information, then, the bottom of an infographic must include the name of the company that originally released it and a landing page URL that sends the viewer to the original source of the infographic.

Transparency measures. The Global Reporting Index offers guidelines for promoting transparent communication (Rawlins, 2009). The GRI indicated clarity, relevance, timeliness, neutrality, sustainability context, and comparability were important components in transparent communication (Rawlins, 2009).

Transparent communication should aid with decision-making by providing relevant information to members of key publics (Global Reporting Index, as cited in Rawlins, 2009). Transparent communication should be timely. The Global Reporting Index defined timeliness as providing “information within a time frame that makes the information usable” (as cited in Rawlins, 2009, p. 82). Transparent communication should be neutral in order to avoid perceptions of deception. The GRI defined neutrality as “avoid[ing] bias and striv[ing] for a balanced account of the company’s performance” (as cited in Rawlins, 2009, p. 81). While transparent communication should be neutral and timely, it should also provide a sustainability context to information. The Global Reporting Index defined sustainability context as “identify[ing] how organizational behavior is contributing to effects on the environment, economy, and/or social welfare” (as cited in Rawlins, 2009, p. 80). Furthermore, transparent communication should be comparable. The GRI defined comparability as “easily compar[ing] to both earlier performance of the company and to other similar organizations” (as cited in Rawlins, 2009, p. 81).

Clarity

Public relations students are taught about presenting information clearly, but infographic design has special implications for this principle, which may be less understood. As delineated by the Global Reporting Index guidelines, information is clear, or has clarity, when the information communicated is easily understood by various publics and does not contain jargon (as cited in Rawlins, 2009). Furthermore, the GRI indicated that clarity enhances understanding of information (as cited in Rawlins, 2009). Jargon, or highly technical and industry-specific words or acronyms, hinders understanding of organizational communication by members of key publics. Marken (1996) contended that public relations professionals have a responsibility to communicate on behalf of their organizations in a clear and concise manner, and public relations students are taught to present information clearly.

Clarity and infographic design. When creating infographics, several principles of design promote clarity. A primary purpose of creating infographics is to provide clarity to disorganized and difficult-to-understand data or ideas (Cairo, 2012). A well-designed infographic should present information in a way that readers can see, read, and explore information which would be too difficult to digest in its raw data form (Cairo, 2012). As Krum (2013) said, “Nobody wants to read a text article that has been converted into a JPG image file and then called an infographic” (p. 291), and further stresses: “Using big fonts in an infographic to make the numbers stand out is not data visualization . . . . Displaying the number in a large font doesn’t make it any easier for the audience to understand” (p. 219). Therefore, the visualization of data in order to increase comprehension of information is essential.

Charts (pie, line, bar), graphs, illustrations, maps, and diagrams, when used correctly, help make complex information more clear and understandable (Cairo, 2012). Additionally, considered by many the Father of Data Visualization, Edward Tufte is described by Yaffa (2011) as saying “the first grand principle of analytical design: above all else, always show comparisons” (para. 12). Doing so allows clear data presentation and interpretation to viewers. According to Yaffa (2011), Tufte believes, “there is no such thing as information overload . . . . Only bad design,” which impedes rather than enhances clarity (para. 36). In addition to choosing the proper visualization method for the given data, clarity is increased when viewers do not have to look back and forth to discern the meaning of the visualizations or colors. This is why pioneer infographic designer Scott Farrand said to “avoid legends like the plague” (personal communication, March 23, 2016), and Randy Krum said using legends are “evil” (p. 293). Tufte (1983) coined the term “chart junk” (p. 67) to refer to anything that gets in the way of a viewer interpreting the data.

Research Questions

Given the popularity of infographics use by nonprofit organizations and the call from the Commission on Public Relations Education (2015) and other scholars, this area should be examined, and improvements should be offered for the next generation of public relations practitioners. For this reason, the following research questions are offered:

RQ1: To what degree are nonprofit organizations’ information graphics transparent?

RQ2: To what degree do nonprofit organizations present the information in graphics clearly?

METHOD

A content analysis was conducted to systematically and quantitatively evaluate transparency and clarity strategies in nonprofits’ online information graphics (Stempel, 2003). Content analysis allowed for conclusions to be drawn from the observations that emerge from analysis of data (Stempel, 2003).

Sampling

Information graphics (N = 376) released by 18 nonprofit organizations on Twitter were analyzed. The researchers defined an infographic for this study as a graphic that contains information. This graphical information did not necessarily need to be quantitative, but could also be words, facts, or illustrations. None of the infographics were “clickable” or lead to other pages. Note that the definition is broad. While Fernando (2012) defines an infographic as “a form of storytelling that people can use to visualize data in a way that illustrates knowledge, experiences, or events” (Fernando, 2012, p. 2), a wider definition is adopted for the present study in order to accommodate those infographics that fall out of the expert definition. Infographics distributed through Twitter were selected for this study because 21% of American adults use the social media platform for their news consumption (Greenwood, Perrin, & Duggan, 2016). As this is an exploratory study, only one social-networking website was used. Future studies should examine transparency and clarity of nonprofit organizations on other social networks, such as Facebook.

Nonprofit organizations were selected for analysis based on a sampling frame of Top Nonprofits.com’s Top 100 Nonprofits on the Web list. The sample frame was selected for its reliance on “publicly available web, social, and fiscal responsibility metrics” (Top 100 Nonprofits on the Web, n.d., para. 2), as well as for its rankings methodology of nonprofits online. Each of the chosen nonprofit organizations’ Twitter feeds were accessed to gather infographics. Data collection occurred from November 1, 2015 to November 31, 2015 for this study. All non-animated, non-clickable infographics collected were released in November 2015, as well as up to six months prior in May 2015 in order to collect a substantive sample. This time frame allowed the researchers to examine a snapshot of nonprofit organizations’ infographic use and design practices prior to December and January, which are traditionally peak fundraising periods. Duplicates were excluded.

Nonprofit organizations found in the Top Nonprofits.com’s Top 100 Nonprofits on the Web list were divided into “more than 10” and “less than 10” infographics categories. The rationale for this categorization was to ensure that the researchers were not pulling infographics from nonprofit organizations that used the visual communication infrequently; this categorization was intended to ensure representativeness of infographic use and frequency. The researchers collected infographics from the Top 100 Nonprofits on the Web list using this categorization until an adequate sample size was met. The sample was not random, as generalizing to the broader social media sphere was not the purpose of the paper. Rather, the purpose of the examination is to gather a snapshot of contemporary public relations infographics and offer suggestions for improvement in education.

Nonprofit organizations analyzed in this study and listed in Top 100 Nonprofits on the Web include: Human Rights Campaign (15.2%, n = 57), UNICEF (15%, n = 55), Save the Children (8.2%, n = 31), ACLU (8%, n = 29), Conservation International (7%, n = 25), International Rescue Committee (7%, n = 25), Wounded Warrior Project (6.4%, n = 24), Amnesty International (6.1%, n = 23), Teach for America (5.1%, n = 19), Feeding America (5%, n = 18), Susan G. Komen (5%, n = 18), March of Dimes (5%, n = 17), Rotary International (4%, n = 13), ASPCA (3%, n = 10), Livestrong (1.1%, n = 4), Samaritan’s Purse (1%, n = 3), Ronald McDonald House (1%, n = 3), and Kliva (1%, n = 2).

 Coding and Variables

Variables for measuring transparency and clarity were gleaned from the academic and professional literature on public relations and infographic design, as described in the literature review. Each concept contained variables pertaining to the intersection of infographic design and public relations. Transparency was the largest of the three concepts, and specifically measured using variables and variable definitions found in Table 1.

Table 1
Transparency variables and definitions
Variable Variable Definition
Data attribution Whether or not the data was attributed at all
Data availability Whether the original data itself is available to viewers: A link on infographic? Link on landing page? Spreadsheet on landing page? Data source not available at all? Each was coded as yes or no.
Data quality Whether the data source is vague, questionable, reliable, or not identified. Vague data sources are those that only contain the name of the host site that publishes the data without any additional information about a specific report or article. Questionable data sources are those that are Wikipedia, blogs, or personal sites, and unclear sites are those where the source is not clearly identified. Each was coded as yes or no.
Data date Whether the date of the data was provided. Coded as yes or no.
Designer credit Whether credit was given to the individual that designed the infographic. Coded as yes or no.
Photographer or graphic credit Whether credit was given to the individual(s) who created any graphical elements or photographs used in the infographic. Coded as yes or no.
Landing page Whether a URL was provided that directs the user back to the original web location of the infographic. Coded as yes or no.
Relevance Whether the infographic contains information specific to the organization. Coded as yes or no.
Sustainability context Whether the infographic identifies how organizational behavior is contributing to effects on the environment, economy, and/or social welfare. Coded as yes or no.
Neutrality Whether the infographic contains information from organizations other than itself. Coded as yes or no.
Comparability Whether the infographic compares its performance to itself or to other/ similar organizations. Coded as yes or no.
Timeliness Whether the infographic contains information in a timeframe usable to stakeholders. Coded as yes or no.

Clarity contained three variables: two derived from the infographic literature and one derived from public relations literature, which can be found in Table 2. They were infographic type, presence of a legend, and presence of industry jargon.

Table 2
Clarity variables and definitions
Variable Variable Definition
Infographic type Timeline, pie chart, line graph, how-to diagram, bar graph, bubble chart, flow chart, list, numbers only, words/facts only, or other. For each of these, coded as present or not present.
Legend Whether or not the infographic contained a legend. Coded as yes or no.
Jargon Whether or not the infographic contained jargon. Coded as yes or no.

 

In addition, the researchers coded the organization that released the infographic, the topic, tone, and type of data visualization. Tone was coded as humorous/entertaining, informational, utility/how-to, serious/somber, other and none, and each category was coded as yes or no, as these categories are not mutually exclusive. Humor or entertaining infographics were light-hearted or comical; informational infographics were merely fact-based; utility-based infographics were those that taught a user how to do something; serious or somber infographics contained serious information aimed at persuading users. As this article is aimed towards education, infographic types examined (e.g., pie charts, maps) were selected from two leading textbook authors, Cairo (2012) and Krum (2013). A detailed visual codebook was developed and refined through five separate practice sessions by two independent coders. Following refinement of the codebook, three more practice coding sessions of a subsample of infographics were undertaken, with intermittent discussions and clarifications, until a level of agreement was achieved. Coders reached a good to excellent level of intercoder reliability. The Cohen’s Kappas were all α > 0.9, with three exceptions: Type: 0.87; Neutrality = 0.87; and Attribution = 0.87. To examine the data from the visual and textual content analysis, frequencies and descriptive statistics of each category were conducted.

FINDINGS

Findings from this study highlighted the nuances of how nonprofits approach transparency and clarity practices. The following sections address the results of each research question.

RQ1: To what degree are nonprofit organizations’ infographics transparent?

Frequencies of infographic transparency variables, infographic quality of data source of those that list a source, and data availability were conducted. As Table 3 shows, only 18.6% of the infographics attributed the source of their data. For each variable, there were fewer positive instances of transparency than negative.

Table 3
Frequencies of infographic transparency variables
Variable Yes No Total
Data attribution 70 (18.6%) 306 (81.4%) 376 (100%)
Data date 53 (14.1%) 323 (85.9%) 376 (100%)
Designer credit 3 (0.8%) 373 (99.2%) 376 (100%)
Photographer credit 28 (7.4%) 348 (92.6%) 376 (100%)
Landing page 120 (31.9%) 255 (67.8%) 376 (100%)
Relevance 65 (17.3%) 311 (82.7%) 376 (100%)
Sustainability context 53 (14.1%) 322 (85.6%) 376 (100%)
Neutrality 30 (8.0%) 345 (91.8%) 376 (100%)
Comparability 9 (2.4%) 366 (97.3%) 376 (100%)
Timeliness 39 (10.4%) 336 (90%) 376 (100%)

The inclusion of a landing page was the one tool used most often by the nonprofits in this sample (31.9%). Very few infographics included a credit to the designer (0.8%) or image source (14.1%). Relevance, or whether the infographic contained information about an action taken by the organization, was present in 17.3% of infographics. Similarly, 14.1% of infographics contained information about how organizational behavior is contributing to effects on the community, environment, or social welfare of groups or individuals.

As Table 4 shows, of the infographics that list a data source (18.6%, n = 70), 64 of the sources were vague, or only listed the host site without additional information about the specific report or article; three were “questionable” (e.g., a blog, Wikipedia, or personal site); and three were not clearly identified.

Table 4

Infographic quality of data source of those that list a source

Data quality Number of infographics Total
Vague 64 (91.4%) 70 (100%)
Questionable 3 (4.3%) 70 (100%)
Unclear 3 (4.3%) 70 (100%)

As Table 5 shows, audiences wishing to clarify the source of data would be mostly unable to, as only 16 (4.2%) of the infographics in the total sample contained a way to find the source of the data.

Table 5
How nonprofit organizations make data available
Data availability Number of infographics Total
Link on infographic 5 (1.3%) 376 (100%)
Link on landing page 8 (2.1%) 376 (100%)
Spreadsheet on landing page 3 (.8%) 376 (100%)
Data source not readily available 360 (95.7%) 376 (100%)

Finally, image source (4%) was associated with numbers-only infographics. Image source (4%) was also associated with infographics with only words and facts. Designer credit (1%) was most associated with list infographics. Landing page URLs (12%) were most associated with infographics with only words and facts. Landing page URLs (8%) were also associated with list infographics.

RQ2: To what degree do nonprofit organizations present the information in infographics clearly?

For the present paper, the construct of clarity was measured using three variables culled from the literature: type of infographic, the use of jargon, and the use of legends. Inclusion of jargon (Figure 1, from our sample) and inclusion of a legend (Figure 3, from our sample) inhibit clarity.

Figure 1. Infographic example of jargon and avoiding legend (ACLU, 2015, May 31)

Nonprofit organizations used and disseminated different types of infographics through Twitter. Infographic types examined included: numbers only (66%, n = 248), word and facts (27%, n = 103), lists (13%, n = 103), pie charts (9%, n = 32), bar graphs (4%, n = 16), how-to (3%, n = 12), maps (3%, n = 12), line graphs (2%, n = 6), timelines (1.1%, n = 4), and flowcharts (1%, n = 2).

Figure 2. “Big numbers” (ACLU, 2015, October 28)

 

Figure 3. Infographic example of unnecessary legend (ACLU, 2015, May 21)

Most of the infographics (89.6%, n = 337) examined did not contain a data visualization, thus precluding the need to consider whether a legend must be used. Of the 39 (10.4%) infographics in this analysis that did contain data visualization (e.g., a chart or graph), 14 (3.7%) used a legend unnecessarily, while 25 (6.6%) did not use a legend, thus clarifying data interpretation. Of the infographics examined, 35 (9.5%) contained instances of jargon, or highly technical, industry-specific words or acronyms that may not be understood by all members of the lay audience.

DISCUSSION

The present study sits at the intersection of public relations, infographics, and education. By examining infographic design principles as applied to public relations practices, these exploratory findings lend to the development of more effective education in the area of visual literacy, particularly, public relations infographics design. The study suggests that while making heavy use of infographics on social media, the nonprofit organizations studied do not often translate concepts of transparency and clarity into their infographic-based communications online. This finding magnifies educators’ and researchers’ calls for better visual literacy education among students and lends to suggestions for such literacy in the area of infographic design.

The nonprofit organizations in this study did not often practice transparency in their infographics. Only 19% of the infographics examined included the data source at all, and even fewer provided details such as the date of the data (14.1%). Those that did include a source were most often vague about it, including the name of a company (e.g., “Humane Society”) instead of directing the user to an actual dataset or name of a study. In fact, very few (4.2%) infographics made the dataset available to viewers, inhibiting the viewer’s ability to explore, ask questions, and assess credibility (Cairo, 2012). Nonprofit organizations were most opaque in their sourcing of photographers (7.4%) and designers (0.8%), an ethical and legal blunder (e.g., Lester, 2015; Newton, 2013). Only 32% of the infographics examined included at least a URL leading back to the landing page from where the infographic originated, leaving most viewers in the dark as to the origins of the graphic itself to fill in any of the transparency gaps.

Further, the infographics studied in this sample did not reflect transparent communication practices as outlined by Global Reporting Index guidelines (as cited in Rawlins, 2009). Only 17.3% of the infographics released by nonprofit organizations in this study communicated their actions (i.e., relevance), while even fewer (14.1%) communicated using a sustainability context how their actions impact the community, environment, or social welfare of groups or individuals. Given this finding, nonprofit organizations are missing an opportunity to communicate what they do and how they impact society, which may provide a competitive advantage and enhance relationships with current and potential donors.

Limited (8%) infographics communicated neutral information about the nonprofit organization’s actions from a third party, which may create skewed perceptions. Third-party endorsements provide organizations an additional layer of credibility with members of key publics; therefore, not incorporating this information may impact perceptions of organizational credibility. Very few (2.4%) infographics provided comparable information about nonprofit organizations’ past and present performance, which would show its effectiveness to donors and members of key publics. While using social media to provide information quickly, few (10.4%) infographics provided timely information that would aid donors and key publics in decision-making. Timeliness refers to the information date in relation to the information distribution in infographic via Twitter.

The infographics examined could also improve clarity. Nonprofit organizations are not taking full advantage of the power of infographics to visualize otherwise difficult data or information, a primary purpose of using infographics (Cairo, 2012). Most nonprofits are releasing big numbers, big words, or lists, a strategy recommended against by experts on the topic (e.g., Cairo, 2012; Krum, 2013). Very few other types of data visualizations were used, with pie charts being the most popular, present in 9% of the graphics. Other forms of visualization, while potentially more appropriate, were each used in less than 5% of the sample. Of the infographics using data visualizations, just under half used legends, adding unnecessary work for viewers trying to decipher the meaning of the visualization.

Practical Implications. Findings from this study inform public relations educators by presenting gaps in practice that can be addressed by teaching students about transparency and clarity with regard to infographics. Students should keep in mind that once an infographic is released onto the Internet, its eventual whereabouts are unpredictable. Students should be prepared to conduct a communication audit of their infographic use to ensure that communication has clarity and communicates dedication to transparency practices. Students should employ a thematic analysis of current messaging in their communication audits guided by the measures of transparency and clarity offered in this study.

In the same way that public relations professionals are trained in management, strategy, writing, and research, the visual landscape of information overload dictates a need for basic education in communicating these ideas in data using visuals. Given the findings from this study, the following suggestions for infographic design are offered as a first step toward suggestions on infographic design for educational purposes:

  1. Data source, designer credit, and photographer credit must be included directly on all types of infographics to lend to transparency;
  2. Nonprofit organizations must enhance credibility with members of key publics through the use of neutral information or data to show unbiased impact on society through their organizational efforts;
  3. Nonprofit organizations must strive to communicate clearly by avoiding the use of legends and jargon, which may be confusing and add unnecessary work for members of key publics;
  4. Nonprofit organizations would improve their commitment to clarity and harness the power of visuals by incorporating more visualization of data and fewer graphics with mere large numbers which may make the numbers seem important. Tufte suggests to always show comparisons in data visualizations, allowing the viewer to better understand. Showing, not telling, is at the heart of infographic design;
  5. And, designers and public relations professionals must consider the apparent believability of data visualizations and be vigilant in their transparency efforts by including a data source, a link to the dataset, the date, and a landing page link on the infographic itself, lending to credibility.

Limitations and Suggestions for Future Studies

Despite the relevant and important findings of this study, there are still some limitations worth noting. This study is intended as a snapshot into the current state of nonprofit infographics online with a purpose of opening up a dialogue about improvement and leading to future, more thorough studies on the topic and for developing an initial set of suggestions for teaching public relations students about infographic design. The sample was purposeful and not random; thus, these findings cannot be generalized to all infographics online, or even those from public relations agencies. The goal of the paper was not to generalize, but to glean a snapshot of practices in order to offer best practices for students.

The shortcomings of this study and unaddressed issues open up the door for future studies. Infographics disseminated by nonprofit organizations on Twitter were the only type of infographic studied. Other scholars would add to the literature by exploring other types of infographics and other social networks. Second, there is an important area in need of examination with regard to infographics, and that is data deception. For example, bubble charts are infamous for misrepresenting the size and scale of area, rendering data comparisons misleading (Cairo, 2012; Tufte, 1983). No studies, to the authors’ knowledge, have taken on the task of carefully examining the accuracy of data visualizations. This second, larger step would add richness to the present understanding of infographics. This is an important area of study, and offering students instructions in this regard is relevant.

Further study regarding best practices of visual, social and primarily nonlinear and web-based forms of communications will enhance current practices in the public relations industry and will help to bolster the credibility of an organization during a time when that is desperately needed. The buzz surrounding the proliferation of “fake news” and so-called “alternative facts” calls educators’ attention to the need to teach transparency and clarity as applied to all forms of communications. This study opens up conversations and invites further study into best practices of performing public relations in the contemporary media landscape. Future studies can add to and move beyond the three concepts examined here, and study not only infographics, but the myriad other forms of online communications, including memes, GIFs, animations, and snaps.

REFERENCES

Albu, O. B., & Wehmeier, S. (2014). Organizational transparency and sense-making: The case of Northern Rock. Journal of Public Relations Research, 26(2), 117–133. http://doi.org/10.1080/1062726X.2013.795869

ACLU. (2015, October 28). #CriminalJustice system fails women survivors of domestic & sexual abuse. New ACLU report: https://www.aclu.org/feature/responses-field …. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/659467248487133184

ACLU. (2015, May 31). #Minneapolis arrest rates are much higher for ppl of color #overcriminalization https://www.aclu.org/feature/picking-pieces#minneapolis …. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/605037048928468993

ACLU. (2015, May 21). More than 50% of ppl executed in the US in ’14 were African Americans http://www.vox.com/2015/5/19/8625697/death-penalties-by-state … @voxdotcom @colorlines. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/601395527830392835

Boerman, S. C., Smith, E., & van Meurs, L. (2011). Attention battle: The abilities of brand, visual, and text characteristics of the ad to draw attention versus the diverting power of the direct magazine context. In S. Okazaki, (Ed.), Advances in advertising research (Vol. 2): Breaking new ground in theory and practice (pp. 295–310). Wiesbaden: Gabler Verlag.

Burns, K. S. (2008). The misuse of social media: Reactions to and important lessons from a blog fiasco. Journal of New Communications Research, 3(1), 41–54.

Burns, M. (2006). A thousand words: Promoting teachers’ visual literacy skills. Multimedia and Internet@ Schools, 13(1), 16.

Commission on Public Relations Education. (2015). Educator Summit on Public Relations Education: Summary Report (pp. 1–38). Retrieved from http://www.commpred.org/_uploads/industry-educator-summit-summary-report.pdf

Cairo, A. (2015). Graphics lies, misleading visuals. In New Challenges for Data Design (pp. 103-116). London: Springer-Verlag.

Cairo, A. (2012). The Functional Art: An introduction to information graphics and visualization. San Francisco, CA: New Riders.

DiStaso, M. W., & Bortree, D. S. (2012). Multi-method analysis of transparency in social media practices: Survey, interviews and content analysis. Public Relations Review, 38(3), 511–514. doi:10.1016/j.pubrev.2012.01.003

Edelman, R. (2012, June). When all media is social: Navigating the future of communications. Speech presented at the 2012 Edelman Academic Summit, Palo Alto, CA. Available at http://www.newmediaacademicsummit.com/summit2012/agenda.asp

Fernando, A. (2012). Killer infographic! But does it solve TMI? Communication World, 29(2), 10-12.

Gallicano, T., Ekachai, D., & Freberg, K. (2014). The infographics assignment: A qualitative study of students’ and professionals’ perspectives. Public Relations Journal, 8(4).

Graber, D. A. (1990). Seeing is remembering: How visuals contribute to learning from television news. Journal of Communication, 40(3), 134-156.

Greenwood, S., Perrin, A., & Duggan, M. (2016, November 11). Social Media Update 2016. Retrieved from http://www.pewinternet.org/2016/11/11/social-media-update-2016/

Kent, M. L. (2013). Using social media dialogically: Public relations role in reviving democracy. Public Relations Review, 39(4), 337-345.

Kienzler, D. S. (1997). Visual ethics. Journal of Business Communication, 34, 171-187. doi:10.1177/002194369703400204

Kimball, M. A., & Hawkins, A. R. (2008). Document design: A guide for technical communicators. Boston, MA: Bedford/St. Martin’s.

Knaflic, C. N. (2015). Storytelling with Data: A Data Visualization Guide for Business Professionals. Hoboken, NJ: John Wiley & Sons.

Kostelnick, C., & Roberts, D. (2010). Designing visual language: Strategies for professional communicators (2nd ed.). Boston, MA: Allyn & Bacon.

Krum, R. (2013). Cool infographics: Effective communication with data visualization and design. Indianapolis, IN: John Wiley & Sons.

Lester, P. M. (2015). Photojournalism: An ethical approach. New York, NY: Routledge.

Marken, G. A. (1996). Public relations’ biggest challenge: Translation. Public Relations Quarterly, 41(3), 47–48.

Meacham, M. (2015, August). Use infographics to enhance training. TD Magazine. Retrieved from https://www.td.org/magazines/td-magazine/use-infographics-to-enhance-training

McArdle, M. (2011, December 23). Ending the infographic plague. The Atlantic. Retrieved from https://www.theatlantic.com/business/archive/2011/12/ending-the-infographic-plague/250474/

Messaris, P. (1994). Visual “literacy”: Image, mind, and reality. Boulder, CO: Westview Press.

Metros, S. E. (2008). The educator’s role in preparing visually literate learners. Theory into Practice, 47(2), 102-109.

Moriarty, S. E. (1996). Abduction: A theory of visual interpretation. Communication Theory, 6(2), 167-187.

“Nation shudders at large block of uninterrupted text.” (2010, March 9). The Onion. Retrieved from http://www.theonion.com/article/nation-shudders-at-large-block-of-uninterrupted-te-16932

Newton, J. (2013). The burden of visual truth: The role of photojournalism in mediating reality. New York, NY: Routledge.

Nielsen, J. (2011). How long do users stay on web pages? Retrieved from http://www.nngroup.com/articles/how-long-do-users-stay-on-web-pages/

Plaisance, P. L. (2007). Transparency: An Assessment of the Kantian Roots of a Key Element in Media Ethics Practice. Journal of Mass Media Ethics, 22(2-3), 187–207. doi:10.1080/08900520701315855

Rawlins, B. (2006). Measuring the Relationship Between Organizational Transparency and Trust. Presented at the 9th Annual International Public Relations Research Conference, Miami, FL. Retrieved from http://www.docunator.com/bigdata/1/1366449053_93ee43dea2/iprrc_10_proceedings.pdf#page=425

Rawlins, B. (2009). Give the Emperor a Mirror: Toward Developing a Stakeholder Measurement of Organizational Transparency. Journal of Public Relations Research, 21(1), 71–99. doi:10.1080/10627260802153421

Rosenquist, C. (2012). Visual form, ethics, and a typology of purpose: Teaching effective information design. Business Communication Quarterly, 75, 45-60. doi:10.1177/1080569911428670

Rosenwald, M.S. (April 6, 2014). Serious reading takes a hit from online scanning and skimming, researchers say. The Washington Post. Retrieved from https://www.washingtonpost.com/local/serious-reading-takes-a-hit-from-online-scanning-and-skimming-researchers-say/2014/04/06/088028d2-b5d2-11e3-b899-20667de76985_story.html

Rutenberg, J. (2016, November 6). Media’s next challenge: Overcoming the threat of fake news. The New York Times. Retrieved from https://www.nytimes.com/2016/11/07/business/media/medias-next-challenge-overcoming-the-threat-of-fake-news.html?_r=0

Rutenberg, J. (2017, January 22). “Alternative facts” and the costs of Trump-branded reality. The New York Times. Retrieved from https://www.nytimes.com/2017/01/22/business/media/alternative-facts-trump-brand.html

Schafer, C. (1995). Understanding the brains helps writers. Intercom, 14(9), 18-19.

Smiciklas, M. (2012). The power of infographics: Using pictures to communicate and connect with your audiences. Indianapolis, IN: Que Publishing.

Sosa, T. (2009). Visual literacy: The missing piece of your technology integration course. TechTrends, 53(2), 55.

Stallworth, W. L. (2008). Strengthening the ethics and visual rhetoric of sales letters. Business Communication Quarterly, 71, 44-52. doi:10.1177/1080569907312860

Stempel, G. H. (2003). Content Analysis. In G. H. Stempel, D. H. Weaver, & G. C. Wilhoit (Eds.), Mass communication research and theory (pp. 209–219). Boston: Allyn & Bacon.

Top 100 Nonprofits on the Web. (n.d.). Retrieved May 24, 2017, from https://topnonprofits.com/lists/best-nonprofits-on-the-web/

Toth, C. (2013). Revisiting a genre teaching infographics in business and professional communication courses. Business Communication Quarterly, 76(4), 446-457.

Tufte, E. (2001). The quantitative display of information (2nd ed.). Cheshire, CT: Graphics Press.

Walter, E., & Gioglio, J. (2014). The power of visual storytelling: How to use visuals, videos, and social media to market your brand. New York, NY: McGraw-Hill Professional.

Wheeler, T. H. (2005). Phototruth or photofiction?: Ethics and media imagery in the digital age. New York, NY: Routledge.

Wingfield, N., Isaac, M., and Benner, K. (2016, November 14). Google and Facebook take aim at fake news sites. The New York Times. Retrieved from https://www.nytimes.com/2016/11/15/technology/google-will-ban-websites-that-host-fake-news-from-using-its-ad-service.html?_r=0

Yaffa, J. (May/June 2011). The information sage: Edward Tufte, the graphics guru to the power elite who is revolutionizing how we see data. Washington Monthly. Retrieved from https://washingtonmonthly.com/magazine/mayjune-2011/the-information-sage/

Yeh, H. T., & Cheng, Y. C. (2010). The influence of the instruction of visual design principles on improving pre-service teachers’ visual literacy. Computers & Education, 54(1), 244-252.

© Copyright 2017 AEJMC Public Relations Division

Using Crisis Simulation to Enhance Crisis Management Competencies: The Role of Presence

Author

Bryan Ming Wang

Ming Wang, University of Nebraska-Lincoln

Abstract

Simulation-based training (SBT) is a useful pedagogical tool used in crisis management training. This paper explores the effects of a crisis simulation activity on students’ crisis management competencies. Pre- and post-test surveys indicated that students significantly improved crisis management competencies after the crisis simulation activity. Moreover, presence was found to be positively associated with post-simulation crisis management competencies, suggesting that presence is critical in designing an effective simulation activity.

Key words: crisis simulation, crisis management, presence

SlideShare PDF

Using Crisis Simulation to Enhance Crisis Management Competencies: The Role of Presence

Using Crisis Simulation to Enhance Crisis Management Competencies: The Role of Presence

Effective crisis management is critical to the success of organizations. From the Volkwagen emissions-cheating scandal (Boston & Sloat, 2015) to the food-borne illness outbreak at Chipotle Mexican Grill (Jargon & Newman, 2016), crises, if not properly managed, can severely damage an organization’s reputation, hurt its bottom line, and stunt its long-term growth. It comes as no surprise that crisis management is a popular and important topic in public relations classes.

Simulation activities provide unique opportunities for students of crisis management to develop theory grounded practice in the real world through problem-based learning (Hsieh, Sun, & Kao, 2006), experiential learning (Kolb, 1984; Rogers, 1996), and transformative learning (Clemson & Samara, 2013). One of the key factors that can potentially enhance the effectiveness of such activities is presence, an individual’s subjective sense of “being there” (Barfield, Zeltzer, & Slater, 1995; Minsky, 1980).

This study compares pre- and post-simulation assessment of students’ crisis management competencies in a senior-level public relations theory and strategy class to demonstrate the effectiveness of a crisis simulation activity in improving key learning outcomes. Furthermore, this project identifies presence as a key psychological outcome of the simulation activity and empirically tests whether presence is positively associated with post-simulation crisis management competencies.

Crisis Management Competencies

Effective crisis management involves a variety of skills, such as strategic planning, problem solving, message production, information management, communication management and issues management (Coombs, 2014).

A well-known certification for public relations practitioners is the Accreditation in Public Relations (APR) credential administered by the Universal Accreditation Board (UAB). The APR program delineates a set of competencies—detailed knowledge, skills and abilities (KSA)—in a study guide for its computer-based examination. The competencies in the 2015 guide cover (1) researching, planning, implementing and evaluating programs; (2) ethics and law; (3) communication models and theories; (4) business literacy; (5) management skills and leadership; (6) issues management and crisis communication; (7) media relations; (8) history and practice of public relations; (9) using information technology effectively; and (10) advanced communication skills. Its issues management and crisis communication unit encompasses (1) understanding phases of a crisis, (2) considering multiple perspectives, (3) engaging in issues management, (4) developing risk management capabilities, and (5) providing counsel to management.

To help students in the public relations and theory class develop these competencies, a class session prior to the simulation activity focused on specific theories and topics on crisis management, such as conducting crisis assessment, defining key publics, composing key messages, compiling supporting facts, and understanding situational theory of publics.

This study employs two separate measures of crisis management competencies discussed above as key learning outcomes: APR competencies and course competencies. The APR competencies items are based on the descriptions on the study guide for APR’s computer-based examination; the course competencies items tap more directly into the content covered during the class prior to the simulation activity.

Simulation-Based Training (SBT)

Viewed as a type of problem-based learning (Hsieh et al., 2006), simulation-based training (SBT) is more effective at imparting complex applied competencies, can lead to learning in a short period of time, is simple to learn, is learner-controlled, and is inherently more engaging (Salas, Wildman, & Poccolo, 2009).

SBT is commonly used in public relations and management training, especially crisis management and media relations, to help practitioners apply theoretical concepts to solving practical issues (Bland, 1995; Coombs, 2001, 2014; Dutta-Bergman, Madhavan, & Arns, 2005; Lane, 1995; Shifflet & Brown, 2006). A survey of 122 organizations found the desktop simulation exercise was the second most popular crisis management team-training activity and also the second most common type of media training (Lee, Woeste, & Heath, 2007). Dyer (1995) recommended that “once people are involved in developing, implementing, and evaluating the crisis response, then planning for ongoing simulations with the crisis plan can be a much more viable part of organizational practice” (p. 40).

SBT is also a popular pedagogical tool in classroom teaching. Asal and Blake (2006) claimed that “simulations, particularly human-to-human interactions, offer social science students the opportunity to learn from firsthand experience, and can be an important and useful addition to an educator’s teaching repertoire” (p. 1). SBT provides an experiential learning experience where students learn through “discussion, group work, hands-on participation and applying information outside the classroom” (Wurdinger & Carlson, 2012, p. 2). Crisis management is a great fit for the active learning of analytic skills through a simulation activity (Coombs, 2014).

Despite its popularity in workplace training and classroom teaching, SBT has surprisingly suffered from a lack of rigorous empirical evidence on its effectiveness (Raymond & Sorensen, 2008). Some claim that SBT, as an active learning tactic, is an effective pedagogical tool (Dorn, 1989; Shellman, 2001) motivating students to study the materials harder (Rogers, 1996) and understand abstract concepts better (Smith & Boyer, 1996). However, much of this evidence relies upon instructors’ subjective impressions or select qualitative feedback from students (Fuller, 2016; Olson, 2012; Raymond & Sorensen, 2008; Shellman, 2001). Some other research has reported less optimistic results. For instance, a gaming simulation in an economics class led to surprisingly less thorough understanding of the course content than a conventional introductory course (Wentworth & Lewis, 1975).

SBT in Teaching Crisis Management

In previous studies of teaching crisis management with simulation activities, the findings were largely positive. Students who participated in crisis simulation activities reported positive overall impressions (Anderson, Swenson, & Kinsella, 2014), believed simulation made the class more realistic (Baglione, 2006), effectively applied theoretical concepts (Fuller, 2016), gained a better understanding of the tasks of a communication professional (Aertsen, Jaspaert, & Van Gorp, 2013), and demonstrated improved crisis management skills, as well as confidence, preparation and creativity in managing a crisis (Baglione, 2006).

However, none of these studies employed a rigorous pre/post-test design to examine the extent to which crisis simulation activities improved crisis management competencies. Moreover, none used APR competency measures. Given the promise of SBT in teaching crisis management, the next section describes the motivations for and details of the simulation crisis used in a public relations theory and strategy course that aims to address these limitations in the literature.

Background of the Class

The course that implemented this simulation activity was a senior-level class that targeted upperclassmen and graduate students. This class examined the public relations industry and discussed public relations models and theories early in the semester before devoting two weeks to crisis management strategies. The first week introduced students to key topics in crisis management: issues management, crisis assessment, analysis of key publics, situational theory of publics, and key messages and supporting facts. The second week began with discussions of crisis management strategies and the crisis management plan, after which the students participated in a crisis simulation activity.

Crisis Simulation Activity: Bed Bugs on Campus

The simulation activity followed a three-step process to maximize its effectiveness: instructions, simulation and debriefing (Baglione, 2006).

Effective teamwork is critical to crisis management (Waller, Lei, & Pratten, 2014). Students worked in small groups of five to six students, acting as public relations agencies to work on a variety of tasks throughout the course of the semester. For this activity, they were told to work in their own agencies to advise the client who approached them for counsel on the crisis.

To maximize realism of the scenario and student involvement in the activity, a crisis of bed bugs on campus at a large Midwestern university was chosen. This event did happen to the campus several years ago, but most of the students in the current class were not aware of the occurrence of the event, let alone specific details in the briefs. Hence, prior knowledge should not bias study results.

The crisis escalated through three stages: Bed Bugs Suspected, Bed Bug Rumors, and Bed Bugs Confirmed (see Figure 1 for scenario synopsis and key discussion questions for each stage). Quotes were adapted from news coverage on the actual crisis and key events in the briefs were actual occurrences based on conversations with the university communications director who dealt with this crisis.

Figure 1

Crisis Synopsis (Left Column) and Discussion Points (Right Column)

Figure 1 Crisis Synopsis (Left Column) and Discussion Points (Right Column)

Stage 1: bed bugs suspected. Students were given an initial brief at Stage 1, asked to read the brief and discuss the questions on the brief to provide counsel to University Communications, the client. At the initial stage, a student reported seeing parasitic insects on her roommate’s bed and waking up with bite marks on her legs the next morning. She reported the incident to University Housing, who brought examiners to study the situation and was told that the presence of bed bugs could not be confirmed until at least a week later.

The challenge for University Communications and University Housing was that investigation results would not be available for another week, which left a long spell of information vacuum. Students were asked to assess the situation to decide if this was a crisis at this stage, who the key publics were, what key messages and supporting facts needed to be prepared and what plans needed to be in place for both short-term and long-term challenges.

Stage 2: bed bug rumors. Students received a Stage 2 brief in about 15 minutes, regardless of whether the teams had finished discussions at Stage 1 or not to simulate the urgency and stress during times of crisis.

This brief stated that University Housing decided to inform the student who reported the incident and her dorm of the investigation plan and not to alert the larger public while the investigation was still ongoing.

However, a local TV news crew heard of the rumor and sneaked into the residence hall where the incident occurred. The reporter interviewed students who claimed that there were bed bugs and that the university was trying to hide the issue. She also interviewed students on the street who said they had not heard anything about bed bugs on campus.

Given the development of the crisis, students were asked to assess the situation to redefine key publics and key messages along with supporting facts at this phase.

Stage 3: bed bugs confirmed. Students received the last brief in about 10 minutes, regardless of whether the teams had finished discussing the questions from the Stage 2 brief or not.

The update stated that after a thorough investigation, it was confirmed that the room where the incident occurred was indeed infested with bed bugs along with several other dorm rooms. A story published in a local newspaper included student and university sources who provided their own accounts of what had transpired. The story reported that one of the Resident Assistants (RAs) was asked to allegedly lie about her own bed bug situation by the university.

Given that the story had been covered by several mass media outlets, students in the class were told that the university decided to invite journalists from local media organizations for a media briefing session. They were instructed to brainstorm 10 potential questions that the journalists might ask and to prepare corresponding key messages and supporting facts to address these questions.

Then the students were asked to plan for a mock press conference where each team would send one student to form a committee of university administrators, communications professionals and housing staff to field questions from the rest of the class, who would role play as invited journalists. Incorporating a simulated news conference has been a popular tactic in teaching crisis management (Baglione, 2006; Foote, 2013; Olson, 2012) as it provides students with an opportunity to learn how to be crisis spokespeople (Coombs, 2014).

At the end of the mock press conference, the students and the instructor discussed appropriate plans at each stage of the crisis and critically analyzed the answers from the panel at the press conference as a debriefing for the whole simulation activity.

Given the largely positive effects of SBT documented in the literature, it was expected that the bed bug crisis simulation would enhance both students’ APR crisis management competencies and course-specific crisis management competencies.

H1a: Students will report higher levels of APR crisis management competencies after the simulation activity than before the activity.

H1b: Students will report higher levels of course crisis management competencies after the simulation activity than before the activity.

Presence

Having expected that SBT would improve student learning, this study tackles the next question of how simulation does it. Little research has explored what makes a simulation activity effective. Published work has largely discussed a specific case or scenario used for a particular class, failing to investigate which aspect of the activity is significantly related to positive learning outcomes.

Booth (1990) is one of the few researchers who has delved into which elements enhance learning in SBT. He pointed out two factors: interactiveness (decisions made by participants during the simulation become real situations for other participants) and stress (participants are put under stressful conditions to simulate real-life experiences).

Audience characteristics may also affect how much SBT enhances learning. For instance, in their experiment with a computer-based crisis communication activity, Shifflet and Brown (2006) found learning styles and prior exposure to public relations impacted student performance. This study examines another audience characteristic, presence.

Simulation activities differ from case studies, a popular pedagogical tool to teach crisis management (Friedman, 2013), in that students typically analyze case studies from the perspective of objective observers whereas they are expected to engage in role-playing to be immersed in a simulation activity (Bell, Kanar, & Kozlowski, 2008). This type of immersion in another scenario is presence.

Presence is a concept most commonly studied in virtual environment media (Slater & Wilbur, 1997). However, defined as an individual’s subjective sense of “being there” (Barfield et al., 1995; Minsky, 1980) and the “experience of being in one place or environment, even when one is physically situated in another” (Witmer & Singer, 1998, p. 225), the concept can be applied to other communication modes as well. Indeed, Ijsselsteijn, de Ridder, Freeman and Avons (2000) conceptualized presence more broadly as the sense of being there in a mediated environment. Schloerb discussed the subjective presence as the perception that a person was “physically present in a given environment” (1995, p. 65). Similar concepts in the study of narrative persuasion include transportation (Green & Brock, 2000), narrative engagement (Busselle & Bilandiz, 2009) and flow (Csikszentmihalyi & Csikszentmihalyi, 1988). Witmer and Singer (1998) related presence to involvement and immersion, concepts that are widely studied outside the area of virtual environment.

Presence is a multidimensional construct that has been conceptualized as transportation, realism, immersion, social richness, social actor within a medium, and medium as social actor (Lombard, Bitton, & Weinstein, 2009). In this crisis simulation activity, transportation, realism and involvement are the most relevant dimensions. Previous studies on crisis simulation activities actively discuss measures to enhance realism and involvement, such as incorporating prompts (Baglione, 2006), to transport participants to the role-playing world.

To enhance presence, this study employed several strategies: (1) the crisis briefs repeatedly used second-person voice and emphasized the roles that students played to transport them to the bed bug crisis world; (2) the activity went through three phases, reinforcing the simulated environment that students were in—the longer the students engaged themselves in the simulated story, the more likely they were going to be transported to the story world; (3) to increase realism, this simulated activity was adapted from a real-life crisis with real quotes, development of the event, and crisis management actions; (4) the bed bug crisis was a scenario that students could easily relate to as personal relevance increases the motivation to engage in elaborative processing, resulting in higher levels of involvement (Cacioppo & Petty, 1982; Petty & Cacioppo, 1986); (5) given the nature of crisis management, students were not given enough time to fully discuss the questions at each stage of the simulated activity. An accelerated pace with no break, but urgency at each phase, kept students immersed in the story; (6) students were asked to host and participate in a mock press conference as the conclusion of the crisis. This challenging behavioral task motivated students to more thoroughly research the crisis, resulting in higher levels of involvement.

Despite popular belief that presence increases task performance, there is no solid evidence to support it, claims Welch (1999). Some studies, however, do show that presence increases learning. For instance, Dunnington (2014) interviewed nursing students who participated in scenario-based human patient simulation and found that presence impacted the learning experience and outcomes. Richardson and Swan (2003) found that students reporting higher perceived social presence also perceived they learned more from a course and were more satisfied with the instructor.

Effective simulation activities should induce a high degree of presence among students. This heightened psychological state will improve student learning outcomes.

H2a: Presence will be positively associated with APR crisis management competencies in a crisis simulation activity.

H2b: Presence will be positively associated with course crisis management competencies in a crisis simulation activity.

METHOD

Data were collected from a senior-level public relations theory and strategy class in a large Midwestern university on March 7, 2016. Students in this class were mostly juniors and seniors in the advertising and public relations major. Thirty-three students were enrolled in the course, 31 completed the pre-test questionnaire, and 27 turned in the survey questionnaire after the simulation activity.

The week prior covered issues management and crisis management theories. The class on March 7 started with a discussion of crisis management strategies and components of a crisis management plan, after which students filled out a pre-test questionnaire that assessed crisis management competencies defined by the APR certification exam study guide and content covered in the class. Then students underwent three phrases of the crisis simulation activity, including the mock press conference, before they answered the same set of crisis management competency questions in a post-test questionnaire along with a battery of questions on presence and two open-ended questions.

Measures

APR crisis management competencies. A battery of crisis management competency questions was adapted from the Issue Management and Crisis Communication section of the 2015 Detailed Knowledge, Skills and Abilities Tested on the Computer-Based Examination for Accreditation in Public Relations (Universal Accreditation Board, 2016). Students were asked to rate on a scale from 0 (“do not understand at all”) to 10 (“fully understand the topic”) how much they understood the following topics: (1) the roles and responsibilities of public relations at the pre-crisis, crisis, and post-crisis phases; (2) the messaging needs of each phase (i.e., pre-crisis, crisis, and post-crisis phases); (3) considering and accommodating all views on an issue or crisis; (4) factoring multiple views into communication strategy and messaging; and (5) the importance of providing counsel to the management team or client during all stages of a crisis (pre-crisis, crisis and post-crisis). The mean and standard deviation of each pre-simulation and post-simulation item are reported in Table 1. These questions were averaged to create an index of APR crisis management competencies (αpre = .89, Mpre = 5.92, SDpre = 1.41; αpost = .91, Mpost = 7.65, SDpost = .95).

Course crisis management competencies. Another battery of crisis management competency questions was developed to assess the topics discussed in class. These questions are more specific than the APR crisis management competency items. Students were asked to rate on a scale from 0 (“not confident at all”) to 10 (“very confident”) how confident they were in: (1) doing crisis assessment; (2) defining key publics; (3) composing key messages; (4) composing supporting facts; (5) understanding situational theory of publics; and (6) applying situational theory of publics. The mean and standard deviation of each pre-simulation and post-simulation item are reported in Table 1. These questions were averaged to create an index of course crisis management competencies (αpre = .93, Mpre = 5.18, SDpre = 1.69; αpost = .89, Mpost = 7.50, SDpost = 1.01).

Not surprisingly, APR and course crisis management competencies were positively correlated (rpre = .81, p < .001, n = 31; rpost = .81, p < .001, n = 27).

Table 1

Crisis Management Competences Pre-Post Simulation Comparisons

 

Item Pre-test α Pre-test

M (SD)

Post-test α Post-test

M (SD)

APR Crisis Management Competencies

(0 to 10 scale)

.89 5.92 (1.41) .91 7.65 (.95)
1. the roles and responsibilities of public relations at the pre-crisis, crisis, and post-crisis phrases 5.77 (1.45) 7.48 (1.01)
2. the messaging needs of each phase (i.e., pre-crisis, crisis, and post-crisis phases) 4.65 (1.66) 7.63 (1.81)
3. considering and accommodating all views on an issue or crisis 6.06 (1.98) 7.48 (1.16)
4. factoring multiple views into communication strategy and messaging 6.35 (1.78) 7.74 (1.10)
5. the importance of providing counsel to the management team or client during all stages of a crisis (pre-crisis, crisis and post-crisis) 6.77 (1.52) 7.93 (1.11)
Crisis management course competencies

(0 to 10 scale)

.93 5.18 (1.69) .89 7.50 (1.01)
1. doing crisis assessment 5.00 (1.79) 7.19 (1.42)
2. defining key publics 5.97 (1.91) 7.74 (.90)
3. composing key messages 5.81 (2.06) 7.93 (1.41)
4. composing supporting facts 5.42 (2.11) 7.93 (1.41)
5. understanding situational theory of publics 4.52 (2.05) 7.15 (1.46)
6. applying situational theory of publics 4.36 (1.94) 7.07 (1.36)

Note. npre = 31, npost = 27.

Presence. This construct was measured by asking students to indicate on a scale from 0 (“strongly disagree”) to 10 (“strongly agree”) their agreement with the following statements: (1) I had a sense of being in the crisis scenario; (2) I felt involved in the crisis scenario; (3) The crisis scenario seemed believable to me; (4) I had a strong sense that the characters and events were real; and (5) The scenario seemed real. These questions were only asked in the post-test questionnaire. They were averaged to create an index of presence (α = .94, M = 8.47, SD = 1.19).

FINDINGS

Analytical Strategy

Repeated-measures t tests were conducted to examine whether students reported higher post-simulation crisis management competencies than pre-simulation assessment.

To test the hypotheses on the presence effects, two ordinary least squares (OLS) regression models were run on two dependent variables: APR crisis management competencies and course crisis management competencies, controlling for respective pre-test competencies.

Pre- and Post-Simulation Crisis Management Competencies

A repeated-measures t test showed that students reported higher APR crisis management competencies after the crisis simulation activity (M = 7.65, SD = .95) than before the crisis simulation activity (M = 5.78, SD = 1.37, t(26) = -11.40, p < .001, n = 27). H1a was supported.

Similarly, a repeated-measures t test showed that students reported higher course crisis management competencies after the crisis simulation activity (M = 7.50, SD = 1.01) than before the crisis simulation activity (M = 5.12, SD = 1.76, t(26) = -10.72, p < .001, n = 27). H1b was also supported.

Presence Effects

The results of the two OLS regression analyses are reported in Table 2.

Presence was indeed positively associated with both APR (b = .35, SE = .09, p < .001) and course (b = .34, SE = .09, p < .001) crisis management competencies. Hence, both H2a and H2b were supported.

Moreover, pre-test APR competencies and presence explained 74% of the variance in post-test APR competencies and pre-test course competencies, and presence accounted for 74% of the post-test course competencies as well.

Table 2

Effects of Presence on APR and Course Crisis Management Competencies

 

Model I:

APR Crisis Management Competencies

Model II:

Course Crisis Management Competencies

B (S.E.) β B (S.E.) β
Pre-test Competencies .382***

(.082)

.551 .373***

(.061)

.649
Presence .352***

(.094)

.442 .342***

(.090)

.404
Constant 2.471***

(.681)

2.700***

(.731)

Adjusted R2 .740 .737
n 27 27

 

Note. Entries are coefficients from OLS regressions.

* p < .05; ** p < .01; *** p < .001.

DISCUSSION

Through pre- and post-test surveys, this study finds that SBT indeed improved student learning outcomes and that presence was critical in enhancing that effect.

What Students Learned Most

Using two different measures of student learning outcomes, the crisis simulation activity boosted both APR and course crisis management competencies, reaffirming SBT as an effective pedagogical tool in teaching crisis management.

It is worth pointing out that the biggest improvement in learning involved messaging strategy and crisis communication theory. Before the simulation activity, students reported low ratings for messaging and theory competency items: the messaging needs of each crisis phase; understanding situational theory of publics; and applying situational theory of publics. Each was averaged below the scale midpoint of 5. Encouragingly, these items saw the biggest amount of increase in post-simulation ratings, with 2.98 points increase for “the messaging needs of each phase,” 2.63 points increase for understanding situational theory of publics, and 2.71 points increase for applying situational theory of publics (see Table 1).

The post-test questionnaire also included two open-ended questions: “What lessons have you learned from this activity?” and “What are your other thoughts on this activity?” The qualitative feedback from students was overwhelmingly positive and showed some recurring topics that students felt they learned the most.

The importance of crisis planning. One student learned “just how important having a crisis plan is.” Another learned to “always have a pre-plan for any possible crisis that can arise.” One student also noticed that “there is a lot of planning done before a crisis even occurs.” Multiple students also emphasized the importance of being prepared for every type of question.

Key messages. One student learned the “importance of talking points in the interview.” Another pointed out that “key messages + supporting facts are important.” One echoed that “messaging is very important.”

Crisis phases. One student noted “the different phases that follow a crisis and which steps need to be accomplished within each of those phases.” Another saw “how the crisis evolved and learned what to do in each stage.” Similarly, one student learned the “key differences in the different stages of a crisis/possible crisis” and another understood “how to manage crisis in the best possible way in all phases of crisis.” One student hinted at the situational theory of publics by writing that “I learned the different phases that follow a crisis and which steps need to be accomplished within each of those phases.”

Comments revealed that students found the simulation activity fun and very hands on.

Presence

This study also finds that feeling “present” in the simulation scenario enhances both APR and course crisis management competencies. In their qualitative feedback, many students commented on the realism and believability of the activity, which contributed to a higher degree of psychological presence. Students used such phrases as “real-life situation,” “really believable,” “real-life practice,” and “being in a crisis scenario.” They believed realism contributed to the effectiveness of the activity.

In designing SBT, instructors should strive to induce a high level of presence. The goal is to transport students to the simulated scenario so that they adopt and play the role of the actors in the case. Research insights from narrative persuasion and storytelling can help the instructors design better prompts.

Limitation

The simulation activity was designed so that every student had an opportunity to be engaged during all stages of crisis development. The crisis culminated in a press conference where a panel of six students, one from each team, addressed the questions from the rest of the class, who role-played as journalists. While many students mentioned learning from playing the role of journalists (e.g., “I have learned what a real news conference might be like and how to ask important questions.” and “I learned about the kinds of tough questions journalists should be asking.”), one student noted a desire to play the role of the university panelist (e.g., “Great activity, possibly get everyone a chance to be at the press table.”).

The challenge of rotating everyone in the class through the panelist role at the press conference can be daunting, but this could possibly be achieved in a class dedicated to crisis management where the instructor can use different simulation scenarios to grant every student the opportunity to role-play the organizational panelist who addresses the media.

Future Research

Instructors are encouraged to explore conducting a social-media-based crisis simulation. Public relations agencies, such as Weber Shandwick and Hill+Knowlton Strategies, both have developed innovative social media crisis simulation platforms (Kiefer, 2012; ) that have great potential to be adopted in classrooms (Anderson et al., 2014; Veil, 2010). The challenge is that such a simulation activity requires much more work in both preparation and implementation (Anderson et al., 2014). Nonetheless, with the growing relevance of social media in crisis management, this type of simulation will be of critical value to students and practitioners alike.

CONCLUSION 

Connecting theories and practice is crucial to public relations research and teaching (Cornelissen, 2004). Theories do not transfer perfectly to practice; they need transformation (Wehmeier, 2009). Designing effective pedagogical activities to facilitate this transformation is of great interest to instructors of public relations courses.

Overall, SBT offers an alternative pedagogical approach to traditional assignments in public relations courses. This study shows that a crisis simulation activity can significantly increase students’ crisis management competencies. Creating realistic, engaging simulation activities that enhance presence can help students achieve such competencies more effectively.

Arguably, the contribution of SBT to learning is not confined to crisis management. It can be applied in other areas of public relations, such as media relations, as well. SBT should become part of the pedagogical toolbox that instructors of public relations use to teach both applied and theoretical topics.

REFERENCES

Aertsen, T., Jaspaert, K., & Van Gorp, B. (2013). From theory to practice: A crisis simulation exercise. Business and Professional Communication Quarterly, 76, 322-328.

Anderson, B., Swenson, R., & Kinsella, J. (2014). Responding in real time: Creating a social media crisis simulator for the classroom. Communication Teacher, 28, 85-95.

Asal, V., & Blake, E. L. (2006). Creating simulations for political science education. Journal of Political Sciene Education, 2, 1-18.

Baglione, S. L. (2006). Role-playing in a public relations crisis: Preparing business students for career challenges. Journal of Promotion Management, 12, 47-61.

Barfield, W., Zeltzer, D., & Slater, M. (1995). Presence and performance within virtual environments. In W. Barfield & T. A. Furness (Eds.), Virtual environments and advanced interface design (pp. 473-541). Oxford: Oxford University Press.

Bell, B. S., Kanar, A. M., & Kozlowski, S. W. (2008). Current issues and future directions in simulation-based training in North America. International Journal of Human Resource Management, 19, 1416-1434.

Bland, M. (1995). Training managers to handle a crisis. Industrial and Commercial Training, 27, 28-31.

Booth, S. (1990). Interactive simulation and crisis management training: New techniques for improving performance. Contemporary Crises, 14, 381-394.

Boston, W., & Sloat, S. (2015). Volkswagen emissions scandal relates to 11 million cars. The Wall Street Journal. Retrieved from http://www.wsj.com/articles/volkswagen-emissions-scandal-relates-to-11-million-cars-1442916906

Busselle, R. W., & Bilandiz, H. (2009). Measuring narrative engagement. Media Psychology, 12, 321-347.

Cacioppo, J. T., & Petty, R. E. (1982). The need for cognition. Journal of Personality and Social Psychology, 42, 116-131.

Clemson, D., & Samara, K. (2013). Crisis management simulations–narrative inquiry into tranformative learning. In A. Mesquita & I. Ramos (Eds.), Proceesings of the 12th European conference on research methodology for business and management studies (pp. 100-107).

Coombs, W. T. (2001). Teaching the crisis management/communication course. Public Relations Review, 27, 89-101.

Coombs, W. T. (2014). Ongoing crisis communication. Thousand Oaks, CA: Sage.

Cornelissen, J. (2004). Corporate communication: Theory and practice. London: Sage.

Csikszentmihalyi, M., & Csikszentmihalyi, I. (1988). Optimal experience. Psychological studies of flow in consciousness. Cambridge: Cambridge University Press.

Dorn, D. S. (1989). Simulation games: One more tool on the pedagogical shelf. Teaching Sociology, 17, 1-18.

Dunnington, R. M. (2014). Presence with scenario-based high fidelity human patient simulation. Nursing Science Quarterly, 27, 157-164.

Dutta-Bergman, M., Madhavan, K., & Arns, L. (2005). Responding to bio-terror: A strategic framework for crisis response pedagogy using 3D visualization. New York, NY: International Communication Association.

Dyer, S. C. (1995). Getting people into the crisis communication plan. Public Relations Quarterly, 40, 38-41.

Foote, L. (2013). Honing crisis communication skills: Using interactive media and student-centered learning to develop agile leaders. Journal of Management Education, 37, 79-114.

Friedman, M. (2013). Developing and teaching the crisis communication course. PRism, 10, 1-21.

Fuller, R. P. (2016). The big breach: An experiential learning exercise in mindful crisis communication. Communication Teacher, 30, 27-32.

Green, M. C., & Brock, T. C. (2000). The role of transportation in the persuasiveness of public narratives. Journal of Personality and Social Psychology, 79(5), 701-721.

Hsieh, J.-L., Sun, C.-T., & Kao, G. Y.-M. (2006). Teaching through simulation: Epidemic dynamics and public health policies. Simulation, 82, 731-759.

Ijsselsteijn, W. A., de Ridder, H., Freeman, J., & Avons, S. W. (2000). Presence: Concept, determinants and measurement. In B. E. Rogowitz & T. N. Pappas (Eds.), Human vision and electronic imaging V (pp. 520-529). San Jose: Proceedings of SPIE.

Jargon, J., & Newman, J. (2016). Federal authorities investigate Chipotle outbreak. The Wall Street Journal. Retrieved from http://www.wsj.com/articles/chipotle-sales-continue-tumbling-on-e-coli-outbreak-1452089571

Kiefer, B. (2012). H + K rolls out “flight school” crisis-simulation platform. PR Week. Retrieved May 5, 2016, from http://www.prweekus.com/hk-rolls-out-flight-school-crisis-simulation-platform/article/266379/

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs: Prentice-Hall, Inc.

Lane, D. C. (1995). On a resurgence of management simulations and games. The Journal of the Operations Research Society, 46, 604-625.

Lee, J., Woeste, J. H., & Heath, R. L. (2007). Getting ready for crisis: Strategic excellence. Public Relations Review, 33, 334-336.

Lombard, M., Bitton, T. B., & Weinstein, L. (2009). Measuring presence: The temple presence inventory. Proceedings of the 12th Annual International Workshop on Presence.

Minsky, M. (1980). Telepresence. Omni, 2, 45-51.

Olson, K. S. (2012). Making it real: Using a collaborative simulation to teach crisis communications. Journal of Excellence in College Teaching, 23, 25-47.

Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion. Central and peripheral routes to attitude change. New York: Springer.

Raymond, C., & Sorensen, K. (2008). The use of a Middle East crisis simulation in an international relations course. PS: Political Science and Politics, 41(1), 179-182.

Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Network, 7, 68-88.

Rogers, Y. v. d. M. (1996). Role-paying exercise for the development and international economic courses. The Journal of Economic Education, 27, 217-223.

Salas, E., Wildman, J. L., & Poccolo, E. F. (2009). Using simulation-based training to enhance management education. Academy of Management Learning & Education, 8, 559-573.

Schloerb, D. W. (1995). A quantitative measure of telepresence. Presence, 4, 64-80.

Shellman, S. M. (2001). Active learning in comparative politics: A mock German election and coalition-formation simulation. PS: Political Science and Politics, 34, 827-834.

Shifflet, M., & Brown, J. (2006). The use of instructional simulations to support classroom teaching: A crisis communication case study. Journal of Multimedia and Hypermedia, 15, 377-395.

Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (five): Speculations on the role of presence in virtual environments. Presence: Teleoperators and Virtual Environments, 6, 603-616.

Smith, E. T., & Boyer, M. (1996). Designing in-class simulations. PS: Political Science and Politics, 29, 690-694.

Universal Accreditation Board (2016). Study guide for the examination for Accreditation in Public Relations. Public Relations Society of America. Retrieved from https://www.prsa.org/wp-content/uploads/2016/07/apr-study-guide.pdf

Veil, S. R. (2010). Using crisis simulations in public relations education. Communication Teacher, 24, 58-62.

Waller, M. J., Lei, Z., & Pratten, R. (2014). Focusing on teams in crisis management education: An integration and simulation-based approach. Academy of Management Learning & Education, 13, 208-221.

Weber Shandwick. (2010). Weber Shandwick launches social media crisis simulator, Firebell. PRNewswire. from http://www.prnewswire.com/news-releases/webershandwick-launches-social-crisis-simulator-firebell-108940364.html

Wehmeier, S. (2009). Out of the fog and into the future: Directions of public relations theory building, research, and practice. Canadian Journal of Communication, 34, 265-282.

Welch, R. B. (1999). How can we determine if the sense of presence affects task performance? Presence, 8, 574-577.

Wentworth, D. R., & Lewis, D. R. (1975). Evaluation of the use of the marketplace game in junior college economics. The Journal of Economic Education, 6, 113-119.

Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence, 7, 225-240.

Wurdinger, S. D., & Carlson, J. A. (2012). Teaching for experiential learning. Lanham, MD: Rowman & Littlefield.

© Copyright 2017 AEJMC Public Relations Division

 

Real World Career Preparation: A Guide to Creating a University Student-Run Communications Agency

Reviewer

Teddi A. Joyce, University of South Dakota

SlideShare PDF

Real World Career Preparation: A Guide to Creating a University Student-Run Communications Agency

Real World Career Preparation: A Guide to Creating a University Student-Run Communications Agency

Author: Douglas J. Swanson
ISBN: 9781433137495
DOI: http://dx.doi.org/10.3726/978-1-4539-1681-0
Peter Lang, 2017
298 pp.
$52.95 (Amazon)

A 2015 study conducted for the American Association of Colleges & Universities, noted that 60 percent of employers surveyed think all college students should complete a significant applied learning project before graduation. Whether it is called “project based”, “active”, or “experiential learning”, popular press, students, and today’s employers value applied learning. Exploring authentic problems in real situations enriches students’ academic experiences and engagement while allowing them to demonstrate readiness for the workplace. Real World Career Preparation: A Guide to Creating a University Student-Run Communication Agency presents an inside look at how to operate a significant hands-on learning experience. The book offers faculty and department chairs ideas about structuring an environment where students work collectively building what can be described as a communication business.

Format

Swanson proposes that a student-run agency offers pedagogical, curricular, and resource advantages for a program. The text draws on his personal experience creating a student-run agency. Divided into three sections, the book’s format includes real case studies. Available at the end of each chapter, these Agency Spotlights make it easy for those developing student-run agencies to see how the text applies to these real-world cases.

Point of View

Upfront, Swanson emphasizes the focus is a student-run agency (learning and workplace preparation), not on a student-staffed agency (defined by revenue generation and recognition). This critical distinction (p. 6) situates his approach to academic and career advising as part of holistic education and comes through in the text and in the bit-too-numerous Spotlights of his own campus’ agency. However, Swanson’s “how-to” approach provides readers with a solid guide linking the curriculum to career preparation.

Structure

Section I (Chapters 1-8) frames the student-run agency as a place for students to develop skills for the workplace. Swanson draws on Kuh’s 2008 criteria of High-Impact Practice (HIP) proposing how an agency creates a learning environment of increased engagement and success (p. 8). Throughout Section I, Swanson works to build a case for a student agency by linking it to curriculum (Chapter 2); discussing facilities (Chapter 3) and student engagement (Chapter 4); mentoring and working with graduate students (Chapters 5 and 6); and assessment and accreditation (Chapters 7 and 8).

While campus realities differ, these topics are familiar to most faculty, and it would be easy to bypass the first section. Yet, the Agency Spotlights and some specific examples make reading the first section worth the investment. For example, Chapter 5 (p. 77) outlines the practical procedures for dismissing a student with a concise overview of how to create and structure a performance plan. In addition, Swanson’s discussion on the assessment of student learning outcomes is valuable as every chair and department grapples with assessment. The Agency Spotlights presented after Chapters 7 and 8 (pp. 105-112 and pp. 125-129) offer helpful insights into data collection, learning goals, and the power of national recognition. How an agency adds to the curriculum will depend on each campus’ situation, yet these examples are helpful tools to maximize a reader’s understanding of the complexities—whether curricular or co-curricular—that often accompany creating a new program.

Chapters 9-14 (Section II) address many questions about how to build a student-run agency as a business. Creating a business within an academic environment may be uncharted territory, and several chapters in this section detail the concepts discussed in Section I. For example, Chapter 9 (Establishing a Business within the Academic Environment) provides a detailed list of questions to help faculty drive strategic planning discussions linked to the curriculum discussion (also addressed in Chapter 2). Chapter 10 (Dissent within the Ranks) discusses how programmatic change can create opposition. Again, Swanson pulls forward concepts addressed in Section I and his own experiences to provide commentary on how to address criticism. Chapter 11 (Establishing a Firm Foundation) discusses resources, space, supplies and equipment (see Chapter 3). Swanson’s list is comprehensive, and he even notes that the space and equipment recommended are basic to ensure the program’s success. However, given the varied nature of campuses and available resources for program start-up, prioritizing the list could have benefited programs with limited initial funding.

Section II is rounded out with chapters on billing (Chapter 12), recruiting and retaining clients (Chapter 13) and promoting the agency (Chapter 14). Chapter 13 presents a sample client services agreement (pp. 197-199) key to client recruitment and retention, while the Agency Spotlight at the conclusion of Chapter 14 (pp. 220-222) reinforces the value of branding and promotion for the actual agency.

The final section (Chapters 15-17) offers thoughts on a student-run agency co-existing with professional agencies in the area (Chapter 15), working with local nonprofits (Chapter 16), and creating a diverse, inclusive approach for recruiting clients (Chapter 17). Of significant help to anyone balancing organization and accreditation issues and town/gown relationship building are the Agency Spotlights at the end of Chapter 15. Campuses may collect good ideas from the spotlights in Working with an Advisory Board of Alumni and Bridging the Gap in a Small Town for working with their unique circumstances.

Swanson also shares a directory of student-run communication agencies. The 2016 list, gleaned from a professional online directory and institutional websites, presents the university, the agency’s focus, name (if used) and URL. For those considering the development of a student-run agency, the directory, albeit a bit dated, complements the Spotlights and Swanson’s advice.

Conclusion

Wherever an institution is creating professional activities to equip students for the workplace, Real World Career Preparation: A Guide to Creating a University Student-Run Communication Agency can help move the conversation. Whether the discussion is investing an applied learning environment or how the agency experience can demonstrate learning for accreditation, the easy-to-read style offers an entry into that conversation. Using Agency Spotlights, faculty can develop a deeper understanding of how to shape conversations about operational and curricular issues central to this type of student learning opportunity to better prepare the next generation of communication professionals.

© Copyright 2017 AEJMC Public Relations Division

A Dam(n) Failure: Exploring Interdisciplinary, Cross-Course Group Projects on STEM-Translation in Crisis Communication

Laura Elizabeth Willis, Assistant Professor of Health and Strategic Communication in the School of Communication at Quinnipiac University.

Laura E. Willis, Quinnipiac University

Abstract

This exploratory, quasi-experimental study examines whether incorporating an interdisciplinary, cross-course aspect to a group project on the Teton Dam failure in a crisis communication management course would impact public relations students’ ability to translate technical aspects of the crisis for media and public audiences. Results suggest the inclusion of an engineering student as a technical expert negatively impacted project grades and increased student frustration. Possible improvements and lessons for future interdisciplinary, cross-course projects are presented.

Keywords: science communication, STEM translation, cross-course projects, interdisciplinary projects

SlideShare PDF

A Dam(n) Failure: Exploring Interdisciplinary, Cross-Course Group Projects on STEM-Translation in Crisis Communication

A Dam(n) Failure: Exploring Interdisciplinary, Cross-Course Group Projects on STEM-Translation in Crisis Communication

According to the U.S. Bureau of Labor Statistics (2014), employment in occupations related to STEM—science, technology, engineering, and mathematics—is projected to grow to more than 9 million between 2012 and 2022. With this surge in STEM fields in recent decades, a heightened focus on science communication has followed. Professional networks (such as Stempra) and university programs (from UC Santa Cruz to MIT) have been developed specifically for STEM public relations and communications practitioners. With this in mind, it is becoming increasingly appropriate for communications educators to integrate STEM-related coursework into their curriculum.

Effective science communication is informative (Fischhoff, 2013), and communications practitioners must develop the skills to create meaningful interactions with STEM professionals that result in translation of scientific jargon for lay audiences (Woolston, 2014). While not all public relations students may envision a future working in science communication, the value of learning how to function effectively in interdisciplinary teams is understood to be universal across professions (Goltz, Hietapelto, Reinsch, & Tyrell, 2008). As such, the present paper describes an interdisciplinary, technical group project designed by a strategic communication professor and an engineering professor to provide students the experience of working in interdisciplinary groups within the safe environment of a classroom. Interdisciplinary groups, as used in this paper, refers to combining undergraduate students from two different courses in two different degree programs. Through this quasi-experimental design, the implementation and effect of the cross-course, interdisciplinary group on the key learning objectives, specifically the translation of technical language, of the project is examined.

As a part of the cross-course, interdisciplinary group project, undergraduate public relations majors and undergraduate engineering majors were required to collaborate, where each small group of public relations majors was teamed up with one engineering major. This paper may be beneficial in assisting other faculty who seek to initiate such interdisciplinary, cross-course teams. To begin, this paper reviews literature related to cross-course and interdisciplinary learning environments. The quasi-experimental study is then discussed, and results of the study are presented in conjunction with suggestions for future cross-course, interdisciplinary project development.

LITERATURE REVIEW

Cross-Course and Interdisciplinary Projects to Enhance Learning

Two strategies used to enhance learning at the university level include cross-course and interdisciplinary projects. Both tactics require significant prior planning on behalf of the faculty members developing the projects, and often feature a series of learning objectives that may be shared across all students or vary between students to directly connect with the specific learning objectives of the class in which the student is enrolled (Kruck & Teer, 2009; Waltermaurer & Obach, 2007).

Cross-course projects. Cross-course projects are regularly used to foster collaboration within a major and provide students the opportunity to work in teams that may mirror team work environments common in their career field (Flosi, Fraccastoro, & Moss, 2010). Extant research on this tactic also shows it is utilized to emphasize a discipline’s integrative nature, highlighting how material within a major should not be understood as compartmentalized by class (Waltermaurer & Obach, 2007). In addition to projects that are developed within a specific major or discipline, cross-course projects are frequently used to bolster the goals of a liberal arts education. These projects are developed to encourage students to integrate their learning in a general education context, instead of assuming that students will integrate ideas and practices on their own, outside of the classroom (Envick, Madison, & Priesmeyer, 2003; Wingert et al., 2011). While these cross-course projects generally include learning outcomes from multiple disciplines as well as interdisciplinary learning outcomes, it seems that communication learning objectives have not been included in the pedagogical examination of cross-course projects thus far.

Interdisciplinary projects. Interdisciplinary efforts require integration of disciplinary sub-contributions, and participants need to take into account their peers’ work in order to make their own contributions (Petrie, 1976). According to Repko (2008), interdisciplinary learning influences four cognitive abilities, including perspective-taking, the development of structural knowledge, the integration of conflicting insights from multiple disciplines, and the production of interdisciplinary understanding of a problem. These projects have also been used to bolster technical and employability skills (Juhl, Yearsley, & Silva, 1997). Interdisciplinary projects may be team-taught, although this is not a requirement (Little & Hoel, 2011).

Extant research examining the role of interdisciplinary group projects has found that cooperative work among students can increase learning (Birol, Birol, & Cinar, 2001; Jensen, Moore, & Hatch, 2002) and improve student attitudes toward coursework (Little & Hoel, 2011). However, most of the work examining the impact of interdisciplinary group projects has been conducted in business (Envick, Madison, & Priesmeyer, 2003; Kruck & Teer, 2009), science (Juhl, Yearsley, & Silva, 1997; Little & Hoel, 2011), and engineering courses (Jaccheri & Sindre, 2007; McCahon & Lavelle, 1998). The present paper seeks to fill this gap in the literature by considering the use of an interdisciplinary, cross-course project in an upper-level public relations course.

Crisis Communication  

The study of crisis communication focuses on the management of organizational communication during and after experiencing a crisis (Ulmer, Sellnow, & Seeger, 2014). According to Coombs (2001), crisis communication management courses should focus on three key objectives: (1) approaching crisis management, (2) understanding key concepts, including the core elements for the crisis sensing mechanism and guidelines for selecting crisis team members, and (3) developing essential skills and abilities, such as functioning as an effective spokesperson, constructing crisis management plans, and assessing information needs and resources during crisis situations. When critical details of a crisis are STEM in nature, such as the environmental effects of machinery or infrastructure failures, it is imperative that crisis communicators translate complicated technical information. Beyond issues of image repair, crisis communication courses should develop mindsets and skills that provide students the ability to give organizations the opportunity to communicate clearly with their key publics, with the ultimate goals of transcending crises and strengthening relationships with their publics.

In pedagogical literature for public relations professors, a balance between theory and practice has been recommended, as has the integration of individual and group case study work for upper level courses (Sparks & Conwell, 1998). According to Coombs and Rybacki (1999, p. 57), “the most desirable teaching strategies and assignments are those which enable students to put theory into practice.” Built off this premise, the present study examines the implementation and effect of an interdisciplinary, cross-course technical group project between undergraduate geotechnical engineering and public relations students on the application of crisis communication theory and practice. This study is particularly concerned with the impact of working in an interdisciplinary team on public relations students’ translation of technical language and solutions for media and public audiences. With this in mind, the following research questions are proposed:

R1: How does the introduction of an interdisciplinary, cross-course project with engineering students impact public relations students’ success when dealing with an engineering crisis?

R2: How does the introduction of an interdisciplinary, cross-course project with engineering students impact public relations students’ ability to translate technical engineering language for lay audiences?

R3: How do students assess a STEM-related crisis communication case study project?

R4: How do students assess their own ability to translate technical language for lay audiences through the project?

METHOD

For the purposes of this study, an environmental crisis caused by an engineering failure served as the basis of a group project assignment for two sections of an upper-level crisis communication management course. Access to an engineering student to serve as an ‘engineering expert’ to work with throughout the assignment was reserved for the experimental section. This interdisciplinary, cross-course project blends crisis communication theory with the applied, practical skills required to work with a STEM client. The assignment was composed of two key components: an in-class scenario and a written crisis communication plan. The planning for this project was modeled after the recommendations put forth by McCahon and Lavelle (1998) in their discussion of implementation of cross-disciplinary teams of business and engineering students.

According to Shadish, Cook, and Campbell (2002), quasi-experimental designs utilize pre-existing groups as a way of examining the effects of an experimental manipulation. When teaching multiple sections of the same course, professors have the opportunity to introduce a change in one course, such as an interdisciplinary, cross-course project, while holding other variables as constant as possible to examine the impact of the pedagogical manipulation (Carle, Jaffee, & Miller, 2009). Although this design does not allow for strong causal attribution (Shadish, 2006), it does provide professors the opportunity to gain systematic and empirical evidence of effectiveness. Therefore, this study provides preliminary data regarding the effectiveness of interdisciplinary, cross-course group projects on public relations students’ ability to utilize engineering “experts” to increase meaningful translation of relevant technical information for their crisis communication materials.

Participants

Upper-level public relations undergraduate students (N = 50) enrolled in two sections of a senior seminar course on crisis communication management at a small, private university in the Northeast area of the United States participated in this study. The mean age of the participants was 21 years old (SD = .85). All were seniors, or in the fourth full year of undergraduate studies, and most were women (98%). Most had taken the science courses for their general education requirements (88%); however, none had college-level experience with engineering material. Demographic and STEM course history did not differ between sections.

Procedure

At this university, the strategic communication department generally offers two sections of a crisis communications management senior seminar course each semester. The section to receive the experimental manipulation, i.e., the interdisciplinary, cross-course version of the engineering crisis assignment, was randomly selected, and the other section served as the control. The author taught both sections. Aside from the experimental manipulation, both sections received identical course materials, including assigned readings, lecture information, other assignments, and exams.

All participants completed one of two versions of the in-class engineering crisis activity. The engineering crisis activity was based on a case study of a geotechnical engineering failure commonly used in geotechnical engineering courses, the Teton Dam failure. The case study and related materials, including two scholarly readings and two videos, were selected by a geotechnical engineering professor. The control section (N = 26) was exposed to all the related case study materials prior to class and asked to begin discussing the case and potential crisis response strategies with their group members for their in-class scenario. The experimental section (N = 24) was also exposed to all the related case study materials prior to class; however, for their in-class scenario, each group was provided an “engineering expert” from a mid-level engineering course at the same university to serve as their engineering liaison. The engineering students were assigned randomly to the crisis communication groups. Each group’s “expert” provided their group with a supplemental technical report to provide further information about the crisis.

All student groups were then given a week to develop a crisis communication plan, using a crisis communication theory to suggest a proposed course of action featuring both technical and communication components. The final plan was expected to include a timeframe for crisis communication, discussion of spokesperson(s) and stakeholders, and image restoration strategies appropriate for the crisis. Additionally, the groups were to develop the opening statement for the initial news conference, the initial press release, and a plan of action for the fictional engineering team who would be tasked with determining how to move forward with the failure site. The groups in the experimental section were encouraged to consult with their “experts” outside of the in-class scenario as questions arose.

Measures

Achievement. Using academic records from the sections’ gradebooks, students’ academic achievement on the assignment as a whole and the subcomponent of the assignment focused on technical language translation, worth 15/100 points, can be objectively examined (see Table 1 for descriptive statistics). Additionally, two independent scorers evaluated the technical language translation of the public-facing component of the assignment, the press release, on a rubric developed to assess effectiveness of translation. The reviewers showed a strong degree of agreement, with a Cronbach’s alpha of .93.

Table 1 Descriptive Statistics of Achievement Measures

  Mean SD Range
Overall project grade
(out of 100)
88.9 5.3 26.25
Translation subcomponent
(out of 15)
11.9 1.4 6

Assessment. Students anonymously evaluated their experience with the engineering crisis assignment as a part of a larger, end-of-the-semester class survey. Through an open-ended item, students were asked to “Describe their overall experience with the engineering crisis activity, making sure to touch on any concerns related to the translation of technical information for lay audiences.”

Achievement

Given the exploratory nature of the research question, two-tailed t-tests were used to examine a difference in achievement scores across the two sections. There was a significant effect for the section on overall project score, t(48)= -3.49, p = .001, with students in the experimental section, where they were assigned and able to work with an engineering student serving as a technical expert for the project (M = 86.4, SD = 5.8), scoring lower on the engineering crisis project assignment than students in the control section (M = 91.2, SD = 3.6). There was also a significant effect for the section on the technical translation subcomponent, t(48) = -5.9, p < .001, with students in the experimental section (M = 10.9, SD = 1.2) scoring lower, on average, on the technical translation subcomponent than students in the control section (M = 12.7, SD = 1.1). Moreover, this effect was found through the independent scorers’ evaluation, t(17)= -3.8, p = .05.

Assessment

The qualitative responses to the item in the end-of-the-semester survey regarding the engineering crisis assignment indicated that students in both sections saw value in a STEM-related assignment. One student from the control section wrote, “I hadn’t ever thought about how – no matter where I end up working – my organization might go through a crisis that has scientific or technical components. It was interesting to think about how I might need to first learn what the technical issues are so that I can explain them to others without jargon.”

FINDINGS

The responses also highlighted that students in the experimental section were generally more aware that translation of technical information was a key learning objective of the assignment. One student explained that working with the expert “helped me realize how difficult it can be to translate technical information for non-experts.” That being said, students in this section also expressed frustration with their assigned ‘experts’ from the engineering course. One student who was in the experimental section wrote “I was so annoyed because it felt like he didn’t care to help us better understand the material. He would just explain the engineering failure in the same way, no matter how we asked for more or a different explanation.” Another student wrote that “it felt like she didn’t care as much about the project as we did – I don’t know maybe it wasn’t a big project for her course.” Another issue that students in the experimental section discussed in their responses was a desire for more in-class interaction with their experts: “I wish that we could have had more than 1 class period to meet with our expert since we didn’t have a time that worked for us to meet outside of class as a group.”

Discussion

This preliminary investigation used a quasi-experimental design to examine the role of interdisciplinary, cross-course projects in public relations students’ ability to translate the key technical concerns of an engineering failure crisis for media and general public audiences. In the control section, the communication students developed a crisis communication management plan based off the case study materials alone, without contact with the engineering course. In the experimental section, the crisis situation was supplemented by an engineering student serving as an expert, providing a technical report and answering their group’s questions about the technical aspects of the case study. Based off of this report and interaction with the engineering students, the communication students then developed a crisis communication management plan. All crisis communication plans were expected to apply theory and suggest a proposed course of action featuring both technical and communication components.

In contrast to the positive impacts on student learning that had been reported previously in other pedagogical examinations of interdisciplinary and cross-course projects, the results of this study suggest that the inclusion of the interdisciplinary, cross-course aspect of the assignment negatively impacted students’ level of achievement on the assignment as a whole and the key learning objective in particular. Moreover, it appears that the students who participated in the interdisciplinary, cross-course project found the experience frustrating, even while noting the potential parallels to future work situations. It is possible that instead of working with other undergraduate students, public relations students may be better served by working with professional engineers who, due to experience, may be more comfortable explaining an engineering case study, such as the Teton Dam failure.

In terms of improvements for a future interdisciplinary, cross-course project, students voiced that they would have preferred more in-class meeting opportunities with their ‘expert.’ While this interaction may be preferable, there is the possibility of logistical challenges impeding such a change. The ideal situation would be for the cross-course project to involve courses that have the same meeting date and time, so professors could ensure all students would be available to meet. Unfortunately, with course scheduling generally being outside the hands of faculty, the likelihood of this occurring may vary depending on an institution’s scheduling constraints. Another major improvement would be to encourage all involved faculty members to have the project be worth the same across courses. Again, this may not be possible or appropriate depending on (1) the make-up of the courses outside of the interdisciplinary, cross-course project or (2) the time and energy required for the various roles different students may be asked to play in the project. However, when possible, this could be helpful in reducing the perception by students that the project is significantly more important to one class than the other. Finally, it is clear from the students’ use of time during the in-class activity that the importance of the “up-front” work required to prepare for the meeting with the ‘expert’ was not adequately stressed. For future projects, professors should emphasize the importance of meeting preparation so that in-class meeting time is utilized most efficiently, especially if multiple in-class meetings are not possible due to scheduling.

There are several limitations to this study. While all variables that the professor of the crisis communication management course could hold constant (material provided, other assignments, exams, etc.), were held as constant as possible, the flow of lecture and discussion between the two courses is impacted by students’ questions and comments. Moreover, although both sections meet for the same length of time per week, one section met once a week, while the other met twice a week due to university scheduling. Another limitation is the short study duration of a week-long project in a single semester. Although statistically significant differences were found, the full pedagogical effects of interdisciplinary, cross-course projects would possibly be even more apparent with an analysis over multiple semesters.

Unlike pedagogical studies in other disciplines, the benefits of interdisciplinary, cross-course projects for public relations students are not supported by this exploratory investigation. For public relations students, the project stressed several essential interpersonal and intellectual learning outcomes, including written and oral communication, social intelligence, critical thinking and reasoning, and creative thinking. However, the results suggest that the introduction of an engineering student into the group to serve as a technical expert only increased students’ confusion regarding the technical aspects of the Teton Dam failure, and subsequently negatively impacted their ability to translate those concerns for media and public audiences. Of course, due to the study’s quasi-experimental design, it is not absolute that the difference in achievement is directly linked to the experimental manipulation.

As group projects have become integral to the contemporary workforce (Hirsh et. al, 2001), the present study speaks to the role of interdisciplinary and cross-course group projects in general. When considering the logistical lessons learned, this study has implications beyond the fields of engineering and public relations. In regards to the practical concerns of developing interdisciplinary, cross-course projects, this study would suggest incorporating multiple in-class meeting opportunities if possible, having the project’s worth be equivalent across the various courses involved in the project, and maximizing the utility of shared group time by encouraging students to do all preparatory work prior to the in-class meeting(s). One positive takeaway for public relations professors specifically is that results from the assessment across both sections encourage the incorporation of STEM-related assignments. Students remarked on the importance of being able to effectively communicate with and on behalf of STEM professionals, which highlights their understanding of the changes to come in the workforce at large and how it will likely impact their role as public relations practitioners.

REFERENCES

Birol, G., Birol, I., & Cinar, A. (2001). Student-performance enhancement by cross-course project assignments: A case study in bioengineering and process modeling. Chemical Engineering Education35(2), 128-133.

Carle, A. C., Jaffee, D., & Miller, D. (2009). Engaging college science students and changing academic achievement with technology: A quasi-experimental preliminary investigation. Computers & Education52(2), 376-380.

Coombs, W. T. (2001). Teaching the crisis management/communication course. Public Relations Review27(1), 89-101.

Coombs, W. T., & Rybacki, K. (1999). Public relations education: Where is pedagogy? Public Relations Review, 25(1), 55-63.

Envick, B. R., Madison, T., & Priesmeyer, R. (2003). An interdisciplinary approach to entrepreneurship education: The cross-course project model. Journal of Entrepreneurship Education6, 1-10.

Fischhoff, B. (2013). The sciences of science communication. Proceedings of the National Academy of Sciences110(3), 14033-14039.

Flosi, A., Fraccastoro, K., & Moss, G. J. (2010). Cross-course projects: Teaching students on changing business communication methods. American Journal of Business Education3(1), 65-70.

Goltz, S. M., Hietapelto, A. B., Reinsch, R. W., & Tyrell, S. K. (2008). Teaching teamwork and problem solving concurrently. Journal of Management Education32(5), 541-562.

Hirsch, P. L., Shwom, B. L., Yarnoff, C., Anderson, J. C., Kelso, D. M., Olson, G. B., & Colgate, J. E. (2001). Engineering design and communication: The case for interdisciplinary collaboration. International Journal of Engineering Education, 17(4/5), 343-348.

Jaccheri, L., & Sindre, G. (2007). Software engineering students meet interdisciplinary project work and art. In Information Visualization, 2007. IV’07. 11th International Conference (pp. 925-934).

Jensen, M., Moore, R., & Hatch, J. (2002). Cooperative learning-Part 3: Electronic cooperative quizzes. The American Biology Teacher, 64(3), 169-174.

Juhl, L., Yearsley, K., & Silva, A. J. (1997). Interdisciplinary project-based learning through environmental water quality study. Journal of Chemical Education, 72(12), 1431.

Kruck, S. E., & Teer, F. P. (2009). Interdisciplinary student teams projects: A case study. Journal of Information Systems Education, 20(3), 325-330.

Little, A., & Hoel, A. (2011). Interdisciplinary team teaching: an effective method to transform student attitudes. The Journal of Effective Teaching, 11(1), 36-44.

McCahon, C. S., & Lavelle, J. P. (1998). Implementation of cross-disciplinary teams of business and engineering students for quality improvement projects. Journal of Education for Business73(3), 150-157.

Petrie, H. G. (1976). Do you see what I see? The epistemology of interdisciplinary inquiry. Educational Researcher, 5(2), 9-15.

Repko, A. F. (2008). Assessing interdisciplinary learning outcomes. Academic Exchange Quarterly, 12(3), 171-180.

Shadish, W. R. (2006). Critical thinking in quasi-experimentation. In R. J. Sternberg, H. Roediger, & D. Halpern (Eds.), Critical thinking in psychology. Washington, DC: American Psychological Association.

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton-Mifflin.

Sparks, S. D., & Conwell, P. (1998). Teaching public relations–does practice or theory prepare practitioners? Public Relations Quarterly43(1), 41-42.

Ulmer, R. R., Sellnow, T. L., & Seeger, M. W. (2014). Effective crisis communication: Moving from crisis to opportunity. Sage.

U.S. Bureau of Labor Statistics. (2014). STEM 101: Intro to tomorrow’s jobs. Occupational Outlook Quarterly. Washington, DC: U.S. Government Printing Office.

Waltermaurer, E., & Obach, B. (2007). Cross course collaboration in undergraduate sociology programs. Teaching Sociology35(2), 151-160.

Wingert, J. R., Wasileski, S. A., Peterson, K., Mathews, L. G., Lanou, A. J., & Clarke, D. (2011). Enhancing integrative experiences: Evidence of student perceptions of learning gains from cross-course interactions. Journal of the Scholarship of Teaching and Learning11(3), 34-57.

Woolston, C. (2014). Public relations: For your information. Nature, 509, 123-125.

© Copyright 2017 AEJMC Public Relations Division

Math, Message Design and Assessment Data: A Strategic Approach to the Facebook Assignment

Author

Tiffany Derville Gallicano, UNC Charlotte

SlideShare PDF

Math, Message Design and Assessment Data: A Strategic Approach to the Facebook Assignment

Math, Message Design and Assessment Data: A Strategic Approach to the Facebook Assignment

The purpose of this assignment is to adopt a strategic planning approach to the task of creating engaging social media content in a real-world context. For this assignment, students work as a class to set a weekly research-based objective and work in teams to plan the communication department’s Facebook fan page content for every day of a work week (Monday-Friday) during the semester. Other fan page account administrators can post important departmental content throughout the semester without disrupting the week-by-week student takeovers of the fan page. This assignment has been popular in social media and public relations strategy classes. This assignment provides an experiential way for students to apply basic statistical concepts, assessment data, and message design theories. In addition, it has the benefit of serving as a potential resume item and portfolio sample.

Application of the Assignment to ACEJMC Professional Values and Competencies

The fan page assignment contributes to the fulfillment of several professional values and competencies described by the Accrediting Council on Education in Journalism and Mass Communications (n.d.). It contributes to the professional value and competency about applying theories in how content and images are presented (ACEJMC, n.d.) because students are asked to apply message design concepts from Heath and Heath (2007), which include simplicity, unexpectedness, concreteness, credibility, emotional content, and stories. When reviewing initial drafts, the instructor commonly points to one or two message features that a team needs to improve upon for their final product.

In addition, the assignment contributes to ACEJMC’s (n.d.) professional value and competency about conducting research using appropriate methods adopted in the workplace because students use prior fan page performance data to set a weekly performance objective and determine qualities of successful and unsuccessful posts. Students also review the fan pages of comparison communication departments as part of their research (in accordance with the recommendation by Paine, 2011, about examining competitors’ performance). In addition, students review the metrics for the most popular and least popular posts from the prior semester and apply message design theory (i.e., Heath & Heath, 2007) and inductive logic to discuss best practices for engaging their key publics.

This assignment also contributes to three other communication-related professional values and competencies established by ACEJMC. Students gain practice in writing correctly and clearly in a format commonly used in the workplace through the text that accompanies their fan page posts (ACEJMC, n.d.). They are assigned a team grade, so they must critically assess their work and their teammates’ work “for accuracy and fairness,” as well as clear, grammatically correct writing (ACEJMC, n.d., para. 13). Another communication-related competency that is relevant to this assignment is the call for students to use current technologies used by professionals to understand the digital world (ACEJMC, n.d.). Students learn best practices for the digital world through their research about successful Facebook posts and draft their own digital content. Also, to earn an A, students must use their own images/videos for all posts and are encouraged to use resources such as Canva for images.

Finally, this assignment contributes to the ACEJMC (n.d.) competency about applying basic math and statistics. Students apply the mean, mode, median and standard deviation based on data from the prior semester to set the weekly performance objective that will apply to all teams. They use basic percentage calculations to determine how many interactions would be needed to achieve particular percentage increases. Students are encouraged to also report the percentage by which they surpassed the weekly class objective on their resumes/LinkedIn profiles if relevant.

Connection to Best Measurement Practices

To contextualize the strengths and limitations of the assignment as they apply to the professional practice of public relations, students are taught the Barcelona Principles 2.0 in conjunction with the assignment (see the Institute for Public Relations, 2015). Students are told that the best objectives are tied to business results, and the number of interactions to a post is merely an output measure about whether a campaign is on the right track (in conjunction with an analysis of comments, which is another mid-campaign output measure). Questions about measuring social media and the Barcelona Principles also appear on the class study guide and exam to ensure that students are not confused about using an interaction count as an ultimate measure of a campaign’s success. The instructor explains to students that the assignment is designed in a truncated way to focus the class efforts on the course objectives. Additional survey and qualitative research could be added for a research methods class to tie the social media performance to business results. In conjunction with the assignment, students also share experiences with how they measure the success of their social media in their internships and compare these measures (or lack of any measure) to the Barcelona Principles. Students are shown an award-winning video about a Facebook campaign received from a PR agency, which is paused periodically to identify key terms (output, outcome), recognize message design strategies summarized by Heath and Heath (2007), and apply the Barcelona Principles to the campaign measurement.

Assignment Details

In addition to teaching the Barcelona Principles, additional best practices for measurement, and message design theory, the assignment introduction also involves a discussion about what makes public relations strategic. Ultimately, the assignment addresses the importance of goals, objectives, research about key publics, research-tested message design strategies, tactics that are appropriate to key publics, and assessment, which should occur during the campaign and at the end of the campaign.

Goals and Objectives

The class discusses the goal and sets the objective for weekly performance. The following goal is shared with them as the assignment: “Enhance the sense of community surrounding the UNC Charlotte Department of Communication Studies.” Next, the class is led through basic statistics to set an objective. Students examine the total number of weekly interactions for each week of the prior semester, which are included on the assignment handout. Students calculate the median, mode, and mean on their assignment handout. Next, they use a standard deviation website to automatically calculate this number to determine whether their distribution of weekly fan page interactions is normal (see EasyCalculation.com, n.d.). Kernler’s (2014) visual helps students understand the concept of standard deviation. Once students have figured out whether the weekly distribution of fan page interactions is normal based on the data’s standard deviation (extensive instructions are in the handout, which is walked through together), they decide whether they can use the previous semester’s mean as an anchor for setting their objective or whether the median or mode might be better choices. Once they have made their decision, as a class, they complete the following framework for the class objective: “To increase interaction on the fan page for the week (i.e., defined as the combined total of reactions, comments and shares) among members of any of our key publics by ________________, as compared with _________________________.” They calculate what a 10% increase would be from their anchoring metric and decide whether they think the increase is both meaningful and attainable. If the increase is not meaningful, they calculate what a 20% increase would be and so forth. The class also acknowledges that with social media, a major limitation is that we do not necessarily know if the people interacting with the content represent the class’ key publics, which were defined as prospective, current, and graduated majors and the parents of all three groups; department faculty, staff, and administrators; and university administrators.

Due to the modest size of the department’s fan page subscribers, a second goal for the class was built into the assignment: “Increase awareness of the UNC Charlotte Department of Communication Studies fan page.” The predetermined objective for the class was “to increase page likes among members of any of our key publics by five people per team member.” Students recorded the names of the people they recruited and organized the list by key public. They were not allowed to recruit each other for the assignment. Fan page recruitment stretched some students in terms of their comfort zones with promoting fan page content and might have played an important role in most students’ ability to reach their objective for the number of weekly fan page interactions.

Student Privacy, Assignment Timeline, Content, Rubric, and Teamwork

Each team’s Monday post includes an introduction of the team with a group picture and a quote for #MotivationMonday. To be in compliance with FERPA, students are informed that they need to tell the instructor prior to the deadline of their initial draft if they have any privacy preferences regarding the use of their name or picture. Drafts are due on Tuesday prior to the team’s week, feedback is provided within 24 hours, and students’ final submission for a revised grade is due via email Friday afternoon of the same week. The timeline is feasible because only one Facebook assignment is graded each week. Content is posted a week in advance, and the instructor emails the team to remind them to promote the fan page during the week and email anyone they featured on the day the relevant content appears if tagging was not possible. Students often share the Monday post on their feeds, which helps them exceed the weekly objective. Other themes for posts include Teach It Tuesday, Working Wednesday, Thursday Thoughts, and Forty-Niner Friday (named for the university mascot). The instructor maintains a list of content covered in the prior semester and restricts students from focusing on it (with some exceptions). The rubric for the assignment can be found in the Appendix. The complete handout exceeds the page limit of this article and can be requested via email (tgallica@uncc.edu).

REFERENCES

Accrediting Council on Education in Journalism and Mass Communications [ACEJMC]. (n.d.). Nine accrediting standards. Retrieved from http://www.acejmc.org/policies-process/nine-standards

EasyCalculation.com. (n.d.). Standard deviation calculator. Retrieved from https://www.easycalculation.com/statistics/standard-deviation.php

Heath, C., & Heath, D. (2007). Made to stick: Why some ideas survive and other die. New York: Random House.

Institute for Public Relations. (2015). Barcelona Principles 2.0 – updated 2015. Retrieved from http://www.instituteforpr.org/barcelona-principles-2-0-updated-2015

Kernler, D. (2014, October 30). A visual representation of the empirical (68-95-99.7) rule based on the normal distribution. Retrieved from https://commons.wikimedia.org/wiki/File:Empirical_Rule.PNG

Paine, K. D. (2011). Measure what matters: Online tools for understanding customers, social media, engagement, and key relationships. Hoboken, NJ: Wiley.

 

APPENDIX

Assignment Rubric

In nearly all cases, you and your team will share the same grade. Thus, you need to work together to brainstorm good content ideas and proof each other’s posts, which will help to ensure a consistently high quality.

An exception to sharing the same grade is if a team member is not making internal deadlines that the team sets. If a member of your team is not keeping up with your internal timeline after at least one reminder and is not responsive to you within 24 hours, please email me or meet with me. Possible options I might take include lowering the teammate’s individual score or removing the individual from the team. Individuals who are removed from a team have the option of completing an alternate assignment (such as anonymously creating content for May 1-5 and will earn an assignment grade no higher than a C). Also, if I see that a team member did not author any of the posts, I will drop this person from the group.

5 points: Engaging, inviting, professional, human tone, including word choice. Use of up to one exclamation point per post to avoid sounding giddy.

10 points: Interesting content that is strategic with regard to the information covered in this worksheet and in our class discussion.

10 points: Quality of pictures or videos (aesthetic quality, lighting, sharpness, sound, if relevant) and how interesting they are (candid pictures and videos taken by you are preferred).

  • Any picture taken from the Internet that is not free to use (or that is free to use with attribution but is lacking the attribution) will result in a 0 from the individual author’s score and a maximum of 7/10 on the other team members’ score. I will also file a plagiarism report with the university, even if I do not press charges.
  • For a score of 8-10, the Monday post picture must be taken of your group all together with sharp resolution and good lighting. The picture should enhance your professional footprint.
  • For a score of 9-10, high-quality original photos and videos must be included for every post. See me if you want to appeal for an exception. Remember that you can use Canva online to create free images for quotes.

10 points: Writing mechanics, factual accuracy, spelling (including the saved name of the document), AP style and brevity.

  • 10/10: Flawless
  • 9/10: 1-2 errors
  • 8/10: 3-4 errors
  • 7/10 5-6 errors
  • 6/10 7-8 errors

(and so forth)


 

How Do Social Media Managers “Manage” Social Media? A Social Media Policy Assignment

Author

Melissa Adams

Melissa Adams, North Carolina State University

SlideShare PDF

How Do Social Media Managers “Manage” Social Media?: A Social Media Policy Assignment

How Do Social Media Managers “Manage” Social Media?: A Social Media Policy Assignment

As numerous public relations research studies have noted, social media communication by employees and other stakeholders often impacts public perceptions of their associated organizations, whether or not that communication is sanctioned by the enterprise or is a personal expression. Employees have been known to use social media to purposefully express anger or attempt to harm the reputation of organizations through “venting” or negative “flaming” messages meant to be seen by potential clients or hires, thus presenting new challenges for public relations (Jennings, Blount, & Weatherly, 2014; Krishna & Kim, 2014).

As the resident social media “expert,” commonly charged with monitoring and responding to such communication, as well as day-to-day management, public relations professionals are usually the primary resource for the development of social media policies (Lee, Sha, Dozier, & Sargent, 2015; Messner, 2014). Even though organizations may not have a policy in place when they become active on social media, they often realize the necessity of one after gaining some experience (Messner, 2014).

This assignment was developed to address the task of policy development with practical training that foregrounds professional ethical communication guidance, legal precedent, and collaboration with organizational stakeholders. Researching and crafting the policy also prepares students for the emergent public relations role of social media policy maker and manager (Neill & Moody, 2015).

Assignment Rationale

The social media policy assignment was designed to integrate knowledge gained from recent course material and discussion of ethical social media practice, a unit on the current legal environment (copyright, etc.), and a workshop on the basics of campaign planning. It challenges students to apply what they have learned to the development of a comprehensive policy addressing organizational needs and includes all the appropriate information (i.e., they must think it through just as they would in an agency or professional project). This unit begins with the question “How do social media managers really ‘manage’ social media?” Then, moving through the ethics and legal units as a class, this question continues to promote discussion of the challenges that digital public relations practitioners must take into account as resident technical experts, planners, and policy advisors managing social media and organization-public relationships (Lee, Sha, Dozier, & Sargent, 2015; Neill & Moody, 2015). Legal case precedent and issues of copyright, fair use, and freedom of speech as expressed on social media (e.g., the Hispanics United versus National Labor Relations Board case) are the focus of class discussion leading up to the social media policy assignment (Lipschultz, 2014; Myers, 2014).

In addition, this assignment requires students to identify and work with a client organization, learn about the organization’s potential risks from inappropriate social media use, and then make analytical decisions to construct an ethical, comprehensive policy to address them. Finally, the completed social media policy provides students with a professional quality portfolio piece, and if the client chooses to adopt it, an impressive resume-builder.

Student Learning Goals

This assignment develops several communications practice competencies noted by public relations educators and practitioners as desired skills for young professionals. Through its blend of research and knowledge application, the social media policy assignment teaches students to think like a practitioner following best practices and the value of collaboratively developed policies (Freberg, Remund, & Keltner-Previs, 2013; Messner, 2014). Working through this assignment, students build practical research skills by conducting discovery interviews with organization practitioners or administrators, while simultaneously gaining experience working with a client, managing logistics and communication. The assignment also helps students develop analytic acumen through performing an audit of client social media assets in regard to organizational risk.

By conducting a working review of existing organizational social media and example documents, students learn and understand common objectives and components of social media policies. They are then challenged to apply their recently gained legal knowledge to the development of an ethical and compliant written social media policy document.

Finally, as advanced writing and presentation skills are core competencies for public relations practice, the social media policy assignment provides an opportunity to refine presentation skills and gain experience producing professional quality documents. For the last stage of the assignment, students are required to formally meet and present their final policies to their client organizations, who in turn complete a satisfaction form for assessment.

Connections to Public Relations Theory and Practice

 This assignment comes from a course developed for seniors and advanced juniors enrolled in the public relations concentration. It connects to recent scholarship and research on the ethical practice of social media in public relations. As communications professionals, students will likely be required to either update existing social media policies or develop new ones for clients or employer organizations. To do this, these young professionals will need to work across the organizations to collaborate with other stakeholders in human resources, legal and marketing to develop, implement, promote, and police them across the enterprise as noted in recent research (Neill & Moody, 2015). Crucially, they must be able to craft policies that both recognize the free speech rights of employees and provide a comprehensive guidelines document addressing all areas of possible use (Lipschultz, 2014; Myers, 2014).

In preparation for the social media policy assignment, students read and discuss a textbook chapter on the legal issues of social media practice (Lipschultz, 2014) and review National Public Radio’s Ethics Handbook (n.d.), which addresses the general ethical journalism practice concepts of fairness, transparency, and accuracy. They also review the Public Relations Society of America’s Member Code of Ethics (n.d.), which reinforces the journalistic principles covered by NPR’s Ethics Handbook, yet extends them to the role of ethical digital public relations practice by addressing practitioner duties such as the preservation of accurate information flow and safeguarding privacy (PRSA, n.d.). In addition to professional ethical guidance, these resources offer a framework for the students to refer back to as they work through the assignment and interact with their clients about the specific needs of their organizations.

Assignment Introduction and Execution

 To introduce the assignment, two examples of actual (anonymized) social media policies of varying scope and audience (university and small business or student organization) are presented. Students form small groups to work through examples of the policies, comparing the components and noting differences. They make a list of all the similarities and differences of each policy element as a group. Afterward, the class discusses the elements of each policy to determine their primary function and necessity. Then the social media policy assignment is introduced with an in-depth handout (a brief version of the handout is provided in the Appendix) and a walk-through of the numerous questions students should ask to determine the needs and goals of their client organization, including resources required for implementation and adoption.

Students are then charged with identifying a client organization to work with on this assignment—a nonprofit organization, student organization, or a small business they are affiliated with that needs such a policy. If needed, students receive help connecting with a potential client organization for the project.

From this point, students use the assignment instructions to work on their individual policy documents on their own time. After completion and grading, the policies are returned to the students for finalization for their clients, and they email them to the instructor for a final proofread before the documents are delivered. This final step allows a review of presentation points and the assessment form with the students.

Evidence of Learning Outcomes

Several of the client organizations have implemented their student’s policy document following completion of this assignment. These included student organizations, two nonprofits, and two small businesses where students were employed or interning at the time. One small business, a massage studio and beauty spa, adopted the social media policy across its small chain of retail locations in the Southeastern US.

Additionally, students have noted in instructor feedback forms that this assignment was very useful as it gave them an opportunity to develop “real world” experience and a document they could use as both a portfolio piece and a professional writing sample.

REFERENCES

Freberg, K., Remund, D., & Keltner-Previs, K. (2013). Integrating evidence based practices into public relations education. Public Relations Review, 39(3), 235-237. doi: 10.1016/j.pubrev.2013.03.005

Jennings, S. E., Blount, J. R., & Weatherly, M. G. (2014). Social media—A virtual Pandora’s box: Prevalence, possible legal liabilities, and policies. Business and Professional Communication Quarterly, 77(1), 96-113. doi: 10.1177/2329490613517132

Krishna, A., & Kim, S. (2015). Confessions of an angry employee: The dark side of de-identified “confessions” on Facebook. Public Relations Review, 41(3), 404-410. doi: 10.1016/j.pubrev.2015.03.001

Lee, N., Sha, B. L., Dozier, D., & Sargent, P. (2015). The role of new public relations practitioners as social media experts. Public Relations Review, 41(3), 411-413. doi: 10.1016/j.pubrev.2015.05.002

Lipschultz, J. H. (2014). Social media communication: Concepts, practices, data, law and ethics. New York, New York: Routledge.

Messner, M. (2014). To tweet or not: Analysis of ethical guidelines for social media engagement of nonprofit organizations. In DiStaso, M. W., & Bortree, D. S. (Eds.), Ethical practice of social media in public relations (pp. 82-95). New York, NY: Routledge.

Myers, C. (2014). The new water cooler: Implications for practitioners concerning the NLRB’s stance on social media and workers’ rights. Public Relations Review, 40(3), 547-555. doi: 10.1016/j.pubrev.2014.03.006

National Public Radio (n.d.). NPR Ethics Handbook. Retrieved from http://ethics.npr.org/

Neill, M. S., & Moody, M. (2015). Who is responsible for what? Examining strategic roles in social media management. Public Relations Review, 41(1), 109-118. doi: 10.1016/j.pubrev.2014.10.014

Public Relations Society of America (n.d.). PRSA Member Code of Ethics. Retrieved from http://apps.prsa.org/AboutPRSA/Ethics/CodeEnglish/index.html

APPENDIX

 Assignment Worksheet

For this assignment you will create a formal, professional social media policy for an organization of your choice. If you need help identifying an organization, I will help you connect with a local nonprofit or student organization.

Sohow do you go about this?   Just follow these steps.

Research the social media footprint and assets of the organization and create a list of all their platforms and note any apparent campaigns, strategies and tactics used.

  1. Identify, contact and talk to the person in charge of social media and brand administration for the organization (who will likely be in a communications function). If this individual can’t meet with you in person, you can connect with them via email or phone. Note that in smaller organizations, this contact might be someone in human resources or customer service.
    • Ask them if they have an existing social media policy, if so, does it fit their needs? If not, can you do one for them?
    • Then ask—what are the main concerns regarding social media for their organization? Also find out if there are any special regulations or legal issues you should be aware of when preparing your policy.
  2. Ask yourself (and your client organization when applicable) the following questions as you think through this assignment.
    • What is the “big picture” purpose of this policy? How will the policy meet certain organizational needs and align with business objectives?
    • What types of social media activities need to be addressed in the policy document? What platforms? What types of content?
    • Are there any special considerations (based on your organization) that you should consider and address in the policy?
    • Who is the audience for this policy?
    • What are the specific risks your organization hopes to mitigate with this policy and where might they come from? Employees? Other stakeholders?
    • Who will be in charge of policy administration? Who will monitor and report infractions? What will happen to violators? Who should be contacted with questions about the policy?
    • What resources might readers need to comply with this policy? (Example: A link to an organizational brand standards guide.)
    • How will your organization implement this policy? Who needs to review and approve it before dissemination?

 

Sections to include in your policy document:

Policy Overview – provide a rationale for the policy. Explain in clear terms why it is needed, how it will be implemented, etc. Explain its goal in positive terms (to maintain xxx, to promote xxx, etc.), and be sure to include a list of applicable social media assets. Explicitly state what is covered by the policy (and what isn’t).

Allowed Use – provide examples of approved use. This should include actual or example tweets/posts as well as brand elements. Use screenshots to illustrate as needed.

Disallowed Use – provide examples of what NOT to do! Use screenshots and descriptive language.

Legal – address any legal issues including copyright. (Example: the FERPA section in the university social media policy example.)

General Best Practices – create a short list based on the organization’s current social media assets. Follow the examples provided as well as those posted online by reputable and ethical organizations (such as the examples shared in class).

Resources – this section is for links or directions to internal resources such as legal documents or other policies, and for reference links to external sources.

Contact Information – for the administrator of the policy, legal, etc. as you see fit. Provide full information including email and phone number.

 

Assignment Rubric – 100 pts possible

  1. Research – 20 pts
  2. Planning/Organization – 25 pts
  3. Content (each section is addressed completely) – 35 pts
  4. Clarity (is it easy to follow?) – 10 pts
  5. Professional Presentation – 10 pts

 

 

Who Will Get Chopped?: Mystery Basket PR Challenge

Authors

         Emily Kinsky

• Mary E. Brooks, West Texas A&M University

• Emily S. Kinsky, West Texas A&M University

SlideShare PDF

Who Will Get Chopped?: Mystery Basket PR Challenge

Who Will Get Chopped?: Mystery Basket PR Challenge

Based off Food Network’s Chopped challenge, the Mystery Basket PR Challenge is a competition that focuses on creativity, speed, and skill in which students are given a box of mystery “ingredients” (e.g., brand, crisis, strategy, channel, speaker, audience) they have to use to complete an assigned task (e.g., a tweet, an official statement, a headline). For example, a box might have a brand name, a particular crisis, a group of people affected and a celebrity, and the task would be to write a headline for a news release, keeping in mind which crisis response strategy from Benoit (1997) or Coombs (2007) might be most appropriate. Students open the box and have a limited time in their groups to complete the task, which they then pitch to the judges (faculty and local professionals). This requires teamwork and application of lessons learned in class as the student groups compete against each other.

The purpose of the Mystery Basket PR Challenge is for students to apply PR strategies to handle unexpected situations and solve problems collaboratively under a deadline. This challenge can also help prepare students to clearly and quickly articulate ideas.

Per Kolb’s (1984) experiential learning theory, learning through experience focuses on the process at hand and not necessarily the outcome of the project. By formatting the classroom into a simulated work environment, students will have greater success in their future careers when faced with similar challenges (Ambrose, Bridges, DePietro, Lovett & Norman, 2010; Svinicki & McKeachie, 2014). The challenge covers the five elements that are crucial to an experiential learning activity: the use of real-world situations; complexity (more than one answer may suffice); industry-specific concepts; student-led activity; and finally, feedback and reflection (Svinicki & McKeachie, 2014). The benefits to students are numerous, especially in relation to the PR industry where strategy, creativity, spontaneous thinking, collaboration, and articulate wording are all pivotal to being successful.

This pedagogical teaching tool is applicable to a variety of courses within the PR discipline (e.g., writing, campaigns, cases, ethics, social media) or other strategic communication classes.

During fall 2016, a version of this challenge was successfully implemented in an advertising writing class as a final project. Student feedback was positive. For example, one student said, “the ‘Chopped’ final was also very intriguing! Having an interactive final that brings in industry professionals to critique our work will greatly help” students continuing in the field.

Assignment Instructions

The Mystery Basket PR Challenge includes three rounds. Each round consists of four mystery public relations components that groups of students must incorporate to produce a public relations solution for a specific organization. Students will work in small groups to produce the solution in a short amount of time for a variety of situations, organizations and media platforms. Student groups will compete against each other. Working in a collaborative environment is essential in PR. Learning to meet deadlines is also pertinent, especially in the public relations industry where clients expect work at a pre-set time. Further, PR practitioners must learn to handle unexpected crises in a timely situation.

Rules

The rules for each round include using all of the mystery basket components, creating the designated assignment within the time allotted, and making a persuasive pitch to the judges. In addition, students will have a public relations pantry they can turn to for help. The pantry would consist of their textbooks, Internet access, cell phones and laptops/tablets. This is similar to Chopped where contestants have access to a modified grocery store in order to enhance a dish. Students are given one class period to practice prior to the real competition class period with different ingredients than what will be used in the competition.

Components

Each group has a basket of mystery components during each round. The round assignments can change based on the class topic (see Appendix A for examples). For an introductory course, Round 1 could be the event planning round; Round 2 could be the social media round; and Round 3 could be the news release round. Just like Chopped, the time for each round will increase as each round increases in difficulty. During Round 1 for a social media class, the students will have 10 minutes to create a calendar-related promotion; during Round 2, students will have 20 minutes to create a hashtag campaign; and during Round 3, the students will have 30 minutes to write a blog post.

Professional Feedback

The student groups will be given live feedback on their work from industry professionals (see Appendix B for a sample judging rubric). The benefits of including public relations industry professionals in this challenge are many. Students have a chance to demonstrate their creative and innovative ideas, their presentation abilities, and their quick thinking skills to the professionals. In addition, students and professionals will begin to formulate relationships. This is important for potential future employment and/or mentorship.

When the time for each round expires, one person from each group must present the team’s final idea to the judges for one minute (or longer, depending on the challenge). The judges will deliberate and deliver their individual comments to each group. The judges will also choose a winner for every round. The class enrollment size and the division of groups will determine how many winning groups per round. The winners from each round will be named the Mystery Basket PR Challenge champions.

Appendix A

Assignment Examples

The Mystery Basket PR Challenge can be modified for different PR courses (e.g., crisis, campaigns, writing, social media). Like Chopped, each round allows students more time (e.g., 10, 20 and 30 minutes). Some “ingredients,” like the brands, will be assigned, while others can be selected strategically by the students (e.g., which channel makes the most sense in this situation?).

Crisis Communication 

  • Round 1: Official statement
  • Component #1: Brand/Organization (this would be assigned to the group)
  • Component #2: An image restoration strategy from Benoit or Coombs
  • Component #3: Crisis (a type of crisis would be assigned to the group)
  • Component #4: Speaker (choose the title of the person who would share the statement)
  • Round 2: Social media post
  • Component #1: Brand/Organization
  • Component #2: An image restoration strategy
  • Component #3: Crisis
  • Component #4: Channel (assign or let them choose)
  • Round 3: News release
  • Component #1: Brand/Organization
  • Component #2: Crisis
  • Component #3: Audience
  • Component #4: A quote to include

Social Media

  • Round 1: Calendar promotion
  • Component #1: National ____ Day (choose a day that fits the brand/org; for example, if the students were given Bayer Aspirin as the brand, they might choose July 9 Rock ‘n’ Roll Day as the specific national day for a tied-in promotional post)
  • Component #2: Brand (company/organization assigned to the group)
  • Component #3: Social media site (choose the most appropriate site)
  • Component #4: Post (write copy, decide when it would be posted, sketch image)
  • Round 2: Hashtag campaign
  • Component #1: Organization
  • Component #2: Event
  • Component #3: Goal
  • Component #4: Social media platform
  • Round 3: Blog post
  • Component #1: Organization
  • Component #2: Audience
  • Component #3: Keywords
  • Component #4: Links

Appendix B

Judging Rubric Example 

Division A Judge Name:

Round 2: Social Media Post

 

Please circle which group in Division A is being judged:

Group 1                                           Group 2                                           Group 3

 

CREATIVITY

Please rate from 1-10 (with 10 being the best) the creativity of the social media post based on the components provided in the basket.       1    2    3    4    5    6    7    8    9    10

 

OVERALL IDEA

Please rate from 1-10 (with 10 being the best) the overall idea of the social media post based on the components provided in the basket.      1    2    3    4    5    6    7    8    9    10

 

PRESENTATION

Please rate the quality of presentation from 1-10 (with 10 being the best).

1    2    3    4    5    6    7    8    9    10

 

 

Please provide comments concerning the overall social media post results, the presentation, and/or anything regarding how the challenge was managed (both positive feedback and suggestions for improvement).

REFERENCES

 Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. San Francisco, CA: Jossey-Bass.

Benoit, W. L. (1997). Image repair discourse and crisis communication. Public Relations Review, 23(2), 177-186.

Coombs, W. T. (2007). Protecting organization reputations during a crisis: The development and application of situational crisis communication theory. Corporate Reputation Review, 10(3), 163–176.

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall, Inc.

Svinicki, M. & McKeachie, W. (2014). McKeachie’s teaching tips: Strategy, research and theory for college and university teachers. Belmont, CA: Wadsworth Cengage Learning.