What if Professional Development Could Be…Fun!?

As has been oft-discussed on this blog, as well as among communities of teachers since time immemorial, professional development (PD) activities are quite frequently dull, irrelevant, time-consuming, and centered around a single expert delivering one-size-fits-all content to a group of workshop attendees.  This is not an effective approach to adult learning (or any learning, really), and it’s a poor approach to facilitating meaningful professional development. Higher education instructor Melissa Nicolas (2019) offers some particularly painful examples of poor PD from her own experience, which many educators can probably identify with: 

  • A three-hour lecture on active learning. 
  • An hourlong lecture on how ineffective lecturing is 
  • A session on student engagement citing the fact that most people only retain the first 10 minutes and last 10 minutes of any lecture, delivered via a 60-minute lecture.  
  • A town hall meeting that left precisely two minutes for questions. 
  • A workshop that began with, “Everything I am about to share is in X document, so you can just read that.” 

Unfortunately, I would guess that almost all professionals who have engaged in PD, not just teachers, can recall examples of their own underwhelming, ineffective, and passive PD experiences.  While instructors at all levels are asked to promote and prioritize social-emotional learning for their students, it is less common for schools to simultaneously invest in fostering a dynamic, supportive staff environment that cultivates the social and emotional competence and capacity of their teachers (CASEL, 2022). 

Imagine a world where professionals are actually excited to participate in PD activities.  Imagine a world where instructors feel like PD truly supports their cultural and social-emotional learning needs while simultaneously helping them do their jobs better… 

https://blog.vantagecircle.com/fun-at-work/

I daresay it’s possible.  Thus, in this investigation I’d like to take a moment to explore the lighter side of PD and highlight a few great ideas for making PD fun (that’s right…fun) and engaging for educators in all contexts. 

For starters, if a PD initiative is going to be meaningful to the participants, they must have the opportunity to interact with one another and collaborate on authentic, relevant, problem-based activities (Teräs & Kartoğlu, 2017).  This means that formal presentations (i.e. lectures) should occupy the smallest portion of time in a PD workshop, and student-centered activities should occupy the largest.  If this is the starting place for any kind of PD initiative, there’s no need for a workshop coordinator to feel like they need to prepare anything flashy to ‘wow’ their attendees and trick them into thinking they’re having fun; prioritizing time for engagement and collaboration is the first and most important step.  From there, the rest might be icing on the cake.   

You’d like some recipe ideas for the icing, you say?  Well allow me to share a few tips and tools which promote collaborative, active learning for adults which could potentially ‘ice’ your PD ‘cake’ beautifully. 

Introduction/Icebreaker Activities 

All good lesson plans have an initial entry point—a hook—that helps students engage with prior learning, share background knowledge, establish social presence, and generally get curious about the content.  Adult learning is no different. Why not try incorporating some of these ideas from Ditch That Texbook (2019) to kick off your next PD gathering? 

  1. Digital Escape Rooms – Digital escape rooms get participants collaborating immediately over a shared problem/objective in a fun way.  The escape room theme can be tied into the learning objectives for the PD, or it can be a stand-alone activity which serves as an example of something to try with students. Here is a link to some free digital escape rooms to offer some initial inspiration. 
  1. Use Flipgrid for Short Intro Videos – either some or all PD event attendees can be tapped to create short intro videos which help them introduce themselves to their colleagues. Videos can be shared to the whole group or in small groups.  Fun prompts for what to share about (e.g. a hobby or a fun fact) can help colleagues connect with each other in new ways and engage on a social-emotional level. 
  1. Use a Google Jamboard, Kahoot, Quizizz, Gimkit, Quizlet, etc. – have participants immediately engaging in a session with a starter quiz, poll, or brainstorm board based on an opening question or theme.  This helps activate prior knowledge, but it also serves as an opportunity for instructors to engage with a tool they may want to use in their classrooms. 

Gamification 

Custom-made Jeopardy games have been used as learning activities in K-12 classrooms for decades, and I’m here to say that they still hold up. When I was a 5th grade classroom teacher (not all that long ago), Jeopardy review day was a hallowed day…even for kids who had never seen an episode of the actual gameshow.  Likewise, I still remember playing Jeopardy when I was a student in my 6th grade social studies classroom many years ago (and our whole class loved it).  This is the power of gamification as a tool for learning.  Of course, kids aren’t the only ones who enjoy games. 

“…adults love to be playful, take risks, and experiment with new ideas just as much as children do.”

(Schmidt, 2015)

So why not gamify PD for adult learners? 

Carl Hooker (2015) offers an example of PD gamification through Interactive Learning Challenges (ILCs).  At its core, an ILC “…starts with the concepts of collaborative problem-solving and interactive creativity and adds an element of competition to learning.” An ILC can take place over the course of several days or in one hour, and it can be done with a relatively small group of people or a very large group of several hundred.  The challenges themselves can take many different forms, but Hooker (2015) describes one particularly successful activity, a scavenger hunt, which he facilitated in place of of a keynote address during a PD event.  In this ILC, the goal was to help a large group of educators better leverage iPad and App use in their classrooms.  He created a custom scavenger hunt activity in which educators worked in groups to complete challenges related to the learning objective.  “I had the entire group line up and self-identify who was the most or least tech-savvy. After that, I paired and grouped the staff to ensure that each team of four included at least one ‘high tech’ person. The way I designed the challenges, every team member had to…participate in the creation of the final product, regardless of tech skills” (Hooker, 2015).  

After the activity, participants noted that they loved being engaged with the actual tools they were meant to be using, they valued learning from/with their peers, they felt empowered to try new things in a safe, collaborative space (especially those who identified as least tech-savvy), and…drumroll please…they had fun! 

https://sse.tulane.edu/k12-stem/teacher-pd/summer-pd

Other ILC examples might include gameshow-style formats (Family Feud, Amazing Race, trivia, etc), or perhaps a STEM-style design challenge in which participants work in groups to build or create a final product given a clear goal and building constraints (Ditch That Textbook, 2019).  These are, once again, activities which educators may choose to replicate or adapt for their own classroom environments.  

Also, offering prizes never hurts… 

EdCamps 

Finally, another option for creating an engaging PD experience is trying an EdCamp format. An EdCamp is an informal, peer-led, collaborative workshop (held either face-to-face or virtually) where educators gather to share stories, experiences, lesson plans, resources, and new ideas with each other, often around a shared theme or learning objective (Schmidt, 2015). EdCamps are usually grassroots “non-conferences” in that there typically isn’t any kind of registration fee, vendors, or keynote speakers; they’re open to everyone, they’re often hosted locally, and sessions are usually led by volunteers (also workshop attendees) who are homegrown experts in their particular fields of interest (Schmidt, 2015).  In other words, at an EdCamp, everyone has something to share, and everyone has something to learn.   

EdCamps are a way for educators to take ownership in their own learning and simultaneously participate in something immediately relevant to their professional needs/interests.  EdCamps honor the rich reservoir of background experiences and professional expertise that all adult learners bring to the table, which is key to effective adult learning.  EdCamps help support and expand professional learning networks/communities and provide ‘playful’ learning environments where adult learners can take risks and experiment with new tools and ideas (Schmidt, 2015). No two EdCamps look alike, and there are endless possibilities for how to structure or format an EdCamp, making them particularly accessible, affordable, and customizable options for PD initiatives, no matter what the theme or learning objective(s) might be. 

Conclusion 

In short, in order to make PD ‘fun’ and engaging, coordinators should look to prioritize time for participant interaction, collaboration on problem-based activities, and opportunities to establish social presence.  There are a variety of ways to do this, and introducing games or an EdCamp format are just a couple of options.  No artificial gimmicks needed in order for PD to be fun, just an opportunity for professionals to learn from and with each other in a space that encourages exploration and experimentation….and did I mention that prizes never hurt? 

References: 

CASEL Guide to Schoolwide SEL (2022, March 1). Strengthen Adult SEL. CASEL. https://schoolguide.casel.org/focus-area-2/overview/ 

Ditch That Textbook (2019, August 16). 12 Engaging Activities to Rock Your Back-to-School PD. Ditch That Textbook. https://ditchthattextbook.com/12-engaging-activities-to-rock-your-back-to-school-pd/ 

Hooker, C. (2015, February 10). It’s Time to Make Learning Fun Again, Even for Adults. Edutopia. https://www.edutopia.org/blog/make-learning-fun-for-adults-carl-hooker 

Nicolas, M. (2019, November 19). Workshops that Work. Inside Higher Ed. https://www.insidehighered.com/advice/2019/11/19/how-academe-should-improve-its-professional-development-workshops-opinion 

Schmidt, P. (2015, March 20). Promoting Playful Learning Through Teacher Networks. Edutopia. https://www.edutopia.org/blog/changemakers-playful-learning-teacher-networks-philipp-schmidt 

Teräs, H., & Kartoğlu, Ü. (2017). A grounded theory of professional learning in an authentic online professional development program. International Review of Research in Open and Distributed Learning, 18(7). https://doi.org/10.19173/irrodl.v18i7.2923 

Meaningful Feedback in Online Professional Development

Image Source

Just as online teaching and learning became a necessity for K-12 and postsecondary students during the COVID-19 pandemic, so too did online professional learning activities for educators at all levels.  Not only have professional development (PD) activities primarily been held in virtual spaces over the last two years (both synchronously and asynchronously), but often the learning goals themselves have orbited around the use of educational technology in service of the immediate improvement of online teaching and learning experiences. 

Of course, even prior to COVID-19, many professional learning and development enterprises took place online in order to meet the needs of busy educators. Asynchronous, “on demand” courses and workshops are commonly used for PD so that instructors may access materials whenever, and however frequently, they want.  Additionally, online PD allows instructors to access valuable resources they might not otherwise have access to, both locally and globally. In a case study performed by Gaumer Erickson, Noonan, and McCall (2012), special education teachers in a rural district were able to collaborate with educators and experts in a non-rural district via an online PD enterprise.  Participating teachers felt that the modality was an asset to their learning since they were given resources and feedback from experts the district might not otherwise be able to provide due to geography or lack of funding (Elliott, 2017).

As mentioned by participants in this case study, one of the key contributors to a successful professional learning experience is the opportunity to receive meaningful feedback.  Feedback and participant interaction is part of an active professional learning experience wherein an adult learner is implementing their learning in an authentic, problem-based activity (Teräs & Kartoğlu, 2017).  Feedback may come from a coach or instructor or from peers (or both), but regardless of the source, getting professional feedback is necessary in order to support learning implementation and critical reflection.  Feedback can feel like an automatic and organic part of the learning process in face-to-face settings.  For example, if an educator is being observed by a coach or mentor in their classroom, they would expect feedback to be shared directly following the observation.  Similarly, if a peer group is working on a project together in a shared space or workshop, they will naturally give instant formative feedback, usually verbally, to each other as they collaborate. 

What about with online PD?  For context, the operational definition I’m using for online PD is any Internet-based form of learning or professional growth that an educator is engaged in (Elliott, 2017). How might feedback for professional development look similar or different in an online learning context?  To what extent might feedback look different in an asynchronous environment? If PD is going to increasingly be situated in online environments, what tools are available to help assist in delivering meaningful feedback?

Teräs & Kartoğlu (2017) approach online professional development (OPD) through a framework called authentic e-learning.  Authentic e-learning has a nine-point framework, the points of which are well supported in PD research and adult learning theories independent of the mode or learning environment (Teräs & Kartoğlu, 2017).  The nine points for an authentic e-learning framework they propose are as follows:

1) Authentic context

2) Authentic tasks

3) Access to expert performances and the modeling of processes

4) Promoting multiple roles and perspectives

5) Collaborative construction of knowledge

6) Reflection

7) Articulation of understanding

8) Coaching & scaffolded support at critical times

9) Authentic assessment

Numbers 5, 8, & 9 are bolded on this list because each of these points requires interactions and communication among learners and instructors which will often take the form of meaningful feedback through a virtual medium.  Within this learner-centered framework, technology should not be thought of primarily as a mode of delivering content.  Rather, it should be viewed as a platform for facilitating interactions.  Knowledge may be transferred using technology, but that’s not it’s most important role in e-learning.  When technology is a conduit for a dynamic web of collaborative interactions, authentic e-learning can take place (Teräs & Kartoğlu, 2017).  It’s certainly possible for information to be delivered in an asynchronous format using technology as the medium, but this shouldn’t be conflated with an authentic e-learning experience.  Interaction are key.

Perhaps one of the most effective ways to facilitate online interactions for professional development purposes is to create a Community of Practice (COP).  Names for similar groupings that surface in the literature include Professional Learning Community (PLC) or a Community of Inquiry (COI).  Despite any nuanced differences that may exist between the three, COPs, PLCs, and COIs have quite a bit in common.  They are all entities distinct from formal learning and organizational structures, and are particularly valuable for their ability to extend beyond them.  Members gather around shared experiences and/or goals and create their own communication channels and behavioral norms (Liu et al., 2009).  These communities can exist within an organization, or they might consist of professionals across multiple organizations, but they are meant to facilitate the sharing of knowledge and tools and encourage critical discourse in a manner that is beneficial for professional growth for each of its members.  COPs are inherently collaborative, and can often be formed around solving authentic, work-based problems (Liu et al., 2009).  Though coaches, mentors, or experts may participate in COPs, peer interaction and collaboration are at the heart of a COP, and thus feedback is most often sourced from peers.  COPs serve as a promising way to deliver timely, effective, relevant, and individualized support for adult learners while simultaneously decreasing the need for feedback coming solely from “experts.”

Online learning communities can be formed in a variety of different platforms, but regardless of the tech tool or medium, or whether the communities engage synchronously or asynchronously, COPs should have a medium in which they can engage in discussion, peer review, and collaborative problem-solving so that meaningful feedback may take place.  Referencing a prior post in March of 2021 (Global research collaboration and the pandemic: How COVID-19 has accelerated networked learning in higher education), some notable computer-based platforms for collaborative enterprises include:

This list represents nine, powerhouse collaboration platforms, all of which rolled out between 2010 and 2020, and many of which depend heavily on the power and popularity of cloud storage or cloud computing, such that platform users may interact and build upon one another’s contributions in both synchronous and asynchronous ways.

When attempting to collaborate asynchronously, especially where coaching or mentoring is concerned, video review software can be another important tool to consider.  Teacher education programs or instructor professional development initiatives often use video review software to conduct remote classroom observations (though of course, video review may be used in a variety of fields for a variety of purposes).  GoReact is just one example of video review software.  This user-friendly review software offers users the opportunity to:

  • Record and share videos easily using any kind of device, including smart phones
  • Utilize cloud-based video storage so that recording, viewing, and grading can happen asynchronously
  • Integrate video evidence seamlessly within common Learning Management Systems
  • Give and receive time-stamped feedback on submitted video evidence, both written and recorded
Image Source

Though I could likely spend a great many more hours discussing possible platforms to use in service of online learning communities, I wish to conclude with this simple, summative takeaway: quality PD requires feedback; therefore, effective PD conducted online must have ample space for interactions to take place among participants.  It really is that simple.

References:

Elliott, J. C. (2017). The evolution from traditional to online professional development: A review. Journal of Digital Learning in Teacher Education33(3), 114-125. https://www.tandfonline.com/doi/full/10.1080/21532974.2017.1305304

Gaumer Erickson, A. S., Noonan, P. M., & McCall, Z. (2012). Effectiveness of online professional development for rural special educators. Rural Special Education Quarterly, 31(1), 22–31.

Liu, W., Carr, R. L., & Strobel, J. (2009). Extending teacher professional development through an online learning community: A case study. Journal of Educational Technology Development and Exchange (JETDE)2(1), https://aquila.usm.edu/cgi/viewcontent.cgi?article=1072&context=jetde

SpeakWorks, Inc. (2021). GoReact. GoReact. https://get.goreact.com/

Teräs, H., & Kartoğlu, Ü. (2017). A grounded theory of professional learning in an authentic online professional development program. International Review of Research in Open and Distributed Learning18(7).

Did It Work!? A Brief Look at Professional Development Evaluation in Higher Education & Beyond

As I continue to dive deeper into the research related to professional development (PD) and adult learning initiatives within higher education, one aspect of PD I’ve yet to explore is the evaluation of PD.  In other words, how do we determine if a PD enterprise was successful?  Is the learning having an ongoing, meaningful impact in the workplace?  Did it make a difference? 

Image Courtesy of stock.adobe.com

In order to answer these questions, we must step back to think about two things: 1) what do we mean by ‘success’ in relation to PD (in other words, what particular indicators should we pay attention to), and 2) how/when should we gather data related to those indicators? 

As an entry point to this investigation, Guskey (2002) does an excellent job of pointing our attention to key indicators of effective PD and adult learning, as well as possible ways of gathering data related to those indicators.  These indicators (or “levels of evaluation”) are applicable for higher education instructors just as much as they are for K-12 teachers. Accordingly, Five Possible Indicators of PD Effectiveness—as I am referring to them—are summarized below: 

Indicator What’s measured? How will data be gathered? 
Participants’ Reactions Did participants enjoy the learning experience? Did they trust the expertise of those teaching/leading? Is the learning target perceived as relevant/useful? Was the delivery format appropriate/comfortable? Exit Questionnaire, informal narrative feedback from attendees 
Participants’ Learning Did participants acquire new knowledge/skills? Did the learning experience meet objectives? Was the learning relevant to current needs? Participant reflections, portfolios constructed during the learning experience, demonstrations, simulations, etc. 
Organization Support/Change Was the support for the learning public and overt? Were the necessary resources for implementation provided? Were successes recognized/shared? Was the larger organization impacted? Structured interviews with participants, follow-up meetings or check-ins related to the PD, administrative records, increased access to needed resources for implementation 
Participants’ Use of New Knowledge/Skills Did participants effectively apply the new knowledge/skills? Is the impact ongoing? Were participants’ beliefs changed as a result of the PD? Surveys, instructor reflections, classroom observations, professional portfolios 
Student Learning Outcomes What was the impact on student academic performance? Are students more confident and/or independent as learners? Did it influence students’ well-being? Either qualitative (e.g. student interviews, teacher observations) or quantitative (e.g. assignments/assessments) improvements in student output/performance, behaviors, or attitudes 
Table adapted from Figure 1.1,”Five Levels of Professional Development Evaluation” in Guskey (2002) 

It is important to note that these indicators often work on different timelines and will be utilized at different stages in PD evaluation, but they should also be considered in concert with one another as much as possible (Guskey, 2002).  For example, data about participants’ reactions to PD can be collected immediately and is an easy first step towards evaluating the effectiveness of PD, but participants’ initial reactions as reflected in an exit survey, for example, certainly won’t paint the whole picture.  Student learning outcomes are another indicator to consider, but this indicator will not be able to be measured right away and will require time and follow-up well beyond the initial PD activity or workshop. Furthermore, it can be harmful to place too much evaluative emphasis on any single indicator. If student learning outcomes are the primary measure taken into consideration, this puts unfair pressure on the “performance” aspect of learning (e.g. assessments) and ignores other vital evidence such as changed attitudes or beliefs on the part of the teacher or the role of context and applicability in the learning: 

“…local action research projects, led by practitioners in collaboration with community members and framed around issues of authentic social concern, are emerging as a useful framework for supporting authentic professional learning.”

(Webster-Wright, 2009, p.727)

In most instances, PD evaluation may consist entirely of exit surveys or participant reflections shortly after they complete a workshop or learning activity, and very little follow-up (e.g. classroom observations, release time for collaboration using learned skills) occurs into the future (Lawless & Pellegrino, 2007).  This does nothing to ensure that professional learning is truly being integrated in a way that has meaningful, ongoing impact.  In fact, in their 2011 study dedicated to evaluating faculty professional development programs in higher education, Ebert-May et al. (2011) found that 89% of faculty who participated in a student-centered learning workshop self-reported making changes to their lecture-based teaching practices.  When considered by itself, this feedback might lead some to conclude that the PD initiative was, in fact, effective.  However, when these same instructors were observed in action in the months (and years!) following their professional learning workshop, 75% of faculty attendees had in fact made no perceptible changes to their teacher-centered, lecture-based teaching approach, demonstrating “…a clear disconnect between faculty’s perceptions of their teaching and their actual practices” (Ebert-May et al., 2011).  Participants’ initial reactions and self-evaluations can’t be considered in isolation.  Organizational support, evidence of changed practice, and impact on student learning (both from an academic and ‘well-being’ perspective) must be considered as well.  Consequently, we might reasonably conclude that one-off PD workshops with little to no follow-up beyond initial training will hardly ever be “effective.” 

It is also worth mentioning here that the need for PD specifically in relation to technology integration has been on the rise over the last two decades, and this need has accelerated even more during the pandemic.  In recent years the federal government has invested in a number of initiatives meant to ensure that schools—especially K-12 institutions—keep pace with technology developments (Lawless & Pellegrino, 2007). These initiatives include training the next generation of teachers to use technology in their classrooms and retraining the current teacher workforce in the use of tech-based instructional tactics (Lawless & Pellegrino, 2007). With technology integration so often in the forefront of PD initiatives, it begs the question: should tech-centered PD be evaluated differently than other PD enterprises? 

I would argue no. In a comprehensive and systematic literature review of how technology use in education has been evaluated in the 21st century, Lai & Bower (2019) found that the evaluation of learning technology use tends to focus on eight themes or criteria:  

  1. Learning outcomes: academic performance, cognitive load, skill development 
  1. Affective Elements: motivation, enjoyment, attitudes, beliefs, self-efficacy 
  1. Behaviors: participation, interaction, collaboration, self-reflection 
  1. Design: course quality, course structure, course content 
  1. Technology Elements: accessibility, usefulness, ease of adoption 
  1. Pedagogy: teaching quality/credibility, feedback 
  1. Presence: social presence, community 
  1. Institutional Environment: policy, organizational support, resource provision, learning environment 

It seems to me that these eight foci could all easily find their way into the adapted table of indicators I’ve provided above. Perhaps the only nuance to this list is an “extra” focus on the functionality, accessibility, and usefulness of technology tools as they apply to both the learning process and learning objectives. Otherwise, it seems to me that Lai & Bower’s (2019) evaluative themes align quite well with the five indicators of PD effectiveness adapted from Guskey (2002), such that the five indicators might be used to frame PD evaluation in all kinds of settings, including the tech-heavy professional learning occurring in the wake of COVID-19. 

References: 

Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61(7), 550-558. https://academic.oup.com/bioscience/article/61/7/550/266257?login=true 

Guskey, T. R. (2002). Does it make a difference? Evaluating professional development. Educational leadership, 59(6), 45. https://uknowledge.uky.edu/cgi/viewcontent.cgi?article=1005&context=edp_facpub 

Lai, J.W.M. & Bower, M. (2019). How is the use of technology in education evaluated? A systematic review. Computers in Education 133, 27-42. https://doi.org/10.1016/j.compedu.2019.01.010 

Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating technology into teaching and learning: Knowns, unknowns, and ways to pursue better questions and answers.  Review of Educational Research, 77(4), 575-614.https://journals.sagepub.com/doi/full/10.3102/0034654307309921 

Webster-Wright, A. (2009). Reframing professional development through understanding authentic professional learning. Review of Educational Research, 79(2), 702-739.https://journals.sagepub.com/doi/full/10.3102/0034654308330970 

EDTC 6104 Community Engagement Project – Professional Development Workshop for Resilient Pedagogy

For this quarter’s Community Engagement Project I have been tasked with creating a professional learning presentation or workshop on a topic of my choice which would be used to engage and provide professional growth for a selected audience. This project is meant to demonstrate my understanding of the performance indicators for ISTE Coaching Standard 3. The following is a framework for the construction of my professional development workshop for resilient pedagogy (RP).

ISTE Coaching Standard 3

  • 3a Establish trusting and respectful coaching relationships that encourage educators to explore new instructional strategies.
  • 3b Partner with educators to identify digital learning content that is culturally relevant, developmentally appropriate and aligned to content standards.
  • 3c Partner with educators to evaluate the efficacy of digital learning content and tools to inform procurement decisions and adoption.
  • 3d Personalize support for educators by planning and modeling the effective use of technology to improve student learning.

Intended Audience:  

The intended audience is higher education instructors from all over the world.  This will include attendees to a virtual Global Symposium and a group of instructors from a higher education institution in Indonesia participating in virtual PD workshops. 

Chosen Topic:  

The topic for this PD presentation will be Resilient Pedagogy (RP), both theory and practice. RP is an emerging instructional philosophy/framework with extremely timely implications for this current moment in education and the ongoing effects of the COVID-19 pandemic, as well as the many unknowns that may introduce themselves in the form of future crisis or disruption.  Though facets of RP have long been practiced by educators in the form of classroom differentiation, and though other frameworks like Universal Design for Learning (UDL) and Transparency in Learning and Teaching (TILT) inform resilient pedagogy, Rebecca Quintana and her colleagues at the University of Michigan have attempted to define a more expansive type of differentiation by building upon these approaches to instructional design and extending beyond them, bringing to the forefront the need for instructors to be agile and intentional in all educational contexts, but especially in moments of crisis and change.  More than just a fancy synonym for differentiation, resilient pedagogy can be defined as “…the ability to facilitate learning experiences that are designed to be adaptable to fluctuating conditions and disruptions.”  Resilient teaching is an approach that “take[s] into account how a dynamic learning context may require new forms of interactions between teachers, students, content, and tools,” and those who practice resilient pedagogy have the capacity to rethink the design of learning experiences based on a nuanced understanding of context (Quintana & DeVaney, 2020, para. 8).  The key to resilient teaching is a focus on the interactions that facilitate learning, including all the ways that teachers and students need to communicate with one another and actively engage with the learning material.  

My intent is to create a workshop that introduces the basic tenets of RP to participating instructors, offers practical examples or RP, provides inspiration and opportunity for implementing RP, and, ultimately, helps build resilience in educators in the long term.

Event Description:

This PD material will be used in two settings: 1) a virtual, Global Research Symposium in which academics and higher education instructors from all over the world will be in attendance, sharing with and learning from one another’s research enterprises 2) a PD workshop for a university in Indonesia.  The Global Symposium will consist solely of a pre-recorded presentation, 12 minutes in length, with some opportunity for follow-up discussions in breakout rooms. The PD workshop will have opportunities for follow-up activities which extend beyond the pre-recorded presentation.

Concerns for this project include the fact that I must make considerations for an international audience.  Providing transcripts, for example, will be important given the fact that, for many attendees, English is not their first language (though they are fluent).  Additionally, since I am not currently employed as a higher education instructor or learning designer, establishing trust/rapport with the intended audience may take some extra consideration.

Length:  

The pre-recorded presentation of RP content and case study examples is 12 minutes; 12 minutes reflects the presentation timing restraint given for the Global Symposium. However, there will be extended activities and reflection questions for use in a PD workshop space.  Time allotted may vary depending on the workshop schedule, but my thought is that the presentation (12 minutes) and follow-up activities (30-45 minutes) might be a total of 60 minutes in an active workshop session, not including any additional applications instructors may want to add on their own time.  In the workshop, time will be prioritized for instructor reflection and active participation with colleagues vs. lecture/presentation time.  I am creating follow-up activities under the assumption that attendees will participate synchronously in an online format, and I may or may not be the one leading the actual follow-up activities.

Active and engaged learning/collaborative participation

This slide deck is a framework for follow-up activities, engaging attendees with reflection questions, group discussions, and suggestions for practical application.  A summary of the framework is as follows:

  • Opportunity for brief social-emotional connection via a simple “What dog do you feel like today” slide; participants can respond in the chat with a number corresponding to the dog they associate with.  It’s silly and lighthearted.
  • Brief recap of the most important points from the RP presentation
  • 3 slides with reflection questions, each tailored to a different design principle of resilient pedagogy.  Ideally these questions would be discussed among peer instructors who work in the same department (i.e. group discussion).
  • An activity/exercise which asks instructors to workshop one of their own courses for extensibility.  A sample product is shown in the slide deck to model and help with direction.  The hope is that instructors will each work on their own course while actively collaborating and sharing ideas with members of their discussion group.

Address content knowledge needs

This presentation/workshop on the theory and practice of RP gives educators a chance to explore new instructional strategies (ISTE standard 3a), consider use of new digital tools and resources for varied mediums of instruction (ISTE standards 3c and 3d), and build their own resilience (ISTE standard 3b) so that they are better prepared to meet the fluctuating needs of their students–especially in moments of crisis–in the future.  Instructors will be invited to reflect upon barriers that may exist in their own contexts preventing them from practicing RP more robustly.  Additionally, instructors will be given the opportunity to collaborate with colleagues on possible applications of RP in their course designs.

Address teachers’ needs/presentation artifacts:

  • The original slide decks will be shared with attendees/instructors ahead of time so that they may keep it for their own reference, view it ahead of time, and have access to references and resources with live links. 
    • A link to the slide deck for the pre-recorded presentation is here.  It is only visible to those within my university organization for the time being
    • A link to the workshop framework slide deck is, once again, here.
  • A recording of the presentation without captions will be provided separately as one of the deliverables for this project.  Captions have been edited and a version of the recording with cc is available. I will not be sharing a link to the presentation in this blog post until after a pending publication on this RP material is released.
  • Additionally, a written transcript for the recording is available here.  If a video platform does not easily allow for uploading the closed captions for presentation, the transcript document is both a back-up plan (redundancy!) and a possible supplement to the existing captions.

Standards reflection

This presentation/workshop on the theory and practice of RP gives educators a chance to explore new instructional strategies (ISTE standard 3a), consider use of new digital tools and resources for varied mediums of instruction (ISTE standards 3c and 3d), and build their own resilience (ISTE standard 3b) so that they are better prepared to meet the fluctuating needs of their students–especially in moments of crisis–in the future.  Examples of each are provided below.

  • 3a Establish trusting and respectful coaching relationships that encourage educators to explore new instructional strategies.
    • The workshop invites educators to think about resilient pedagogy as an approach to instructional design which helps make courses resistant to disruption. 
    • This presentation and follow-up workshop gives practical guidance for what it looks like to design a course for extensibility, flexibility, and redundancy.
  • 3b Partner with educators to identify digital learning content that is culturally relevant, developmentally appropriate and aligned to content standards.
    • Takeaways can be immediately applicable, and the topic is especially relevant given the many ongoing challenges faced by schools and universities during the COVID-19 pandemic.  It’s meeting higher education instructors where they’re at and speak into the experiences and challenges they’ve already faced over the last year and a half.
    • Instructors are encouraged to make the learning relevant to their specific discipline and teaching role.  They are also encouraged to think critically about their own course design and reflect on their approach to teaching/learning, which is always valuable.
  • 3c Partner with educators to evaluate the efficacy of digital learning content and tools to inform procurement decisions and adoption.
    • Opportunities for reflection are provided, including questions which ask instructors to reflect on their relationships to the digital tools they use (or would like to learn to use) in their teaching. For example:
      • What new educational technology tools or platforms could you experiment with as you think about adapting a course for different modalities?
      • Do you know how you would approach teaching a course if students had unreliable internet access?
  • 3d Personalize support for educators by planning and modeling the effective use of technology to improve student learning.
    • The workshop slidedeck models a possible way to plan a course with extensibility in mind.  It helps educators take concrete next steps towards resilience.

I look forward to delivering this project to real instructors in higher education with hopes that the theory and practical application of RP will be a source of inspiration, confidence, and clarity in the ever-changing landscape of teaching and learning, especially with the continued unknowns of the COVID-19 pandemic.

_______

May 23, 2022:

Building on the foundation set forth above, I have created a screencast on the theory and practice of RP to be used as a resource for higher education instructional faculty. This screencast is 13 minutes long and may be used as an asynchronous option when a live presentation/workshop isn’t an option.

“Put me in, Coach:” A Brief Look at Best Practices for Instructional Coaching in Higher Education

In 2020, Apple TV’s comedy series Ted Lasso starring Jason Sudeikis became an unlikely hit after its debut season, earning rave reviews, excellent ratings, and a platform as a quirky source of inspiration for coaches–and really just ordinary people–everywhere.  The premise of the show has fictional American football coach, the mustachioed Ted Lasso, recruited to coach a British Premier football (soccer) team, a sport which he knows nothing about.  He’s been recruited by an embittered and recently-divorced club owner who wants to see the team fail miserably in order to get back at her ex-husband.  As the season unfolds, Ted Lasso’s folksy wisdom, relentless optimism, vulnerability, and refreshing and profound decency eventually win over the skeptics, both in the show and in real life.

Image Source https://www.si.com/soccer/2021/07/22/ted-lasso-season-two-preview-jason-sudeikis-apple-tv

Indeed, professional athletic coaches such as Quin Snyder of the Utah Jazz and Steve Kerr of the Golden State Warriors have come out in support of the show and the coaching prowess of its lead (Cohen, 2021).  Snyder was even quoted saying that “[Ted Lasso] should be required watching for coaches” (Cohen, 2021).  Among Ted Lasso’s most notable coaching attributes are his:

  1. Commitment to having a short memory for failure while simultaneously building self-efficacy and resilience in his players
  2. Openness to modeling vulnerability, curiosity, and personal growth
  3. Ability to empower other leaders around him
  4. Varied approaches to coaching different individuals on the team depending on their individual needs, personalities, and backgrounds

Now of course I’m referencing a fictional television series geared at athletic coaching which can only go so far, but I think some of these “stand out” tactics of Mr. Lasso’s are relatable for a reason. As journalist Ben Cohen puts it in his 2021 Wall Street Journal article, “Why Real Coaches Want to be Ted Lasso”:

“There is a takeaway from the series…that applies to any line of work: The best coaches are the best managers of people.”

Ben Cohen, Wall Street Journal

Thus, as I turn to look at instructional coaching relationships in the world of higher education, there are many aspects of “what makes a good coach” that are universal to effective leaders and, dare I say, teachers in all kinds of contexts.  After all, what is a good coach if not a good teacher? What is a teacher if not a ‘manager of people’? Perhaps we are all wise to think on the attributes of a good coach, regardless of whether or not our job titles imply that we are as much.

Anderson and Wallin (2018) offer some excellent “empirical tips” for instructional coaches to help us get started.  You’ll see that many of them, at their core, overlap nicely with some of Ted Lasso’s coaching “best practices” listed above.

  1. Build Relationships: Nothing meaningful can be accomplished without trust and respect between the coach and the coachee.  This takes time.  Establishing trust may even require that coaches acknowledge where their own shortcomings in skills/experience are in order to better listen to and learn from those whom they are coaching.  
  2. Remain Connected to Students: Stay active, connected, and relevant. In other words, “practice what you preach.” Trust and respect are harder to establish with instructors when there is a disconnect between the coach and what’s going on in classrooms.  If possible, continue teaching in some capacity while in a coaching position.
  3. Develop Leadership Skills: Maintain a growth mindset and actively seek out ways to improve upon the coaching and leadership skills you already possess.
  4. Model for Teachers: Be prepared to demonstrate.  Don’t just tell people to “go do” something; focus on active learning opportunities and practical takeaways that can be modeled for those being coached.
  5. Build in Planning Time: Allow for one-on-one time to come alongside those being coached and address their needs individually.  Collaboratively work with educators during this time to plan and implement new strategies. Treat instructors as partners in the process.
  6. Remain Focused on the Goal: It’s easy to be distracted by all the things that might be addressed by an instructional coach, and no single coach can be all things to all teachers.  Lean into your particular areas of expertise and stay attuned to the specific ways you’ll be best equipped to improve instruction.

Of course, when it comes to focusing on your goals as an instructional coach, much will depend on context, and higher education is its own unique ecosystem of teaching and learning.  Higher education is notoriously slow-moving in its shift away from a traditional, lecture-based classroom format.  Reasons for this are many, but they certainly include 1) time constraints for faculty, 2) lack of proper training in teaching practice, and 3) conflicts with instructors’ existing beliefs about teaching and learning (Czajka & McConnell, 2016).  There can also be a tension between faculty members’ prioritization of research activities over and above investing in the improvement of teaching practice, depending on their campus culture and which part of their job has a greater influence on their own professional identity (Czajka & McConnell, 2016).  This, then, is the unpredictable instructional landscape instructional designers/coaches in higher education must navigate.

Czajka & McConnell (2016) conducted a case study wherein an instructor from a large research institution was invited to work with an instructional coach to reform a course taught frequently to undergraduate students. The work in this case study is referred to as “situated instructional coaching.”  Situated instructional coaching has a qualified collaborator (i.e someone familiar with the subject/discipline/curriculum in question) working one-on-one with an instructor to change a course design over time, including class observations and feedback on real-time delivery, as well as creating opportunities for the instructors to reflect on the changes. In its original format, the instructor’s course for this case study was very much lecture-based and revolved around lengthy PowerPoint presentations delivered over the course of 75-minute class sessions twice per week. 

Image Source https://www.society19.com/free-things-to-do-when-youre-bored-and-in-college/

However, after working closely with an instructional coach and implementing changes to her course design and delivery in three phases, the instructor demonstrated measurable changes in attitudes regarding her perceptions of “what worked” when it came to teaching and learning while also implementing changes to her course that were notably more student-centered (such as breakout discussions in small groups).

According to this case study, the situated instructional coaching model seemed to work particularly well in a higher education setting. Having a collaborator who can aid in revisions can significantly reduce the time burden of recreating a course (which is often a major deterrent for faculty).  Additionally, since training is specific to the discipline and occurring in the classroom in real time, instructors are not investing time to attend general training workshops, listen to talks, or read and interpret unfamiliar literature as additive activities on top of their already-full schedules, nor are they having to wonder how certain strategies will actually translate into their specific field of study. Situated instructional coaching occurs in the midst of their normal teaching responsibilities, with feedback and opportunities for reflection provided immediately by the coach (Czajka & McConnell, 2016). It’s worthwhile to imagine how instructional coaches in higher education might ‘situate’ themselves on their own campuses to be the most helpful and relevant to the instructors they hope to assist.

If I may conclude with a return to Ted Lasso, we must remember that coaches and instructors are all on the same team and they need one another to meet their goals. The coach succeeds when the the player/instructor they’re working with succeeds. When coaches seek to openly collaborate with the individuals they’re coaching, faithfully modeling best practices in teaching and investing in the specific needs of the individuals they’re assisting, much growth is possible. And perhaps at the end of the day, it’s more important to see players improve and grow over the course of the season then walk away with the championship trophy.

References:

Anderson, V. & Wallin, P. (2018). Instructional Coaching: Enhancing Instructional Leadership in Schools. National Teacher Education Journal 11(2), 53-59.

Cohen, B. (2021, July 14). Why real coaches want to be Ted Lasso. Wall Street Journal. https://www.wsj.com/articles/ted-lasso-nba-coaches-11626232223

Czajka, C.D. & McConnell, D. (2016). Situated instructional coaching: a case study of faculty professional development. International Journal of STEM Education 3(10). https://doi.org/10.1186/s40594-016-0044-1

Using Canvas Analytics to Support Student Success

Though online teaching/learning are hardly new concepts in education, the pandemic has necessitated a massive shift to online learning such that educators worldwide–at all levels–have had to engage with online learning in new, immersive ways.  Online learning can take many forms (synchronous, asynchronous, hybrid, hyflex, etc.), but regardless of the form, educators with access to an LMS have been forced to lean into these platforms and leverage the tools within in significant ways, continually navigating (perhaps for the first time) how to best support students in achieving their learning goals using technology.

Without consistent opportunities for face-to-face communication and informal indicators of student engagement that are typically available in a classroom (e.g. body language, participation in live discussions, question asking) a common challenge faced by educators in online learning environments–especially asynchronous ones–is how to maintain and account for student engagement and persistence in the course.  Studies using Educational Data Mining (EDM) have already demonstrated that student behavior in an online course has a direct correlation to their successful completion of the course (Cerezo et al., 2016). Time and again, these studies have supported the assertion that students who are more frequently engaged with the content and discussions in an online course are more likely to achieve their learning goals and successfully complete the course (Morris et al., 2005).  This relationship is, however, tricky to measure, because time spent online is not necessarily representative of the quality of the online engagement.  Furthermore, different students develop different patterns of interaction within an LMS which can still lead to a successful outcome (Cerezo et al., 2016). Consequently, even as instructors look for insights into student engagement from their LMS, they must avoid putting too much emphasis on the available data, or even a ‘one style fits all’ approach to interpreting it.  Instead, LMS analytics should be considered as one indicator of student performance that contributes to the bigger picture of student learning and achievement.  Taken in context, the data that can be quickly gleaned from an LMS can be immensely helpful in identifying struggling or ‘at-risk’ students and/or those who could benefit from differentiated instruction, as well as possible areas of weakness within the course design that need addressing.

Enter LMS analytics tools and the information available within.  For the purposes of this post, I’ll specifically be looking at the suite of analytics tools provided by the Canvas LMS, including Course Analytics, Course Statistics, and ‘New Analytics.’

Sample Screenshot of Canvas New Analytics, https://sites.udel.edu/canvas/2019/11/new-canvas-analytics-coming-to-canvas-in-winter-term/
  • Course Analytics are intended to help instructors evaluate individual components of a course as well as student performance in the course.  Course analytics are meant to help identify at-risk students (i.e. those who aren’t interacting with the course material), and determine how the system and individual course components are being used.  The four main components of course analytics are: 
    • Student activity, including logins, page views, and resource usage
    • Submissions, i.e. assignments and discussion board posts
    • Grades, for individual assignments as well as cumulative
    • Student analytics, which is a consolidated page view of the student’s participation, assignments, and overall grade (Canvas Community(a), 2020).  With permission, students may also view their own analytics page containing this information.
  • Course Statistics are essentially a subset of the larger course analytics information pool.  Course statistics offer specific percentages/quantitative data for assignments, discussions, and quizzes.  Statistics are best used to offer quick, at-a-glance feedback regarding which course components are engaging students and what might be improved in the future (Canvas Community(b), 2020).
  • New Analytics is essentially meant to be “Course analytics 2.0” and is currently in its initial rollout stage.  Though the overall goal of the analytics tool(s) remains the same, New Analytics offers different kinds of data displays and the opportunity to easily compare individual student statistics with the class aggregate.  The data informing these analytics is refreshed every 24 hours, and instructors may also look at individual student and whole class trends on a week-to-week basis.  In short, it’s my impression that ‘New Analytics’ will do a more effective job of placing student engagement data in context.  Another feature of New Analytics is that instructors may send a message directly to an individual student or the whole class based on a specific course grade or participation criteria (Canvas Community(c), 2020). 
Sample Screenshot of Canvas New Analytics, https://sites.udel.edu/canvas/2019/11/new-canvas-analytics-coming-to-canvas-in-winter-term/

Of course, analytics and statistics are only one tool in the toolbelt when it comes to gauging student achievement, and viewing course statistics need not be the exclusive purview of the instructor.  As mentioned above, with instructor permission, students may view their own course statistics and analytics in order to track their own engagement.  Beyond viewing grades and assignment submissions, this type of feature can be particularly helpful for student reflection on course participation, or perhaps as an integrated part of an improvement plan for a student who is struggling.

Timing should also be a consideration when using an LMS tool like Canvas’ Course Analytics.  When it comes to student engagement and indicators of successful course completion, information gathered in the first weeks of the course can prove invaluable.  Rather than being used solely for instructor reflection or summative ‘takeaway’ information about the effectiveness of the course design, course analytics may be used as early predictors of student success, and the information gleaned may be used to initiate interventions from instructors or academic support staff (Wagner, 2020). Thus, instructors who use Canvas will likely find that their Canvas Analytics tools might actually prove most helpful within the first week or two of the course (University of Denver Office of Teaching & Learning, 2019).  For example, if a student in an online course is having internet access issues, the instructor can likely see this reflected early-on in the student’s LMS analytics data. The instructor would have reason to reach out and make sure the student has what they need in order to engage with the course content.  If unstable internet access is the issue, the instructor may then flex due dates, provide extra downloadable materials, or continually modify assignments as needed throughout the quarter in order to better support the student.

Finally, as mentioned above, in addition to student performance, LMS analytics tools may be used by the instructor to think about the efficacy of their course design.  Canvas’ course analytics tools help instructors see which resources are being viewed/downloaded, which discussion boards are most active (or inactive), what components of the course are most frequented, etc.  Once an online course has been constructed, it can be tempting for instructors to “plug and play” and assume that the course will retain its same effectiveness in every semester it’s used moving forward. Course analytics can help instructors identify redundancies and course elements that are no longer needed/relevant due to lack of student interest.  They can also help instructors think critically about what seems to be working well in their course (i.e. what are students using, where are they spending the most time in the course) why that might be, and how to leverage that for adding other course components or tweaks for the future.

In summary, the information available via an LMS analytics tool should always be considered in concert with all other factors impacting student behavior in online learning, including varying patterns or ‘styles’ in students’ online behaviors and external factors like personal or societal crises that may have impacted the move to online learning in the first place.  Student engagement (as measured by LMS analytics tools) can be helpful tools used for identifying struggling students, providing data for student self-reflection, and providing insight into the effectiveness of the instructors’ course design.  To the extent that analytics tools aren’t considered the “end all be all” when it comes to measuring student success, tools like Canvas Analytics are a worthwhile consideration for instructors teaching online who are invested in student success as well as their own professional development.

References:

Canvas Community(a). (2020). What are Analytics? Canvas. https://community.canvaslms.com/t5/Canvas-Basics-Guide/What-are-Analytics/ta-p/88

Canvas Community(b). (2020). What is New Analytics? Canvas. https://community.canvaslms.com/t5/Canvas-Basics-Guide/What-is-New-Analytics/ta-p/73

Canvas Community(c). (2020). How do I view Course Statistics? Canvas. https://community.canvaslms.com/t5/Instructor-Guide/How-do-I-view-course-statistics/ta-p/1120

Cerezo, R., Sanchez-Santillan, M., Paule-Ruiz, M., & Nunez, J. (2016). Students’ LMS interaction patterns and their relationship with achievement: A case study in higher education. Computers & Education 96, 42-54. https://www.sciencedirect.com/science/article/pii/S0360131516300264

Morris, L.V., Finnegan, C., & Wu, S. (2005). Tracking student behavior, persistence, and achievement in online courses. The Internet and Higher Education 8, 221-231. https://www.sciencedirect.com/science/article/pii/S1096751605000412 

Wagner, A. (2020, June 6). LMS data and the relationship between student engagement and student success outcomes. Airweb.org. https://www.airweb.org/article/2020/06/17/lms-data-and-the-relationship-between-student-engagement-and-student-success-outcomes 

Resilient pedagogy: The professional development opportunity educators need now more than ever

Resilient pedagogy is an emerging instructional philosophy with extremely timely implications for this current moment in education and the ongoing effects of the COVID-19 pandemic.  Though facets of resilient pedagogy have long been practiced by educators in the form of classroom differentiation, and though other frameworks like Universal Design for Learning (UDL) and Transparency in Learning and Teaching (TILT) inform resilient pedagogy, Rebecca Quintana and her colleagues at the University of Michigan have attempted to define a more expansive type of differentiation by building upon these approaches to instructional design and extending beyond them, bringing to the forefront the need for instructors to be agile and intentional in all educational contexts, but especially in moments of crisis and change.  More than just a fancy synonym for differentiation, resilient pedagogy can be defined as “…the ability to facilitate learning experiences that are designed to be adaptable to fluctuating conditions and disruptions” (Quintana & DeVaney, 2020, para. 8). Resilient teaching is an approach that “take[s] into account how a dynamic learning context may require new forms of interactions between teachers, students, content, and tools” (Quintana & DeVaney, 2020, para. 8), and those who practice resilient pedagogy have the capacity to rethink the design of learning experiences based on a nuanced understanding of context (Quintana & DeVaney, 2020).  The key to resilient teaching is a focus on the interactions that facilitate learning, including all the ways that teachers and students need to communicate with one another and actively engage with the learning material (Hart-Davidson, 2020). 

“Teachers often plan carefully for delivering content…but when it comes to planning interactions, we can easily take this very important component of learning for granted.”

(Hart-Davidson, 2020, para. 5)

In 2020, Rebecca Quintana and the University of Michigan released a Massive Open Online Course (MOOC) via Coursera titled “Resilient Teaching Through Times of Crisis & Change.”  The MOOC is available in a free, open access format and offers a flexible learning structure which makes it accessible to any educator wanting to engage with the topic. The registration process is simple, and as an asynchronous online learning experience, there are no time constraints on when a participant must register or when a participant must complete the course.  Though the course is aimed at educators who may need to rethink how they teach in the immediate or near future due to the ever-changing circumstances of the pandemic, the course creators “…expect it will remain relevant to instructors who are faced with disruptions and change to their teaching for any number of reasons and must quickly adapt their course designs” (Quintana, 2020). Furthermore, though this MOOC course is especially relevant to the higher education environment, the principles of resilient pedagogy can absolutely be applied in any type of classroom by any type of educator.

Interested educators may engage with the course casually by reviewing videos (thoughtfully ‘chunked’ into appropriately consumable lengths) and reading materials in whatever order and pacing–and to whatever depth–feels pertinent to their needs.  They can choose to purchase the full course and engage in all aspects of the learning experience, including submitting assignments and completing checks for understanding.  In this format, participants can receive a course completion certificate at the end.  This type of engagement may be especially helpful if participating in the course alongside colleagues in a more formal professional development venture.  My personal engagement has been decidedly less formal.

The course content focuses on three key components of resilient pedagogy: designing for extensibility, designing for flexibility, and designing for redundancy.  This three-principle framework helps flesh out the meaning and potential of resilient pedagogy while also serving as a practical guide to course design.

  1. Designing for Extensibility means that a course is designed in such a way that it has a clearly defined purpose and essential, unaltered learning goals, and yet the basic essence of the course content can be extended with new capabilities and functionality as needed.  This may involve the introduction of new tools or a change in format, moving fluidly from synchronous to asynchronous modalities, etc.  
  2. Designing for Flexibility means that a course is designed to respond to the individual needs of learners within a changing learning environment.  In a nod to the UDL framework, designing for flexibility means that a course is structured to meet a variety of student needs and learning styles, even before knowing specific individuals in a given class.  Flexibility will require a learner-centered approach with multiple means of engagement/expression and considerations for student needs which may arise within variable class sizes and modalities.  A course designed for flexibility will also allow instructor expectations and assessments to flex in response to these needs.
  3. Designing for Redundancy, simply put, means having contingency plans in place. Designing for redundancy asks instructors to analyze a course design for possible vulnerabilities.  For example, how will students accustomed to synchronous virtual meetings be given the opportunity to engage in course activities if their internet access becomes unpredictable?  In this design approach, instructors look for alternative ways of accomplishing goals with the hope of eliminating “single points of failure.” This is, of course, incredibly important when learning is situated in a time of crisis or emergency.

These three principles of resilient pedagogy do not stand alone. Rather, they inform one another and will naturally overlap in the instructional design process.  The MOOC contains excellent examples and practical applications of extensibility, flexibility, and redundancy throughout, but Rebecca Quintana and her team aren’t the only academics talking about resilient pedagogy, and examples of resilient pedagogy implemented during the pandemic can be found outside the MOOC.  For the reader who might be thinking about resilient pedagogy for the first time, here are a few examples of what resilient pedagogy may look like in practice:

  • Educators on a staggered schedule or a hybrid return-to-school plan may put together an in-person and virtual lesson plan that can be running at the same time on the same day with students engaging with the same content in two different modalities (Watson, 2020).
  • Instructors may create a spreadsheet for a course which helps track various contingencies and needed adjustments for various modalities: in person, hybrid or hyflex, and fully remote (Quintana, 2020).
  • Resilient pedagogy involves reducing complexity in any way possible.  This can look like establishing a predictable weekly pattern for remote students, having fewer due dates, simplifying assignments, etc. (Tange, 2020). Resilient pedagogy in practice means educators can scale up or down as needed according to student needs, understanding that crisis situations almost always call for some sort of scaling down. It’s OK to pair a course down to its most essential elements.
  • Resilient pedagogy requires an emphasis on feedback and interactions vs. assignments and grading.  Grading fewer assignments while also providing more opportunities for ongoing feedback increases the opportunity for interactions between instructors and students while also lowering the stakes for all parties (Watson, 2020).  It also keeps educators from getting stuck trying to stick a “square-pegged” assignment or assessment into a “round hole” of a specific digital tool, modality, or crisis context, simply because this assignment has always been done as part of the course in the past. 
  • As another way to emphasize the importance of interactions within a course, resilient pedagogy prioritizes small group interactions over and above large group instruction (Watson, 2020).  This can take many forms in both synchronous and asynchronous, online and in-person formats.
  • Resilient pedagogy requires educators to consider the use of digital tools carefully within their course design. If, for example, they are using a particular tool on which the success of their students rests, instructors may dedicate time within their learning activities to help students learn how to use that technology and not make assumptions about their students’ digital literacy (Gardiner, 2020).

Though the application of resilient pedagogy may feel particularly prescient in this current moment of crisis, resilient teaching will benefit students and instructors in all circumstances in the long run, regardless of the circumstance.  At the end of the day, resilient teaching forces instructors to examine student engagement carefully and intentionally and develop a student-centered mindset.  It also helps instructors design a dynamic course once, so that they’re using their time and efforts efficiently and making their courses as resistant to disruption as possible (Gardiner, 2020).  Resilience has been an oft-discussed trait that ‘successful’ students possess, but perhaps it’s time to shift that focus on to educators.  Successful educators must be resilient themselves.  It’s not only necessary for this moment, it’s the right thing to do for students in all contexts moving forward, and the “Resilient Teaching Through Times of Crisis & Change MOOC is a great place to start.

“If it seems like resilient pedagogy is in line with calls for us all to be making learning more inclusive and accessible, it certainly is.”

(Hart-Davidson, 2020, para. 17) 

References:

Gardiner, E. (2020, June 25). Resilient Pedagogy for the Age of Disruption: A Conversation with Josh Eyler. Top Hat. https://tophat.com/blog/resilient-pedagogy-for-the-age-of-disruption-a-conversation-with-josh-eyler/

Hart-Davidson, B. (2020, April 6). Imagining a resilient pedagogy. Medium. https://cal.msu.edu/news/imagining-a-resilient-pedagogy/

Kaston Tange, A. (2020, June 8). Thinking about the humanities. https://andreakastontange.com/teaching/resilient-design-for-remote-teaching-and-learning/

Quintana, R. (2020).  Resilient teaching through times of crisis and change [MOOC]. Coursera. https://www.coursera.org/learn/resilient-teaching-through-times-of-crisis 

Quintana, R., & DeVaney, J. (2020, May 27). Laying the foundation for a resilient teaching community. Inside Higher Ed. https://www.insidehighered.com/blogs/learning-innovation/laying-foundation-resilient-teaching-community 

Watson, A. (2020). Flexible, resilient pedagogy: How to plan activities that work for in-person, remote, AND hybrid instruction.  Truth for Teachers. https://thecornerstoneforteachers.com/truth-for-teachers-podcast/resilient-pedagogy-hybrid-instruction-remote-learning-activities/

Exemplars of Computational Thinking in Higher Education Classrooms

Though the concepts and theory behind computational thinking (CT) have been around for decades in the realms of computer science and engineering, it is widely acknowledged that Jeanette Wing’s 2006 publication on computational thinking laid the groundwork for CT’s popularity and integration in 21st century education theory.  Wing (2006) suggested that CT might be considered essential to all human endeavors as it is a distillation of a way that we naturally approach solving problems, managing our daily lives, and communicating and interacting with people.  It need not be relegated only to the STEM fields and computer science majors, because CT is not about getting humans to think like computers.  Rather, CT harnesses the natural outpouring of human cleverness, creativity, and problem solving that laid the foundations for the field of computer science in the first place (Wing, 2006). CT is about “…solving problems, designing systems, and understanding human behavior” by drawing on, and leveraging, the concepts fundamental to computer science (Wing, 2006, p. 33).

Though academics continue to debate an authoritative definition for CT, certain common themes are generally accepted characterizations of CT across the board.  These characteristics include:

  • Abstraction — thinking through abstract concepts and ill-defined problems, at times breaking them into smaller, digestible pieces, in order to move towards a more concrete, real-world solution (Wing, 2006).
  • Pattern Recognition —  recognizing useful patterns in data, filtering out the characteristics of patterns that aren’t needed, focusing on those that are (Wing, 2006).
  • Algorithmic Thinking — curating a list of steps that can be followed to solve a problem (Lyon & Magana, 2020).
  • Creative Problem Solving — developing a unique, context-based solution that is  considered original, valuable, and useful (Romero et.al., 2017).
  • Evaluating Solutions — considering the efficacy of a proposed solution to a problem, perhaps making considerations for factors like efficiency and resource consumption (Lyon & Magana, 2020).
Image sourced from https://koneilleci201.wordpress.ncsu.edu/2020/01/28/computational-thinking/

These facets of CT and the related skills are all integral parts of a 21st century education at all levels, including K-12 and postsecondary.  Indeed, “…computational sciences have been deemed essential to maintaining national competitiveness in the workplace and national security.” (Lyon & Magana, 2020, p. 1174)  For these reasons, the fundamentals of CT have been championed in education theory over the last decade, and nationally recognized standards like the Common Core State Standards and ISTE Standards for Students have pointedly emphasized the importance of “21st century skills” in K-12 education while simultaneously offering some clear guidance for what CT can look like in action. 

But what about higher education?  The implementation of CT in higher education classrooms is noticeably harder to call out, especially outside of computer science and engineering classrooms.  In my opinion, this is likely due to a number of factors including, but not limited to:

  1. Lack of collegial collaboration:  higher education disciplines are notoriously siloed. Meaningful integration of CT concepts outside of computer science and engineering programs demands intentional professional development for faculty, as well as interdisciplinary cooperation, both of which can be less accessible in higher education.
  2. Lack of resources: there is relatively little literature available which provides ideas for practical application of CT outside of computer science programs (i.e. coding and computer programming) at the postsecondary level. Additionally, published standards often lean more heavily towards K-12 education.
  3. Questions of applicability: the humanities often resist algorithmic ways of knowing because there is so much value placed on interpretation, subjectivity, and open debate about meaning (Czerkawski & Lyman, 2015).
  4. Just getting started: there is growing interest in translating CT pedagogy into a wide variety of disciplines in higher education (and K-12 for that matter), but the research and discussions are just getting started.  There is much yet to be explored.

Knowing that many STEM instructors in higher education automatically incorporate CT in their approach to teaching and learning because of the nature of their field, and knowing that engaging with CT in courses devoted to coding and programming are already integral to computer science and engineering majors, I seek to offer a few alternative examples of CT as it has been used to enhance teaching in learning in other kinds of higher education environments:

  • In a professional writing course taught at the undergraduate level, CT was used to systemize the writing process.  It called for a “deconstructive approach, breaking down the task of structured authoring into multiple layers of abstraction, and teaching each layer independently.” (Lyon & Magana, 2020, p. 1182)
  • In the fine arts, CT can be used as a tool to enhance creativity.  In one example, CT was used to create an organized system for tracing the origins of musical composers, which in turn inspired new creative endeavors based on the organized data. “…Algorithmic composition in music is, effectively, a human-computer collaboration–the computer serving as a tool that extends the composer’s ability to explore new musical ideas” (Edwards, 2011, p. 67)
  • The Stanford Literary Lab famously applied CT via Graph Theory to perform a “network analysis of character relationships and interactions” in a series of Shakespeare’s plays (Czerkawski & Lyman, 2015).
  • In the life sciences, CT has been used to inform systems theory and how to teach and understand biological processes, such as genetics, in an organized, logical fashion (Czerkawski & Lyman, 2015).
  • Utilizing a process dubbed “creative programming,” instructors may engage learners in the process of designing and developing an original work through coding. In this collaborative approach, learners are encouraged to co-construct knowledge in an interdisciplinary way. Examples might be to have students in a history course (co-)create a rendering of a city at a given historical period, or to present a traditional story in a visual programming tool like Scratch. In this kind of activity, learners must use skills and knowledge in mathematics, technology, language arts, and social sciences. (Romero et. al, 2017, para. 3)

Modeling and simulation activities are excellent examples of CT at work, and these types of learning activities can certainly extend themselves to many types of fields and disciplines.  Consider a learning activity where a group of undergraduate philosophy majors create a simulated narrative presentation wherein a human “character” makes a series of daily choices based on their moral philosophy or framework–almost like a “Choose Your Own Adventure” novel meets systems theory within one, or multiple, philosophical frameworks.  The simulation itself could be a computer-based product (or not), but regardless, the learning activity would draw upon many tenets of CT while also demonstrating in-depth knowledge of the discipline-specific subject matter.

All fields and disciplines require problem solving in some form.  Thus, it is reasonable to assume that CT may be useful in expanding the human ability to effectively problem solve in all fields.  In one study comparing the use of CT by an undergraduate computer science student and an art student, the researchers found that both students “…used various CT skills when solving all [italics added] problems, and the application of CT skills was influenced by their background, experiences, and goals.” (Febrian et. al., 2018, para. 1)  Regardless of training, background, or chosen major, CT enables postsecondary students to become more efficient problem solvers in all areas of life, teaching them to recognize computable problems and approach the problem-solving process as skillfully as possible (Czerkawski & Lyman, 2015).

References

Czerkawski, B.C. & Lyman, E.W. (2015). Exploring issues about computational thinking in higher education. TechTrends 59(2), 57–65. https://doi.org/10.1007/s11528-015-0840-3 

Edwards, M. (2011). Algorithmic composition: Computational thinking in music. Communications of the ACM, 54(7), 58-67. doi:10.1145/1965724.1965742  

Febrian, A., Lawanto, O., Peterson-Rucker, K., Melvin, A., & Guymon, S. E. (2018). Does everyone use computational thinking?: A case study of art and computer science majors. Proceedings of the ASEE Annual Conference & Exposition, 1–16.

Lyon, J. & Magana, A. (2020). Computational thinking in higher education: A review of the literature. Computer Applications in Engineering Education 28(5), 1174-1189. https://doi.org/10.1002/cae.22295

Romero, M., Lepage, A., & Lille, B. (2017). Computational thinking development through creative programming in higher education. International Journal of Educational Technology in Higher Education 14(42). https://educationaltechnologyjournal.springeropen.com/articles/10.1186/s41239-017-0080-z

Wing, J. (2006). Viewpoint: Computational thinking. Communications of the Association for Computing Machinery (49)3, 33-35. https://dl.acm.org/doi/fullHtml/10.1145/1118178.1118215?casa_token=DY3JiA-SOKMAAAAA:OYN4CIuf3LvuR1v4IiYsKQ-2J1KQMV6e0k6skWtib9uI02IKHhX9fEEA7rQC459Lk39QGworokaU 

A Few Best Practices for Online Learning & Adoption in Higher Education

Though the digital age may not actually be changing a student’s capacity to learn, it’s certainly changing how students access content and participate in learning environments. Digital technology thoroughly transforms the way in which we create, manage, transfer, and apply knowledge (Duderstadt, Atkins, & Van Houweling, 2002). Unsurprisingly, it’s also changing how educators teach, particularly with technology-mediated instruction in higher education. The demand for online instruction is on the rise.  In the United States alone, the number of higher education students enrolled in online courses increased by 21% between fall 2008 and fall 2009, and the rate of increase has only grown in recent years, both nationally and globally (Bolliger & Inan, 2012).  Of course, the COVID-19 pandemic of 2020 has also necessitated a radical—though in some cases temporary—shift to online learning modalities at all educational levels across the globe.

Fortunately, there’s evidence to support that digital education incorporation can enhance pedagogy and improve overall student performance at the college level.  An extensive, multi-year case study conducted at the University of Central Florida showed that student success in blended programs (success being defined as achieving a C- grade or higher) actually exceeded the success rates of students in either fully online or fully face-to-face programs (Dziuban, C., Hartman, J., Moskal, P., Sorg, S., & Truman, B., 2004).

In the switch to online teaching and learning, a clear challenge is presented: teaching faculty are faced with a need to move their programs and classes into online/flexible learning formats, regardless of their discipline or their expertise/ability to do so.  It is not uncommon for teachers, no matter the level at which they teach, to be asked to implement something new in their classroom without sufficient support, professional development, or resources to make the implementation successful.  The need for appropriate training becomes that much more pressing when educators are asked to engage with an entirely different instruction medium from that which they are accustomed to.  In the case of blended or online learning, many faculty will need to develop completely new technological and/or pedagogical skills.  While a number of scholars have conducted investigations into the effectiveness of blended or online learning, very few have provided guidance for adoption at the institutional level (Porter, Graham, Spring, & Welch, 2014). 

Far from being a comprehensive guide, this post seeks to explore a few major themes and best practices for online learning in postsecondary education which may prove helpful for teaching professionals and higher education institutions heading into an otherwise unfamiliar world of digital education.

Create a Learning Community:

Digital education is made possible by computers and the internet.  In the age of the Internet, the computer is ultimately used most to provide connection, whether that be through social media, e-commerce, gaming, publications, or education (Weigel, 2002).  Technology-mediated education is making it possible for students to participate in programs, access content, and connect in ways they were previously unable to.  Rather than viewing the Internet as a necessary evil for distance learning that ultimately begets isolated student learning experiences, digital education should, first and foremost, be connective and communal.  This means a professor accustomed to lecture-based learning in a physical classroom may need to consider a new approach in order to make space for student voice in the learning process.  In an online context, this means there should be dynamic opportunities for students to engage in debate, reflection, collaboration, and peer review (Weigel, 2002).

Beyond Information Transfer:

Learning and schooling no longer have the same direct relationship they had for most of the 20th century; devices and digital libraries allow anyone to have access to information at any time (Wilen, 2009). Schools, teachers, and even books no longer hold the “keys to the kingdom” as sources of information.  Higher education, then, will not function effectively as a large-scale effort to teach students information through a standardized curriculum.  Rather, education must be a highly relevant venture that enables individual students to do something with the virtually endless information and resources they have access to (Wilen, 2009).

Relevance:

If university instructors are going to seriously account for the rich background experiences, varied motivations, and personal agency of their postsecondary students, they must also take into account the larger “lifewide” learning that takes place within the life of most college students (Peters & Romero, 2019). Student learning at any age is both formal and informal, and what takes place in a formal classroom environment is influenced by informal learning and daily living that takes place outside of it.  Likewise, if deep learning takes place, a student’s world and daily life should be altered by the creation of new schemas and the learning that has taken place in a formal classroom environment. 

In a multicase and multisite study conducted by Mitchell Peters and Marc Romero in 2019, 13 different fully-online graduate programs in Spain, the US, and the UK were examined in order to analyze learning processes across a continuum of contexts (i.e., to understand to what extent learning was used by the student outside of the formal classroom environment).  Certain common pedagogical strategies arose across programs in support of successful student learning and engagement including: developing core skills in information literacy and knowledge management, community-building through discussion and debate forums, making connections between academic study and professional practice, connecting micro-scale tasks (like weekly posts) with macro-scale tasks (like a final project), and applying professional interests and experiences into course assignments and interest-driven research (Peters & Romero, 2019).  In many regards, each of these pedagogical strategies is ultimately teaching students to “learn how to learn” so that the skills they cultivate in the classroom can be applied over and over again elsewhere.

Professional Development:

Still there remains the question of implementation.  In order for the mature adoption of digital education to take place, faculty need to be given time and training to help them develop new technological and pedagogical skills.  If an institution fails to provide sufficient opportunities for professional development, many faculty members will likely fail to fully embrace the shift to an online format, and will instead replicate their conventional teaching methods in a manner that isn’t compatible with effective online instruction (Porter, et al., 2014).  If higher education institutions are committed to delivering high quality instruction in all contexts, it will be important for administrators to retain qualified instructors who are motivated to teach online and who are satisfied with teaching online (Bolliger, Inan, & Wasilik, 2014).

 In a 2012-2013 survey of 11 higher education institutions reporting on their implementation of blended learning programs, Wendy Porter et al found that every university surveyed provided at least some measure of professional development to support faculty in the transition.  Each university had their own customized approach, but the fact that developmental support was prioritized in some regard remained consistent across all of the institutions in the survey.  Strategies used for professional development in digital learning included presentations, seminars, webinars, live workshops, orientations, boot camps, instructor certification programs for online teaching, course redesign classes, and self-paced training programs (Porter et al., 2014).

Digital Literacy:

Digital literacy among higher education faculty can’t be taken for granted.  A recent Action Research study aimed at exploring the digital capacity and capability of higher education practitioners found that, though the self-reported digital capability of an individual may be relatively high, it did not necessarily relate to the quality of their technical skills in relation to their jobs (Podorova et al., 2019).  Survey results from the study also showed that the majority of practitioners (41 higher education professors in Australia) were self-taught in the skills they did possess, receiving very little formal training or support from their employer, even with technology devices and tools directly pertaining to teaching and assessment (Podorova et al., 2019).  Though this data relates to a specific case study, it is not difficult to imagine that higher education faculty in institutions all over the world might report similar experiences.  If faculty aren’t given sufficient technological support and training, they will be less satisfied in their work and, ultimately, the student experience will suffer (Bolliger, et al., 2014).

Institutional Adoption:

In addition to providing sufficient technological or pedagogical resources, it is important for university administrators to communicate the purpose for online course adaptation.  In a later study conducted by Wendy Porter and Charles Graham in 2016, research indicated that higher education faculty more readily pursued effective adoption strategies when they were in alignment with the institution’s administrators and the stated purpose for doing so (Porter & Graham, 2016). If faculty members are, in essence, adult learners being asked to acquire new skills, it is essential to take their own motivations for learning into account.  Additionally, sharing data and course feedback internally from early-adopters to online instruction can go a long way in helping reticent faculty feel ready to approach online learning (Porter & Graham, 2016).  Institutional support is cited frequently in literature pertaining to faculty satisfaction in higher education. In the domain of online learning, institutional support looks like: providing adequate release time to prepare for online courses, fair compensation, and giving faculty sufficient tools, training, and reliable technical support (Bolliger et al., 2014).

One effective approach to professional development for online learning places professors in the seat of the student.  At Hawaii’s Kapi’olani Community College on the island of Oahu, Instructional Designer Helen Torigoe was charged with training faculty in the process of converting courses for online delivery.   In response, Torigoe created the Teaching Online Prep Program (TOPP) (Schauffhauser, 2019). In TOPP, faculty participate in an online course model as a student, using their own first-hand experience to inform their course creation.  As they participate in the course, faculty are able to use the technology that they will be in charge of as an instructor (programs like Zoom, Padlet, Flipgrid, Adobe Spark, Loom, and Screencast-O-Matic), gaining comfort and ease with the tools and increasing their overall digital literacy.  Faculty also get a comprehensive sense for the student experience while concurrently creating an actual course template and receiving guidance and support from the TOPP course coordinator.  Such training is mandatory for anybody teaching online for the first time at Kapi’olani Community College. A “Recharge” workshop has also been created to help faculty engage in continued learning for best practice in digital education, ensuring that faculty do not become static in their teaching methods and are consistently exposed to new tools and strategies for digital education (Schauffhauser, 2019).  Institutions that participate in online education need to provide adequate training in both pedagogical issues and technology-related skills for their faculty, not only when developing and teaching online courses for the first time, but as an ongoing priority in faculty professional development (Bolliger et al., 2014).

Summary:     

The number of graduate courses and programs that must be offered in an online format is increasing in many higher education environments.  Effective online educators will acknowledge the unique needs of their postsecondary learners: that their students need to have their background experiences and context utilized in the learning process, that their learning needs to be relevant to their life and work, and that their learning needs to be providing them with actionable skills and learning strategies that ultimately change how they interact with their world.  Effective online learning will also provide ample space for student connection and active participation.  This means there should be dynamic opportunities for students to engage in debate, reflection, collaboration, and peer review (Weigel, 2002).  Additionally, online learning ought to be a highly relevant venture that enables individual students to do something with the virtually endless information and resources they have access to (Wilen, 2009).  Yet in order for the mature adoption of digital education to take place, faculty need to be given time and training to help them develop new technological and pedagogical skills.  This training needs to happen with initial adoption and as an ongoing venture.  One example of highly effective faculty professional development can be found in Instructional Designer Helen Torigoe’s Teaching Online Prep Program (TOPP) (Schaffhauser, 2019).  In this program the instructors become the students as they familiarize themselves with a new learning system, create a customized course template, and get feedback and support from knowledgeable online educators.  In short, well-equipped, well-trained, and well-supported graduate faculty are fertile ground for effective online education.

References

Bolliger, D. U., Inan, F. A., & Wasilik, O. (2014). Development and validation of the online instructor satisfaction measure (OISM). Educational Technology Society, 17(2), 183–195.

Duderstadt, J., Atkins, D., Van Houweling, D. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Praeger Publishers.

Dziuban, C., Hartman, J., Moskal, P., Sorg, S., & Truman, B. (2004). Three ALN modalities: An institutional perspective. In J. R. Bourne, & J. C. Moore (Eds.), Elements of quality online education: Into the mainstream (127–148). Sloan Consortium.

Peters, M. & Romero, M. (2019) Lifelong learning ecologies in online higher education: Students’ engagement in the continuum between formal and informal learning. British Journal of Educational Technology, 50(4), 1729.

Podorova, A., Irvine, S., Kilmister, M., Hewison, R., Janssen, A., Speziali, A., …McAlinden, M. (2019). An important, but neglected aspect of learning assistance in higher education: Exploring the digital learning capacity of academic language and learning practitioners. Journal of University Teaching & Learning Practice, 16(4), 1-21.

Porter, W., & Graham, C. (2016). Institutional drivers and barriers to faculty adoption of blended learning in higher education. British Journal of Educational Technology, 47(4), 748-762.

Porter, W., Graham, C., Spring, K., & Welch, K. (2014). Blended learning in higher education: Institutional adoption and implementation. Computers & Education, 75, 185-195.

Schaffhauser, D.  (2019). Improving online teaching through training and support. Campus Technology. https://campustechnology.com/articles/2019/10/30/improving-online-teaching-through-training-and-support.aspx

Weigel, V.B. (2002) Deep learning for a digital age. Jossey-Bass.

Wilen, T. (2009). .Edu: Technology and learning environments in higher education. Peter Lang Publishing.

css.php