Professional Development & Technology in Higher Education: What’s Working?

As a former classroom teacher, I am deeply aware of the potential professional development (PD) activities have to positively improve teaching practice; it’s the same potential that PD has to overwhelm instructors and use up valuable time, energy, and resources that might have been used elsewhere in jam-packed school schedules.

When it comes to effective use of educational technology and online teaching in particular, thoughtful, engaging, and practical PD is essential.  Of course, with the onset of COVID-19, schools and instructors at every level were required to make rapid, comprehensive pivots to online teaching and learning, and ed tech specialists, coaches, and instructional designers found their hands full with the overwhelming need for support and training teachers needed in a condensed time frame. There’s no doubt that the emergency shift to online teaching and learning necessitated by the pandemic was immensely challenging for both students and educators, but it’s also fair to say that there has been more than a few success stories related to online teaching and learning, some of them because of effective PD efforts that were made well in advance of the pandemic.  Considering this, I am curious to explore some recent exemplars of professional development activities in higher education related to pivots to online teaching/learning, COVID-related or otherwise.

To frame this exploration, it’s helpful to first examine some of the research shaping current approaches to PD in education. In 2014, the Boston Consulting Group working on behalf of the Bill & Melinda Gates Foundation surveyed over 1,300 stakeholders in education (teachers, administrators, instructional coaches, etc.) on topics related to PD (BCG, 2014).  Research suggested that teachers at all levels were overwhelmingly dissatisfied with the majority of PD offerings.  Reasons cited included a disconnect between classroom observations by administrators and meaningful coaching interactions, a lack of trust or authority from those leading the PD initiatives, PD presented as an exercise in compliance instead of a meaningful opportunity for growth, lack of opportunity for collaboration with peers, lack of choice, and lack of relevance to immediate needs (BCG, 2014). Suggestions for future practice included a decreased dependence on external vendors for PD workshops and increased attention to teacher-driven needs and collaboration time, as well as considerations for leveraging technology to boost collaboration and streamline workloads (BCG, 2014). 

Image Source, BCG (2014)

These findings were also supported by Cho & Rathburn’s 2013 case study on PD in higher education. Similar to the findings of the Boston Consulting Group, Cho & Rathburn (2013) found that a traditional workshop format for higher education PD constrained active participation, collaboration, and the creation of usable knowledge for teaching.  Cho & Rathburn (2013) proposed a problem-based learning framework for PD in higher education which:

  1. Lets relevant problems guide the learning activities
  2. Has participants self-direct their learning and take responsibility for knowledge acquisition
  3. Encourages social interaction and collaborative knowledge construction among instructors. 

Data from this particular case study supported a teacher-centered approach to PD which was favored by university instructors and facilitated the creation of usable knowledge which could be immediately applicable in their own teaching contexts.  In this case study, the PD opportunities were provided online and asynchronously in order to counteract constraints of time and place and allow instructors to engage with the PD as it was fitting for their individual departments (Cho & Rathburn, 2013).

In another look at PD initiatives in higher education, Schildkamp et al. (2020) make note of the presence of certain “building blocks” which made for effective professional development and use of educational technology during the COVID-19 pandemic.  In this research, the two PD initiatives examined by Shildkamp et al. (2020) were effective because they prioritized:

  1. The effective use of technology and ways it might need to be customizable to specific content area needs
  2. Active learning activities supported by experts
  3. Clearly defined goals focused on the instructor’s own practice and use of technology with attention to long-term sustainability
Image Source: https://www.eventbrite.com/blog/eventbrite-academy-create-better-events-ds00/

In an effort to highlight and streamline some of the similarities and standouts of the research initiatives mentioned above, I find it helpful to reference Vicki Davis’s list of tips for highly effective PD activities that can serve as a meaningful guide for PD facilitators and coaches in any academic environment (Davis, 2015):

1. Use What You Are Teaching: don’t just lecture about a helpful strategy or tool, model it and have participants actively engage with it

2. Develop Something That You’ll Use Right Away: if it’s relevant, instructors should be able to implement a takeaway within a few weeks

3. Receive Feedback: create opportunity for feedback on the PD “session” as well as peer-to-peer feedback on implementation of the takeaway

4. Improve and Level Up: create opportunities to workshop the initial takeaway with ongoing PD and support–effective PD isn’t “one and done” 

5. Local Responsibility and Buy-In: institutional/school-wide support is needed, it’s not just the responsibility of teachers/instructors to internalize and implement PD initiatives

6. Long-Term Focus: avoid the temptation to chase fads or take a “flavor of the week” approach to PD (especially in regards to technology) which can make takeaways feel disconnected, erratic, and short-lived; make sure PD aligns meaningfully with long-term goals of the school/district/institution 

7. Good Timing: consider the larger ebb and flow of the academic calendar and when instructors will be in the best position to be fully present for a PD initiative

8. Empower Peer Collaboration: give teachers/instructors the time and opportunity to learn from one another.

Finally, I’d like to highlight a comprehensive example of effective PD for online learning sourced from a community college in Hawaii.  This approach to PD places professors in the seat of the student in an online learning context, and it puts many of the tips listed above into action.  At Kapi’olani Community College on the island of Oahu, Instructional Designer Helen Torigoe was charged with training faculty in the process of converting courses for online delivery (this was prior to the onset of the pandemic).   In response, Torigoe created the Teaching Online Prep Program (TOPP) (Schauffhauser, 2019). In TOPP, faculty participate in an online course model as a student, using their own first-hand experience in the program to inform their course creation.  As they participate in the course, faculty are able to use the technology that they will be in charge of as an instructor (which include programs like Zoom, Padlet, Flipgrid, Adobe Spark, Loom, and Screencast-O-Matic), gaining comfort and ease with the tools and increasing their overall digital literacy.  Faculty also get a comprehensive sense for the student experience while concurrently creating an actual course template that they will use in the near future.  Instructors receive guidance, feedback, and support from the TOPP course coordinator and their peers in the course. Such training is mandatory for anybody teaching online for the first time at Kapi’olani Community College. A “Recharge” workshop has also been created to help faculty engage in continued learning for best practice in digital education.  This ensures that faculty do not become static in their teaching methods as they are consistently exposed to new tools and strategies, while also gleaning reminders and refresh opportunities in support of long-term sustainability (Schauffhauser, 2019).  Institutions that participate in online education need to provide adequate training in both pedagogical issues and technology-related skills for their faculty, not only when developing and teaching online courses for the first time, but as an ongoing priority in faculty professional development (Bolliger et al., 2014).

I am curious to know how Kapi’olani Community College fared during the worst of the COVID-19 pandemic and how faculty and students dealt with the switch to fully remote learning, especially those who weren’t previously involved with distance learning initiatives.  Was TOPP used to onboard instructors who previously only taught face to face?  Did faculty feel like they had the resources and training they needed to make the switch more effectively than colleagues at other institutions?  These aren’t questions I have answers to, but I venture to guess that faculty and instructional designers at Kapi’olani Community College did indeed have a leg up because of the prior investments the institution had already made in timely, meaningful, applicable, teacher-driven, problem-based, technology-rich, and sustainable PD.

References:

Bolliger, D. U., Inan, F. A., & Wasilik, O. (2014). Development and Validation of the Online Instructor Satisfaction Measure (OISM). Educational Technology Society, 17(2), 183–195.

Boston Consulting Group (2014). Teachers know best: Teachers’ Views on professional development. Bill & Melinda Gates Foundation. https://usprogram.gatesfoundation.org/news-and-insights/usp-resource-center/resources/teachers-know-best-teachers-views-on-professional-development

Cho, M. & Rathbun, G. (2013). Implementing teacher-centred online teacher professional development (oTPD) programme in higher education: a case study. Innovations in Education and Teaching International, 50(2), 144-156. 10.1080/14703297.2012.760868

Davis, V. (2015, April 15). 8 Top Tips for Highly Effective PD. Edutopia. https://www.edutopia.org/blog/top-tips-highly-effective-pd-vicki-davis

Schaffhauser, Dian. (2019, October 30). Improving online teaching through training and support. Campus Technology. https://campustechnology.com/articles/2019/10/30/improving-online-teaching-through-training-and-support.aspx

Schildkamp, K., Wopereis, I., Kat-De Jong, M., Peet, A. & Hoetjes, I. (2020). Building blocks of instructor professional development for innovative ICT use during a pandemic. Journal of Professional Capital and Community, 5(3/4), pp. 281-293. https://doi.org/10.1108/JPCC-06-2020-0034

Can a hybrid approach to teaching and learning organically foster digital citizenship?

As has been noted by many educators at this point in the pandemic, it seems likely that the massive shifts to online teaching/learning that have taken place in the last year have permanently altered attitudes towards, and usage of, educational technology.  Though most teachers, students, and families will be eager to return to the physical classroom as soon as it’s safe to do so, there’s no doubt that a hybrid approach to teaching and learning, both in K-12 and in higher education, is here to stay.  This is true because of pragmatic constraints (i.e. some students and families will elect to continue with online learning into the Fall, even if in-person options are available), but also because of a more pervasive level adaptation such that hybrid teaching & learning has asserted itself as part of the “new normal” and will, in my opinion, be utilized well after the effects of the pandemic would mandate the use of technology-mediated modalities.

Hybrid learning (alternatively referred to as blended learning) at its most basic is an approach to teaching and learning that combines, or blends, face-to-face instruction with technology-mediated instruction (Saichaie, 2020).  The actual ratio of face-to-face to online instruction can differ greatly and still be considered hybrid instruction; thus, a K-12 classroom may still be meeting in-person during traditional school hours and be participating in hybrid learning.  Hybrid learning is also closely associated with the “flipped classroom” model of learning in which students are exposed to course content prior to class so that time in class allows students to “engage in higher-order thinking and application of the concepts in a group setting with the support of the instructor to foster deep and significant learning” (Saichaie, 2020, p.97).  Effective hybrid teaching/learning will intentionally leverage the use of technology in order to replace seat time in a classroom and creatively engage students in learning in a variety of ways.

Konopelko (2020) offers some compelling suggestions as to why teachers and school districts might be more inclined to maintain hybrid or fully online learning modalities moving forward:

  • Some students responded exceptionally well to the agency afforded by online, asynchronous learning.  It’s possible that some sort of online learning option will need to remain accessible to students in public school systems in perpetuity.
  • Advancements and investments in educational technology and, more specifically, interoperability have skyrocketed during the pandemic.  Previously, it was common for teachers to be using many different curriculum softwares, interactive whiteboards, device-types, and student information systems that didn’t always communicate well with one another.  Now, with tech giants like Microsoft investing major resources into expanding their virtual learning platforms, the integration of educational technology has become exceedingly easier and more effective in a short period of time (e.g. Microsoft Teams touting “Effective learning, all in one place”).
  • Improvements to audiovisual tools/platforms have also been noticeable during the pandemic, opening the doors to broader use of video recordings and live, synchronous meeting sessions held virtually.

Additionally, I would suggest that:

  • Many educators’ personal digital literacy and comfort with educational technology has exponentially increased during the pandemic as they’ve been forced to lean heavily (and creatively) on ed tech tools to continue teaching.
  • The disruptive nature of the pandemic has created space for teachers to rethink how certain courses/subjects are taught with a mind towards student needs and student-centered learning.  Hybrid teaching invites educators to “trim the fat” for lessons that have always been done a certain way, even as they recalibrate learning goals and think critically about student engagement.  Student engagement can’t be taken for granted in a hybrid learning approach the way it might be in a physical classroom.

Thus, if we assume that hybrid teaching/learning using educational technology will, at some level, be a permanent fixture in education moving forward, it is important to pause and seriously consider the impact and potential of technology-mediated instructional practices.  ISTE Standard 3 for educators asks instructors to consider how they can inspire students to positively contribute to, and responsibly participate in, the digital world.  This standard is often closely tied to the concept of digital citizenship in that an instructor’s goal is to help learners act in socially responsible ways in digital spaces, exhibit empathetic behaviors online, think critically about online resources and their ethical use, and ultimately, build relationships and community as citizens with a stake in the virtual world.   These aspects of digital citizenship are some of the many 21st century skills that are essential for learners to acquire in 2021.  

Of course, rather than adding another piece of content for instructors to cover in a classroom, digital citizenship development can/should be imbedded in instructional practice in an organic way.  The Edvolve Framework put forth by Lindsey (2018) does a nice job of situating digital citizenship education within existing educational structures, noting that digital citizenship education isn’t about delivering more content, it’s about putting core values into practice within naturally occurring educational activities and technology use.  In the Edvolve Framework, learner agency (attitude towards learning and personal responsibility) and digital literacy (consuming, communicating and creating with digital tools) work together to build a digital citizen (socially responsible participant in online environments), even as practicing digital citizenship will, in turn, improve learner agency and digital literacy skills (Lindsey, 2018).  Like cogs in a machine, these three elements inform one another and mutually influence one another in all learning activities.  Thus, with the Edvolve Framework as an example, it’s reasonable to assert that a hybrid learning model and the effective use of education technology can/should naturally lead to the cultivation of digital citizens.

Lindsey (2018), https://www.edvolvelearning.com/framework.html

Pedersen et al. (2018) refer to hybrid teaching as an entry point for practicing and expressing digital citizenship.  More than any specific content, curriculum, or list of “dos and don’ts,” Pedersen et al. (2018) argue that digital citizenship is about becoming, belonging, and cultivating the capabilities to do so.  Similarly, Emejulu and Mcgregor (2020) assert that

“…the cornerstone of a radical digital citizenship is the insistence that citizenship is a process of becoming – that it is an active and reflective state for individual and collective thinking and practice for collective action for the common good” (p.140). 

There is something unique to hybridity that develops student thinking towards digital citizenship. “Thinking and acting in hybrid ways change the scope and space for education, making it more inclusive and conducive to the fostering of a digital citizenship that opens up to something other” (Pedersen et al., 2018, p. 234).  Hybridity within education is the acknowledgement, and indeed the value of, otherness and difference, and it develops a learners’ ability to exist in in-between spaces in a globalized world (Pedersen et al., 2018).  Hybridity, by definition, is the combination of more than one thing, and thus hybrid teaching and learning will not ascribe to one set of rules; rather, it will ask students to be flexible and practice resilience, thinking critically about the nuance of context and the shifting roles and expectations for themselves and others therein. Pedersen et al. (2018) argue that digital citizenship provides the “why” behind a hybrid educational enterprise, and that a value-based approach to teaching will inherently impact instructional design and, therefore, the learning experiences and goals.  Hybrid teaching/learning with a digital citizenship “why” driving it will embody the following core values:

Shared value foundation for Hybrid Education (Pedersen et. al., 2018)

Core ValueUnderlying Value
EmpathyCare, Respect, Commitment, Compassion, Sensitivity, Invitational
Belonging & BuildingContribution, Sensitivity, Care, Generosity
PlayfulnessJoy, Creativity, Curiosity, Exploration, Experimentation
Agency & EmpowermentAutonomy, Resourcefulness, Self-determination, Freedom, Autonomy, Courage 
*BildungThoughtfulness, Discipline, Professionalism 
DiscoveryExperimentation, Curiosity, Exploration   
*Bildung refers to the German philosophical tradition of self-cultivation for both personal and societal maturation

Once again, these core values are not meant to be additive curricular elements; rather these values and the practice of digital citizenship should be organically imbedded in the learning process whenever educational technology or a hybrid approach to education is involved, and it is the essence of hybrid teaching/learning which will create space for these values to be cultivated in students.

It is also important to mention that hybridity, in contrast to a learning environment that is fully online, has unique benefits in that students are, to some extent, still situated in a physical environment, and therefore the cultural, and sociopolitical realities (and limitations) that are part of that physical environment.  Emejulu & McGregor (2019) assert that a digital citizenship that is seemingly neutral, nomadic, agnostically tolerant, and primarily concerned with effective network extension and demonstrating ‘netiquette’ in virtual spaces is, at best, incomplete.  They argue that digital citizenship must be situated within social, economic, and political contexts in order to be meaningfully practiced.  Indeed, a ‘radical digital citizenship’ asks citizens to think about the consequences of digital technologies in everyday life and consider “who has the power” wherever technology is used (Emejulu & McGregor, 2019).  The authors assert that digital citizenship can’t just be about improving virtual spaces; rather, active digital citizenship should be situated in the ‘real world’ and help us engage with “social, economic, and environmental inequalities in a new and different way” (Emejulu & McGregor, 2019, p. 143). 

In summary, the specific tools and learning platforms used to engage in hybrid teaching/learning are less important than the overall vision for what hybridity, as an ethos, can accomplish.  A hybrid teaching/learning model is uniquely valuable in its ability to organically cultivate digital citizens, and will go a long way to helping students become resilient, socially responsible, empathetic, competent, and creative participants in both virtual and physical spaces.  After all, education makes both the “becoming of the individual and the renewal of the common world possible,” and digital education is no exception (Pedersen et al., 2018, p. 227). 

References

Emejulu, A. & McGregor, C. (2019). Towards a radical digital citizenship in digital education. Critical Studies in Education (60)1, 131–147 https://doi.org/10.1080/17508487.2016.1234494 

Konopelko, D. (2020, May 7). Post-Pandemic classrooms: What will they look like and how will they be different? EdTech. https://edtechmagazine.com/k12/article/2021/05/post-pandemic-classrooms-what-will-they-look-and-how-will-they-be-different

Lindsey, L. (2018). Edvolve Framework, Evolving in the Digital Age: Digital Citizenship, Digital Age Literacy, Learner Agency. Edvolve Learning. https://www.edvolvelearning.com/framework.html

Pedersen, A.Y., Nørgaard, R.T., & Köppe, C. (2018). Patterns of inclusion: Fostering digital citizenship through hybrid education. Educational Technology & Society 21(1), pp. 225-236. https://www.jstor.org/stable/26273882?seq=1#metadata_info_tab_contents

Saichaie, K. (2020). Blended, flipped, and hybrid learning: Definitions, developments, and directions. New Directions for Teaching & Learning 164, 95-104. https://doi.org/10.1002/tl.20428

Using Canvas Analytics to Support Student Success

Though online teaching/learning are hardly new concepts in education, the pandemic has necessitated a massive shift to online learning such that educators worldwide–at all levels–have had to engage with online learning in new, immersive ways.  Online learning can take many forms (synchronous, asynchronous, hybrid, hyflex, etc.), but regardless of the form, educators with access to an LMS have been forced to lean into these platforms and leverage the tools within in significant ways, continually navigating (perhaps for the first time) how to best support students in achieving their learning goals using technology.

Without consistent opportunities for face-to-face communication and informal indicators of student engagement that are typically available in a classroom (e.g. body language, participation in live discussions, question asking) a common challenge faced by educators in online learning environments–especially asynchronous ones–is how to maintain and account for student engagement and persistence in the course.  Studies using Educational Data Mining (EDM) have already demonstrated that student behavior in an online course has a direct correlation to their successful completion of the course (Cerezo et al., 2016). Time and again, these studies have supported the assertion that students who are more frequently engaged with the content and discussions in an online course are more likely to achieve their learning goals and successfully complete the course (Morris et al., 2005).  This relationship is, however, tricky to measure, because time spent online is not necessarily representative of the quality of the online engagement.  Furthermore, different students develop different patterns of interaction within an LMS which can still lead to a successful outcome (Cerezo et al., 2016). Consequently, even as instructors look for insights into student engagement from their LMS, they must avoid putting too much emphasis on the available data, or even a ‘one style fits all’ approach to interpreting it.  Instead, LMS analytics should be considered as one indicator of student performance that contributes to the bigger picture of student learning and achievement.  Taken in context, the data that can be quickly gleaned from an LMS can be immensely helpful in identifying struggling or ‘at-risk’ students and/or those who could benefit from differentiated instruction, as well as possible areas of weakness within the course design that need addressing.

Enter LMS analytics tools and the information available within.  For the purposes of this post, I’ll specifically be looking at the suite of analytics tools provided by the Canvas LMS, including Course Analytics, Course Statistics, and ‘New Analytics.’

Sample Screenshot of Canvas New Analytics, https://sites.udel.edu/canvas/2019/11/new-canvas-analytics-coming-to-canvas-in-winter-term/
  • Course Analytics are intended to help instructors evaluate individual components of a course as well as student performance in the course.  Course analytics are meant to help identify at-risk students (i.e. those who aren’t interacting with the course material), and determine how the system and individual course components are being used.  The four main components of course analytics are: 
    • Student activity, including logins, page views, and resource usage
    • Submissions, i.e. assignments and discussion board posts
    • Grades, for individual assignments as well as cumulative
    • Student analytics, which is a consolidated page view of the student’s participation, assignments, and overall grade (Canvas Community(a), 2020).  With permission, students may also view their own analytics page containing this information.
  • Course Statistics are essentially a subset of the larger course analytics information pool.  Course statistics offer specific percentages/quantitative data for assignments, discussions, and quizzes.  Statistics are best used to offer quick, at-a-glance feedback regarding which course components are engaging students and what might be improved in the future (Canvas Community(b), 2020).
  • New Analytics is essentially meant to be “Course analytics 2.0” and is currently in its initial rollout stage.  Though the overall goal of the analytics tool(s) remains the same, New Analytics offers different kinds of data displays and the opportunity to easily compare individual student statistics with the class aggregate.  The data informing these analytics is refreshed every 24 hours, and instructors may also look at individual student and whole class trends on a week-to-week basis.  In short, it’s my impression that ‘New Analytics’ will do a more effective job of placing student engagement data in context.  Another feature of New Analytics is that instructors may send a message directly to an individual student or the whole class based on a specific course grade or participation criteria (Canvas Community(c), 2020). 
Sample Screenshot of Canvas New Analytics, https://sites.udel.edu/canvas/2019/11/new-canvas-analytics-coming-to-canvas-in-winter-term/

Of course, analytics and statistics are only one tool in the toolbelt when it comes to gauging student achievement, and viewing course statistics need not be the exclusive purview of the instructor.  As mentioned above, with instructor permission, students may view their own course statistics and analytics in order to track their own engagement.  Beyond viewing grades and assignment submissions, this type of feature can be particularly helpful for student reflection on course participation, or perhaps as an integrated part of an improvement plan for a student who is struggling.

Timing should also be a consideration when using an LMS tool like Canvas’ Course Analytics.  When it comes to student engagement and indicators of successful course completion, information gathered in the first weeks of the course can prove invaluable.  Rather than being used solely for instructor reflection or summative ‘takeaway’ information about the effectiveness of the course design, course analytics may be used as early predictors of student success, and the information gleaned may be used to initiate interventions from instructors or academic support staff (Wagner, 2020). Thus, instructors who use Canvas will likely find that their Canvas Analytics tools might actually prove most helpful within the first week or two of the course (University of Denver Office of Teaching & Learning, 2019).  For example, if a student in an online course is having internet access issues, the instructor can likely see this reflected early-on in the student’s LMS analytics data. The instructor would have reason to reach out and make sure the student has what they need in order to engage with the course content.  If unstable internet access is the issue, the instructor may then flex due dates, provide extra downloadable materials, or continually modify assignments as needed throughout the quarter in order to better support the student.

Finally, as mentioned above, in addition to student performance, LMS analytics tools may be used by the instructor to think about the efficacy of their course design.  Canvas’ course analytics tools help instructors see which resources are being viewed/downloaded, which discussion boards are most active (or inactive), what components of the course are most frequented, etc.  Once an online course has been constructed, it can be tempting for instructors to “plug and play” and assume that the course will retain its same effectiveness in every semester it’s used moving forward. Course analytics can help instructors identify redundancies and course elements that are no longer needed/relevant due to lack of student interest.  They can also help instructors think critically about what seems to be working well in their course (i.e. what are students using, where are they spending the most time in the course) why that might be, and how to leverage that for adding other course components or tweaks for the future.

In summary, the information available via an LMS analytics tool should always be considered in concert with all other factors impacting student behavior in online learning, including varying patterns or ‘styles’ in students’ online behaviors and external factors like personal or societal crises that may have impacted the move to online learning in the first place.  Student engagement (as measured by LMS analytics tools) can be helpful tools used for identifying struggling students, providing data for student self-reflection, and providing insight into the effectiveness of the instructors’ course design.  To the extent that analytics tools aren’t considered the “end all be all” when it comes to measuring student success, tools like Canvas Analytics are a worthwhile consideration for instructors teaching online who are invested in student success as well as their own professional development.

References:

Canvas Community(a). (2020). What are Analytics? Canvas. https://community.canvaslms.com/t5/Canvas-Basics-Guide/What-are-Analytics/ta-p/88

Canvas Community(b). (2020). What is New Analytics? Canvas. https://community.canvaslms.com/t5/Canvas-Basics-Guide/What-is-New-Analytics/ta-p/73

Canvas Community(c). (2020). How do I view Course Statistics? Canvas. https://community.canvaslms.com/t5/Instructor-Guide/How-do-I-view-course-statistics/ta-p/1120

Cerezo, R., Sanchez-Santillan, M., Paule-Ruiz, M., & Nunez, J. (2016). Students’ LMS interaction patterns and their relationship with achievement: A case study in higher education. Computers & Education 96, 42-54. https://www.sciencedirect.com/science/article/pii/S0360131516300264

Morris, L.V., Finnegan, C., & Wu, S. (2005). Tracking student behavior, persistence, and achievement in online courses. The Internet and Higher Education 8, 221-231. https://www.sciencedirect.com/science/article/pii/S1096751605000412 

Wagner, A. (2020, June 6). LMS data and the relationship between student engagement and student success outcomes. Airweb.org. https://www.airweb.org/article/2020/06/17/lms-data-and-the-relationship-between-student-engagement-and-student-success-outcomes 

Resilient pedagogy: The professional development opportunity educators need now more than ever

Resilient pedagogy is an emerging instructional philosophy with extremely timely implications for this current moment in education and the ongoing effects of the COVID-19 pandemic.  Though facets of resilient pedagogy have long been practiced by educators in the form of classroom differentiation, and though other frameworks like Universal Design for Learning (UDL) and Transparency in Learning and Teaching (TILT) inform resilient pedagogy, Rebecca Quintana and her colleagues at the University of Michigan have attempted to define a more expansive type of differentiation by building upon these approaches to instructional design and extending beyond them, bringing to the forefront the need for instructors to be agile and intentional in all educational contexts, but especially in moments of crisis and change.  More than just a fancy synonym for differentiation, resilient pedagogy can be defined as “…the ability to facilitate learning experiences that are designed to be adaptable to fluctuating conditions and disruptions” (Quintana & DeVaney, 2020, para. 8). Resilient teaching is an approach that “take[s] into account how a dynamic learning context may require new forms of interactions between teachers, students, content, and tools” (Quintana & DeVaney, 2020, para. 8), and those who practice resilient pedagogy have the capacity to rethink the design of learning experiences based on a nuanced understanding of context (Quintana & DeVaney, 2020).  The key to resilient teaching is a focus on the interactions that facilitate learning, including all the ways that teachers and students need to communicate with one another and actively engage with the learning material (Hart-Davidson, 2020). 

“Teachers often plan carefully for delivering content…but when it comes to planning interactions, we can easily take this very important component of learning for granted.”

(Hart-Davidson, 2020, para. 5)

In 2020, Rebecca Quintana and the University of Michigan released a Massive Open Online Course (MOOC) via Coursera titled “Resilient Teaching Through Times of Crisis & Change.”  The MOOC is available in a free, open access format and offers a flexible learning structure which makes it accessible to any educator wanting to engage with the topic. The registration process is simple, and as an asynchronous online learning experience, there are no time constraints on when a participant must register or when a participant must complete the course.  Though the course is aimed at educators who may need to rethink how they teach in the immediate or near future due to the ever-changing circumstances of the pandemic, the course creators “…expect it will remain relevant to instructors who are faced with disruptions and change to their teaching for any number of reasons and must quickly adapt their course designs” (Quintana, 2020). Furthermore, though this MOOC course is especially relevant to the higher education environment, the principles of resilient pedagogy can absolutely be applied in any type of classroom by any type of educator.

Interested educators may engage with the course casually by reviewing videos (thoughtfully ‘chunked’ into appropriately consumable lengths) and reading materials in whatever order and pacing–and to whatever depth–feels pertinent to their needs.  They can choose to purchase the full course and engage in all aspects of the learning experience, including submitting assignments and completing checks for understanding.  In this format, participants can receive a course completion certificate at the end.  This type of engagement may be especially helpful if participating in the course alongside colleagues in a more formal professional development venture.  My personal engagement has been decidedly less formal.

The course content focuses on three key components of resilient pedagogy: designing for extensibility, designing for flexibility, and designing for redundancy.  This three-principle framework helps flesh out the meaning and potential of resilient pedagogy while also serving as a practical guide to course design.

  1. Designing for Extensibility means that a course is designed in such a way that it has a clearly defined purpose and essential, unaltered learning goals, and yet the basic essence of the course content can be extended with new capabilities and functionality as needed.  This may involve the introduction of new tools or a change in format, moving fluidly from synchronous to asynchronous modalities, etc.  
  2. Designing for Flexibility means that a course is designed to respond to the individual needs of learners within a changing learning environment.  In a nod to the UDL framework, designing for flexibility means that a course is structured to meet a variety of student needs and learning styles, even before knowing specific individuals in a given class.  Flexibility will require a learner-centered approach with multiple means of engagement/expression and considerations for student needs which may arise within variable class sizes and modalities.  A course designed for flexibility will also allow instructor expectations and assessments to flex in response to these needs.
  3. Designing for Redundancy, simply put, means having contingency plans in place. Designing for redundancy asks instructors to analyze a course design for possible vulnerabilities.  For example, how will students accustomed to synchronous virtual meetings be given the opportunity to engage in course activities if their internet access becomes unpredictable?  In this design approach, instructors look for alternative ways of accomplishing goals with the hope of eliminating “single points of failure.” This is, of course, incredibly important when learning is situated in a time of crisis or emergency.

These three principles of resilient pedagogy do not stand alone. Rather, they inform one another and will naturally overlap in the instructional design process.  The MOOC contains excellent examples and practical applications of extensibility, flexibility, and redundancy throughout, but Rebecca Quintana and her team aren’t the only academics talking about resilient pedagogy, and examples of resilient pedagogy implemented during the pandemic can be found outside the MOOC.  For the reader who might be thinking about resilient pedagogy for the first time, here are a few examples of what resilient pedagogy may look like in practice:

  • Educators on a staggered schedule or a hybrid return-to-school plan may put together an in-person and virtual lesson plan that can be running at the same time on the same day with students engaging with the same content in two different modalities (Watson, 2020).
  • Instructors may create a spreadsheet for a course which helps track various contingencies and needed adjustments for various modalities: in person, hybrid or hyflex, and fully remote (Quintana, 2020).
  • Resilient pedagogy involves reducing complexity in any way possible.  This can look like establishing a predictable weekly pattern for remote students, having fewer due dates, simplifying assignments, etc. (Tange, 2020). Resilient pedagogy in practice means educators can scale up or down as needed according to student needs, understanding that crisis situations almost always call for some sort of scaling down. It’s OK to pair a course down to its most essential elements.
  • Resilient pedagogy requires an emphasis on feedback and interactions vs. assignments and grading.  Grading fewer assignments while also providing more opportunities for ongoing feedback increases the opportunity for interactions between instructors and students while also lowering the stakes for all parties (Watson, 2020).  It also keeps educators from getting stuck trying to stick a “square-pegged” assignment or assessment into a “round hole” of a specific digital tool, modality, or crisis context, simply because this assignment has always been done as part of the course in the past. 
  • As another way to emphasize the importance of interactions within a course, resilient pedagogy prioritizes small group interactions over and above large group instruction (Watson, 2020).  This can take many forms in both synchronous and asynchronous, online and in-person formats.
  • Resilient pedagogy requires educators to consider the use of digital tools carefully within their course design. If, for example, they are using a particular tool on which the success of their students rests, instructors may dedicate time within their learning activities to help students learn how to use that technology and not make assumptions about their students’ digital literacy (Gardiner, 2020).

Though the application of resilient pedagogy may feel particularly prescient in this current moment of crisis, resilient teaching will benefit students and instructors in all circumstances in the long run, regardless of the circumstance.  At the end of the day, resilient teaching forces instructors to examine student engagement carefully and intentionally and develop a student-centered mindset.  It also helps instructors design a dynamic course once, so that they’re using their time and efforts efficiently and making their courses as resistant to disruption as possible (Gardiner, 2020).  Resilience has been an oft-discussed trait that ‘successful’ students possess, but perhaps it’s time to shift that focus on to educators.  Successful educators must be resilient themselves.  It’s not only necessary for this moment, it’s the right thing to do for students in all contexts moving forward, and the “Resilient Teaching Through Times of Crisis & Change MOOC is a great place to start.

“If it seems like resilient pedagogy is in line with calls for us all to be making learning more inclusive and accessible, it certainly is.”

(Hart-Davidson, 2020, para. 17) 

References:

Gardiner, E. (2020, June 25). Resilient Pedagogy for the Age of Disruption: A Conversation with Josh Eyler. Top Hat. https://tophat.com/blog/resilient-pedagogy-for-the-age-of-disruption-a-conversation-with-josh-eyler/

Hart-Davidson, B. (2020, April 6). Imagining a resilient pedagogy. Medium. https://cal.msu.edu/news/imagining-a-resilient-pedagogy/

Kaston Tange, A. (2020, June 8). Thinking about the humanities. https://andreakastontange.com/teaching/resilient-design-for-remote-teaching-and-learning/

Quintana, R. (2020).  Resilient teaching through times of crisis and change [MOOC]. Coursera. https://www.coursera.org/learn/resilient-teaching-through-times-of-crisis 

Quintana, R., & DeVaney, J. (2020, May 27). Laying the foundation for a resilient teaching community. Inside Higher Ed. https://www.insidehighered.com/blogs/learning-innovation/laying-foundation-resilient-teaching-community 

Watson, A. (2020). Flexible, resilient pedagogy: How to plan activities that work for in-person, remote, AND hybrid instruction.  Truth for Teachers. https://thecornerstoneforteachers.com/truth-for-teachers-podcast/resilient-pedagogy-hybrid-instruction-remote-learning-activities/

Diigo as a tool for collaborative learning and research in higher education

There is significant opportunity within higher education environments–indeed, all education environments–to lean into a constructivist educational philosophy and approach knowledge as something co-created by both instructors and students.  Furthermore, as higher education courses and programs are increasingly offered in hybrid and fully online modalities, finding authentic ways for students to increase their social presence and overall engagement-level in coursework is essential (Baran, 2013). Digital tools can greatly assist in the act of socially constructing knowledge by helping to eliminate learning boundaries and extend opportunities for both formal and informal learning in a myriad of ways (Baran, 2013).  

Within higher education, one of the more important realms of knowledge construction between students and instructors, especially at the graduate level, takes place in the academic research process and in the conversations that, quite literally, take place in the margins within that research process (Farber, 2019).  As a graduate student who currently does not use any specific annotation or research collaboration tools for research outside of Microsoft Teams or Google Suite, I am curious to explore the ways in which the social bookmarking tool Diigo (which allows learners to collect, annotate, organize, and share online resources) supports efficient, collaborative research among instructors, students, and their peers in higher education.  Furthermore, I am interested in anecdotally comparing my exploration of this tool with the functionalities of more recently-released (and decidedly more expansive) digital collaboration tools like Teams and Google+.  Does Diigo hold its own in the digital collaboration tool market in 2021?

According to the product website (Diigo Inc., 2021), Diigo supports collaborative learning endeavors in four key ways.  It allows users to:

  • Collect: save and tag online resources to public or private curated libraries for easy access
  • Annotate: annotate web pages, PDFs, and other digital content directly while browsing online
  • Organize: organize links, references and personal input to create a structured research base
  • Share: share research with friends, classmates, colleagues or associates

Originally released in 2011, Diigo (Digest of Internet Information, Groups and Other stuff) quickly found a dedicated user-base and distinguished itself from other types of bookmarking applications (most notably its 2003 bookmarking predecessor, Delicious) due to its user-friendly interface, emphasis on social engagement, and extensions specific to use in education, combined with more traditional bookmarking functionalities (Ruffini, 2011).  The table below shows a helpful comparison between Diigo, its initial competitor and predecessor Delicious (now defunct), and the typical bookmarking capabilities of a web browser.

Table 1. Social Bookmarking Comparison Chart (updated by Ruffini, 2011)

FeatureDiigoDeliciousBrowser
Organize bookmarks automatically with tagsxxx
Popular bookmarksXXX
Anytime, anywhere access to bookmarksXX 
Share bookmarks with othersXX 
Powerful, customizable search toolsXX 
Groups of people with similar interestsXX 
Post automatically to blogXX 
Tools and browser extensions for bookmarkingXX 
Lists of grouped bookmarksXX 
Free iPhone and Android appsXThird party 
iPad Safari browser bookmarkletX  
Add and share sticky notesX  
Capture, mark up, share images and textX  
Collect web pictures into albumsX  
Sync bookmarksX  
Tools for educatorsX  
Original table by: Schmidt, Jason. (2010, July 30). Diigo and Delicious. Interactive Inquiry. https://iisquared.wordpress.com/2010/07/30/diigo-and-delicious/

As the table indicates, Diigo offers much more functionality in several categories, but most notably (pun intended) in the realm of annotation.  When installed as a browser extension, Diigo can be integrated fairly seamlessly into existing research habits, and perhaps most importantly, most Diigo tools can be accessed for free.

Since Delicious left the scene, new social learning/annotation tools have surfaced such as Hypothesis.is and Mendeley, both of which deliver many of the same features as Diigo, and both of which are largely open access.  Though a detailed direct comparison of these tools is beyond the scope of this post, a brief exploration seems to suggest that Hypothesis might be most appealing for educators who would like to incorporate the application into their Learning Management System (LMS).  Hypothesis can integrate nicely into all of the major LMS platforms and it offers many resources and training videos for educators so that they can truly maximize their use of the tool within their planned learning activities (Guhlin, 2020). Mendeley, a tool primarily intended for use in higher education, has a handy “cite as you write” plugin that prioritizes the streamlining of the reference process, automatically capturing author, title, and publisher information as needed (Guhlin, 2020).

Though it’s now been a decade since its release, Diigo seems to maintain its relevance and dedicated user base for several reasons:

  1. It is a bit simpler and more user-friendly than its competitors such that it is more easily adapted for use in a variety of learning environments and contexts, including both K-12 and higher education (Guhlin, 2020).
  2. Diigo seems to be the tool with the strongest integrated support for educators independent of an LMS.  Since its early stages, special accounts are available for educators that empower registered teachers with a variety of extra tools and features, leveraging the tool for use and collaboration with an entire class if desired (Education Technology, 2015).
  3. Individual features like Diigo Outliner, which lets you create and share digital outlines within a document, add sustaining value to the tool; these features are more nuanced than the general commenting features or Track Changes available in so many other types of collaboration tools (Guhlin, 2020).
  4. Because of its longevity, Diigo has had time to collect a large, lasting user base and dynamic Interest Groups (i.e. K-12 teachers, higher education instructors, researchers, etc.) which offer grassroots professional development tips and organic user insights accessible to the whole community (Ruffini, 2011).  
  5. Diigo’s longevity is also a testament to the creators’ ability to update the tool to best fit user needs over time, and there continue to be product and app updates on a regular basis. Diigo has evolved over the years, and today it is used more frequently for its collaboration/annotation capabilities than the social bookmarking services it was originally focused on (Guhlin, 2020).

Having recently worked on a collaborative research publication using Microsoft Teams, and as a frequent user of collaborative Google products for both academic and personal endeavors, I was curious if this exploration would support Diigo as a stand-alone tool to be considered for use in collaborative research endeavors, or whether its offerings were more or less synonymous with tools embedded within these larger platforms.  Anecdotally, I think the answer is yes, Diigo does stand alone, at least for specific use cases.

Both Teams and Google have many strengths when it comes to video conferencing, and cloud-based word processing and document sharing, but I do not find that these tools go quite so far to aid in the initial research phase. According to Educational Technology and Mobile Learning (2015), with Diigo, student and faculty researchers may:

  • Search for online content relevant to their project, bookmark the websites and then add them to a shared ‘class’ or group
  • Organize bookmarks by tags and date to organize content around a particular topic and to make it easy to search for it later
  • Highlight specific segments of a webpage or add sticky notes to annotate them for others to read 
  • Take screenshots of useful online content and annotate them for use as well

In these cases, Diigo essentially cuts out a step (or multiple steps) for an instructor or student trying to share research. In the initial research phase, a researcher using Diigo would not need to save and download a PDF or copy the link for a website of interest, only to reupload it or paste it later to a general repository, not having the same ability to annotate the resource or organize it as efficiently as Diigo would allow.  However, I do think Diigo finds its strongest value in working for a specific research purpose with a specific group of people.  Its value is inherently collaborative and is best used when trying to co-construct knowledge.  Consequently, it isn’t a tool I’ll be using regularly in my daily academic activities for just myself, but it is a tool I’ll be reaching for when it comes time to spearhead my next research project.

Resources:

Baran, E. (2013). Connect, participate and learn: Transforming pedagogies in higher education. Bulletin of the IEEE Technical Committee on Learning Technology, 15(1), p. 9-12. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.681.1177&rep=rep1&type=pdf

Diigo Inc. (2021). Diigo. https://www.diigo.com/

Education Technology and Mobile Learning. (2015, January 14). 7 ways students use Diigo to do research and collaborative project work. https://www.educatorstechnology.com/2015/01/7-ways-students-use-diigo-to-do-research.html

Farber, M. (2019, July 22). Social Annotation in the Digital Age. Edutopia. https://www.edutopia.org/article/social-annotation-digital-age

Guhlin, M. (2020, April 13). Note-taking and outlining: Five digital helpers. TechNotes. https://blog.tcea.org/note-taking-and-outlining/

Ruffini, M. (2011, September 27). Classroom collaboration using social bookmarking service Diigo. Educause Review. https://er.educause.edu/articles/2011/9/classroom-collaboration-using-social-bookmarking-service-diigo

Schmidt, Jason. (2010, July 30). Diigo and Delicious. Interactive Inquiry. https://iisquared.wordpress.com/2010/07/30/diigo-and-delicious/

Leveraging Digital Tools for Instruction Outside the Classroom: Community Engagement Project, EDTC 6102

It’s been a wonderful opportunity to invest time and energy into a project which ultimately helps me do my job better.  Though I am not currently a classroom teacher and do not have typical instructional responsibilities in my day to day work, I do constantly convey important information about the WA State teacher certification process to prospective educators, a process which can often feel overwhelming and convoluted to many would-be career changers. I feel confident that I’ve put together a blueprint for a meaningful, engaging informational session which will help students navigate the first steps of the certification process with confidence and clarity, ultimately helping them discern for themselves if becoming a teacher is the right step for them professionally at this time, and if so, how to make that a reality. Not only that, I’ve been able to think creatively about how to leverage digital tools to make the session interactive, student-centered, and practically useful to those who attend.

Lesson Plan for 60 Minute Information Session

This lesson plan is intended for a Group of 10-20 prospective graduate students using a virtual teleconference platform like Zoom, Teams, Google Meet, etc.

Note: some hyperlinks may not be accessible to all due to permissions settings

Introduction 
10 min.
Students will be notified of my intention to record the session and their associated rights (turning off video, etc.).

Introduce Self and Learning Objectives of Info Session:
Objective 1: learn how someone becomes a teacher in WA State and determine personal readiness to begin the process.
Objective 2: learn what program options are available at SPU and which program is best-fit for personal context
Objective 3: learn what steps to take next with an application to a teacher certification program

10 Signs That You Should Become a Teacher, opening reflection video
Interactive Presentation
20 min.
Google Slide Deck
Students will be provided with access to the Google Slide Deck in advance of the information session so that they may conduct research ahead of time or follow along independently, clicking on hyperlinks, etc. in their own browser window. I will also use it to structure the presentation of information in the session.  For those who do not choose to follow along independently, hyperlinks for interactive elements will be provided directly in the session chat.

Within the slide deck, students will be introduced to SPU’s various graduate teacher certification program options.Students will be given the opportunity to stop and reflect on what feels like their best-fit program halfway through, before new information about the application process is introduced.  They will also be given the opportunity to ask clarifying questions at this point via Jamboard.

Students will then be given detailed information about application requirements and due dates, including specific information about the endorsement verification process.  This is also a time where I intend to help students understand and navigate the various information systems which they will have to engage with throughout the certification and application process (online application, standardized tests, etc), and to what extent they’ll be expected to provide personal data in digital spaces.
Formative Assessment 10 min.Prospective students will then have the opportunity to review and test their understanding of the material via a brief, 15-question Kahoot Quiz.  This will also serve as a formative assessment tool for me, the instructor, and may help point out areas that need further clarification. There will be time to pause and address these areas while going through the Kahoot Quiz.
Performance Task 
10 min.
Students will then have the opportunity to curate a personal To Do List outlining their next steps towards an application (and thus, towards becoming a teacher).  This is a performance task that helps students indicate their understanding of the material covered in the information session, but it is also meant to be a practical, relevant takeaway for each attendee.

Students will be able to make a copy of this Google Doc Template which contains a scaffolded “word bank” of application requirements.  Students will be able to copy/paste from the word bank in order to create their own, personalized To Do List, paying special attention to their specific program needs, endorsement requirements, and chronological order (i.e. which items need attention first). I will also provide the Google Doc Template in an alternate format (i.e. Word document) for any students that need it after the session. 
Self-Assessment & Reflection
10+ min.
Shortly after providing students with the Google Doc Link, I will provide a link to the final Sticky Note Q & A w/ Jamboard so that students may ask any needed questions while they construct without interrupting the thought processes of others in the session.  They may also choose to drop questions privately to me in the chat depending on the immediacy of the need and/or group appeal of their question. 

After the allotted 10 minutes passes for list construction, I’ll also offer a final few minutes for students to review their lists and ask final questions that need resolving on the Jamboard, OR live in the video conference, time permitting. The session will be recorded and the recording will be provided to students after the information session via email. This allows students to go back and review as needed.

Throughout this session, students will have the opportunity to demonstrate their understanding through application and self-knowledge.

  • Application: as students create their own, personally curated “To Do List” they will be able to apply their understanding of the session material by creating a useful tool that will guide them moving forward.  This includes discerning which information is most relevant to their particular context.  Students will be able to effectively navigate the various information systems which they will have to engage with throughout the certification and application process, especially in regards to the information they provide during the application process.  This also brings to mind the fact that students will literally apply to a program as part of their next steps towards becoming a teacher.
  • Self-Knowledge: students will be invited to reflect on their motivations to become a teacher, whether now is the time to take steps towards becoming a teacher, as well as what kind of program would be best suited for their needs. There will also be ample time for students to grapple with what they do not know, or what is confusing for them in this process.  The decisions they make moving forward will be rooted in the self-knowledge acquired from this session.

I do believe that the most helpful reflections on this “lesson plan” will come after I’ve had the opportunity to put it into action for the first time in a professional setting, likely in Fall of 2021. That said, One area for potential improvement that I can already identify is the curated “To Do List” platform.  Though I found some potential tools that I was interested in which might be a bit sleeker/less cumbersome than using a Google Doc, the ones I came across were not open access or would require a full account set-up in order to use them. This wouldn’t translate well to the timing and context of this particular lesson, nor the student audience (i.e. prospective students attending an information session, not students enrolled in a class).  Thus, for now, I have the Google Doc format as a bit of a place-holder.  I’m quite open to tweaking this section of my lesson in favor of a better tool later on.

In summary, this particular project was a wonderful exercise in thinking about learning and instruction on a macro level and the many ways they take place outside of a formal classroom environment. Digital tools may be leveraged in a myriad of ways to help us do our jobs better, and this was an opportunity for me to think creatively about how to bring that home in my own context. I look forward to using this session format in the next recruiting cycle!

Global research collaboration and the pandemic: How COVID-19 has accelerated networked learning in higher education

Image courtesy of https://www.polyu.edu.hk/web/en/about_polyu/global_network/

According to the National Science Foundation (2019), one out of every five academic research articles are written by authors hailing from more than one country. This fact suggests that the value of international research collaboration was recognized and sought out well in advance of the global COVID-19 pandemic of 2020, but perhaps it’s only just the beginning.  Reasons to pursue global collaboration in higher education include reaching wider audiences and increasing the impact of published research, reducing bias and broadening perspectives with a diverse research team, and leveraging the ability to offset domestic skill shortages by collaborating across national borders (Lee & Haupt, 2020).  Networked learning in higher education can also encourage new levels of creativity and innovation in all kinds of disciplines, and it expands the potential for authentic global and cultural learning experiences in an increasingly connected world (Cronina et al., 2016).  These benefits apply to even the most scholastically “productive” countries like the USA and China (Lee & Haupt, 2020).  That said, understanding that international collaboration in higher education was valued–at least to a certain extent–prior to 2020, I am curious to explore how recent changes in technology and cultural shifts in academia during the pandemic have worked in tandem to build upon this trend, potentially accelerating technology’s impact on global research collaboration and cooperation into the future.

With the onset of COVID-19, scientists and researchers from every corner of the world scrambled to understand the virus and seek a cure. Information sharing among countries quickly became essential, especially for those countries that were hardest hit early on (Lee & Haupt, 2020).  Though socio-political tensions between countries–and even domestically within countries–were hardly in short supply in 2020, the demands of the pandemic shifted priorities such that many international corporations and research institutions began working together rather than competing with each other to produce a vaccine, and large-scale exchanges of medical and public health data, including possible solutions, was (and still is) shared internationally using digital and online tools (Buitendijk, 2020).

When it comes to information sharing, one way of measuring an increase in international collaboration is through a country’s participation in open access publication platforms.  Open access journals and publication platforms remove barriers for accessing information and research since they do not require payment or subscriptions in order to be read and cited.  Sometimes the cost of publishing is absorbed by these open access platforms through philanthropic efforts, sponsorship, or submission fees paid by the authors, etc., but the bottom line is that there’s typically little profit to be made by academics, researchers, and authors who publish in open access platforms. Thus, the motivation for researchers–either individually or nationally speaking–to publish in an open access platform is often more altruistic in nature, placing higher priority on the sharing of information than any potential gains or notoriety received from publishing, monetary or otherwise.  According to Lee & Haupt (2020), countries with lower GDP who were more severely affected by the pandemic were the most likely to increase participation in open access publishing and international research efforts. It would follow that decisions to increase open-access participation was also meant to illicit reciprocal behavior from other countries, and indeed, the majority of all “knowledge producing” countries increased their participation in open access publishing during the pandemic: “For each of the top 25 COVID-19 research-producing countries, there was a noticeably higher proportion of open-access articles on COVID-19 than during the past 5 years and on non-COVID-19…publications during the same period” (Lee & Haupt, 2020, para. 26).

In addition to the public health concerns that have motivated scholars and researchers to participate in more information sharing during the pandemic, it must also be said that the act of collaboration has gotten exponentially easier in recent years.  In a 2010 publication, Iorio et al. discuss the use of digital tools designed to facilitate international collaboration and interaction amongst higher education scholars.  In this specific case, domestic teams in five different areas of the world were attempting to complete an integrative design task which required synchronous virtual meetings, a way to exchange ideas, brainstorm, and problem solve in real time (though not necessarily synchronously), as well as an appropriate digital repository for their work (building plans, model mapping, cost estimates, etc.) which could be accessed frequently by the members of each team in their respective countries.  The article focused its efforts on reviewing the virtual reality platform Second Life. To me, Second Life now feels woefully insufficient as a project management platform, at least according to 2021 standards, but at the time, the authors found Second Life to be a comparatively “appealing choice” due to its options for customization and tools such as virtual white boards, voice and text chat, scheduling agents, etc., all in one centralized, virtual location. Second Life aside, the authors noted that “To date, very few technological options exist that provide all of [the needed] functionalities to distributed networks. Commonly used tools such as email, instant messaging, and teleconferencing do not provide a framework for interaction that fully satisfies the demands of geographically distributed projects.”

Sample of a virtual meeting room in Second Life; image courtesy of https://marketplace.secondlife.com

In short, Second Life was more or less the best this research team could find in 2010. Since then, there’s been a massive influx of virtual project management/collaboration platforms introduced to the market.  Consider the list below of some of the “big names” in collaboration software along with their launch dates:

This list represents nine, powerhouse collaboration platforms, all of which rolled out between 2010 and 2020, and many of which depend heavily on the power and popularity of cloud storage or cloud computing (which was also expanding significantly during this time frame).  And please note: this list is hardly exhaustive.  There are many more out there (and counting!), and even the ones on this list are constantly being updated and expanded.  Each platform or collection of tools on this list boasts its own strengths and weaknesses (the exploration of which is not the point of this post), but there can be no doubt that no matter the platform, it is easier to make the choice to communicate, collaborate, and innovate with people all over the world in 2021 than it was even ten years ago.  Of course, not only have the tools themselves gotten better, but the pandemic has accelerated digital tool adoption for purposes of collaboration at an extraordinary rate, popularizing already existing tools (e.g.. Zoom teleconferencing) in unprecedented ways, such that millions, if not billions, of students and professionals in myriads of settings worldwide are collaborating and problem solving virtually across distances in ways they were not one year ago.

The COVID-19 pandemic has brought into stark realization the fact that we need ‘global solutions to global challenges’ (Buitendijk et al, 2020) and that we need not relegate the phenomena of increased global collaboration in higher education to a particular moment in time.  Instead, we might view this as an opportunity to challenge the model of competition between higher education institutions and place lasting value on diversified bodies of knowledge production, dissemination, and consumption (Buitendijk et al, 2020).  We might also recognize that a philosophy of collaboration makes it possible for students and lecturers in all types of higher education settings to have more equal roles in creating content, sharing resources, and asking/answering important questions (Cronina et al., 2016).  As centers of research all over the world, universities have a crucial role to play in helping humans to better care for one another on a global scale, teaching us to become “more empathic, less competitive, and more networked in our research and educational activities” (Buitendijk, 2020).  Let us not lose the momentum of this moment to embrace a new norm in higher education, maintaining a sincere commitment to, and value of, community-minded research and collaboration across borders.

For further discussion on this topic, consider viewing this 60-minute webinar, “The impact of COVID-19 on University Research and International Collaborations” offered through the UC Berkeley Center for Studies in Higher Education. The webinar was recorded in August of 2020.

References:

Buitendijk, B., Ward, H., Shimshon, G., Sam, A., & Sharma, D. (2020). COVID-19: An opportunity to rethink global cooperation in higher education and research. BMJ Global Health. http://dx.doi.org/10.1136/bmjgh-2020-002790

Cronina, C., Cochraneb, T.,  & Gordonc, A. (2016). Nurturing global collaboration and networked learning in higher education. Research in Learning Technology, 24

Iorio, J., Peschiera, G., Taylor, L., & Korpela, L. (2010). Factors impacting usage patterns of collaborative tools designed to support global virtual design project networks. Journal of Information Technology Design in Construction, 16, 209-230. https://itcon.org/papers/2011_14.content.08738.pdf

Lee, J., & Haupt, J. (2020). Scientific globalism during a global crisis: research collaboration and open access publications on COVID-19. Higher Education. https://doi.org/10.1007/s10734-020-00589-0


National Science Foundation (2019). Publications Output: U.S. Trends and International Comparisons. National Science Board: Science and Engineering Indicators. https://ncses.nsf.gov/pubs/nsb20206/executive-summary

Exemplars of Computational Thinking in Higher Education Classrooms

Though the concepts and theory behind computational thinking (CT) have been around for decades in the realms of computer science and engineering, it is widely acknowledged that Jeanette Wing’s 2006 publication on computational thinking laid the groundwork for CT’s popularity and integration in 21st century education theory.  Wing (2006) suggested that CT might be considered essential to all human endeavors as it is a distillation of a way that we naturally approach solving problems, managing our daily lives, and communicating and interacting with people.  It need not be relegated only to the STEM fields and computer science majors, because CT is not about getting humans to think like computers.  Rather, CT harnesses the natural outpouring of human cleverness, creativity, and problem solving that laid the foundations for the field of computer science in the first place (Wing, 2006). CT is about “…solving problems, designing systems, and understanding human behavior” by drawing on, and leveraging, the concepts fundamental to computer science (Wing, 2006, p. 33).

Though academics continue to debate an authoritative definition for CT, certain common themes are generally accepted characterizations of CT across the board.  These characteristics include:

  • Abstraction — thinking through abstract concepts and ill-defined problems, at times breaking them into smaller, digestible pieces, in order to move towards a more concrete, real-world solution (Wing, 2006).
  • Pattern Recognition —  recognizing useful patterns in data, filtering out the characteristics of patterns that aren’t needed, focusing on those that are (Wing, 2006).
  • Algorithmic Thinking — curating a list of steps that can be followed to solve a problem (Lyon & Magana, 2020).
  • Creative Problem Solving — developing a unique, context-based solution that is  considered original, valuable, and useful (Romero et.al., 2017).
  • Evaluating Solutions — considering the efficacy of a proposed solution to a problem, perhaps making considerations for factors like efficiency and resource consumption (Lyon & Magana, 2020).
Image sourced from https://koneilleci201.wordpress.ncsu.edu/2020/01/28/computational-thinking/

These facets of CT and the related skills are all integral parts of a 21st century education at all levels, including K-12 and postsecondary.  Indeed, “…computational sciences have been deemed essential to maintaining national competitiveness in the workplace and national security.” (Lyon & Magana, 2020, p. 1174)  For these reasons, the fundamentals of CT have been championed in education theory over the last decade, and nationally recognized standards like the Common Core State Standards and ISTE Standards for Students have pointedly emphasized the importance of “21st century skills” in K-12 education while simultaneously offering some clear guidance for what CT can look like in action. 

But what about higher education?  The implementation of CT in higher education classrooms is noticeably harder to call out, especially outside of computer science and engineering classrooms.  In my opinion, this is likely due to a number of factors including, but not limited to:

  1. Lack of collegial collaboration:  higher education disciplines are notoriously siloed. Meaningful integration of CT concepts outside of computer science and engineering programs demands intentional professional development for faculty, as well as interdisciplinary cooperation, both of which can be less accessible in higher education.
  2. Lack of resources: there is relatively little literature available which provides ideas for practical application of CT outside of computer science programs (i.e. coding and computer programming) at the postsecondary level. Additionally, published standards often lean more heavily towards K-12 education.
  3. Questions of applicability: the humanities often resist algorithmic ways of knowing because there is so much value placed on interpretation, subjectivity, and open debate about meaning (Czerkawski & Lyman, 2015).
  4. Just getting started: there is growing interest in translating CT pedagogy into a wide variety of disciplines in higher education (and K-12 for that matter), but the research and discussions are just getting started.  There is much yet to be explored.

Knowing that many STEM instructors in higher education automatically incorporate CT in their approach to teaching and learning because of the nature of their field, and knowing that engaging with CT in courses devoted to coding and programming are already integral to computer science and engineering majors, I seek to offer a few alternative examples of CT as it has been used to enhance teaching in learning in other kinds of higher education environments:

  • In a professional writing course taught at the undergraduate level, CT was used to systemize the writing process.  It called for a “deconstructive approach, breaking down the task of structured authoring into multiple layers of abstraction, and teaching each layer independently.” (Lyon & Magana, 2020, p. 1182)
  • In the fine arts, CT can be used as a tool to enhance creativity.  In one example, CT was used to create an organized system for tracing the origins of musical composers, which in turn inspired new creative endeavors based on the organized data. “…Algorithmic composition in music is, effectively, a human-computer collaboration–the computer serving as a tool that extends the composer’s ability to explore new musical ideas” (Edwards, 2011, p. 67)
  • The Stanford Literary Lab famously applied CT via Graph Theory to perform a “network analysis of character relationships and interactions” in a series of Shakespeare’s plays (Czerkawski & Lyman, 2015).
  • In the life sciences, CT has been used to inform systems theory and how to teach and understand biological processes, such as genetics, in an organized, logical fashion (Czerkawski & Lyman, 2015).
  • Utilizing a process dubbed “creative programming,” instructors may engage learners in the process of designing and developing an original work through coding. In this collaborative approach, learners are encouraged to co-construct knowledge in an interdisciplinary way. Examples might be to have students in a history course (co-)create a rendering of a city at a given historical period, or to present a traditional story in a visual programming tool like Scratch. In this kind of activity, learners must use skills and knowledge in mathematics, technology, language arts, and social sciences. (Romero et. al, 2017, para. 3)

Modeling and simulation activities are excellent examples of CT at work, and these types of learning activities can certainly extend themselves to many types of fields and disciplines.  Consider a learning activity where a group of undergraduate philosophy majors create a simulated narrative presentation wherein a human “character” makes a series of daily choices based on their moral philosophy or framework–almost like a “Choose Your Own Adventure” novel meets systems theory within one, or multiple, philosophical frameworks.  The simulation itself could be a computer-based product (or not), but regardless, the learning activity would draw upon many tenets of CT while also demonstrating in-depth knowledge of the discipline-specific subject matter.

All fields and disciplines require problem solving in some form.  Thus, it is reasonable to assume that CT may be useful in expanding the human ability to effectively problem solve in all fields.  In one study comparing the use of CT by an undergraduate computer science student and an art student, the researchers found that both students “…used various CT skills when solving all [italics added] problems, and the application of CT skills was influenced by their background, experiences, and goals.” (Febrian et. al., 2018, para. 1)  Regardless of training, background, or chosen major, CT enables postsecondary students to become more efficient problem solvers in all areas of life, teaching them to recognize computable problems and approach the problem-solving process as skillfully as possible (Czerkawski & Lyman, 2015).

References

Czerkawski, B.C. & Lyman, E.W. (2015). Exploring issues about computational thinking in higher education. TechTrends 59(2), 57–65. https://doi.org/10.1007/s11528-015-0840-3 

Edwards, M. (2011). Algorithmic composition: Computational thinking in music. Communications of the ACM, 54(7), 58-67. doi:10.1145/1965724.1965742  

Febrian, A., Lawanto, O., Peterson-Rucker, K., Melvin, A., & Guymon, S. E. (2018). Does everyone use computational thinking?: A case study of art and computer science majors. Proceedings of the ASEE Annual Conference & Exposition, 1–16.

Lyon, J. & Magana, A. (2020). Computational thinking in higher education: A review of the literature. Computer Applications in Engineering Education 28(5), 1174-1189. https://doi.org/10.1002/cae.22295

Romero, M., Lepage, A., & Lille, B. (2017). Computational thinking development through creative programming in higher education. International Journal of Educational Technology in Higher Education 14(42). https://educationaltechnologyjournal.springeropen.com/articles/10.1186/s41239-017-0080-z

Wing, J. (2006). Viewpoint: Computational thinking. Communications of the Association for Computing Machinery (49)3, 33-35. https://dl.acm.org/doi/fullHtml/10.1145/1118178.1118215?casa_token=DY3JiA-SOKMAAAAA:OYN4CIuf3LvuR1v4IiYsKQ-2J1KQMV6e0k6skWtib9uI02IKHhX9fEEA7rQC459Lk39QGworokaU 

The Changing Nature of Research in Higher Education

Research in higher education looks very different today than it did even ten years ago.  Academics who, not so very long ago, were well acquainted with physical library study spaces and large collections of peer-reviewed academic journals, find themselves in a digitized world of research with unprecedented access to information and virtual repositories of human meaning-making activity.  The nature and culture of research in higher education is shifting, including that which is considered “worthy” content to explore when conducting research in all kinds of disciplines.  One need look no further than the APA reference guide and the ever-expanding list of possible resources (e.g.. YouTube videos and TED talks, podcasts, blog posts, etc.) to note that the “rules” of research are expanding, and must expand, alongside our access to information.

jesadaphorn/Shutterstock.com

As a current doctoral student (and someone who received my initial graduate degree a decade ago), I am curious about the ways the research sector of higher education has changed over time. How are undergraduate students being taught to conduct research?  What kind of shifts have been made due to new tools and technology platforms that assist in the research process?  What cultural shifts are happening in graduate and doctoral programs, and are these cultural shifts impacting research strategy? 

Jonbloed et al. (2008) posit that higher education has an expanding set of stakeholders and thus a continually shifting societal expectation of what a university’s public obligation is.  Early universities provided education exclusively for clergy and societal elites, but over the centuries, higher education has been democratized such that there are many invested parties and participants with competing paradigms and priorities. Indeed, one of the major, ongoing, accelerated shifts in higher education is the diversification of students, staff, and faculty and the role that universities can/should play as advocates of–and vehicles for–social justice (Brennan & Teichler, 2008).  We also now live in a “knowledge society” where knowledge is considered the solution to everything and the key to personal and societal advancement (Jonbloed et al., 2008).  Thus, higher education institutions (HEIs) are driven to make teaching and research more publicly accountable, often restructuring programs and creating new ones to meet modern societal demands and forfeiting, or “reorienting,” long standing academic norms and values along the way (Jonbloed et al., 2008).

Even the doctorate degree, a terminal, research-based degree program which is typically the highest academic degree that can be awarded by a university, isn’t immune to change.  There is an increasing demand for doctoral programs to become more relevant, to produce academics with transferable skills in their field in addition to research skills, and to even be more sensitive to issues of employability that extend beyond creating new academics who scarcely step outside the “ivory tower” of a university campus (Park, 2005).  This requires attention to the course structure and modality of a doctoral program, the quality of the mentorship provided, the diversity of students within the program, and an expansion of that which is considered sufficient, valuable evidence of research contributions in a given field.

At the undergraduate level, much focus is given to the development of research skills as a form of information or digital literacy.  K-12 schools and districts across the United States differ greatly in their approach to teaching digital literacy skills.  Thus, undergraduate students at HEIs come into lower division classes with a wide range of background and abilities (or lack thereof) informing their approach to research.  In a case study conducted at Texas Christian University (TCU) by Huddleston et al. (2019), faculty were surveyed to determine what research skills they felt were most needed and valuable for undergraduate students to have, and which skills undergraduate students tended to struggle with most.  A list of nine core skills for research success was produced based on faculty responses:

  1. Topic selection
  2. Search strategy
  3. Finding resources
  4. Differentiating source types
  5. Evaluating sources
  6. Synthesizing information
  7. Summarizing information
  8. Citing sources
  9. Reading and understanding citations

Perhaps unsurprisingly, faculty overwhelmingly felt that the skill they most wanted students to master by the time they graduated was the ability to critically evaluate information and sources.  This was, however, also found to be the weakest skill that undergraduate TCU students possessed, and that they were least likely to be able to do at a satisfactory level upon graduation (Huddleston et al., 2019).  It is no coincidence that the ability to think critically about an information source is needed now more than ever due to the overwhelming amount of information and sources available on the world wide web.  While access to valuable, credible sources of information expands, students need to be able to recognize “worthy” material in dynamic ways which allow them to differentiate their source types appropriately.  Certainly not all valuable research material is limited to the contents of academic journals, but neither is every blog post worthy of scholarly consideration. In this case study, Huddleston et al. (2019) note that the university library/librarians are important resources and guides when it comes to information literacy instruction, and a number of suggestions were made to help increase the visibility of librarians at the department level, leveraging their knowledge and training alongside faculty in a collaborative approach to teaching undergraduates needed research skills.

There is no denying that a certain level of digital and informational literacy is essential in all areas of higher education given that “research outputs across the academic disciplines are almost exclusively published electronically,” and therefore “organizing and managing these digital resources for purposes of review…are now essential skills for graduate study and life in academia.” (Lubke et al., 2017, p. 285) Of course, in the year 2021, there are also a myriad of digital tools available that not only assist in the research process, but make it easier to practice information literacy and grow a researcher’s individual technical savvy. Assuming the literature review (i.e. research paper) is the most frequent research-based activity conducted in higher education, especially at the graduate level, Lubke et al. (2017) propose a simple, 3-step framework which can become the essential workflow for a paperless research project.

Lubke et al. (2017)

As the image suggests, stage one begins with selecting a digital tool to store and analyze sources.  Some suggested platforms include Zotero, EndNote, F1000 Workspace, RefWorks, and Mendeley.   Each tool has its own strengths and weaknesses, but generally speaking, each is an example of a digital tool that assists researchers in methodically storing and organizing possible source material for consideration, both in the current research process and for possible future use (e.g. dissertation).  Once sources have been selected and stored, researchers may then move to stage two where they may read, annotate, and analyze their sources.  This is where weak sources may be removed from consideration and where important pieces of information are mined and commented on in preparation for creating an academic argument (Lubke at al., 2017). In the annotation phase, digital tools like GoodReader can be used to take notes and highlight a text; then, annotated versions of sources may be saved separately from the original.  Finally, in stage three, researchers may choose to employ Qualitative data analysis software (QDAS) like QSR NVivo to synthesize themes and pull together information from across sources, ultimately drawing conclusions for publication.

The nature of research in higher education–and really, higher education itself–has changed drastically over the course of the last couple of decades.  Higher education is expanding in its scope and purpose, and there is increasing demand for academic research to have immediate, practical value. When conducting research, the most frequent problem faced by students and academics at all levels is what to do with the vast amounts of information we now have access to: how to source it, organize it, and analyze it critically.  Direct instruction in digital and information literacy continues to be a need in postsecondary education (both undergraduate and graduate), but there are a number of tools available that can be powerful aids in the research process, expanding our knowledge base and extending our capacity to think critically about sources, thus also expanding our potential for innovation.  There is no doubt that the nature of research will continue to evolve alongside the digital world…are we ready to consider the possibilities?

References

Brennan, J. & Teichler, U. (2008).  The future of higher education and of higher education research. Higher Education, 56(3), p. 259-264. https://doi.org/10.1080/13583883.2003.9967102

Huddleston, B., Bond, J., Chenoweth, L., & Hull, T. (2019). Faculty perspectives on undergraduate research skills: Nine core skills for research success. Reference & User Services Quarterly (59)2, pp. 118-130. 

Jonbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Interconnections, interdependencies and a research agenda. Higher Education, 56(3), p.303-324.  https://doi.org/10.1080/13583883.2003.9967102

Lubke, J., Britt, V., Paulus, T., & Atkins, D. (2017).  Hacking the literature review: Opportunities and innovations to improve the research process. Reference and User Services Quarterly (56)4, p. 285-295.

Park, C. (2005). New variant PhD: The changing nature of the doctorate in the UK. Journal of Higher Education Policy and Management 27(2), p.189–207. https://www.tandfonline.com/doi/abs/10.1080/1360080050012006

Assessment in higher education during COVID-19 and beyond: Will it ever be the same?

Ridofranz/Getty Images

Perhaps the word “unprecedented” has been overused in recent months, but it consistently seems to be the most fitting word to express the seismic shifts in all areas of life that have occurred during the COVID-19 pandemic. As K-12 and higher education institutions worldwide have grappled with rapid pivots to online teaching and learning (and have continued in these blended and fully remote modalities for much longer than anticipated), academics are now taking a moment to reflect on the past year and its lasting implications for the world of education.  After all, as business theory would posit, disruption leads to innovation.

As an educator interested in online teaching/learning in post-secondary education, I would specifically like to explore how the last year of remote learning has impacted assessment strategies in higher education. How have widespread shifts to online teaching/learning impacted college students’ abilities to demonstrate competency in varied and student-driven ways?  Higher education is notoriously “old school,” and post-secondary classes are most frequently  lecture-based, led by instructors who are slower to adapt to more progressive, student-driven pedagogies.  And yet, as universities across the U.S. have made college and graduate school entrance exams like the SAT, ACT, and GRE optional for applicants in 2020 and 2021, a world beyond high stakes standardized testing can perhaps be imagined now more than ever.

Higher education instructors worldwide are already engaging in this issue and offering recommendations for ongoing change.  Perhaps surprisingly, some of the first publications I encountered came from educators in the graduate medical school community in both Australia and Pakistan. This was particularly striking since the medical sciences require lab work and clinical assessments which are particularly challenging to address in remote situations, as well as the fact that the medical sciences have long required high stakes testing at many stages of a medical student’s training.

According to Torda (2020), many medical school instructors in Australia have moved to lower the stakes of traditional written or multiple choice exams delivered online during the pandemic.  At the same time, a shift has been made to put more weight on multi-sourced feedback and student portfolios.  Where clinicals are concerned, simulation platforms such as the OSPIA (Online Simulated Patient Interaction and Assessment) system have been leveraged to bridge the gap until in-person clinicals may safely resume.  Another significant shift has been an emphasis on measuring a student’s ability to exhibit key professional skills, known in the medical community as “Entrustable Professional Activities,” over and above written examinations (Torda, 2020)..  In other words, students are being assessed on their ability to apply their learning in professionally relevant contexts.  Some of these skills include, but are not limited to, recommending and interpreting common diagnostic and screening tests, providing a (virtual) oral presentation of a clinical encounter, forming clinical questions and retrieving evidence to advance patient care, and collaborating with professional colleagues (Torda, 2020).  It was noted that, taken as a whole, these measures go a long way toward easing test anxiety and motivations to cheat in an otherwise high stakes, demanding field of study (Torda, 2020).

Additional examples of altered assessment strategies in the medical community have been reported in Pakistan.  Similar to Torda (2020), Khan and Jawaid (2020) posit that the pandemic has necessitated lowering the stakes of online-proctored, traditional exams.  The authors advocate for the use of student portfolios and video evidence of professional tasks completed, as well as synchronous open book exams.  The authors note that the aim of synchronous open book exams “…is to assess the ability of students to analyze and solve a problem, [and to] assess critical thinking and creativity. With open book exams taken in real time, the issues of cheating can be minimized.” (Khan & Jawaid, 2020, p. 109)

Changes in higher education assessment have also been reported in the United States. In June of 2020, Natasha Jankowski, in partnership with the National Institute for Learning Outcomes Assessment (NILOA), spearheaded a higher education survey meant to capture “a snapshot of assessment-related changes made during Spring 2020 in response to the sudden shift to remote instruction…” (Jankowski, 2020, p. 3). The survey included responses from faculty and staff at 624 different institutions, both public and private, with representation from all 50 states. The survey sought to record learning changes that higher education instructors were making, the impacts of those changes on assessment culture, and the role of student voice in the decisions (Jankowski, 2020).  The survey results showed that 97% of respondents made learning, instructional, and assessment changes of some kind during Spring 2020. Changes included modifying assignments and assessments, allowing flexibility in assignment deadlines, shifting to a pass/fail grading model, and modifying assessment reporting deadlines.  Though some respondents made changes that included accepting alternative assignments, this was a less often made change (Jankowski, 2020).  The survey also showed “…that assessment-related changes were undertaken to address student needs” (p. 3).  However, these changes may have had more to do with faculty/staff perception of student needs as opposed to action taken in direct response to student reports: “Information gathered from students was less likely to influence decisions on what to change, and students were less likely to be asked to identify their needs prior to decisions being made.” (p. 3) Consequently, it might be hard to define many of these changes in assessment as authentically “student-driven.” 

Nevertheless, it seems that the pandemic has disrupted “business as usual” in higher education such that many of the changes reported above may in fact have lasting impact with increasing opportunity for student voice to take a front seat in decision-making.  Dr. Funmi Amobi of Oregon State University’s Center for Teaching and Learning puts forth compelling arguments in favor of  “reimagining” assessment in higher education in light of the lessons we’ve learned in the pandemic (Amobi, 2020).  Amobi asserts that the radical move to remote instruction has “refocused attention on improving assessment practices to alleviate student stress and anxiety, emphasize learning, and redress inequities in student success.” (par. 2)  The author goes further and provides seven practical strategies for reimagining assessment in higher education.  Though these strategies can certainly be used effectively in remote learning environments, they are not only meant to solve problems related to online teaching and learning.  The strategies presented by Amobi (2020) should be taken seriously by all higher education instructors wanting to diversify their approach to assessments and create more student-centered learning experiences: 

  1. Use short, weekly quizzes to assess students formatively, and consider making the quizzes cumulative so that they may contribute to a summative assessment score.
  2. Ask for justification on multiple choice tests and grade the response instead of the answer.
  3. Create opportunities for collaborative, group tests.
  4. Have students construct exam questions themselves as a way of reviewing and exercising higher order thinking skills; then, include many of the student questions on the exam.
  5. Allow for notes or a study card and have students submit the prepared materials for credit along with the actual exam.
  6. Utilize practice tests.
  7. Spend time reviewing exams to address misunderstandings and improve future performance; consider giving credit for thoughtfully corrected exams where learning is evident.

In each of the reviewed publications, certain recurring themes were readily apparent: 1) it may be high time for colleges and universities to rethink the value of high stakes testing 2) varied assessment strategies allow for a more effective presentation of student learning 3) assessment is part of the overall learning process and should not be divorced from student voice 4) varied assessment strategies reduce test anxiety and the motivation to cheat (the ladder being oft-cited as a obstacle in online assessment). 

We must avoid the underlying assumption that more technology is needed in order to solve the problems that technology introduces.  In other words, as the pandemic continues to require extended engagement in remote teaching, higher education instructors must not assume that the only way to assess online is to find a way to virtually proctor the same exam that would normally be given in a physical classroom (Kumar, 2020).  Instead, educators at all levels may take this opportunity to make meaningful changes to their use of assessments, both now and into the future, thinking critically and creatively about how to best meet students where they’re at.  

References:

Amobi, F. (2020, November 12). Reimagining assessment in the pandemic era: Comprehensive assessment of student learning. OSU Center for Teaching and Learning. https://blogs.oregonstate.edu/osuteaching/2020/11/12/reimagining-assessment-in-the-pandemic-era-comprehensive-assessment-of-student-learning/

Jankowski, N. A. (2020). Assessment during a crisis: Responding to a global pandemic. National Institute for Learning Outcomes Assessment. https://public.uhcl.edu/education/centers-initiatives/planning-assessment/documents/niloa-covid-assessment-report.pdf

Khan, R. A. & Jawaid, M. (2020). Technology Enhanced Assessment (TEA) in COVID 19 Pandemic. Pakistan Journal of Medical Sciences 36, 108-110. 10.12669/pjms.36.COVID19-S4.2795

Kumar, R. (2020). Assessing higher education in the COVID-19 era.  Brock Education Journal 29(2), 37-4. https://journals.library.brocku.ca/brocked

Torda, A, (2020). How COVID‐19 has pushed us into a medical education revolution. Internal Medicine Journal 15(9), (1150-1153).  https://doi.org/10.1111/imj.14882