by Tricia Seifert
Assessment & You features a number of perspectives on assessment from across Canada and the US. Originally published on ryersonstudentaffairs.com from 2015-2017, this series dives into the depths of assessment knowledge and practice, aiming to build a culture of assessment for Student Affairs in Canada.
“Student affairs programs and services have been recast as learning engines, designed to foster learning and development.” This is one of the primary conclusions my co-authors and I advanced in How College Affects Students (volume 3: Mayhew, Rockenbach, Bowman, Seifert, & Wolniak; 2016).
Not an auxiliary set of support services. Not a complement to the curriculum. But an engine igniting learning and development! This is a serious mission, one that requires planning, execution, assessment, and evaluation.
Student affairs and services staff members are whizzes when it comes to planning. They have organizational charts, role descriptions, purchase orders, checklists, flow sheets, and minute by minute timelines. No one can plan programs like student affairs and services staff.
With a foolproof plan, the execution goes off without a hitch. Even when something unplanned happens, student affairs and services staff scramble and demonstrate nothing short of Olympics-worthy flexibility. Nothing thwarts the plan’s execution.
It is in areas of assessment and evaluation in which student affairs staff could use additional training and professional development.
Moving from support and service to an engine of learning is a substantial paradigm shift. The operational task changes from one that provides excellent student support service to that which achieves articulated learning outcomes. This change in focus demands a different skill set and provides a ready-made opportunity for collaboration with faculty.
A Little Help from my Friends
There is enormous value in people stretching to learn new skills but there is also enormous value in asking for help. Help seeking is a behaviour that student affairs and services staff members encourage students to develop. There is nothing wrong with staff members developing this same behaviour.
In the last decade, many faculty members have transitioned from presenting course objectives on their syllabus (a teacher-centred notion of education) to articulating learning outcomes (a student-centered notion). Student affairs and services are making a similar transition. Previously, student affairs and services staff focused primarily, perhaps even solely, on student service and satisfaction. Yet, the future of student affairs and services work lies in facilitating student learning.
Transitions present challenges and it may be worthwhile to ask faculty friends how they made the change in their classes. What resources were available? Did they participate in faculty development workshops? What books or websites did they consult?
While faculty may reference Peggy Maki’s excellent Assessing for Learning (2010) or Linda Suskie’s Assessing Student Learning (2009), student affairs and services staff members may turn to Student Affairs Assessment: Theory to Practice (Henning & Roberts, 2016) or Demonstrating Student Success: A Practical Guide to Outcomes-Based Assessment in Student Affairs (Bresciani, Gardner, & Hickmott; 2009). They may subscribe to this very blog that publishes ideas and suggestions which emphasize how student affairs and services staff can assess and communicate the division’s contributions to the institution’s mission. They may seek out expertise and resources available through CACUSS’ research assessment and evaluation Community of Practice and analogous communities through ACPA or NASPA.
Like our faculty colleagues who articulate what students will learn as a result of the course, student affairs staff need to identify what students will learn as a result of participating in a program or engaging with a particular service. Stating the intended learning outcomes from student affairs programs and services implicitly poses the questions: How do you know? Why do you think so? Simply put, these questions are at the heart of assessment.
I can sense readers’ hearts beginning to palpitate wildly. Answering the questions how we know and why we think so requires collecting, analyzing, and interpreting data. Dealing with data is not necessarily a well-honed skill of student affairs and services staff but it is part and parcel of faculty work. So why not ask for help?
Faculty members across the disciplinary spectrum may be keen collaborators, particularly when the assessment project recognizes the value of the data and methods used by the discipline. Imagine approaching a faculty member with a proposal to connect your program assessment project with a course that teaches disciplinary ways of thinking about, collecting, and interpreting data. The faculty member’s students then become consultants and apply the concepts they’ve learned to the assessment project.
At the beginning of any collaborative assessment project, it is important to articulate the research question and the desired learning outcome to be examined. The collaborative team of students, faculty, and student affairs staff then discuss how the learning outcome may be assessed or measured differently depending on the discipline.
For example, if the learning outcome is intercultural understanding, the type of data collected may vary from responses on a standardized survey instrument to narrative data collected through an interview or focus group, to photos or a short film elicited from a short prompt. Students and their faculty member bring a disciplinary perspective to analyzing the data, presenting results, and ultimately to answering the question: How does the student affairs program contribute to students’ development of intercultural understanding?
As a faculty member, I have collaborated frequently with my institution’s student affairs division on these types of assessment projects. I approach a senior leader and ask if they have data that they wish they had more time to examine in order to improve their practice. Senior leaders then meet with the students, identify the research questions and outcomes of interest, provide data, and attend the symposium in which the students present and interpret the results. The presentations have always concluded in rich conversations about implications for practice, a necessary component of closing the assessment loop.
The power of this approach has been in providing a space for students to apply disciplinary ways of thinking to data of interest. Students are excited to collect and analyze data when it contributes to the goal of improving student learning as well as institutional policy and practice. They work harder and are more persistent in troubleshooting challenges because they want to have the highest quality presentation to share with senior leaders. They are no longer motivated to master course concepts simply to earn an “A” but because they believe their best effort may have transformational benefits for students and their institution.
These benefits are not limited to graduate students. There is no reason why this type of collaborative assessment project could not have a similar effect with undergraduate students in psychology, sociology, education, film, photography, English and creative writing, economics, as well as other disciplines. From survey correlations to film interpretation and critique, collaborations across the disciplines may lend breadth and depth to the ways of knowing and also to the story of how student affairs is a powerful learning engine on campus.
These assessment collaborations actually provide student affairs with dual-engine power to facilitate student learning. By collaborating with faculty on these assessment projects, students actively apply course concepts in collecting, analyzing, and interpreting real data. One of the most salient take-away messages from How College Affects Students (volume 3) is that students who think and act in ways consistent with their discipline exhibit greater subject matter competence.
Student affairs and services is not just the little engine that could . . . it’s the little engine that will! Collaborating with faculty and students to assess student learning is the rocket fuel we need to blast out of orbit, transforming postsecondary education in a way that is truly out of this world.
Tricia A. Seifert
is associate professor and head of the Department of Education at Montana State University and maintains a faculty appointment at the Ontario Institute for Studies in Education at the University of Toronto. She is the Principal Investigator for the Supporting Student Success research project, https://supportingstudentsuccess.wordpress.com/. She tweets @TriciaSeifert and @CdnStdntSuccess.