
Faculty perspective: Connecting institute learning to statewide assessment work

Christi Camper Moore, Ph.D., associate professor of dance and graduate chair of the Arts Administration program in the Chaddock + Morrow College of Fine Arts was grateful for the opportunity to attend the .
In a conversation with Center for Teaching, Learning and Assessment (CTLA) Associate Director for Assessment Wendy Adams, she shared how the experience shaped her approach to assessment in both academic and statewide contexts.
Hosted by the Center for Leading Improvements in Higher Education at Indiana University Indianapolis, the Assessment Institute is the oldest in the nation, and OHIOās CTLA has made possible faculty and staff attendance the last two years.
Motivated by a desire to deepen her understanding of assessment as a critical and evolving element of teaching and learning, Camper Moore used the institute as an opportunity to explore assessment from a systems perspectiveāgaining practical tools, reinforcing a student-centered mindset, and strengthening her foundational knowledge.
She described the conference as a timely and energizing experience that not only validated the importance of authentic, embedded assessment practices, but also encouraged her to challenge assumptions and refine her approach to using assessment data to guide curricular change.
Soon after attending the institute, Camper Moore was invited by the Center on Education and Training for Employment at The Ohio State University to serve as a subject-matter expert on the Arts and Communication Standards (Dance) during a Standards Alignment Workshop, facilitated by the . This work involved realigning existing test items with revised course outlines, writing new questions, and collaboratively reviewing item rigorāall while applying the very strategies and perspectives she had sharpened through the institute.
Interview with Christi Camper Moore
What initially motivated you to attend the Assessment Institute conference?
Assessment is education's constant companion, and we're doing it all the time through formal and informal means -- whether we're aware of it or conscious of it or not, or whether it's codified or not. It's like the constant in what we do, and because assessment is supposed to inform teaching and learning, I wanted to really embrace the opportunity to expand my own knowledge, get new tools, and then also think about the way AI is changing assessment practices. Assessment is not going away: itās so embedded in what we're doing, and I really wanted to be better at it and understand it more fully. These factors really motivated me to attend.
In what ways did the conference content align with or challenge your prior understanding of assessment practices?
I learned a lot! It was a great reminder of just how complex and multi-layered assessment truly is. Sometimes with teaching or program evaluation or curriculum planning, it's easy to get bogged down and make assessment something that is only topical or superficial. The assessment conference really helped me push back against my implementation of assessment practices that aren't really rooted or embedded in what we're actually doing.
From a course-level perspective, the conference also reminded me that assessment has to think about studentsā engagement and needs and the related implications for retention. It really helped me think more critically about alternative assessments and collaboration, and how assessment can inform our pedagogical practices and choices for course content. Ultimately, I think the conference was really helpful in thinking about the layers of assessment and then the connective tissue between course-level assessment, program-level assessment, et cetera: there was emphasis on understanding a kind of connected assessment system and how this impacts students at every level.
After attending the institute, how prepared did you feel to implement assessment strategies within your department?
I think that what I can confidently say is that I have a much clearer and stronger foundational knowledge of assessment. That broader foundation then gives me the confidence to try new strategies, build out different metrics, dive deeper into specific assessment practices, and understand how to evaluate those strategies to guide decision making moving forward. So, I think a stronger foundation, the confidence to apply and try different things, and then the ability to analyze the data we're getting and to better understand how to make changes in the assessment process.
How did the institute help you think differently about using assessment data to improve teaching or curriculum design?
Similar to what I just said, the institute helped (re)focus me on student learning. Specific to the Master of Arts administration program, for example, I have previously used assessment data to revise and reimagine assignments but also to think about how student learning artifacts and program assessment plans can better align with broader program outcomes and goals. So, I've made course changes, assignment changes, curricular changes, and program changes based on assessment data and now I'm really thinking more connectedly about how do all of those things relate to ensuring that we have student centered outcomes. And the institute also helped me think differently - or accept differently- that less is more.
I think previously, again, because assessment and education are constant companions, there always feels like a tension between you're not assessing all the things all the time at one end of the assessment spectrum and then the feeling that you aren't doing enough concrete assessment of student learning and outcomes at the other end. The institute really helped me think differently about if we're student centered throughout all assessment practices, and we're really clearly aligned in that assessment funnel, then assessment is authentic and can be used to improve course design, student engagement, and/or informing our curricular planning decisions.
You were recently invited to participate in some assessment work for the Ohio Department of Education and Workforce. You were specifically recruited by the Center on Education and Training for Employment at The Ohio State University. Can you tell us a little bit about the work you did on this team?
I was asked to be what they call a SME (subject matter expert), specifically for the Arts and Communication Standards (Dance). We reviewed existing test items and realigned them with revised course outlines and competencies. Our work was threefold: 1) we conducted a standards alignment, making edits and changes to bring test items to current standards, 2) we wrote new items based on the course blueprint and aligned competencies, and 3) we worked to review, edit, and judge the level of rigor for each item in relation to the test development process.
Can you describe how you've applied the knowledge or tools from the Institute in this work?
At a basic level, I think the knowledge and tools from the institute really prepared me to utilize and understand the assessment process: How does it keep us all moving in the same direction? How do I take what I learned at the institute and apply that when we're thinking about assessment item writing, scenario writing, the level of difficulty or rigor of an item, and how does this align with stated competencies or learning goals?
The process was really fast, and we reviewed very quickly. And so, for me, it was interesting to work with other SMEs in small groups because we were always thinking critically, asking ourselves questions, referencing back to the standards we were writing to. There were just so many considerations! It was really fun to bounce ideas around and you would think a scenario would make sense, but then another SME would be like, āBut what about this? And what about this?ā and you'd agree, āAh, we have to throw that one outā. That kind of learning and brainstorming in the moment with my writing team was beneficial and what I learned at the institute better prepared me to engage with assessment at this level.
How did your participation in the instituteāand the assessment work within your own programsāenhance your ability to collaborate with colleagues on the ODEW project?
It provided me with a better understanding, a more common language, and the confidence to be able to ask questions when I didn't understand something as we developed the assessment questions. There are many layers to assessment. The institute definitely prepared me to be in a space where I might have been a subject matter expert, but I was also working with assessment professionals and thinking about how all of our knowledge developed the work collaboratively and proactively.
Looking back on your experience with the ODEW assessment work, was there anything that stood out to youāeither as something you felt well-prepared for or areas where you felt less confident? To what extent did your prior experience and participation in the Assessment Institute shape that readiness?
It is not easy to do assessment well. It takes thoughtful consideration. It takes time. It takes a constant check and balance with this is what you say you're assessing, but is that method, is that tool really assessing the standard, the outcome, is it really student centered? Is it too easy? Are there important cultural considerations with the use of language or with the implications of the question? And so, I'm not sure about necessarily what stood out, but the process really reminded me that assessment has to be a serious task and there has to be time allocated to do it well. And I think the Institute reinforced that: I went into the ODEW with that kind of preparation and framework.
What advice would you give someone who is either starting out, brand new to assessment work or struggling with assessment work?
I think the three things that I would offer are: 1) Just start. Not being an assessment professional but having to do assessment, I think it can get overwhelming if you feel you're doing it āright.ā 2) Relatedly, build your networks. There are so many people like you and others at the CTLA that are doing this assessment work who are really open to looking at what you're doing, where your starting point might be, and helping you build out and scaffold a broader assessment strategy or practice. 3) Document everything. If you're documenting everything, whether that is simply course evaluations or copies of student assignments, those are really important data points that most faculty are collecting on a regular basis. And yet, they're also part of the assessment conversation, which makes it easier to go back and begin to identify what you're assessing and where you're assessing and the outcomes of what you have been doing. And then you can start to build better alignment and clarity with what your assessment goals or objectives are.
Are there any other thoughts you would like to share about your experiences?
I hope I get to go to the institute again. I think there can be a misperception that assessment is somehow static that the general framework is always there. However, the institute really pushed me to think about even the assumptions of the existing framework for assessment and how it is evolving and changing and crumbling and being rebuilt with new tools and practices. I think that the Institute is a really exciting opportunity to get a current pulse on the depth and breadth of what's happening and changing in the field and allows you to be re-centered on how assessment ultimately impacts teaching and learning.
Faculty who are interested in attending future Assessment Instituteās are invited to contact Adams.