Assessment, Evaluation, & Research

Focuses on the ability to design, conduct, critique, and use various AER methodologies and the results obtained from them, to utilize AER processes and their results to inform practice, and to shape the political and ethical climate surrounding AER processes and uses in higher education.

Development

Professional growth in this competency area is broadly marked by shifts from understanding to application, and then from smaller scale applications focused on singular programs or studies to larger scale applications that cut across departments or divisions. Many advanced level outcomes involve the leadership of AER efforts.

Reflections

At multiple points during my graduate program, and annually since, I have reflected on my progression within each of the competency areas. They are listed below, with the most recent reflection first and the earliest reflection last.

Winter 2024 – Reflection

As the Director of Student Engagement I have been able to place greater emphasis on data collection and reporting. This emphasis has resulted in a department review of learning-outcome-based programming, related to Council for the Advancement of Standards (CAS) standards for Campus Activity Programs, and the initial development of an aligned student-program learning assessment (set to be administered during the spring 2025 semester). During the 2023-2024 academic year, I was also able to present utilization data at the University Scholarship & Best Practices (USBP) Exposition. I continue to self-rate the AER competency, ‘intermediate.’

Goals

  1. Administer the student-program learning assessment during the spring 2025 semester.
  2. Present learning outcome data at the spring 2026 USBP Expo or similar conference opportunity.

Summer 2023 – Reflection

The formality of assessment, evaluation, and research is much lower at a non-research-driven institution, however, I have been able to introduce some new AER skills to my role at Johnson & Wales. The utilization of Presence.io software for student engagement further enhanced data collection, which I leveraged into an end of year evaluative report for the department (available under artifacts on ‘About My Experiences’). My supervisor, the department Director, has suggested that this end of year report will be instructive for years to come. Critically however, the report overly emphasizes utilization data, and falls short of assessing student learning outcomes or more meaningful program engagement. My rating remains, ‘intermediate.’ 

Summer 2022 – Reflection

The choice to begin job searching (and eventually moving) ended my pursuit of a second master’s degree in the History and Philosophy of Education (HAPE) at Florida State University (and its thesis, as mentioned in the “Summer 2021 – Reflection”). However, during my final year of programming in the Center for Leadership & Social Change opportunities emerged to leverage the newly brought-together datasets for the benefit of revitalizing student engagement. Funnels and involvement pathways began to become visible for students who had not experienced in-person programming for nearly two-years during the COVID-19 pandemic. I was also able to support multiple graduate student research projects, through the master’s of Higher Education program and graduate assistants/interns. My rating remains, ‘Intermediate.’

    Summer 2021 – Reflection

    Following coursework (Fall 2020) based on the Econometric Model within Quantitative Research I feel more comfortable describing my AER competence as ‘intermediate.’  I have also developed a greater idea of a potential master’s thesis – but this idea would require additional research competence development, related to Historical Methods. As new datasets have been drawn together within the Center for Leadership & Social Change, and as datawork becomes more highly valued among my colleagues, I will continue to advocate for AER competence development.

    Spring 2020 – Reflection

    I do not believe that I have progressed to a level where I would feel comfortable suggesting solid intermediate competency, but I do know that I have mastered the foundational level. In the last year I have become increasingly familiar with the higher functions of Qualtrics, such as contact lists, piped texts, and triggers. Due to changes in course offerings and limitation in my ability to enroll in certain offerings, I was unable to take EDH5668, but I successfully complete my single-method direct observational study, which resulted in a 37 page research report outlining patterns in inter-modal conflict on a heavily trafficked campus pathway. Moving forward, I would like to investigate greater research methods, as I prepare for my next potential graduate degree.

    Summer 2019 – Reflection

    I feel more confident in my meeting of foundational outcomes of the Assessment, Evaluation, and Research competency after my first year of graduate study. I was able to attend Qualtrics 101, but was unable to attend Qualtrics 201, although I did receive the notes from my supervisor who has led the training workshop previously. Through using Qualtrics and excel I have been able to apply program assessment in much of my work at the Center for Leadership and Social Change. This work resulted in data-informed program revisions. Through Outcome I (EDH5078) I was able to learn about different methodological approaches and have designed a single-method direct observational study examining inter-modal interactions on a heavily trafficked campus pathway. While I will take Outcomes II (EDH5079) in the Fall semester, I am no longer certain of my plan to enroll in EHD5668 in the Spring.

    Fall 2018 – Reflection

    Foundational – In my previous role, as an entry-level student affairs professional, I had experience conducting program assessment and evaluation. This previous experience also included departmental review by a board of directors. Unfortunately this review was not done according to professional standards, such as the Council for the Advancement of Standards (CAS) standards, and as such the applicability of my previous experience in meeting some of the foundational outcomes is questionable. Related to the research outcomes, I did conduct a brief undergraduate research study, although I was not expected to establish a full understanding of different methodological options or a refined sense of review board policies. I have also been limited in my use of technology in assessment, evaluation, and research, having never used research-intended software (such as SPSS or Qualtrics).