Skip to main content Skip to search
""

Learning Assessment

Office of the Provost

Our Commitment

Welcome to the Learning Assessment website. At we are committed to student learning assessment in order to ensure that our colleges and schools, programs/majors, and courses are successfully fulfilling their educational missions, goals, and objectives. The purpose of this website is to provide faculty and staff with assessment-related tools and resources to help guide development and implementation of effective learning assessment plans.

The purpose of the AAC is to promote and support ’s learning assessment efforts by:

  • Fostering a positive assessment culture throughout the University
  • Supporting and facilitating University-wide assessment activities such as (1) disseminating assessment information across colleges/schools, including identifying best models and practices, and (2) collecting, documenting, and sharing assessment information for program/major improvement

The Committee meets at least once each semester. Its members are as follows:

  • Dr. Rachel Ebner (Chair), Director of Assessment; Clinical Assistant Professor of Psychology
  • Dr. Selma Botman, Provost and Senior Vice President for Academic Affairs
  • Dr. Timothy Stevens, Deputy Provost; MSCHE ALO, NYSED CEO-Designee & NC-SARA Primary Contact
  • Dr. Sean McKitric, Director of Assessment and Quality Improvement, Katz School of Science and Health
  • Dr. Yuxiang Liu, Director of Institutional Research 
  • Dr. Rebecca Cypess, The Mordecai D. Katz and Dr. Monique C. Katz Dean, Undergraduate Faculty of Arts and Sciences
  • Dr. Avi Giloni Associate Dean of Sy Syms School of Business
  • Dr. William Stenhouse, Associate Dean for Academic Affairs, Yeshiva College
  • Dr. Daniel Rynhold, Dean of the Bernard Revel Graduate School of Jewish Studies 
  • Dr. Sandy Moore – Director of University Libraries
  • Dr. John Vivolo, Executive Director of Academic Operations and Teaching and Learning, Katz School of Science and Health 

 

 

Program Mission Statements, Goals, & Objectives

FAQ

The Assessment Cycle

Assessment is "the systematic and ongoing process of gathering, analyzing, and using information from multiple sources to draw inferences about the characteristics of students, programs, or an institution for the purpose of making informed decisions to improve the learning process" (Linn & Miller, 2005). The principle that assessment is a systematic and continuous process, not an end product, is central to this definition.

Assessment is...

  • a cyclical process not an end goal
  • planned and systematic not random and variable
  • ongoing and cumulative not one point in time
  • multifaceted not singular
  • informative not a judgment
  • objective not subjective
  • transparent not unclear or hidden
  • pragmatic not useless
  • faculty designed and implemented not imposed from the top down

  • Classroom assessment: assessing an individual student’s learning experience in a course
  • Program assessment: assessing a group of students’ learning experience in relation to a program, departmental major or unit of study
  • Institutional assessment: assessing campuswide factors

If you don’t know where you are going, the best-made maps won’t help you get there” (Mager, 1997, p. vi).

  1. Assessment promotes self-reflection, which is essential for effective teaching and learning (Assessment: FAQ, Stanford University). It helps you to reflect on:
    • What goals you are trying to accomplish
    • How well you are meeting those goals
    • How you can improve
  2. Accreditation: Middle States Commission on Higher Education Standard 14: Assessment of Student Learning—“Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals” (MSCHE, 2009). .

Assessment involves collecting evidence of student learning and attainment of intended learning outcomes. To develop a more complete understanding of the extent of student learning, multiple pieces of evidence are needed. Evidence of student learning can be direct or indirect. To obtain the best indication of student learning, a combination of direct and indirect measures should be used.

  • Direct assessment: evidence based on directly examining and measuring students’ performance (e.g., exams, projects, papers, portfolio assignments, oral presentations, fieldwork observations)
  • Indirect assessment: evidence based on reports of perceived student learning (e.g., surveys and interviews with students, employers, faculty)

It means to take action by using assessment results to make program-level improvements or decisions. This might include:

  1. Revising your program-level outcomes
  2. Changing curricula by adding or removing courses or program experiences, requiring prerequisite courses, changing instructional methods or assignments within courses
  3. Creating or modifying assessments
  4. Creating or modifying rubrics
  5. Using assessment results to support current program practices or to make other program policies or decisions

Transparency showcases evidence of student learning from program experiences. It also enables you to reflect on program practices and effectiveness for meeting student outcomes.

Please contact us if there is any aspect of this website or student learning assessment that you would like to discuss.

Rachel J. Ebner, PhD
Director of Student Learning Assessment
Belfer Hall 1300A; 215 Lex. Room 606
212.960.5400, ext. 6138
rachel.ebner@yu.edu

Assessment Toolkit Resources

  1. Clearly define program’s/major’s mission
  2. Identify student outcome learning goals that directly align with program’s/major’s mission
  3. Define learning goals by stating objectives
  4. Map out which program/major courses and learning experiences will enable students to achieve program/major goals (curriculum mapping)
  5. Devise a program/major assessment plan and timeline
  6. Identify which goals you are going to assess and when
  7. Develop comprehensive methods for both directly and indirectly assessing students’ attainment of those goals (NOTE: no one assessment can evidence learning—multiple pieces of evidence are needed)
  8. Develop corresponding scoring rubrics to ensure consistency and accuracy in scoring of assessments (NOTE: rubrics are not the assessment, but a tool for scoring assessments)
  9. Implement the assessment plan and continuously monitor its effectiveness, making changes or improvements when necessary
  10. Analyze assessment results and communicate/report findings
  11. Use assessment results to inform and improve program’s/major’s effectiveness in meeting learning goals and objectives
  12. Document steps 1-11

Click Here for Program/Major Assessment Guidelines

from the University of Connecticut’s assessment website

from the University of Connecticut’s assessment website

  • from the University of Connecticut’s Assessment website
  • and

  • :  The Association of American Colleges and Universities (AAC&U) has developed a set of peer reviewed Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics including:  critical thinking, written communication, oral communication, quantitative literacy, information literacy and more.
  • :A web-based interactive software for designing rubrics
  • Presentation on the Design & Use of Scoring Rubrics
  • Using rubrics for program assessment from Loyola Marymount University's assessment website
  •    

  • Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations (2005). Middle States Commission on Higher Education. A brochure prepared by MSCHE detailing its expectations for meeting Standard 7: Institutional Assessment and Standard 14: Assessment of Student Learning
  • Assessment Primer (University of Connecticut). Provides an overview of assessment related concepts including information on how to write effective program-level mission statements, goals, objectives, curriculum maps, and assessment plans
  • . An online handbook on program-based assessment. It includes chapters on why to assess, defining goals and objectives, designing assessment programs, assessment strategies and methods, analyzing, reporting and using results
  • Student Learning Assessment: Options and Resources (2nd Edition, 2007). Middle States Commission on Higher Education. A publication by MSCHE presenting assessment-related options and resources.

  • Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and Assessing: A revision of Bloom's Taxonomy of educational objectives: Complete edition, New York: Longman.
  • Assessment: FAQ. Stanford University Office of Institutional Research and Division Support. Retrieved October 31, 2013.
  • .
  • .
  • .
  • Linn, R. L., & Miller, M. D. (2005). Measurement and Assessment in Teaching 9th ed. Upper Saddle River, NJ: Pearson.
  • Mager, R. F. (1997). Preparing instructional objectives: A critical tool in the development of effective instruction. Atlanta, GA: Center for Effective Performance.
  • .
Skip past mobile menu to footer