Category 7: Measuring Effectiveness

 

Reacting
Systematic
Aligned
Integrated
Isolated tasks and
activities address
immediate needs
Repeatable, proactive
processes with clear
and explicit goals
Stable, consciously
managed, regularly
evaluated
Regularly improved
through analysis,
innovation and sharing

As the underpinning AQIP Category, Measuring Effectiveness is something MCC has consciously attempted to cultivate over the past several years.  The importance of data, metrics, and specific goals has been stressed in every department by managers.  We believe the current status of this aspect of MCC's systems and processes to be at the level of Aligned.

MCC has taken seriously the feedback received during the 2009 Systems Appraisal process about a focus on results and comparative data.  Specifically, there have been numerous efforts to be more proactive about the selection of specific data to benchmark internally and against other institutions externally.  Increased participation in State and National projects and initiatives have fostered a greater importance on measurement and benchmarking.  Examples highlighted in this category are MCC's participation in Achieving the Dream (AtD), the Aspen Institute's College Excellence Program, the Community College Survey of Student Engagement (CCSSE), as well as internal efforts on the part of AQIP Action Projects and other workgroups to select, analyze, and act upon measurements of institutional performance.

In addition to increased efforts in benchmarking, MCC continues to build upon the strengths within its Institutional Research (IR) and Information Technology Services (ITS) areas.  A great deal of technical and analytical capacity exists and is brought to bear on hundreds of reporting tasks and processes across all functional areas of the college.  MCC has greatly expanded its business intelligence capacity in the form of data warehouse and live ERP reporting of college data (visualized in figure 5-4 above).

When MCC submitted its first AQIP Systems Portfolio in 2009, the data from its first administration of the Community College Survey of Student Engagement (CCSSE) were not yet available.  CCSSE is now administered every other year and MCC has three separate administrations (2009, 2011 & 2013) for comparative analysis.  The data collected during CCSSE is also part of a "multiple measures" approach to assessment of student learning and engagement that is described in detail in 7I1 below.  MCC continues to administer and analyze results from the ETS Proficiency Profile, and has recently developed its own objective assessment of General Education outcomes.  These General Education outcomes were newly-created in 2009, but now the Committee for the Assessment of Student Learning (CASL) regularly assesses students though its faculty-developed objective test of these outcomes which include, critical thinking, global awareness, and citizenship.

MCC has increased its Institutional Research capacity significantly since 2009.  A new Director and Senior Analyst have joined the department in recent years and the staff has expanded to accommodate a growing need for compliance and improvement reporting.  Quality analysis of college data has become a staple of most work teams and projects; much of this demand stems from the quality data analysis performed for AQIP Action Project teams and the Achieving the Dream (AtD) effort in recent years.

In summary, the activities and results described for Measuring Effectiveness indicate that MCC is performing in an Aligned manner that is stable, consciously managed, and regularly evaluated.

7P1Selection, Management and Distribution of Data.MCC uses a number of methods to select, manage and distribute performance data.  These include internal administrative identification of key indicators for the college, State of Michigan Performance Indicators, Federal Perkins Core Indicators, and the Federal IPEDS Data System.  Management of external compliance reporting information is coordinated primarily through the Institutional Research (IR) office and the Accounting/Finance office.  Distribution of data is accomplished by way of employee forums, web pages, memos, e-mails and submission/participation in external reporting sites.  Institutional Research and other functional areas maintain hundreds of regularly-used business intelligence, Cognos, and Datatel data reports:

Reports Field Contents/Examples
Report Career Education Consumer Report, HLC Organizational Profile, Institutional Characteristics (IPEDS), Perkins Program Review of Occupational Education (PROE), Graduation Rates, HLC Annual Institutional Data Update, Michigan Governor's Dashboard Metrics, Student Right to Know, Year-End Program Enrollment, Gainful Employment (US Dept. of Education), Achieving The Dream, CEPI/STARR (Center For Educational Performance and Information- State of Michigan), Activities Classification Structure (ACS), Employee Group Information, Financial Aid/Student Cost, etc.
Departments Responsible Institutional Research, Accounting/Finance, Human Resources, Financial Aid, Information Technology Services, Physical Plant, etc.
Strategic Plan Relationship 2013-18  MCC Strategic Plan Overarching Goals (Ex: 1-1, 3-3, 5-1, 6-2, etc.)
Report Category Benchmarking, Institutional Information, Finance, Employee Information, Enrollment, Awards, Program Status, ACS, Aid Awarded, Perkins Data, Transparency Reporting, etc.
Primary Users External (State/Federal Governments), Internal (Staff, Faculty, Administration), Students, etc.
How Utilized Program Development and Planning, Student Demographics, Cohort Counts, Planning and Budget, Course Change and Planning, Prerequisite Courses, Planning Staff Assignments, etc.
Report Audience State of Michigan, National Center for Education Statistics, Student Services, US Department of Education, Higher Learning Commission, Financial Aid, etc.
Unit/Analysis Student Cohorts, Courses, Students/Financial, Seat Counts, Building Information, Seat Counts, Employee Information, Degrees Conferred, Financial Award, High Schools, etc.
Figure 7-1 List of Data Selection Fields in MCC's Report Taxonomy (selected)

These reports are widely used by managers, administrators, faculty and staff to manage numerous regular processes across the institution.

7P2Data Support for Planning and Improvement.The two primary drivers of data used for planning and improvement are internal processes and State and Federal reporting.  While many of the data elements contained in these reports are set by external stakeholders, a number of reports are customized or tailor made for internal analysis efforts.  A comprehensive sampling of actual standard data reports may be found in Figure 7-1 above and listed in Figure 7-2 below.

Type Requestor Timing Report(s)
Federal Reporting NCES Fall Institutional Characteristics; Completions; 12 Month Enrollment
Winter Student Financial Aid; Graduation Rates (200%); Graduation Rates (150%)
Spring Fall Enrollment; Finance; Human Resources
US Department of Education Winter Gainful Employment- Data Reporting; Disclosure
Winter ECAR- Eligibility and Certification Approval Report
State of Michigan Reporting Fall Reporting Fall Program Inventory; Fall Enrollment; Tuition and Fees; Awards Conferred; Technical Skills (1P1); Certificates, Credential, or Degree (2P1); Retention/Transfer (3P1); CTE Concentrator Completions; Special Populations by Gender; Year End Participants; CTE Concentrators; Tech Prep Participants; Year End Program Enrollment; Non-Program Enrollments; Unduplicated # of Students Within Award Level who Received an Award; Total Unduplicated Count of Students Having Received an Award
State of Michigan Spring PROE Self-Study Evaluation of Occupational Programs
Grants Reports and Applications Fall ACS-6; North American Tuition Waiver
Governor Snyder Winter Michigan Governor's Dashboard Metrics
CEPI Year Round UIC Upload
Spring STARR Data Submission
MiWorks! Year Round Career Education Consumer Report (CECR)
Accreditation HLC Spring AIDU- Annual Institution Update
Year Round AQIP Project Teams
Assessment CASL-Committee for Assessment of Student Learning Fall General Education Student Assessment
Winter CCSSE/CCFSSE (every 2 years); MAPP (every 2 years)
Institution Winter Program Review
Internal Audits Internal Year Round Degrees Conferred; ACS Codes; Course Master; Sections; Person Demographic; Cities/Zip Codes; HS (CEEB Codes); Gender; Race/Ethnicity; Programs of Study
Summer Annual Institutional Title IV Eligibility
External Initiatives ATD-Achieving the Dream Fall ATD Cohort Upload
Win Win/MCCA Summer Win Win Data Set- Eligible/Potential Students
Figure 7-2 Examples of Recurring Institutional Research Reports (selected)

The taxonomy also contains fields identifying report aliases, ID numbers, external web links, submission methods, as well as requestor, agency, and cycle indicators.  Annual department-level actions plans are prepared, approved, managed and assessed by internal administrators.  These action plans are aligned with the MCC strategic plan and other institutional priorities, such as AQIP Action Projects.

In addition to internal and external standing reports, MCC's Committee for the Assessment of Student Learning (CASL) regularly shares student learning data at faculty meetings, including the annual Assessment Update.  Assessment data are also shared online here:
http://www.mcc.edu/acad_affairs/aa_gen_ed.shtml

CC5D.

  (1)  As an AQIP institution, MCC works to improve performance through a number of specific processes, including AQIP Action Projects, departmental goal setting, the strategic planning process, and other regular reports and presentations that are documented in publicly available spaces.  A primary repository for operational performance is the MCC Fact Book, which is maintained by Institutional Research:
http://www.mcc.edu/mccfact/index.shtml

In addition to the Fact Book, each month the President's Office prepares an executive summary of college operations that is shared with the Board of Trustees and e-mailed internally to all employees.

7P3Collection, Storage, and Accessibility of Data. MCC has considerable capacity for data collection, storage, and access as well as a system of formal structures for employees to use technology and data.  MCC continues to be a Datatel institution and maintains a Datatel Users Group (MCCDUG) which is chaired by the Chief Technology Officer (CTO).  This group is facilitated by the Director of Enterprise Services, a position that was created after the 2009 Systems Appraisal in an attempt to build on MCC's strength in technology infrastructure.  This group meets regularly to determine departmental need for data collection, storage, and accessibility.  MCC maintains a data warehouse and Operational Data Store (ODS) with archived "snapshots" of data at specific key dates during the fiscal and academic year.  Business intelligence software (currently IBM's Cognos) is used to query either warehouse data for historical/longitudinal reporting or live data in Datatel Colleague.  A more detailed explanation of MCC's data warehouse architecture appears in 5P6 above.  The institutional Report Taxonomy, maintained by Institutional Research, documents external and internal information requirements; custom reports are developed for particular requests by managers, project leaders, and other internal stakeholders. 

Over the past two years, an increased emphasis on data integrity has driven a number of improvement efforts in the design of data collection and analysis.  One example of the way that regular reporting data are collected, stored and made accessible appears in a description of Institutional Research's report automation project in 8I1 below.  By standardizing and documenting reporting methods and structures, greater reliability and efficiency are being achieved for analyses that are regularly used across the institution.

7P4Analysis of Performance Data.MCC continues to build upon its data warehouse architecture to have one common data set that can be analyzed at the department level by individual leaders and managers who track and chart progress in overall performance.  Recently, a number of institutional dashboards have been developed for tracking key performance data such as enrollment and section seat counts.  A daily Registration Dashboard Portal report runs throughout the entire year and is e-mailed directly to managers and other stakeholders on campus.  The body of the e-mail contains the top-level details of the report, including term, activity date, days before class, current non-duplicated headcount, prior term unduplicated headcount, and percentage change from the same day in the prior academic year.  A PDF copy of the actual report is also contained in the e-mail.  In addition, a link to the source report in Cognos is included; this link takes the user to the actual dashboard portal which allows for "drilling in" on key data elements for current and past terms.  In addition to the daily Registration Dashboard Portal report, deans have access to a number of Seats Filled reports in Cognos which populate with conditional formatting.  This conditional formatting uses a "traffic light" dashboard color-coding system for section fill rates: red is extremely low, yellow is marginal, and green is above 67% full.  Use of these data to promote efficient section management has allowed the Academic deans to save considerable resources during a time of declining property tax and state appropriation revenues.

CC5D. 

(2)  MCC routinely uses college data and analysis to monitor results and make further improvements to existing processes (steps 6 & 7 of the CQI process described in Figure 5-3 above).  Examples of learning from operational experience to improve effectiveness include the use of dashboards for strategic enrollment and section management (described in 7P4 above) and routine surveys of students and program advisory committees (described in 3P2 above), as well as ongoing assessment of student learning outcomes (described in 7I1 below).

Data of all kinds are maintained and openly-available on college web pages, college network servers, and Datatel Colleague/Cognos systems and is made available to employees and the community.

7P5Comparative Data.MCC's priorities for comparative data are driven by environmental scanning—both formal and informal—to create a context for data that reflects our service area and the unique needs of the district and contiguous communities.  A number of significant opportunities for comparative data have been created through partnerships and recognition, such as Achieving the Dream (AtD), the Aspen Institute College Excellence program, as well as recent initiatives through the Michigan Community College Association (MCCA) and the Governor's Office.  State and Federal comparisons allow for identification of areas to strengthen, improve or discontinue.  State comparative data books and Federal IPEDS peer group analysis, as well as state AQIP group members provide benchmark criteria against which institutional data are evaluated.  Local area socio-economic factors such as changes in population and the labor market have improved considerably though analysis in preparation for the 2010 Aspen Institution Top 10 recognition.  As part of a highly de-centralized community college system, MCC and other institutions have had little to no access to state-wide data such as UI wage information.  When MCC was in contention for the Aspen Prize, the State of Michigan for the first time shared such data and these are reflected 8R1 and 8R4 below (the actual data appear in Figure 8-6).  Such comparative data were previously unavailable given the autonomous nature of community colleges and the structure of state government in Michigan.

Michigan's current Governor assumed office in January of 2011 and quickly worked to create a series of higher education "dashboards" of publicly-available information.  Part of the "Open Michigan" series of websites, the Mi Dashboard has a section devoted to postsecondary education that is populated with data submitted by individual colleges.  MCC's Director of Institutional Research was involved in the development of many of the metrics, and the Governor's office has worked closely with the Michigan Community College Association (MCCA) in the absence of a state-wide system for community colleges.  Among the reports on the Mi Dashboard site for postsecondary education are:  Tuition & Fees as Percent of Median Family Income; Students Who Require Developmental Courses; Community College Retention Rate; University First-Year Retention Rate; Community College Graduation/Transfer Rate
University Graduation Rate; Associate Degree or Higher.  The Mi Dashboard site is available here:
http://www.michigan.gov/midashboard/0,1607,7-256-58084_58245_58267---,00.html

Cohort MCC
Initial Cohort 
              
Number of Students Meeting Metric Benchmark Percent of Students Meeting Metric Benchmark Mi Dashboard
2007 3212 2203 68.59% 58%
2008 3206 2142 66.81% 57%
2009 3748 2535 67.64% 62%
2010 3356 2294 68.36% 63%
2011 3522 2446 69.45% 62%
Figure 7-3 Governor's Dashboard Incidence of Developmental Education

As the Governor's Dashboard information show, MCC consistently has a higher incidence of developmental education than the Mi Dashboard percentage.  As is noted on the Mi Dashboard site, at this time data is not available for all 28 community colleges. Mi Dashboard plans to update the data as the appropriate metrics become available. 

MCC also compares its own data for trend analysis over time.  Due to participation in Achieving the Dream (AtD) and various AQIP Action Projects, considerable focus has been placed on developmental education as described 1I2.  For example, trend data are collected for developmental placement and broken out by subject area.  These are compared to external benchmarks such as the Mi Dashboard percentages above.  Below are recent placement rates for developmental education from 2007-2011:

Cohort Dev Reading Dev Writing Dev Math
2007 78.6% 29.4% 55.5%
2008 79.8% 34.0% 57.6%
2009 77.7% 37.1% 60.8%
2010 76.8% 40.7% 60.5%
2011 79.1% 47.0% 62.5%
Figure 7-4 Developmental Recommendations for Students (Accuplacer)

These numbers are tracked and used by standing groups such as the Developmental Education Steering Committee to drive decision making and curriculum development.  Comparative data are also available through institution-wide student learning outcomes assessments such as MAPP results, which are discussed in greater detail in 1R1, 1R2, and 1R6 above.

7P6Alignment of Data Analysis With Organizational Goals. Most data needs and analyses are in alignment with one or more components of MCC's overall strategic plan.  In addition, standing groups such as the BI users group, MDUG, and Leadership Group coordinate the use of institutional data.  Also, MCC has a centralized IR function, and nearly all important data analyses, surveys, and institutional information is coordinated through the office of Planning, Research and Quality and Institutional Research.  All department administrators and many other department staff have direct access to standard and ad hoc reports in Datatel Colleague and Cognos.  Lead administers set direction for departments to analyze and compare their local data to college-wide expectations.  An example is the budget modification process; institutional targets are set for reductions and department data are aligned to accomplish the necessary adjustments.  This is coordinated by the Accounting office through a simple set of Excel spreadsheets accessible to managers on the college network.

7P7Information Systems and Processes. MCC continues to invest significant human and technological resources to ensure the timeliness, accuracy, reliability, and security of information systems.  An entire section of MCC's strategic plan is devoted to technology-related initiatives:

2-0 Technology Initiatives
  Commit the funds to maintain user-centered, state-of-the-art technology and staffing support that enhances student learning, supports faculty/staff productivity, maximizes student success, and ensures organizational effectiveness.
  Establish systematic processes and practices to maintain data integrity.
  Promote a culture of data security awareness across all areas of the organization to support student privacy and limit college liability.
Figure 7-5 2013-2018 Overarching Goals for Technology Initiatives

MCC maintains powerful systems for managing and analyzing information.  The ITS department has the accountability to purchase, install, configure and maintain the primary college information systems; the Executive Cabinet routinely reviews systems needs and usage and makes recommendations to the Chief Technology Officer (CTO) for changes and improvements.  Overarching Goals and Enabling Objectives in the Strategic Plan set data standards and development objectives.

7R1Measures of Knowledge Management. Members of the Business Intelligence work group routinely analyze the systems in place for data warehousing and reporting.  Recently a Data Integrity team convened to examine numerous internal measures of data and information management.  Cognos Systems Utilization studies and analyses provide benchmarks.  Institutional Research has undertaken a comprehensive report automation project (described in detail in 8I1 below) in an attempt to measure and analyze informational reporting with the goal of increasing accuracy and shortening turnaround time.  In addition, ITS staff regularly conduct performance audits and usage reports for their analysis of system performance and effectiveness.  Data standards are established and upheld by groups of trained staff.

7R2 and 7R3 Results for Measuring Effectiveness.Members of the Executive Cabinet have varied methods for understanding and using system tools and data.  External reports are completed in timely and accurate fashion as described in 7P1 and 7P2 above.  Student and Graduate satisfaction measures (described in 3R1 and 3R2 above) are analyzed to identify institutional processes that need to be improved.  In addition, ITS conducts user groups and service evaluation questionnaires.  Other system processes such as the Office of Physical Plant service requests are analyzed for effective outcomes.  The Director of Institutional Research (IR) and the Chief Technology Officer (CTO) participate in state and national groups that serve to develop and improve system processes such as data warehouse utilization.  As mentioned in 7P5 above, the Michigan Community College Association (MCCA) has become increasingly involved in coordinating the work of the 28 community colleges in Michigan on data issues.

While MCC has made strides in adding comparative data and benchmarks through its participation in state, regional, and national initiatives, there are still relatively few opportunities to compare performance results for measuring effectiveness.  Perhaps the most recent and rigorous comparison was made during the Aspen Institute recognition, described at various points during this Portfolio.

7I1Recent Improvements in Measuring Effectiveness.MCC continues to build on the strength of the ITS merger cited in the 2009 Systems Appraisal Feedback Report.  A comprehensive list of recent innovations in ITS can be found in 6I1 above.  Since 2009, MCC has made dramatic improvements to measuring effectiveness in the assessment of student learning outcomes.  New general education objectives and requirements, which include critical thinking, global awareness and citizenship, were developed beginning with the Summer 2009 semester.  Therefore, we recently revised our assessment processes. The CCSSE is used biannually to assess student engagement in a number of areas including experience with the three overarching general education objectives.  MCC also administers the MAPP on a biannual basis as an assessment of our general education objectives. In addition, a local assessment instrument was developed to assess specifically our three general education objectives.  This local instrument was piloted to a small group of students during the spring 2010 semester and to a larger group during the fall of 2010.  It has been used to assess a larger group of MCC students on an annual basis since the fall semester of 2011. These and other assessment results are used to drive curriculum change as well as professional development.  It is anticipated that:

  • General Education Assessment (a locally prepared instrument) will be administered every Fall
  • Educational Testing Service's Proficiency Profile (formerly "MAPP") will be administered biennially in even-numbered Winter semesters
  • Community College Survey of Student Engagement/Community College Faculty Survey of Student

Engagement will be administered biennially in odd-numbered Winter semesters. Below is a 5-year calendar of institution-wide deployment of assessment instruments. 

  GenEd Assessment ETS Proficiency Profile CCSSE/CCFSSE
2012-2013 Fall 2012   Winter 2013
2013-2014 Fall 2013 Winter 2014  
2014-2015 Fall 2014   Winter 2015
2015-2016 Fall 2015 Winter 2016  
2016-2017 Fall 2016   Winter 2017
Figure 7-6 Multiple Measures Schedule for Assessment of Student Learning

MCC's comprehensive plan to assess student learning outcomes using multiple measures was cited by the Aspen Institute during the 2010 Top 10 selection process.  During the review process, the Aspen Institute engaged Dr. Peter Ewell and Dr. Karen Paulson of the National Center for Higher Education Management Systems (NCHEMS) to review the learning outcomes portion of MCC's submission for the Aspen Prize for Community College Excellence.  As part of the review, Ewell and Paulson conducted a telephone interview with a small group of faculty and staff knowledgeable about MCC's student assessment efforts.  The reviewers were impressed with the multiple measures approach and in particular cited the locally-developed General Education assessment as a strength.

7I2Improvement Efforts for Measuring Effectiveness.One significant recent change to improvement efforts for measuring effectiveness center on participation is the Community College Survey of Student Engagement (CCSSE) and the Community College Faculty Survey of Student Engagement (CCFSSE).  MCC has administered CCSSE/CCFSSE three times since the last Systems Appraisal.  Below are the most recent benchmark scores for the main survey of CCSSE.

  MCC Large Colleges 2013 Cohort
Benchmark Score Score Difference Score Difference
Active and Collaborative Learning 49.4 49.3 0.1 50 -0.6
Student Effort 47.1 49.3 -2.2 50 -2.9
Academic Challenge 46.8 49.7 -3.0 50 -3.2
Student-Faculty Interaction 45.3 49.1 -3.8 50 -4.7
Support for Learners 49.1 49 0.1 50 -0.9
Figure 7-7 2013 CCSSE Benchmark Scores (Main Survey)

Numerous work groups and teams have made use of the CCSSE/CCFSSE data since 2009.  Administered every other year, CCSSE also provides an opportunity to ask custom questions as part of the survey instrument.  During each administration, MCC uses its custom questions to solicit student perceptions of how often they encounter the three General Education (outlined in Figure 1-19 above) outcomes in their courses.

The culture and infrastructure of MCC continues to move toward greater reliance on a systems orientation to performance improvement.  A growing number of departments and areas routinely develop goals, and priority is given to efforts to improve systems, infrastructure, and activities leading to improved student success and service to the community.  These institutional strategies undergo periodic/annual review and modification.