Building KU's Teaching and Learning Community

Evaluating Student Learning Across the Mechanical Engineering Curriculum—Sara Wilson, Peter TenPas, Ronald Dougherty, Christopher Depcik, & Kenneth Fischer (2010)

Overview

In this project, the mechanical engineering department at the University of Kansas examined presentations of senior capstone design posters to assess student learning across the curriculum. The program’s faculty members created a rubric later used by members of the department’s industrial advisory board to evaluate student performance. Following the assessment, ME faculty members reviewed the board members’ feedback for further discussion and evaluation.

 

Background

The Accreditation Board for Engineering and Technology (ABET) has a number of educational objectives for engineering programs. In assessing an engineering program’s success in meeting these objectives, various outcomes are traditionally used: course performance, faculty assessment, standardized testing, GPA, and surveys of graduates and employers. In this project, the mechanical engineering department at the University of Kansas examined presentations of senior capstone design posters in order to assess student learning across the curriculum.

Implementation

To evaluate the senior design projects, the faculty developed a rubric to assess students’ performance along a number of dimensions that were deemed important by ME faculty members and our outside advisory board. After the first implementation, we refined the rubric form based on comments from the users.

Student Work

Student performance on rubric areas remained consistent across 2008 and 2010. In these results, in general, faculty rated the creativity of the student groups higher than did the advisory board; however, both faculty and advisory board members rated evaluation and testing lower than other scores.

Reflections

Senior capstone design projects offer a snapshot of the skills students have learned during their educational careers. First, in using both the presentations of design posters and a carefully designed rubric, a department can easily assess a number of educational objectives. Second, by including outside reviewers from industry, a department can obtain external validation for the quality of the program and identify areas of improvement needed to prepare students for the workplace.


^Back to top^

Background

In Spring 2009, 326 students were working toward a Bachelor of Science in Mechanical Engineering. Like other engineering and science majors, but unlike many majors across the University, these students follow a fairly prescribed schedule of courses—they start with courses in the basic sciences and mathematics, and move toward the application of these materials in designing mechanical systems. In addition, a series of design and development courses allows students to learn the methodology of the design process, which includes identifying functional objectives, developing appropriate budget and time plans, and communicating design ideas in written, graphic and oral forms. For more information, please see this chart of the 2009 Mechanical Engineering Curriculum.

As a department, the mechanical engineering faculty is tasked by the advisory board (alumni and industry representatives), by the School of Engineering, and by the accrediting agency (ABET) with a series of educational objectives in which graduate students must demonstrate competency:

  1. The ability to apply knowledge of mathematics, science, and engineering.
  2. The ability to design and conduct experiments, as well as to analyze and interpret data.
  3. The ability to design a system, component, or process to meet desired needs.
  4. The ability to function on multi-disciplinary teams.
  5. The ability to identify, formulate, and solve engineering problems.
  6. An understanding of professional and ethical responsibility.
  7. The ability to communicate effectively.
  8. The broad education necessary to understand the impact of engineering solutions in a global and societal context.
  9. Recognizing the need for, and ability to engage in, life-long learning.
  10. Knowledge of contemporary issues.
  11. The ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

The department collects a wide range of information to assess performance on these objectives (grades, employment data, FE licensing exam, scores, and student/alumni/faculty surveys). To process the information, the department developed a system that includes an Outcomes Committee to assess the data and make recommendations, a Curriculum Committee to determine curricular changes, and regular faculty meetings to approve the recommendations of these committees. The interaction of these committees is illustrated in this flowchart.

Solid processes are in place to analyze these outcome measures; however, the measures themselves are often inadequate to objectively evaluate the progress of our educational goals. Senior capstone design projects, on the other hand, offer an opportunity to develop more quantified measures. These projects are accomplished by teams of students—ranging from three to thirty members—and cover a wide array of work, from designing medical device testing equipment to designing and building a race car. Regardless of the project, the students’ work has the design process in common, as well as the need to apply knowledge from a wide range of classes. The goals of this project were, therefore, to develop a measure that could be used across different types and sizes of design projects, and one that would be useful in examining our educational objectives.


^Back to top^

Implementation

To evaluate the senior design projects, the faculty developed a rubric to assess students’ performance along a number of dimensions that were deemed important by ME faculty members and our outside advisory board (see Background). These dimensions are:

 

  • Identifying functional objectives
  • Engineering analysis and methodology
  • Evaluation and testing
  • Inventiveness and creativity
  • Team chemistry—interest and passion for the work
  • Written and visual presentation
  • Oral presentation and questions

Within each dimension, we developed categories that described student performance from low to high.

The process to develop the rubric began with Dr. TenPas and Dr. Wilson, from mechanical engineering, and Dr. Dan Bernstein, from the Center for Teaching Excellence, who put together the first draft of objectives and levels of proficiency to be assessed. The goals of this draft were to highlight dimensions of the project, while keeping the form to one page to prevent evaluator fatigue. Four professors from design courses (ME 640-643) examined the form, and their feedback was incorporated into a second draft, which we implemented in Spring 2008 for the presentations of final posters from design students. At these presentations, all student groups created a poster describing their project and the engineering work involved. The students were asked to attend a lunch session where they showed their posters to fellow students, faculty, alumni on the advisory board, and others who could review the work and ask questions. At these presentations, we asked both external (alumni advisory board members and industrial sponsors of the design projects), and internal (instructors and faculty in the department) reviewers to evaluate the design projects using the rubrics.

After the first implementation, we refined the rubric form based on comments from the users. One comment was that the rubric was “too wordy,” so to streamline the form we italicized differences between proficiency levels. In addition, we created a second version of the form for oral (rather than poster) presentations, so that it could be used in other design presentations throughout the year. We presented both the form and the composite scores from the reviewers to the mechanical engineering faculty at one of our regular faculty meetings and at the annual faculty retreat.


^Back to top^

Student Work

Graduating seniors must take capstone courses (ME 640, 641, 642, 643, 644, and 645), which require one major work completed in a group setting. The presentation posters shown here are a summary of the results of this group project. Examples of the projects and their posters are linked below (pdfs): 

 

In the first implementation of the rubric, we quantified the scores on a 1–4 scale (where 4 was mastery) to obtain the following table:

Composite Scores from the 2008 Annual Student Design Poster Presentations for All Design Groups

  Objectives Analysis Evaluation Creativity Interest Written Oral
Advisory Board 3.48 3.27 3.00 3.00 3.54 3.45 3.69
Faculty 3.62 3.50 2.85 3.42 3.77 3.77 3.81
 

From this, we were able to draw a number of observations:

  1. Faculty, in general, rated the creativity of the student groups higher than did the advisory board. From faculty discussions, we concluded that faculty members are more familiar with the students’ ability and the limitations of the projects’ scope and duration. While some projects lend themselves to greater creativity, there are a number of common limitations: sponsors often proscribe the choice of project, and both sponsorship funding and availability of time can limit the possible solutions.
  2. Both faculty and advisory board members rated evaluation and testing lower than other scores. In faculty discussions, some noted that evaluation and testing are often performed in the final stages of a design project, so presentations in April, for example, may not reflect the final levels of evaluation and testing performed at the end of the semester. The faculty also discussed how exposure to evaluation and testing could be improved, as there are several courses that include experimental work that would develop these skills. However, in many of these courses experimental work is pre-designed to allow students to do an experiment quickly; therefore, it may be possible in the future to incorporate student experimental design into these courses to further expose students to such work.
  3. Many of the pre-requisites for capstone courses are in the areas of engineering analysis and methodology. While, except for one, all student groups received scores of 3 or 4 in these areas from both the faculty and alumni, due to the different areas and courses the score covers it would be desirable to expand its assessment (for example, do they know engineering analysis in fluid dynamics?). To do this, other student projects may need to be assessed, since not all design products require the analysis of all engineering areas.

In Spring 2010, the students were again evaluated using the revised rubric (one of the 12 groups did not participate in the poster session this year), and members of the industrial advisory board performed the assessments. The scores were as follows:

Composite Scores from the 2010 Annual Student Design Poster Presentations for 11 of 12 Design Groups (one group was unable to participate)

  Objectives Analysis Evaluation Creativity Interest Written Oral
Advisory Board 3.46 3.32 3.20 3.01 3.52 3.62 3.65
 

In general, performance remained consistent with that of 2008. Advisory board members rated evaluation higher in 2010, perhaps due to the encouragement of students by the design faculty to include more evaluation components in their presentation, or by changes in the measurement and instrumentation curriculum.

A challenge for the department during this period was the increase in the size of the undergraduate program, which presented considerable strain on the laboratory classes, including measurement and instrumentation classes. However, these results show that while a larger class size has been difficult, instructors in these courses have been able to maintain and even improve student learning in these areas.


^Back to top^

Reflections

The senior design courses in mechanical engineering are required for all seniors and cover materials taught in many courses across the curriculum. As such, they are a good place to assess student learning comprehensively, allowing a large-scale picture of the overall preparedness and its variability across the student body. A single, one-page rubric that is simply written, can, in this setting, allow for easy and quantifiable assessment, while using faculty and outside observers (advisory board and industrial sponsors) as a way to compare these perspectives.

Based on completed work, there are clearly some strengths and limitations to using senior capstone design presentations as a measure of educational outcomes. These presentations offer an opportunity to capture student work across the entire senior class, within the major. However, since the work is done in groups, it is harder to observe variations in individual performance or to relate that performance to other measures of an individual (such as classroom performance in other courses). The presentations show a wide variety of engineering, communication, economic, and broader impact skills in one place, but the level and completeness of these can vary across projects. For example, one project may have more thermodynamic elements to the work, while another may focus more on solid mechanics and machine design. As such, it can be difficult to consistently evaluate skills that are unevenly present across different projects—with the primary exception being communication skills (both oral and written), since all project presentations share this requirement.

The future goals of this work include: a) incorporating this measure into the departmental outcomes assessment processes, and b) using this measure in other design courses, including a freshman course that introduces design (ME 228) and a junior course that focuses on learning design principles (ME 501), so as to develop a longitudinal measure and improve the rubric. Also, variations of the rubric have been created to use in other senior capstone design presentations, including interim oral reports and final oral design reports.

In the end, it is clear that the senior capstone design projects offer a snapshot of the skills students have learned during their educational careers. Using the design poster presentations and a carefully designed rubric, a department can easily assess a number of educational objectives. By including outside reviewers from industry, a department can obtain external validation of the quality of the program and identify areas of improvement needed to prepare students for the workplace.

Contact CTE with comments on this portfolio: cte@ku.edu.


^Back to top^

GTA Flex and Online Teaching Program

GTAs: The link to the application form for the GTA Flex and Online Teaching Program is now available. It can be accessed here. 

Portfolio & Poster Search



List of all portfolios & posters