Course Evaluation Phase Two project

CSCF Master ST Item


Project Charter


The Faculty of Math is responsible for collecting and reporting student evaluation statistics for all Math instructors. Historically, a faculty member in CS has been responsible for generating statistics for all instructors in the School of Math (the "Tompa Scores"), and this responsibility has carried forward to CSCF on that faculty member's retirement. Phase Two covers users requests following on from Phase One regarding back-data, and a conversion to a the Science Computing "evaluate" database, to support a Web UI to display statistics to faculty members and faculty administration. This database change will also support a process simplification, if the Math Dean's office proceeds with a plan to run all course evaluations online using the Science Computing database.


This project will work on fulfilling requests generated after client usage of Phase One and provide a user-friendly Web UI to display current and historical statistics. It will also support the Dean's Office's plans to run course evaluations online, although the Dean's Office has been very clear that they are not requesting work from CSCF in this project.

Project Goals

  1. prepare for faculty self-service data lookups via a web UI
  2. work with Science Computing to import back-data from Math
  3. work with Science Computing to enable categories of administrative users in the back-end, to support Math and other faculties operating the software without working through Science Computing.
  4. work with Math/CS clients to identify and resolve their needs for Course Evaluation data-reporting; iterating as many times as necessary

Project Objectives

The project goals translate into the following objectives:

  1. import back data from 1998 to 2005 into normalized form
    1. first, in the database set up in Phase One [2000-2005 are complete, '98 and '99 remain]
    2. second, in the database used by '' designed by Science Computing
  2. develop a web front-end that interfaces with the '' database DONE
  3. assist Science Computing building the API for the Course Evaluation platform DONE
  4. implement client requests for command line tools

Project Sponsor

Ed Lank, Associate Director, School of Computer Science

Project Group

Daniel Allen (CSCF), Chirag Gada (CSCF Spring 2014), Spondon Banday (CSCF Fall 2014), Mirko Vucicevich (Science Computing)

Project advisory group

Ed Lank (Associate Director of CS), Dan Brown (Director, Undergrad Studies), Charles Clarke (Associate Director, Undergrad Studies) , Steve Furino (Faculty of Math, Associate Dean Undergrad Studies)

Risks, assumptions and mitigating factors

  1. Risk: Failure to complete project by December 2014. Mitigation: Administrative Faculty Members use the querying tool from Phase One; regular faculty members query CSCF for their own evaluation results.
  2. Risk: Fundamental Incompatibility of Phase One Data with Science Computing API. Mitigation: Work with Science Computing to improve the API.
  3. Risk: Math/CS unwilling to store Instructor Evaluations on a Science Computing Database. Mitigation: Run the program to get data from CSCF's copy of the database.
  4. Risk: CS Faculty are unhappy with the Web-UI. Mitigation: Since this is a API based program, we can revamp the Front-end according to Faculty Members needs.

Project Architecture Overview


From Phase One CourseEvaluation#output: CSCF produces:

  1. evaluations per instructor/class-section (for use by CS Director for Undergraduate Studies, Math Associate Dean for Undergraduate Studies, CS Director, and Math Department Chairs; and eventually available to each instructor directly)
  2. aggregation per instructor over previous 5 years and over all years available (for use by CS Tenure and Promotions Committee, Math Department Chairs, and eventually available to each instructor directly)

An export from these tools will be used to populate the Web-UI database bult by Science Computing/CSCF The Science Computing database also uses OAT API calls to gather course and instructor information.


A Web-UI for use by Faculty Members and Chairs (privileged users) containing Course Evaluations. Requirements:

  • Easy to use for regular faculty members to look up their own statistics
  • Easy to use for privileged users to look up and summarize faculty members' statistics
  • Allows users to download .csv of above data

A command-line tool for privileged users. Requirements:

  • Fulfills all of the functions of the tool written in Phase One via the same database as the web UI

Functional Milestones

Winter 2014

  • 2014-02-20: Give a Demo for Ed Lank DONE
  • 2014-03-03: Complete revisions to database model DONE
  • 2014-04-07: Give a Demo for Ed Lank and other users. DONE
  • 2014-04-08: Modify the Web-UI based on first-round of user feedback at the demo. DONE
  • 2014-04-15: Modify the Web-UI based on second-round of user feedback. DONE
  • 2014-04-21: sysadmin documentation is complete. DONE
  • 2014-04-22: Modify the Web-UI based on any remaining user feedback. [none received]
  • 2014-04-22: Decision whether to keep running the Phase One tools for Spring 2014 term [Yes]

Fall 2014

  • 2014-10-13: Decide what feature-enhancements will be implemented in the Science Computing administrative interface (eg., so Math can work independently of Science/Engineering/AHS)
  • 2014-11-07: Give a demo for Ed Lank (or other users)
  • 2014-11-14: Complete revisions
  • 2014-11-21: Give a demo for Ed Lank (or other users)
  • 2014-11-28: Sysadmin documentation is complete.
  • 2014-12-05: Modify UI based on any remaining user feedback



Winter 2014

2014-02-20 - Ed Lank, Daniel Allen, Chirag Gada: Reviewed Student/Prof/Chair Views. Summary: Student Views: The survey inputs need to be modified to reduce bias and need to consider getting rid of sliders and replacing with just radio buttons. Plan to accommodate 'Did Not seek help' option for relevant questions. There should be a clear demarcation between Course and Professor Evaluation questions. Ed said he would consider talking to the Dean's office about the phrasing and options for a few questions to implement better survey design. Professor Views: Need to get rid of spider charts or push them towards the bottom. Tooltips should spell out the entire question for histograms and also equations for formulae. Consider implementing better course navigation and ditch the show more bar. Display 5-year statistics history to provide a scale against which the professors can compare. Chair Views: There will likely be two views needed, one for faculty performance review committee who require more in-depth information per instructor while the hiring committee which will require a concise set of information. An option to externally export data would be a good add-on for both the kinds of chairs. It was decided that after necessary changes have been made, a meeting of the stakeholders should be arranged.

2014-03-26 - Dean's Council - the web system which is the software basis for this project was demoed by Mirko Vucicevich (Science Computing), to the Deans; Chirag Gada participated as well. The discussion was largely about performing the student evaluations using the web system, as Science is doing this term. There is no committment that Math Faculty do so; and there is no issue with using our existing processes to perform student evaluations and then import to the web front-end.

2014-04-07- Dan Vogel, Daniel Allen, Chirag Gada: As Ed Lank has been unavailable for a demo, we met with Dan Vogel (HCI Group) to review the user interface. He made a number of UI recommendations, which will be reasonably easy fixes.

2014-04-15 - Ed Lank, Daniel Allen, Chirag Gada: Ed Lank was available for a followup demo. He made a few additional suggestions for the user interface, but overall likes what he's seen.

2014-04-17 - Daniel Allen, Byron Weber Becker: Byron says the OAT access for this project authenticates via shared secrets; he would strongly prefer that OAT projects switch to a private-key based authentication model. This will be an additional factor in bringing this project to completion, and won't be finished before the term is over.

2014-04-24 - Daniel Allen, Mirko Vucicevich: Mirko wants to refactor the back-end code to allow more flexibility between faculties' front-end code; he hopes to have code finished in May that he will share with us. It will include Chirag's contributions to the back-end; there will be a common code-base for all faculties; it can also switch to public-key authentication as Byron requested. When this is complete, Daniel can update Chirag's code to use the new back-end.

2014-04-24 - Daniel Allen, Chirag Gada: Project handover to Daniel. The project is stalled waiting for above code updates from Science Computing.

Spring 2014

2014-07-07 - Dean Goulden has indicated he would like to use Science's system for evaluations starting this term. Dan Brown asked whether this is straightforward; the timing is that evaluations should start in two weeks, Monday 21 July. He and David suggest we should use this for 1xx/2xx courses and any other volunteer instructors.

2014-07-07 - Daniel Allen sat in on 'evaluate' onboarding meeting for AHS. Science Computing (Mirko Vucicevich, his supervisor, 2 co-ops); AHS (Ron McCarville, AD Undergrad Studies, Terry Stewart, Director for IT, and two Dean's office admin staff). Overall, things look favourable for having this ready to run in time. AHS is proceeding to use 'evaluate' for all undergrad courses for S2014. Their start-date is the same as ours. Their existing evaluation procedure has 70% return rate. Process for both AHS and Math:

  1. We supply the start/end dates for evaluations (AHS's is July 21st - 30th; ours can be different).
  2. We supply a list of all course prefixes of relevence (Daniel can supply). Next week Sci Comp will give us back a list of all courses; we verify that none are missing and identify any we do NOT want to use with online evaluation.
  3. The text that appears on the front page, and the text that is emailed to students to say the evaluation period is open, will be shared with us for any necessary edits. It must be the same for all faculty using the system.
  4. We meet with them this Friday (the 11th) to address any questions we have; and on Wednesday the 16th to wrap up any last issues.

2014-07-14 - Daniel Allen met with Mirko to clarify status since Friday meeting didn't happen. Mirko confirmed that: he's received courses from Dean Goulden; Chirag's changes are the front-end they're using; CSV export is possible; will require work from their coder. Tomorrow's meeting is 10:30. Mirko also noted that in the Fall, he hopes to not touch the code; he's put up git repos for the front-end and back-end and would like CS to review it all.

2014-08-22 - Daniel Allen sat in on term wrapup meeting, which included Science Computing (Mirko Vuciecevich, Paul Miskovsky, two co-ops), AHS (Ron McCarville, Terry Stewart, AHS Dean's Office Staff) and Engineering (Jack Turong, Engineering Computing). Cyntha Struthers was invited but couldn't attend from Math Dean's Office. Topics included: reviewing the response-rates; a desire for protocol changes to make sure instructors give class-time for evaluations; looking at a few new administrative features in the web UI including time-of-response; and a bit of discussion of standardizing a single score for each instructor/course - which we more or less agreed was difficult since each faculty has different questions and similar but not identical calculations for rating the instructors. Action items for Math (which were communicated with Cyntha Struthers and Jack Rehder): telling Mirko what usernames should have administrative access to the course-results data; and what date the results should be released for admin viewing and for faculty-members to look up their own results.

2014-08-27 - Email from Jack Rehder: MUO evaluation books are still to be produced via the old method, and these are due to be produced for mid-September.

2014-08-28 - Email from Cyntha Struthers to Mirko: Jack Rehder and Cyntha Struthers are to get administrative access; the Math release date should be September 17th.

Fall 2014

2014-09-11 - Daniel Allen and Spondon Banday met with Mirko to clarify status. Mirko immediately wanted the math calculation for the Tompa score; Daniel Allen emailed him the details. Mirko identified the following back-end improvements that we would mutually find useful: adding an importer so Math can load our back-data; and adding Administrative Levels to the back-end code- so Math and other faculties could create the lists of courses to include for each term, and change survey contents, without going through Mirko. Mirko gave Daniel and Spondon access to the code repository; we agreed to not start working on the code until he's made his last batch of changes, after Wednesday 17 September.
Edit | Attach | Watch | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r10 - 2015-09-11 - DanielAllen
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback