Course Evaluation instructions for end-users

These instructions will be emailed to end-users of the data. Clients have (so far) been fairly clear in a desire for software that doesn't need much documentation.

There are two sets of instructions:

  • for end-users who access the data from the command-line (members of unix group cs-eval);
  • for end-users who are emailed the data once per term.

Email Instructions for command-line users

There is now a web-based interface for the course evaluation data, at: https://evaluate.uwaterloo.ca/

It has identical back-end data to the command-line tools, with more detail explicitly explained at the cost of more pointing-and-clicking. Your access is limited to CS (whereas command-line access is for the whole faculty). The web UI should be fairly straight-forward; if there are any questions you can refer to: https://cs.uwaterloo.ca/cscf/teaching/evaluate/administrators.shtml

The Course Evaluation results ("Tompa Scores") are available via the command-line.

Using the interface:

  1. log in to linux.cs.uwaterloo.ca and add an alias that will execute ~cs-eval/evals/report.py
  2. run the command report.py with various options

Examples

report.py --term W2012-F2013 - returns all undergrad evaluations for Winter 2012 to Fall 2013

report.py --term 2010.1 --subject "CS|SE" - returns CS and SE evaluations for Winter 2010

report.py --userid jsmith --tag grad - returns all grad courses taught by userid jsmith

report.py --userid jsmith - returns all undergrad courses taught by userid jsmith

CSV Output

The CSV output columns are: 'course_id', 'term_id', 'instructor_name', 'userid', 'course_label', 'section', 'nresp', 'rresp', 'percent_classes', 'hrs_wk', 'prep', 'delivery', 'effect', 'summary', 'course_avg_prep', 'course_avg_delivery', 'course_avg_effect', 'course_avg_summary', 'course_count', 'prof_avg_prep', 'prof_avg_delivery', 'prof_avg_effect', 'prof_avg_summary', 'prof_count', 'tag'

  • course_id is a unique identifier matching the same course over all terms regardless of the cross-listings, as determined by the maintainers of Quest data.
  • term_id is returned in the Quest format. The first digit is 0 for years < 2000 and 1 for years > 2000. The next two digits are the last two digits of the year. The last digit is the month.
  • Columns with _avg_ are over the previous 15 terms (5 years).

Notes on limitations

  • The new system has undergrad data going back to Fall 2001. Grad data goes back to 2009.

Help page

This help page may be viewed by running report.py -h
usage: report.py [-h] [-v] [--precision {1,2,3,4,5}]
                 [--tag {grad,all,not-grad}] [--term TERM]
                 [--avg_terms [AVG_TERMS]] [--userid USERID] [--out OUT]
                 [--subject SUBJECT] [--top_instructors]

Report evaluation data for a given term, userid, name, and/or subject.

optional arguments:
  -h, --help            show this help message and exit
  -v, --verbose         display verbose output
  --precision {1,2,3,4,5}
                        precision of results (1-5, default 2)
  --tag {grad,all,not-grad}
                        tag to identify graduate or online data. Accepts a
                        single value or negated value. Default 'not-grad' for
                        not-grad courses. Note that online tag is not
                        currently used.

Action:
  Supply one or more of the following required actions:

  --term TERM           term number (eg., S2010 or 2010.2 or 2010.5 or 1105
                        for May 2010) or range (eg., S2009-F2012)
  --avg_terms [AVG_TERMS]
                        the number of terms to calculate the avg scores. The
                        highest number of terms would be 15 (5 years). The
                        Default is 15. (eg., 1,2,5,7,14)
  --userid USERID       instructor userid
  --out OUT             the format of the output. Printed in eather CSV or
                        Table. default csv
  --subject SUBJECT     course subject (eg., 'AMATH|CS|MATH') - regex format
                        accepted; special parsing so 'MATH' does not include
                        AMATH/PMATH and 'CO' does not include COMM
  --top_instructors     Generate a report for top instructors with 70%
                        response or 25 replies, and all ratings at least 4.1

If you have any questions, please feel free to drop me an email.

Email Instructions for emailed users

The following was emailed to Math Chairs who are emailed the results once a term.

The Course Evaluation results ("Tompa Scores") will be emailed shortly; as mentioned previously, they will be in CSV format, with a descriptive header as follows:

Improvements to note:

  • input data is normalized against the campus Quest database, as a second source besides hand-entered MUO data. We have an audit trail and a unique course_id regardless of whether the course changed numbers over time.
  • Fall 2013 first-year courses have been evaluated online instead of on paper; for future use, we have a tag field to note these (and any additional courses) as 'online' such that we can track whether their evaluations are different compared to paper.

The CSV output columns are: 'course_id', 'term_id', 'instructor_name', 'userid', 'course_label', 'nresp', 'rresp', 'percent_classes', 'hrs_wk', 'prep', 'delivery', 'effect', 'summary', 'course_avg_prep', 'course_avg_delivery', 'course_avg_effect', 'course_avg_summary', 'course_count', 'prof_avg_prep', 'prof_avg_delivery', 'prof_avg_effect', 'prof_avg_summary', 'prof_count', 'tag'

  • course_id is a unique identifier matching the same course over all terms regardless of the cross-listings, as determined by the maintainers of Quest data.
  • term_id is returned in the Quest format. The first digit is 0 for years < 2000 and 1 for years > 2000. The next two digits are the last two digits of the year. The last digit is the month.
  • Columns with _avg_ are over the previous 15 terms (5 years).

Questions? Feel free to drop me an email.

-- DanielAllen - 2014-01-30

Edit | Attach | Watch | Print version | History: r11 | r9 < r8 < r7 < r6 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r7 - 2016-07-18 - DanielAllen
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2025 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback