UW-Logo

CS 854: Advanced Topics in Computer Systems

Experimental Performance Evaluation

Fall 2021

 

 

 

 

 

Home Video Logistics Schedule
Projects Resources Integrity

 

 

 

Course Schedule and Readings

UNDER CONSTRUCTION: I'm adding more information and many of the links on this page will not work yet

If you have suggestions for papers you would like to read or topics you would like to cover please let me know.

Paper Summaries

There may be two different types of summaries for different types of papers. Research papers will require research paper summaries. Other papers (e.g., survey papers) will require a survey paper summary. The format of these can be found by following the links below.

Recorded Class Videos

The recorded videos of the course meetings from each week can be accessed using the link below.

Week 1: Tuesday Sept. 14, 2021 (Topic: Introduction)

Week 2: Tuesday Sept. 21, 2021 (Topic: Measuring WiFi Throughput)

Week 3: Tuesday Sept. 28, 2021 (Topic: Variability)

  1. Some Statistics / Background (watch some Khan Academy Videos)
    NOTE: For now just go through Part I: Statistics on that page.
    [NO SUMMARY REQUIRED]
  2. How to read a paper
    [NO SUMMARIES REQUIRED]
    1. Paper Reading Check List,
      Sugih Jamin, jamin@eecs.umich.edu.
    2. How to Read a Research Paper,
      by Spencer Rugaber
    3. Efficient Reading of Papers in Science and Technology
      Prepared by Michael J. Hanson, Updated by Dylan J. McNamee, January 6, 2000.
    4. How to Read a Paper
      S. Keshav, ACM SIGCOMM Computer Communication Review, Volume 37, Number 3, July 2007.
  3. Conducting Repeatable Experiments and Fair Comparisons using 802.11n MIMO Network
    [RESEARCH SUMMARY REQUIRED]
    Ali Abedi, Andrew Heard and Tim Brecht
    Operating System Review, 2015.

Week 4: Tuesday Oct. 5, 2021 (Topic: Background and Motivation)

  1. From Repeatability to Reproducibility and Corroboration
    (Survey Paper Summary required -- skip the last question (i.e., just cover questions 1-3).
    Dror Feitelson
    Operating Systems Review, 2015.
  2. Producing Wrong Data Without Doing Anything Obviously Wrong!
    (Research Paper Summary required)
    Todd Mytkowicz, Amer Diwan, Matthias Hauswirth, and Peter F. Sweeney
    ASPLOS, 2009.
  3. Come up with a potential group and potential project. See more details in the CHECKLIST below.

Reading Week : Tuesday Oct. 12, 2021

Week 5: Tuesday Oct. 19, 2021 (Topic: Workloads and Benchmarks)

  1. Workloads (Creation and Use),
    (No Summary required)
    Alan Jay Smith,
    CACM, November, 2007.
  2. httperf -- tool for measuring web server performance
    (No Summary required),
    David Mosberger and Tai Jin ,
    ACM SIGMETRICS Performance Evaluation Review (PER), Volume 26 Issue 3, Dec. 1998.
  3. Methodologies for Generating HTTP Streaming Video Workloads to Evaluate Web Server Performance,
    (Research Paper Summary required)
    Jim Summers, Tim Brecht, Derek Eager, and Bernard Wong,
    5th Annual International Systems and Storage Conference (SYSTOR), 2012.
  4. Characterizing the Workload of a Netflix Streaming Video Server,
    (No Summary required)
    Jim Summers, Tim Brecht, Derek Eager and Alex Gutarin,
    IEEE International Symposium on Workload Characterization (IISWC), 2016.
    You are not required to do a really detailed reading of this paper and don't need to understand all of the results. Focus on what is being done, why, and how. Read the questions below first and ensure that you read the paper well enough to answer all of them.
  5. Answer the following questions about this week's papers.
    1. Why is it important to understand workloads?
    2. Do you think httperf is a good tool or not and justify/explain your answer?
    3. What is the difference between a workload generator, a workload and a benchmark?
    4. Explain why you think you've been asked you to read (in order) papers 2, 3, and 4.
    5. Describe *** exactly two *** (no more and no less) ideas for research that could be conducted as followup work to the "Characterizing the Workload of a Netflix Streaming Video Server" paper.
    The answers to all questions must all fit on a single page (I expect most will be much shorter). Your solution must be submitted to Crowdmark as a PDF file.
  6. Course Project Proposal: Submit a PDF slide presentation for your course project proposal. See the information on the course project page for details and please follow the instructions there.
    **** NOTE: This is due Wed. Oct 20 11:59 pm eastern (just before midnight)

Week 6: Tuesday Oct. 26, 2021 (Topic: Conducting Evaluations and Common Mistakes)

  1. The Truth, The Whole Truth, and Nothing But the Truth: A Pragmatic Guide to Assessing Empirical Evaluations
    (Survey Paper Summary required -- skip the last question (i.e., just cover questions 1-3).
    Blackburn et al.,
    Transactions on Programming Languages and Operating Systems, 2016.
  2. Systems Benchmarking Crimes
    (No survey required)
    Gernot Heiser
  3. Watch the this video until the 19:19 mark (feel free to watch more if you wish).
    Common Mistakes and How to Avoid Them by Raj Jain.
    (No survey required)
    Here is a copy of the slides are used in the video (approximately the slides)
    Common Mistakes and How to Avoid Them

    The lecture and slides are based on (from) Raj Jain's textbook.
    The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling, Wiley- Interscience, New York, NY, April 1991,

Week 7: Tuesday Nov. 2, 2021 (Topic: Miscellaneous)

  1. Learning in situ: a randomized experiment in video streaming
    (Research paper summary is required and answer the questions below)
    Francis Y. Yan, Hudson Ayers, Chenzhi Zhu, Sadjad Fouladi, James Hong, Keyi Zhang, Philip Levis, and Keith Winstein, NSDI 2020.
    If you follow the link below you can also get access to the talk slides and a video of the talk.
    Link to the NSDI page that contains talk slides and the video.
    Be sure that you read the paper. Watching the video is not required but you may find it useful to help understand that paper.
  2. This is not a separate paper but a set of questions to answer about the paper above. This will be completed and handed in using a separate document.
    1. How would you rate the performance evaluations in this paper?
      Be sure to state what is good and bad about the evaluations and be clear which evaluations you are referring to.
    2. How would you rate the paper in terms of the claims it is making? Describe if you think the claim(s) made are sound or not and explain your reasoning. Be sure to describe which claim(s) you are referring to.

    It is not required but it is fine if there is some overlap with some of the points in your summary when answering these questions.
    Your answers must fit on one page (single sided). Use the same font size and margin restrictions as for summaries. Produce a PDF file to be uploaded via Crowdmark. Using point form is fine.
  3. An introduction to Docker for reproducible research
    (Survey paper summary is required)
    Carl Boettiger
    Operating Systems Review, 2015.

Week 8: Tuesday Nov. 9, 2021 (Topic: Cloud Computing)

  1. Conducting Repeatable Experiments in Highly Variable Cloud Computing Environments,
    (Write a Research Paper Summary)
    Ali Abedi and Tim Brecht, 8th ACM/SPEC International Conference on Performance Engineering (ICPE), 2017.
  2. Is Big Data Performance Reproducible in Modern Cloud Networks?
    (Write a Research Paper Summary)
    Alexandru Uta, Alexandru Custura, Dmitry Duplyakin, Ivo Jimenez, Jan Rellermeyer, Carlos Maltzahn, Robert Ricci, and Alexandru Iosup,
    NSDI, 2020.
    Please read the paper. If you are interested can also get access to the talk slides and a video of the talk.
    Link to the NSDI page that contains talk slides and the video.

Week 9: Tuesday Nov. 16, 2021 (Topic: Reproducibility in Machine Learning)

  1. Improving Reproducibility in Machine Learning Research (A Report from the NeurIPS 2019 Reproducibility Program)
    (No summary required. Instead, answer the questions below about this paper)
    Joelle Pineau, Philippe Vincent-Lamarre, Koustuv Sinha, Vincent Lariviere, Alina Beygelzimer, Florence d'Alche-Buc, Emily Fox, Hugo Larochelle

    Answer the following questions about this paper.

    1. In your own words briefly (in one paragraph) summarize the paper.
    2. What if anything new did you learn from this paper?
    3. Is there anything from this paper that you think you can apply to your course project? If so briefly describe what applies and why it is relevant to your project.
    4. Provide any other points that you would like to make about the paper (optional).
  2. The NIPS experiment
    (No summary required. Instead, answer the questions below about this posting)
    Eric Price, Dec 15, 2014,
    If you know of or find a better source (starting point) please let me know.
    This is the experiment conducted to examine the consistency of their reviewing process. You can find a starting point for reading below.
    Answer the following questions about this paper.
    1. What is your understand and/or interpretation of the results of this experiment?
    2. Do you think that this will or will not influence how you think about paper acceptance and rejections? Justify and/or explain your answer.
  3. Choose a tool that your group will describe for Week 10.
    See the information about Week 10 for more information about what will be required.
    This is being done via Piazza.
    • Working together as a group (using the same groups as for the course project) choose a tool that is used to help measure and/or understand performance and create a slide deck (presentation) describing that tool and what it can be used for. If possible include a small example of how it can be used and an example of the output.
    • Examples of some possible tools include but are not limited to: gprof, Instruments, vmstat, iostat, oprofile, vtune, TAU, JMeter, iperf
    • NVIDIA seems to have several GPU profiling tools.
    • Here is a list of performance analysis tools
    • This seems to refer to some tools for profiling python programs.

Week 10: Tuesday Nov. 23, 2021 (Topic: Performance Tools)

  1. As a group, create a slide deck describing the tool you have chosen to present and upload them to the Piazza thread about the tools.
  2. As individuals, pick and read two slide decks and be prepared to discuss and ask questions about those tools in class.

Week 11: Tuesday Nov. 30, 2021 (Topic: Dynamic Instrumentation and Miscellaneous)

  1. Dynamic Instrumentation of Production Systems,
    (Research Paper Summary required)
    Bryan M. Cantrill, Michael W. Shapiro and Adam H. Leventhal
    USENIX ATC 2004.

    HIGHLY RECOMMENDED READING/LISTENING:
    Brian Cantrill, Keynote Talk at USENIX ATC 2016.
    A Wardrobe for the Emperor: Stitching Practical Bias into Systems Software Research
    Follow the link to the slides and/or the audio of the talk.
    This talk presents some interesting views from one of the authors of Dtrace on USENIX, conferences, the review process, program commitee meetings and publishing. I would recommend listening to the audio while following along with the slides (the slides are nteresting but the audio is much more illuminating).

  2. A Few Short Lessons and Tips,
    (No Summary Required. See notes below)
    Tim Brecht
    The slides for this component are not meant to be stand alone and I will go over the slides during class. However, to help the class discussion spend some time (e.g., 30-60 minutes) reading over and trying to understand what you can from the slides. See the checklist below for more of what is expected.

Week 12: Tuesday Dec. 7, 2021 (Topic: Project Paper Submission and Wrapup)

NOTE THAT THERE ARE MULTIPLE DIFFERENT DUE DATES THIS WEEK AS NOTED BELOW.

Please have a look at the form reviewers will be using to provide information about their view of your paper (this may be modified somewhat for the final form).
Review Form in Text Format
Note that this form is for completing a review offline. There is a more user friendly alternative available when you do your review online.

Note that you can continue to submit new versions of your paper and abstract right up to the deadlines (i.e., your abstract can change after your initial submission).

I suggest that you submit early and update your paper as you near the deadline.

Saturday December 11, 2021 (Topic: Paper Reviews)

Tuesday Dec. 14, 2021 (Topic: Program Committee Meeting)

After submitting your review for each paper and well in advance of the program meeting you should:

 

 

 

Home Video Logistics Schedule
Projects Resources Integrity

 

 

 

Last modified: Tue Nov 30 17:47:37 EST 2021