Related Pages

Marmoset documentation for Instructors is located at This document contains general instructions for adding new projects (assignments) to Marmoset and for writing test cases for Marmoset. As a caveat, some of the methods and information in this document may be antiquated.

Omar also maintains MarmosetCourseConfiguration covering UW-specific setup of Marmoset and running the buildservers.


Marmoset is a testing suite operated through a web interface (for both ISG and students); it provides an easy way to create programming problems and assign marks to each test case. The advantage of Marmoset over other test suites is it provides students with instant test feedback letting them know how they did. It is also possible to use Marmoset as a submission bank with no automatic testing if you would like to also have written problems thrown in (students can submit .txt or equivalent files). Student marks are also easily downloadable from marmoset into convenient .csv files.

The basic mechanism behind Marmoset is to have "public tests" and "release tests" (and sometimes secret tests). Public tests can be thought of as sanity checks for students. These tests are "free" and their results are instantly visible; students will automatically see these for every submission. Release tests require students to spend a token (these are limited and take a long time to recharge) to "release test" their submission. In general, most of the marks are allocated to release tests. Secret tests behave the same way as release tests however students do not get to see the results of these tests (hence they are 'secret').

It is worth noting that Marmoset was not created by UW and is modified from it original version to work with the UW systems. That being said, internet documentation should be trusted over external documentation.

How Marmoset Works:


  • Tutors upload test setup and canonical solution via web interface
  • Canonical solution and test setup are loaded into the database
  • Assignment is made visible to students (via the web interface)
  • Students submit assignment via web interface
  • Assignment gets loaded into database and copied to buildservers for testing
    • Auxiliary files are copied to the buildserver from course account and cs_build
    • Student submission is compiled
    • Test setup gets run against the student's submission as csNNNt (see *)
    • Test results are returned to the database
  • Students receive feedback via web interface
  • Tutors may download marks via web interface in csv format
csNNNt is designed as a 'sandbox' account to test student submissions. Its purpose is to minimize problems during execution (ex: security issues, malicious code, ect.). The goal is to give csNNNt just sufficient priviledges to perform submissions testing and to make it as disjoint as possible from the csNNN course account (we do not want people probing any sensitive material in cnNNN).

Getting Marmoset Setup for your Course:

To get Marmoset for your course, you should contact CSCF who can set up an instance of Marmoset for you for the term. Typically, whoever contacted CSCF will be added as an instructor to the instance.

Once this is set up, you will need to add all the tutors and instructors with 'instructor' privileges. You can do this by navigating to the home page for Marmoset and clicking 'Instructor View'. Then click on "Register an Instructor for this course using a web interface". Enter their first and last name and leave the rest of the fields as their UW userid (ex: kabob56).

Students will also need to be added with 'student' privileges. This can be done manually in the same way instructors are registered if you need to add only a few students. To add lots of people at once, use the "Register students or groups for this course using a CSV file" link.

Creating a new project:

The following videos illustrate the basics outlined below: (0:00 - 1:47) (4:49 - 7:25)

You can create a new project by logging into Marmoset as an instructor and navigating to the main page and clicking "create a new project". Fill the fields in as desired.


  • Project Number should be alphanumeric. You can 'break' the html source by having a quote (") character in this (viewing html source will reveal why; this causes quote mismatching).
  • If you do not want to accept lates, make the on-time and late deadlines the same.
  • Project title supports html formatting. You could have an image for the title if you wanted.
  • Description is usually left as which files to submit (ex: "Files to submit: file1.c file2.c . . .")
  • Stack trace policy is usually left as "the entire stack trace". The test setup is what is currently used to control and format output.
  • The rest of the fields can be left as default which are respectively (Best, 2, 3, 12, Upload and test, constant, 0, 0.0).
  • If you want to change the release token policy you can also do that here
Once you have created a project, you will need to upload a Test Setup and a Canonical Solution. The Canonical Solution is simply a submission which passes the Test Setup (described below).

After these two things have been uploaded and the Canonical Solution passes the Test Setup, you can "Activate" the Test Setup (this will cause people with student privileges to be able to submit and have their submission tested by the Test Setup). Although, until you make the project "Visible", students will not be able to see the project. WARNING: Once you make a project visible, it cannot be made 'invisible' again. Be sure to thoroughly verify that your test setup and canonical solution are correct before doing this!

Configuring the (Static) Test Setup

This method is DEPRECATED and the Dynamic Test Setup is preferred (see below). However, understanding this method is key for understanding the dynamic method.

At bare minimum, Marmoset requires every test setup to include:

  • A file
  • A makefile
  • A bash script for every Marmoset (i.e. public, release, secret) test you want to perform
These things should be zipped into an archive (since there are multiple files) and submitted as one package. BE CAREFUL about hidden files in the directory you zip up as they might confuse marmoset!

*** ***

This file contains metadata about how to build and test a submission. It must include the following lines (and these are sufficient):

test.class.public=(space separated list of bash scripts which correspond to public tests)
test.class.release=(space separated list of bash scripts which correspond to release tests)
test.class.secret=(space separated list of bash scripts which correspond to secret tests)
build.make.command=(usually /usr/bin/make when the buildservers are set up on the linux servers)
build.make.file=(if your makefile has a name other than makefile or Makefile you can specify it here)


  • If there are no secret tests, you may omit that line (The same is true if there are no public/release tests).
  • For the make command, if /usr/bin/make fails, contact CSCF for the correct path
Sample file contents:

test.class.public=public0 public1
test.class.release=release0 release1 release2

*** Makefile ***

The Makefile with respect to test suites can be thought of as a set of instructions for how to compile or pre-process a student's submission. Currently, it is processed using GNU Make located in /usr/bin/make. More can be read about GNU Make and Makefiles at

Here is a very simple example of a Makefile (see the template links below for more complex Makefile setups):

TEST_INPUT_FILE = file_1.c file_2.c ... file_n.c
CHECK_FILE = /u/cs_build/bin/marmoset_test_tools/checkfiles

   @chmod u+x *

What it does:

  • Line 1: Define a variable TEST_INPUT_FILE to be a space separated list of files a student is expected to submit.
  • Line 2: Define a variable CHECK_FILE to be the path to a script called 'checkfiles'

  • The 3 lines after 'all:' do the following (respectively):
    • Check if a student has submitted the expected files
    • Add execute permissions for 'user' to all files in the current directory (in this case, 'user' will be csNNNt where NNN is your course number)
Note: The @ in front of the lines prevent those lines from being echo'd when they are executed.

*** public/release/secret test bash scripts ***

Since these tests are bash scripts, there is huge flexibility for configuring testing and test output. At bare minimum, the script should return one of the following exit codes to Marmoset:

exit code outcome meaning
0 Correct student's code passed the test
1 Error student's code crashed
2 Timeout student's code ran too long
3 Failed student's code produced the wrong answer

A very simple test which checks if a student's submission (called sub.txt) contains the text "hello" could look like the following:

export CORRECT=0
export ERROR=1
export TIMEOUT=2
export WRONG=3

grep 'hello' sub.txt > /dev/null

if [ $RET == 0 ] ; then
   echo "Your submission contains 'hello' - good job!"
   exit $CORRECT
   echo "Your submission does not contain 'hello'.  Try again."
   exit $WRONG


  • All output generated by the bash script (echo/head/tail ect.) is outputted to students via the Marmoset interface
  • It's worth noting that the variable $0 contains the name of the bash script during its execution (this can be put to use by things such as RunC)
*** Order of Operations when Testing ***

  • 1. Copies the student's submission to some working folder
  • 2. Copies the contents of the test setup into the same working folder (OVERWRITING any files with the same name)
  • 3. Parses the file for information
  • 4. Executes the Makefile as specified by the file
  • 5. Runs each bash script as indicated by the file (in listed order, starting with public tests, then release, then secret)
  • 6. Test results returned to Marmoset and entered in database
  • 7. Students may see their test results via Marmoset
  • Students can't tell but ALL tests (public/release/ect.) are run and students are auto-graded over all tests. Release tests are only made visible to students by "release testing".
  • NEVER leave solutions in the test setups (the solution will overwrite the student's submission!)

Configuring the (Dynamic) Test Setup

The dynamic test setup is in principle the same as the static setup but instead of zipping the test configuration and submitting it, a 'stub' is submitted instead. All the 'stub' contains is the file with some subtle modifications:

  • build.make.command (in is set from /usr/bin/make to a special "Makefile" called "dynamic_test"
  • build.make.file (in is set to the file path of the actual test setup located on the course account (or the csNNNt equivalent)
What the dynamic setup does is create a test setup on the course account (or the csNNNt equivalent) that is effectively "live"; any changes to the test setup on the course account are immediately reflected in future future tests run by Marmoset (without Marmoset detecting any changes).



  • If something needs to be changed in the test setup, it can be changed immediately without having to upload a new test setup. This is very useful because reuploading a new test setup will cause Marmoset to retest all submissions for that project; this has the potential to cause a heavy load on the buildservers.
  • Slightly easier and less cumbersome to create and test (there are even scripts to create these stubs for you!).

  • This method can be hazardous; it is easy to accidently tamper with the test setup in the course account and break it.
  • Changes to the live setup may cause students to get different marks for the same submission; this could cause confusion.

Marmoset ISG Standards

To prevent unnecessary stress on the buildservers and to keep systems reasonably secure, ISG imposes certain limitations and security precautions.

Acceptable Limits for Submissions

Be aware that although the software that Marmoset is comprised of is maintained and run by CSCF, you are responsible for using it reasonably and within resource usage limits and guidelines established campus-wide. For example, your tests have the potential to cause an unreasonable load-increase on hosts running marmoset buildservers which violates established policies. So please be aware of this, especially around deadlines for your assignments/projects.

While it's difficult to define "acceptable" limits for student submissions, it has been adopted that 256-512MB ram+stack, 1MB max filesize, and 10 seconds of execution time are "reasonable". To impose these, we often use "wrapper" programs when executing student submissions which set up an environment with the imposed limitations (which submissions are executed in). An example of this is cwrapper, which appears in sample templates and (hopefully) in any test setups from previous terms.

In any case, your tests must use custom-built timeout commands!


  • Many courses over the last couple years have modified cwrapper's source to impose different limitations on submissions (sometimes less, sometimes more). It isn't clear which cwrapper is being referenced when it's called.

csNNNt Accounts and Security

As a security measure, cnNNNt accounts exist as 'sandbox' accounts. The idea is to keep these disjoint from the cnNNN course accounts to safeguard from malicious code or probing (by having all Marmoset tests run on the csNNNt account). The downside of this is that the test setups and solutions need to be kept on or duplicated to the csNNNt account which can be an inconvenience. Some scripts may exist (if they don't they are easy to write) which will duplicate the test setups for you on the course account.

If your test setup is not automatically run as csNNNt (which it probably is), then your tests MUST use sudo to execute student submissions as csNNNt.

  • sudo -u csNNNt command
The above invocation runs command as csNNNt.

Monitoring Buildservers

Sometimes buildservers will behave strangely or "hang". This is most likely to occur during submission deadlines. If you suspect unusual behavior, you can monitor the buildservers by logging onto Marmoset and following one of the two links (depending on whether your course is on marmoset or marmoset2).

To restart hanging buildservers follow the instructions here:


Standardize cwapper and move it to cs_build.


Replace cwrapper and lockout.o with 'Box'. Box was written by Martin Mares in Sweden, and has been used extensively for programming contests. No one has formally followed up on this yet, but with some testing and template creation, it may replace cwrapper and lockout.o.

More can be read here:

A slightly modified version of Box exists in the CS145 course account:
  • ~cs145/sandbox/eval/box

Miscellaneous and Tips

In case students submit a Makefile to be compiled, the Makefile bundled with the test setup should have a cryptic name. Recall that files in the test setup will overwrite any files submitted by students! If you do this, it's important to specify the name of your cryptically named Makefile using the following two lines in


Avoid creating *Upload-only" Marmoset projects, even if the "tests" do nothing of value with a submission.

Whenever something unexpected happens, determine whether the tools you are using or servers you are relying on, have log files. Open them up and look through them. Most logs have time-based entries to help you home-in on error conditions. The log entries usually have integer identifiers as well, to simplify searching for problems if a particular entity (e.g. a student's submission) is experiencing problems.

  • The buildservers have logs under /u/csNNNt/buildserver/bsN.csNNNt.*.student.cs/log/. (use csNNN if submissions are not automatically run as csNNNt)
  • The submitserver has additional logs on under /fsys2/tomcat5.5/logs/marmoset.

To create and test a project, use the $user account rather than ${user}-student or ${user}-canonical. After selecting a course, you will counter-intuitively be seeing a student view from the instructor perspective; you must click "instructor view" on the top bar to do anything productive. Note also that this student view is different than the real student view; you must log in as ${user}-student to see what the students see!


All test setups and templates can be roughly categorized into two flavours:

  • The Makefile only serves as a sanity check and the bash tests do all the heavy lifting
  • The Makefile does compilation and most of the work; bash tests only run the program







The following is documentation from the old Marmoset documentation page. It still needs to be read through and consolidated into the above document. The contents of the old page are here:

By default, you will not be able to read any files created by sudo.  While appropriate permissions can be set up at startup, any careless change in permissions can prevent your tests from running.  Because of this, your Makefile should set permissions similarly to the following before any calls requiring sudo:

   while [[ `/software/.admin/bins/bin/absolute ..` != `/software/.admin/bins/bin/absolute .` ]]; do cd ..; chmod ug+rX * 2> /dev/null; chgrp "$(whoami)t" * 2> /dev/null; done; exit 0
   chmod ug+rwx .
   chgrp "$(whoami)t" .
   chgrp -R "$(whoami)t" *
   chmod -R ug+rx *
   chmod -R a-w *

To make this work properly, you should make sure that your Makefile is executed in bash.  To enforce this, one of the initial variables you set at the start of the Makefile should be:

SHELL = "/xhbin/bash"

---++++ Your test files must be writeable

This is a weird one.

Marmoset was throwing completely unhelpful errors of the form
Compile/Build unsuccessful
         edu.umd.cs.buildServer.BuilderException: Couldn't build project

Dropping temporary files revealed that the Makefile was completing successfully, but that none of the tests (listed in =test.class.*= in the file) were being executed.
The buildserver log file came to the rescue here, with the more helpful information:
ERROR - IOException trying to build submission 360561 [...path...] (Permission denied)

Permissions were loosened from 500 (user-only read and execute) to 555 (everybody read and execute), but the test still would not run.  As soon as the permissions were changed to 755, Marmoset happily ran them.

So, even though there should be no good reason to write test files, make sure your Makefile makes them writeable.  Further testing will be needed to determine if permissions like 750 or 700 are also acceptable.

---++++ Your tests must be _paranoid_ about permissions

Not only do you need to ensure that files are readable from sudo (meaning enforcing group permissions beforehand), you must also ensure that files are writeable from the course account (meaning enforcing group permissions through sudo after testing is done).  The current assumption is that because Marmoset reuses hardcoded build directories instead of a subdirectory of =/tmp=, it *must* have permission to modify all files in that build directory as the course account user.  If it does not, it will die with an unhelpful error message; although Java exceptions regarding file permissions written are to the log,  any indication of why it cares is not.

---+++ Updating the buildserver runtime environment variables

While logged in as the course account:

   1 Modify =~csNNN/.marmoset_buildserver_environment= and add *bash-style* =export= commands to set the variables you need.
   1 Restart the buildservers by executing the following command:
      * =~cs_build/bin/buildserver_hosts_command pkill -u csNNN java=
---+++ Avoid using =student.cs= for any sort of work

Do *not* use =student.cs= for any sort of development or testing.  See [[TAUnix#AvoidingCertainHosts]] for more information.

---+++ Avoid creating "Upload-only" Marmoset Projects

Marmoset allows one to create projects where students can submit files with no expectation of testing. Converting such projects into regular testable projects, after students have started to submit to them, requires a bit of database hackery that may result in more harm than good. It is therefore advisable that one never really create "Upload-only projects" and always create testable ones, even if the tests do nothing of value with a student's submission.

---+++ Get into the habit of reading Log Files

Whenever something unexpected happens, determine whether the tools you are using or servers you are relying on, have log files. Open them up and look through them. Most logs have time-based entries to help you home-in on error conditions. The log entries usually have integer identifiers as well, to simplify searching for problems if a particular entity (e.g. a student's submission) is experiencing problems.

   * The buildservers have logs under =/u/csNNN/buildserver/bsN.csNNN.cpuXX.student.cs/log/=. 
   * The submitserver has additional logs on under =/fsys2/tomcat5.5/logs/marmoset=.

---+++ Uploading a project

To create and test a project, use the =$user= account rather than =${user}-student= or =${user}-canonical=.  After selecting a course, you will counter-intuitively be seeing a student view from the instructor perspective; you must click "instructor view" on the top bar to do anything productive.  Note also that this student view is different than the _real_ student view; *you must log in as =${user}-student= to see what the students see*!

Once you've created a project (see the documentation), you can access information by clicking the 'view' link for the project under 'Projects', and then click the 'Utilities' link near the top of the screen.  It can be tested from here by uploading a file under 'Canonical solutions'.  *Both of these files should be JAR files*!  While Marmoset will accept zip files and they will often work, occasionally they will just mysteriously fail with Marmoset claiming the zip file cannot be extracted, even though these files can be downloaded, extracted, and run perfectly fine.  No such problem has yet occurred with jar files.  Note also that if you are using a browser like Safari, you will normally have to click this twice to make Marmoset acknowledge you.  A link will be added under 'Canonical Submissions' near the top of the page.  If you click on this immediately you will see a completely unhelpful java exception; this is Marmoset's way of telling you "I'm currently testing this; try again later."  After a couple of minutes, this will contain potentially helpful error output in the event of a failure.  

If you have a dependency on a utility outside of the Makefile that is causing the error, you may wish to click "Retest" on the canonical solution failure page instead of re-uploading the zip file.  Don't bother; "Retest" appears to change the text on the page to tell you that it's retesting, but it does not actually appear to do anything.  No other status will change, and there's no evidence a retest will actually ever complete.

If the project configuration fails, click 'mark broken' to make the testing setup disappear.  This does not appear to be undoable if you do this accidentally, as the entry disappears entirely instead of just being flagged; because of this, be *very*  careful clicking anything on any page.  After uploading a new one, the retesting of the canonical solution will be queued.  Often this is tested relatively quickly, but particularly if some of the build servers fall asleep this can take roughly a quarter of an hour to start testing; if you give it a bit to come back from vacation and keep clicking refresh, you'll see it start to test eventually.

After a test is complete, a link with appropriate text (such as "failed") will appear in the status column of the "Testing setups" section.  The line at the bottom lets you know what build server was used.  If you look at the path

(substituting that string for $bs), you can examine the state of that run.  The =log= subdirectory contains logs for that buildserver which may give you helpful information about what happened on the test run.  The =build= subdirectory is where Marmoset ran the test, so you can examine the final state of any files there.

---+++ Interpreting Marmoset build errors

edu.umd.cs.buildServer.BuilderException: test script returned unknown exit code: 127 127 is the bash shell's exit code when a file you attempt to run does not exist. This likely means that one of the tests listed in your file is not actually in the build directory (which means it was probably not in the zip file you uploaded). edu.umd.cs.buildServer.BuilderException: test script returned unknown exit code: 126 126 is the bash shell's exit code when you attempt to run a file that does not have execute permissions. This means you likely need to add appropriate chmod (and potentially chgrp) commands to your Makefile to ensure that all test files in the build directory have appropriate permissions before Marmoset tries to run them. edu.umd.cs.buildServer.BuilderException: Couldn't build project This is a useless catch all message that translates into "look at the buildserver logs" It can signal that Marmoset does not have write permission on files it does not need to write (see above), which has absolutely nothing to do with whether or not the project was built.
---+++ Post-Upload After uploading a successful canonical submission, you will be allowed to assign marks for each test by clicking the appropriate link under the "Activate/Inactivate" heading. Once you have done this, *never click here again*. If you accidentally click inactivate, the project becomes inactive again, and there's no clear way to re-activate it other than uploading the canonical solution and re-assigning marks yet again. This would be far less irritating if there were an automated way to do mark assignment. No such way is currently known. ---++ Adding Group Accounts If your students are working on an assignment that allows them to work in groups, and they are made to be a part of a specific UNIX group on =student.cs= (e.g. =cs246_NN=), then these instructions will help you set up Marmoset accounts for group submissions. It is assumed that the UNIX groups have already been assigned to the students. The examples given will use cs246, but make sure you swap that course id for whatever course you are using. *NOTE:* When you download the submissions from Marmoset, their folders will begin with dashes, which causes problems with basically every UNIX command. To rename them to not have the dashes, enter something like the following at a bash command prompt in the directory which contains all submissions (works for partners): =for dir in -*; do mv -- "$dir" "`echo $dir | cut -f2,3 -d-`" ; done= ---+++ How To Since Marmoset supports adding accounts by uploading a classlist file, we will generate a classlist file that will add the group accounts. If users =pbeshai= and =tblancha= are part of the same group, say =cs246_01=, then their Marmoset group id is =-pbeshai-tblancha-=. Their entry in the class list: =99999999:-pbeshai-tblancha-:Group cs246_01 - pbeshai, tblancha:000:study:plan:group:0:degree:initials:family:R:F:1:lec=000,tut=000,tst=000=. The following scripts will generate the classlist for you. All you need to do is replace cs246 with whatever course you're working on and then upload the outputted classlist to marmoset. Run the scripts by typing: =./marmosetGroupsClasslist > groupsClasslist=. Then upload the outputted file =groupsClasslist= to Marmoset. Note that these scripts make some necessary formatting tweaks; for example, the third colon-delimited field (officially the "name" field) *must* contain a comma character, which is why =marmosetGroupsClasslist= explicitly includes one. You must make sure that any data you modify yourself retains the comma. ---+++ Necessary Scripts ---++++ listGroups #!/bin/ksh # Lists the groups in cs246 and their members grep "^cs246_" /etc/group | sort | cut -d: -f1,4 ---++++ marmosetGroups #!/usr/bin/env bash # # Translates unix groups into Marmoset user group syntax # # e.g. cs246_46:c32liu,yh2zhao becomes -c32liu-yh2zhao- for group in `listGroups | cut -d: -f2 | sed 's/,/-/'`; do echo -$group- done ---++++ marmosetGroupsClasslist #!/usr/bin/env bash # # Translates unix groups into Marmoset user group syntax then into classlist format # The classlist can then be uploaded to Marmoset # #id:userid:name:lec:study:plan:group:year:degree:initials:family:status:time:session:sections echo "# cs246 classlist updated: Thursday May 28 10:14:23 2009" echo "# Do not make changes to this file." echo "# It will be automatically overwritten by the next update." echo echo "#Id:Userid:Name:Lecture:Study:Plan:Group:Year:Degree:Initials:Family:Status:Time:Session:Sections" for group in `listGroups`; do userid="-`echo $group | cut -d: -f2 | sed 's/,/-/'`-" groupid=`echo $group | cut -d: -f1` member1=`echo $userid | cut -d- -f2` member2=`echo $userid | cut -d- -f3` echo "99999999:$userid:Group $groupid - $member1, $member2:000:study:plan:group:0:degree:initials:family:R:F:1:lec=000,tut=000,tst=000" done

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng Overview.png r1 manage 8.4 K 2011-04-25 - 00:40 AlexLake  
PDFpdf RestartingMarmosetbuildservers1.pdf r1 manage 132.5 K 2015-09-04 - 11:36 OmNafees  
PNGpng StaticDynamic.png r1 manage 12.5 K 2011-04-25 - 00:40 AlexLake  
Texttxt olddoc.txt r1 manage 16.9 K 2011-04-25 - 00:39 AlexLake  
Edit | Attach | Watch | Print version | History: r49 < r48 < r47 < r46 < r45 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r49 - 2023-03-09 - YiLee
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback