Many instructors asked for a way to detect plagiarism, so we added a "similarity checker" feature that finds similar program submissions among students in a class, and highlights similarities (like Stanford's MOSS program).
However, we also encourage instructors to focus on preventing plagiarism in the first place, by reducing a student's temptation. Thus, we also released a zyLab "coding trail" feature. A coding trail is a compact representation of a student's history of effort with a zyLab. Seeing that the system is recording effort, and knowing instructors can see such effort, students may be less likely to plan on copy-pasting another student's code (or code written by a "tutor" from those nice student websites).
zyLabs now display a coding trail of work just below the student's code output (in Develop mode) or the Submit button (in Submit mode):
Below is a coding trail for one student working on a zyLab:
The above student started on 3/26 (March 26) on Thursday (R).
- Each "-" indicates a develop run, meaning the zyLab is configured for students to code in the zyBook, and the student ran their code in the zyBook with their own input values.
- Each "|" indicates a submission, meaning the student submitted their code for grading against the zyLab's test cases. The | is followed by the score that code received, so |5 means the submit earned 5 points.
Above, the student did 3 develop runs "---", submitted but got 0 "|0", developed twice more "--", submitted but got 0 again "|0", and so on. The student eventually got 5 points, and with two more develops, a last submission earned 10 points. The student ended on 3/26.
Below, the student started on 3/10 Sunday, and continued on Monday, eventually getting 8 points:
Note: M: Monday T: Tuesday W: Wednesday R: Thursday F: Friday S: Saturday U: Sunday
If a zyLab is configured for students to code outside the zyBook and upload files, the coding trail will only contain submits "|". In that case, we recommend instructors tell students they are expected to submit their code throughout development, such as after every 20 minutes of effort, knowing the score achieved may be low, so that instructors can see when they started and the history of their effort.
Coding trails can reduce the temptation among some students to cheat. This is akin to how many retail stores have a camera and video display visible to entering customers. Such displays are known to reduce shoplifting, as the entering would-be shoplifter sees the display and thinks "Hmm, there's video, maybe I shouldn't".
Of course, we also don't want students feeling like they are being constantly monitored. Thus, we've placed the coding trail in a location, and sized it, such that students will notice it but such that it does not dominate their attention, just as most customers entering a store don't really notice or care about the video display -- unless they were thinking of shoplifting.
Of course, coding trails can be faked by submitting bogus programs, but that takes effort, so may deter some folks.
Coding trails do not currently indicate the time spent, but we plan to append that info too. We also plan to indicate develop/submits where the code dramatically varies from the previous code, as might be the case if a student at some point replaces their own unsuccessful code by a copy-paste of somebody else's code. And we might even develop some AI that detects coding trails likely to be fake.
Nothing can entirely prevent cheating. But we encourage instructors to take steps to try to reduce cheating before it happens, in part by reducing temptation. Coding trails can help, as can informing students of the similarity checker. Many students are young and may make bad decisions especially when overwhelmed or desperate. Reducing temptation to cheat can not only keep otherwise good students on track, but also reduces the headache instructors must endure to punish detected cheating. We hope coding trails help!