4

What I'd like to do:

I would like to use r-exams in the following procedure:

  1. Providing electronic exams in pdf format to students (using exams2pdf(..)
  2. Let the students upload excel file with their answers
  3. Grade the answers using (using eval_nops(...))

My Question:

Is calling the function eval_nops() the preferred way to manually grad questions in r-exams?

If not, which way is to be prefered?

What I have tried:

I'm aware of the exam2nops() function, and I know that it gives back an .RDS file where the correct answers are stored. Hence, I basically have what I need. However, I found that procedure to be not very straightforward, as the correct answers are buried rather deeply inside the RDS file.

Claus Wilke
  • 16,992
  • 7
  • 53
  • 104
Sebastian Sauer
  • 1,555
  • 15
  • 24
  • I can't see a question, here. – SiHa Nov 02 '20 at 14:00
  • > "How to grade exams/questions manually?" "I have not found a procedure/function to easily grade a PDF exam where the answer came in electronically (not scanned)." So, how could I easily grade a PDF exam where the answers came in _electronically_? – Sebastian Sauer Nov 02 '20 at 14:09
  • This is far too broad for SO, I'm afraid. – SiHa Nov 02 '20 at 14:40
  • Maybe I wasn't clear enough. Here's my own attempt to solving it, probably someone more into the system will have a better solution: https://gist.github.com/sebastiansauer/c942b2dded75620a67269cbc9aa66a14 – Sebastian Sauer Nov 02 '20 at 14:53
  • 1
    Hi everyone, the question is clear to me, I'll post an answer when I'm at a proper computer again, please don't close the question. @SebastianSauer please don't cross post with the R/exams forum to avoid duplication of efforts. – Achim Zeileis Nov 02 '20 at 18:48
  • Ok, thanks @AchimZeileis, sorry for cross-posting. – Sebastian Sauer Nov 02 '20 at 20:01
  • No worries. I've answered now, focusing on the objective "hard facts" which StackOverflow is most appropriate for. For further open discussions the R-Forge forum is probably more suitable. – Achim Zeileis Nov 02 '20 at 23:33
  • Sigh, @SebastianSauer so the question has been closed despite my answer. Could you please edit it to focus on the question where the meta information is and how it can be processed? Hopefully it gets reopened again ... – Achim Zeileis Nov 03 '20 at 07:13

1 Answers1

5

Overview

You are right that there is no readily available system for administering/grading exams outside of a standard learning management system (LMS) like Moodle or Canvas, etc. R/exams does provide some building blocks for the grading, though, especially exams_eval(). This can be complemented with tools like Google forms etc. Below I start with the "hard facts" regarding exams_eval() even though this is a bit technical. But then I also provide some comments regarding such approaches.

Using exams_eval()

Let us consider a concrete example

eval <- exams_eval(partial = TRUE, negative = FALSE, rule = "false2")

indicating that you want partial credits for multiple-choice exercises but the overall points per item must not become negative. A correctly ticked box yields 1/#correct points and an incorrectly ticked box 1/#false. The only exception is where there is only one false item (which would then cancel all points) then 1/2 is used.

The resulting object eval is a list with the input parameters (partial, negative, rule) and three functions checkanswer(), pointvec(), pointsum(). Imagine that you have the correct answer pattern

cor <- "10100"

The associated points for correctly and incorrectly ticked boxed would be:

eval$pointvec(cor)
## pos neg
## 0.5000000 -0.3333333

Thus, for the following answer pattern you get:

ans <- "11100"
eval$checkanswer(cor, ans)
## [1] 1 -1 1 0 0
eval$pointsum(cor, ans)
## [1] 0.6666667

The latter would still need to be multiplied with the overall points assigned to that exercise. For numeric answers you can only get 100% or 0%:

eval$pointsum(1.23, 1.25, tolerance = 0.05)
## [1] 1
eval$pointsum(1.23, 1.25, tolerance = 0.01)
## [1] 0

Similarly, string answers are either correct or false:

eval$pointsum("foo", "foo")
## [1] 1
eval$pointsum("foo", "bar")
## [1] 0

Exercise metainformation

To obtain the relevant pieces of information for a given exercise, you can access the metainformation from the nested list that all exams2xyz() interfaces return:

x <- exams2xyz(...)

For example, you can then extract the metainfo for the i-th random replication of the j-th exercise as:

x[[i]][[j]]$metainfo

This contains the correct $solution, the $type, and also the $tolerance etc. Sure, this is somewhat long and inconvenient to type interactively but should be easy enough to cycle through programatically. This is what nops_eval() for example does base on the .rds file containing exactly the information in x.

Administering exams without a full LMS

My usual advice here is to try to leverage your university's services (if available, of course). Yes, there can be problems with the bandwidth/stability etc. but you can have all of the same if you're running your own system (been there, done that). Specifically, a discussion of Moodle vs. PDF exams mailed around is available here:

If I were to provide my exams outside of an LMS I would use HTML, though, and not PDF. In HTML it is much easier to embed additional information (data, links, etc.) than in PDF. Also HTML can be viewed on mobile device moch more easily.

For collecting the answers, some R/exams users employ Google forms, see e.g.: https://R-Forge.R-project.org/forum/forum.php?thread_id=34076&forum_id=4377&group_id=1337. Others have been interested in using learnr or webex for that: http://www.R-exams.org/general/distancelearning/#going-forward.

Regarding privacy, though, I would be very surprised if any of these are better than using the university's LMS.

Achim Zeileis
  • 15,710
  • 1
  • 39
  • 49
  • 1
    thanks for the explanation. That's now clear to me. Some thoughts: HTML is superior to PDF here, good idea! Watching students via Webcam is too strong a measure for me in terms of privacy rights. Hence, an open-book-approach is what I like. For reasons of stability/bandwidth, I think that downloading the exam, reading it offline, and then uploading the answers to the LMS is preferable. – Sebastian Sauer Nov 04 '20 at 08:52
  • That's where I disagree (as argued in the posts I linked). I personally think that using the official login/authentication and getting fully automatic evaluation of a broader range of exercises outweights the little bit of extra bandwidth needed. But, of course, it's ok if your mileage may vary. I don't do supervision via webcam either, though, but for stability/bandwidth concerns and not for privacy rights. P.S.: Would you consider trying to make the question more focused? Then we might get it re-opened. – Achim Zeileis Nov 04 '20 at 09:54
  • 1
    I've tried to get the question more focused. My cohort is quite large (350 students for one exam), so our IT department was nervous. x I see your point weighting bandwidth against comfort, but I think first priority should be to ensure the system does not brake.I'll test-drive it. Maybe the number of students won't be any problem for the server. – Sebastian Sauer Nov 04 '20 at 12:09
  • Re: question update. Thanks! // Re: size. 350 shouldn't be too large for Moodle - at least if the exercises are simple and do not embed large datasets or other large supplementary files. If you can, make a trial exam with all of your students. Of course, this is extra work but will make everybody more relaxed about the actual exam. // Re: comfort. This is just one aspect. More importantly, you would go through the official authentication and standard workflow via your university's system. – Achim Zeileis Nov 04 '20 at 22:30