langtest.metrics.llm_eval.EvalTemplate#

class EvalTemplate#

Bases: object

The EvalTemplate class provides a method to build a prompt for evaluating student answers based on a given rubric. The prompt is designed for a teacher to grade a quiz by comparing the student’s answer with the true answer and scoring it according to specified criteria.

build_prompt(rubic_score: Mapping[str, str] = {'CORRECT': None, 'INCORRECT': None}) str#

Constructs and returns a grading prompt based on the provided rubric scores.

__init__()#

Methods

__init__()

build_prompt([rubic_score])