langtest.metrics.prometheus_eval.RelativeGrading#

class RelativeGrading(instruction: str, response_a: str, response_b: str, reference_answer: str, criteria_description: Dict[str, str])#

Bases: object

Class for relative grading of the Prometheus model.

Relative Grading (Comparative Assessment) Prometheus requires 4 components in the input: An instruction, a response to evaluate, a reference answer, and a comparative answer. You could refer to the prompt format below. You should fill in the instruction, response, reference answer, and comparative answer.

__init__(instruction: str, response_a: str, response_b: str, reference_answer: str, criteria_description: Dict[str, str]) None#

Methods

__init__(instruction, response_a, ...)

get_prompt()

Get the prompt for the model.

get_score_rubric()

Get the score rubric for the model.

Attributes

instruction

response_a

response_b

reference_answer

criteria_description

get_prompt() str#

Get the prompt for the model.

Returns:

The prompt for the model.

get_score_rubric() Dict[str, str]#

Get the score rubric for the model.

Returns:

The score rubric for the model.