langtest.modelhandler.jsl_modelhandler.PretrainedModelForQA#

class PretrainedModelForQA(model: NLUPipeline | PretrainedPipeline | LightPipeline | PipelineModel)#

Bases: PretrainedJSLModel, ModelAPI

Pretrained model for question answering tasks

__init__(model: NLUPipeline | PretrainedPipeline | LightPipeline | PipelineModel)#

Constructor class

Parameters:

model (LightPipeline) – Loaded SparkNLP LightPipeline for inference.

Methods

__init__(model)

Constructor class

is_qa_annotator(model_instance)

Check QA model instance is supported by langtest

load_model(path)

Load the NER model into the model attribute.

predict(text, *args, **kwargs)

Perform predictions with SparkNLP LightPipeline on the input text.

predict_raw(text)

Perform predictions on the input text.

Attributes

hub

model_registry

task

static is_qa_annotator(model_instance) bool#

Check QA model instance is supported by langtest

classmethod load_model(path) NLUPipeline#

Load the NER model into the model attribute.

Parameters:

path (str) – Path to pretrained local or NLP Models Hub SparkNLP model

predict(text: str | Dict, *args, **kwargs) Dict[str, Any]#

Perform predictions with SparkNLP LightPipeline on the input text.

Parameters:

text (str) – Input text to perform question answering on.

Returns:

Question answering output from SparkNLP LightPipeline.

Return type:

Dict[str, Any]

predict_raw(text: str) List[str]#

Perform predictions on the input text.

Parameters:

text (str) – Input text to perform question answering on.

Returns:

Predictions as a list of strings.

Return type:

List[str]