langtest.transform.security.PromptInjection#

class PromptInjection#

Bases: BaseSecurity

PromptInjection is a class that implements the model security for prompt injection.

__init__()#

Methods

__init__()

async_run(sample_list, model, **kwargs)

Abstract method that implements the model security.

run(sample_list, model, **kwargs)

Abstract method that implements the model security.

transform(*args, **kwargs)

Attributes

alias_name

supported_tasks

test_types

async classmethod async_run(sample_list: List[Sample], model: ModelAPI, **kwargs)#

Abstract method that implements the model security.

abstract async static run(sample_list: List[Sample], model: ModelAPI, **kwargs)#

Abstract method that implements the model security.