⛏️checToxicityTest

Description

Check toxicity of a piece of text. This functionality is developed using [this](https://huggingface.co/unitary/toxic-bert) model.

Parameters

Response

Example Request and Response

Prerequisites

Before making requests with NEST® SDK, you must have it installed.

You can install NEST® SDK using either npm or yarn. Use the following commands to install NEST® SDK:

npm install @nest25/ai-core-sdk
OR
yarn add @nest25/ai-core-sdk

Request

Here is an example of how to make a checkToxicityTextrequest using the NEST® SDK:

// import the ai-core-sdk
import {AIServices} from '@nest25/ai-core-sdk';

// create a new instance of the sdk
const aiServices = new AIServices();

async function main() {
  // get the result of the test
  const result = await aiServices.checkToxicityText('this is a prompt');
  console.log(result);
}

main();

Response

{
    "message": "Request successful",
    "response": {
        "identity_hate": true,
        "insult": false,
        "obscene": false,
        "severe_toxic": false,
        "threat": false,
        "toxic": true
    },
    "status": 1
}

Last updated