⛏️checkToxicityImage

Description

Check toxicity of an image provided a valid URL. It has two different models to check the toxicity, one is [bumble](https://github.com/bumble-tech/private-detector) and second is Yahoo [Open NSFW](https://github.com/yahoo/open_nsfw) model.

Parameters

ParameterTypeDescription

imageUrl

string

image URL

Response

ParameterTypeDescription

message

string

message about the response

status

int

1 for successfull execution

-1 for any error occurs during the execution of the request

response

object

this will have three properties bumble - toxicity score of bumble model opennfsw - toxicity score of open nsfw model is_toxic - true for toxic and false for not toxic

Example Request and Response

Prerequisites

Before making requests with NEST® SDK, you must have it installed.

You can install NEST® SDK using either npm or yarn. Use the following commands to install NEST® SDK:

npm install @nest25/ai-core-sdk
OR
yarn add @nest25/ai-core-sdk

Request

Here is an example of how to make a checkToxicityImage request using the NEST® SDK:

// import the ai-core-sdk
import {AIServices} from '@nest25/ai-core-sdk';

// create a new instance of the sdk
const aiServices = new AIServices();

async function main() {
  // get the result of the test
  const result = await aiServices.checkToxicityImage('https://ik.imagekit.io/BIOSPHERE/1678716455079_PTj9bkO9d.jpeg');
  console.log(result);
}

main();

Response

{
    "message": "Request successful",
    "response": {
        "bumble": 94.26363110542297,
        "is_toxic": true,
        "nsfw": 99.0408182144165
    },
    "status": 1
}

Last updated