⛏️checkToxicityImage
Description
Check toxicity of an image provided a valid URL. It has two different models to check the toxicity, one is [bumble](https://github.com/bumble-tech/private-detector) and second is Yahoo [Open NSFW](https://github.com/yahoo/open_nsfw) model.
Parameters
Parameter | Type | Description |
---|---|---|
imageUrl | string | image URL |
Response
Parameter | Type | Description |
---|---|---|
message | string | message about the response |
status | int | 1 for successfull execution -1 for any error occurs during the execution of the request |
response | object | this will have three properties bumble - toxicity score of bumble model opennfsw - toxicity score of open nsfw model is_toxic - true for toxic and false for not toxic |
Example Request and Response
Prerequisites
Before making requests with NEST® SDK, you must have it installed.
You can install NEST® SDK using either npm
or yarn
. Use the following commands to install NEST® SDK:
Request
Here is an example of how to make a checkToxicityImage
request using the NEST® SDK:
Response
Last updated