⛏️checkToxicityImage
Last updated
Last updated
Check toxicity of an image provided a valid URL. It has two different models to check the toxicity, one is [bumble](https://github.com/bumble-tech/private-detector) and second is Yahoo [Open NSFW](https://github.com/yahoo/open_nsfw) model.
Parameter | Type | Description |
---|---|---|
Parameter | Type | Description |
---|---|---|
Before making requests with NEST® SDK, you must have it installed.
You can install NEST® SDK using either npm
or yarn
. Use the following commands to install NEST® SDK:
Here is an example of how to make a checkToxicityImage
request using the NEST® SDK:
imageUrl
string
image URL
message
string
message about the response
status
int
1 for successfull execution
-1 for any error occurs during the execution of the request
response
object
this will have three properties bumble - toxicity score of bumble model opennfsw - toxicity score of open nsfw model is_toxic - true for toxic and false for not toxic