Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Usage Section Of Toxicity Classifier Model #1336

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

gaikwadrahul8
Copy link
Contributor

@gaikwadrahul8 gaikwadrahul8 commented Jan 19, 2024

I have updated usage section of TFJs Toxicity Classifier Model on this page because result was not displaying as expected mentioned in the comment section so I did some modifications in the code to display result as expected. I tested code snippet on Node.js environment and also on the Google chrome browser and it's displaying the model result as expected this enhancements will improve the clarity of model result and usability of the model for community members.

Kindly review the updated section and provide any feedback or suggestions you may have. If the changes are satisfactory please do the needful. Thank you.

Here is screenshot of model result without modifications of current code snippet in the Node.js project :

image

Here is screenshot of model result with modifications of current code snippet in the Node.js project :

(base) gaikwadrahul-macbookpro:test-8145 gaikwadrahul$ node index.js

============================
Hi there 馃憢. Looks like you are running TensorFlow.js in Node.js. To speed things up dramatically, install our node backend, which binds to TensorFlow C++, by running npm i @tensorflow/tfjs-node, or npm i @tensorflow/tfjs-node-gpu if you have CUDA. Then call require('@tensorflow/tfjs-node'); (-gpu suffix for CUDA) at the start of your program. Visit https://github.com/tensorflow/tfjs-node for more details.
============================
Label: identity_attack
        - probabilities : Float32Array(2) [ 0.9659663438796997, 0.034033700823783875 ]
        - match : false

Label: insult
        - probabilities : Float32Array(2) [ 0.08124702423810959, 0.9187529683113098 ]
        - match : true

Label: obscene
        - probabilities : Float32Array(2) [ 0.3993152379989624, 0.6006847620010376 ]
        - match : null

Label: severe_toxicity
        - probabilities : Float32Array(2) [ 0.9970394968986511, 0.002960436511784792 ]
        - match : false

Label: sexual_explicit
        - probabilities : Float32Array(2) [ 0.7053251266479492, 0.2946748435497284 ]
        - match : null

Label: threat
        - probabilities : Float32Array(2) [ 0.9106737971305847, 0.08932614326477051 ]
        - match : false

Label: toxicity
        - probabilities : Float32Array(2) [ 0.031176716089248657, 0.9688233137130737 ]
        - match : true

(base) gaikwadrahul-macbookpro:test-8145 gaikwadrahul$ 

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant