Safe Space - Github Action
Github action that checks the toxicity level of comments and PR reviews to help make repos safe spaces.
tensorflow-js sentiment-analysis toxicity github-actions natural-language-processing demo code

Github action that uses machine learning to detect potential toxic comments added to PRs and issues so authors can have a chance to edit them and keep repos a safe space.

It uses the Tensorflow.js toxicity classification model.

It currently works when comments are posted on issues and PRs, as well as when pull request reviews are submitted.

Don't forget to tag @charliegerard in your comment, otherwise they may not be notified.

Authors community post
Senior Front-end Developer @netlify , former Software Dev @atlassian & @thoughtworks & Creative Developer @nytimes. Always tinkering with new technologies
Share this project
Similar projects
Interactive Analysis of Sentence Embeddings
Learn how to interactively explore sentence embedding and labels in Tensorflow Embedding Projector.
TensorFlow JS- Object Detection in Browser
A real-time object detection model in your browser using TensorFlow JS.
CNN Explainer
CNN Explainer uses TensorFlow.js, an in-browser GPU-accelerated deep learning library to load the pretrained model for visualization.
TensorFlow.js - Gesture Controlled 2048
Gesture Controlled 2048 built with TensorFlow.js
Top collections