-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function OpenNSFW classification model for store proposal #103
Comments
Sounds like a great sample. Can you make it non-root, and only write to |
Updated it to a non root user container. I believe nothing is written to filesystem as the input request is handled like this req = urllib2.Request(image_entry, None, headers)
with contextlib.closing(urllib2.urlopen(req)) as stream: Do you agree or are there other places I am overlooking that write outside of /tmp? The model paths I feel shouldn't be in /tmp. |
Reading from outside of /tmp is fine. Thanks for the modifications. Do you have an image that is safe to use, but which flags as NSFW for testing? Feel free to send over your PR. |
Added a failing sample image to the repo and readme. Will do a PR now, thank you much for your feedback. |
Hello,
I have created a function that wraps Yahoo's Open NSFW classifier that I feel would fit well in the store. (What defines a good fit for the store?)
https://github.com/servernull/openfaas-opennsfw
It takes a url to an image as an argument , and could also be a base64 encoded image if that was requested.
Open to additional collaboration, criticism and feedback.
The text was updated successfully, but these errors were encountered: