(String: {%- set hs_blog_post_body -%} {%- set in_blog_post_body = true -%} <span id="hs_cos_wrapper_post_body" class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_rich_text" style="" data-hs-cos-general-type="meta_field" data-hs-cos-type="rich_text"> <div class="blog-post__lead h3"> <p style="font-weight: bold;">Nowadays, visual content is a major part of most websites. This means that every time we, at Netguru, build a Rails application for our client, we have to think about how to make visual content both awesome and safe.</p> </div></span>)

Netguru Presents: Picguard - Visual Content Processing in Rails

Photo of Szymon Baranowski

Szymon Baranowski

Updated Feb 20, 2023 • 9 min read

Nowadays, visual content is a major part of most websites. This means that every time we, at Netguru, build a Rails application for our client, we have to think about how to make visual content both awesome and safe.

Every picture uploaded to the server should be appropriate to the idea behind the application. We’ve recently found out about Google Cloud Vision API and thought that this might be just the right tool for the job. And indeed it is.

Problems with Image Processing in Apps

The Risk of Publishing Inappropriate Content

Imagine an application that allows users to upload a photo when creating their profile. The user is free to upload any kind of picture (including the ones showing violent scenes or adult content). The most popular solution for that is implementing a feature that allows other users to report the inappropriate content - which means you rely on your users when it comes to the safety of your application. This post factum moderation approach is also time- and resource-consuming and while you wait for your users to do the job, the inappropriate visual content is destroying your business’s reputation.

New Solution: Google Cloud Vision API

The newly released Google Cloud Vision API is able to process an image and conclude how likely it is for this image to show violent or adult content. It is also capable of recognising faces which is particularly useful in the cases when you want to be sure that a profile photo is indeed a photo showing a human face. Another advantage of using the Google Cloud product is that Google will keep on improving it thanks to the feedback from millions of users.

Picguard: Using Google API for Rails Apps

Inspired by the release of such a powerful tool, we decided to go one step further and create an open-source Ruby gem that will make the process of validating an image even simpler and hopefully save Rails' developers some time when creating an application that needs safe visual content upload.

The Google Vision API is currently in beta, which implies its functionality is not entirely as stable and functional as we would like it to be. During the development process, we noticed some inaccuracies in results provided by the Google API. Because of this, we decided to implement a feature that allows developers using our gem to define custom thresholds for the following features: face detection, adult and violent content.

You can define the global thresholds for all models using Picguard and the private thresholds separately for each model. It's comprehensively explained in the gem’s documentation to make it as easy as possible to set up.

From our experience, we know that there are dedicated gems for uploading and storing images used within Rails apps. So we documented ready out-of-the-box solutions for Carrierwave, Paperclip and Dragonfly.

  • Carrierwave
  • Paperclip
  • Dragonfly

Nevertheless, we created an open solution to be used in any conditions:

Picguard.analyze(
image_path: 'path/to/image',
safe_search: true,
face_detection: true,
threshold_adult: 'POSSIBLE',
threshold_violence: 'UNLIKELY',
threshold_face: 0.7
)

For more advanced usage, the analyze method might come in handy:

Picguard.analyze(
image_path: 'path/to/image',
safe_search: true,
face_detection: true,
threshold_adult: 'POSSIBLE',
threshold_violence: 'UNLIKELY',
threshold_face: 0.7
)

Sometimes an application requires uploading bigger images or multiple images. To improve user experience, we recommend implementing the analysing feature as a background job. To get it running, you need to provide at least an image_path attribute, and the rest of the arguments will use the default options. It will analyse an image by looking for adult and violent content and also for a human face. Thresholds are set by default in the initialiser. If you are not happy with the default values, you can provide the custom ones yourself:

  • UNKNOWN
  • VERY_UNLIKELY
  • UNLIKELY
  • POSSIBLE
  • LIKELY
  • VERY_LIKELY

These are the available thresholds for adult and violence content. Float value from 0 to 1 - threshold for face detection (consider it as 0 - 0% and 1 - 100%).

Try Picguard Yourself - Demo App Is Here!

We have created a simple app where you can try Picguard yourself by uploading an image to a model validated by our gem! The goal here is to prevent users from uploading avatars that lack a human face. You can also find the source code and some basic information on Picguard Github page.

If you enjoy Picguard, feel free to fork our repository and add any extra feature you find useful. We believe Picguard can bring the value to any business web application and are curious about how it has been applied by the other developers out there.

Photo of Szymon Baranowski

More posts by this author

Szymon Baranowski

After a couple of years working as a DBA, SysAdmin and DevOps Engineer, Szymon has decided to jump...

We're Netguru!

At Netguru we specialize in designing, building, shipping and scaling beautiful, usable products with blazing-fast efficiency
Let's talk business!

Trusted by: