Skip to content

socrata-platform/Voight-Kampff

 
 

Repository files navigation

Voight-Kampff

Build Status Code Climate Gem Version

Voight-Kampff relies on a user agent list for its detection. It can easily tell you if a request is coming from a crawler, spider or bot. This can be especially helpful in analytics such as page hit tracking.

Installation

gem install voight_kampff

Configuration

A JSON file is used to match user agent strings to a list of known bots.

If you'd like to use an updated list or make your own customizations, run rake voight_kampff:import_user_agents. This will download a crawler-user-agents.json file into the ./config directory.

Note: The pattern entries in the JSON file are evaluated as regular expressions.

Usage

There are three ways to use Voight-Kampff

  1. Through Rack::Request such as in your Ruby on Rails controllers:
    request.bot?

  2. Through the VoightKampff module:
    VoightKampff.bot? 'your user agent string'

  3. Through a VoightKampff::Test instance:
    VoightKampff::Test.new('your user agent string').bot?

All of the above examples accept human? and bot? methods. All of these methods will return true or false.

Types

We needed a way to categorize our bots so we added another level of granularity to bot differentations by allowing you to add a list of types to each bot definition in config/crawler-user-agents.json. You can add additional definitions by forking the repo or specifying your own custom crawler-user-agents.json file. This library is customized to give some bots a bad designation which is an internal differentation we made between various bots. You can use that one or add your own.

To query these designations, you can call bot?(:bad) in any of the various forms already described above and it will return if the user agent is a bot that you have designated bad.

Specifying A Custom List of Bot Regexes

  • Create a config directory somewhere in your software project
  • Copy the existing crawler-user-agents.json or start fresh and customize at will
  • Set the environment variable VOIGHT_KAMPFF_ROOT to the folder you placed the config directory in

Upgrading to version 1.0

Version 1.0 uses a new source for a list of bot user agent strings since the old source was no longer maintained. This new source, unfortuately, does not include as much detail. Therefore the following methods have been deprecated:

  • #browser?
  • #checker?
  • #downloader?
  • #proxy?
  • #crawler?
  • #spam?

In general the #bot? command tends to include all of these and I'm sure it's unlikely that anybody was getting this granular with their bot checking. So I see it as a small price to pay for an open and up to date bot list.

Also, the gem no longer extends ActionDispatch::Request instead it extends Rack::Request which ActionDispatch::Request inherits from. This allows the same functionality for Rails while opening the gem up to other rack-based projects.

Publishing gem

  • Make sure that you have the ruby-local gem source configured: gem source -a https://<USERNAME>:<PASSWORD>@repo.socrata.com/artifactory/api/gems/ruby-local/
  • Build the gem: gem build voight_kampff.gemspec
  • Publish the gem: gem push --host https://repo.socrata.com/artifactory/api/gems/ruby-local

FAQ

Q: What's with the name?
A: It's the machine in Blade Runner that is used to test whether someone is a human or a replicant.

Q: I've found a bot that isn't being matched
A: The list is being pulled from github.com/monperrus/crawler-user-agents. If you'd like to have entries added to the list, please create a pull request with that project. Once that pull request is merged, feel free to create an issue here and I'll release a new gem version with the updated list. In the meantime you can always run rake voight_kampff:import_user_agents on your project to get that updated list.

Q: __Why don't you use the user agent list from ______________ If you know of a better source for a list of bot user agent strings, please create an issue and let me know. I'm open to switching to a better source or supporting multiple sources. There are others out there but I like the openness of monperrus' list.

Thanks

Thanks to github.com/monperrus/crawler-user-agents for providing an open and easily updatable list of bot user agents.

Contributing

PR without tests will not get merged, Make sure you write tests for api and rails app. Feel free to ask for help, if you do not know how to write a determined test.

Running Tests?

  • bundle install
  • bundle exec rspec

About

Voight-Kampff is a Ruby gem that detects bots, spiders, crawlers and replicants

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Ruby 100.0%