I’m a heavy user of the Logstash geoip features which utilize the GeoLite database. To keep up to date with the latest mappings I updated my logstash ansible role to check the current database and retrieve a new one if its older than a certain number of days. This was super easy to do with ansible. To get started I defined a couple of variables in group_vars:
geoip_directory: "/elk/logstash/geoip"
geoip_source: http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.mmdb.gz
geoip_upgrade_days: 30
These variables define the location to put the geoip database, the URL to the latest database and how how often to update the file. To check if a file is outdated I used the stat module’s mtime attribute along with a when conditional:
- name: Get the GeoIP database file name
set_fact: geoip_compressed_file_name="{{ geoip_source | basename }}"
- name: Get the GeoIP database file name
set_fact: geoip_uncompressed_file_name="{{ geoip_compressed_file_name | replace('.gz', '') }}"
- name: Retrieving file stat data from {{ geoip_directory }}/{{ geoip_uncompressed_file_name }}"
stat:
path: "{{ geoip_directory }}/{{ geoip_uncompressed_file_name }}"
register: stat_results
- name: "Download the latest GeoIP database from {{ geoip_source }}"
get_url:
url: "{{ geoip_source }}"
dest: "{{ geoip_directory }}"
mode: 0600
when: ((ansible_date_time.epoch|int - stat_results.stat.mtime) > (geoip_upgrade_days * 60 * 60 * 24))
register: downloaded_geoip_file
- name: "Uncompressing the GeoIP file {{ geoip_directory }}/{{ geoip_compressed_file_name }}"
shell: gunzip -f "{{ geoip_directory }}/{{ geoip_compressed_file_name }}"
when: downloaded_geoip_file.changed
I still need to add a couple of checks to deal with edge conditions but this is definitely a step up from what I was doing previously. Viva la ansible!