geonode_logstash.logstash

Attributes

log

IS_ENABLED

GZIP_COMPRESSED

DATA_TYPES_MAP

CUSTOM_DATA_TYPES_MAP

DATA_TYPES_MAP

Classes

LogstashDispatcher

Dispatcher of GeoNode metric data for Logstash server

GeonodeAsynchronousLogstashHandler

Extends AsynchronousLogstashHandler to allow gzip compression

GeonodeLogstashFormatter

Extends LogstashFormatter to allow gzip compression

GeonodeTcpTransport

Extends TcpTransport to avoid loss of messages

GeonodeLogProcessingWorker

Extends LogProcessingWorker to use GeonodeDatabaseCache

GeonodeDatabaseCache

Extends DatabaseCache to have more method

Module Contents

geonode_logstash.logstash.log[source]
geonode_logstash.logstash.IS_ENABLED[source]
geonode_logstash.logstash.GZIP_COMPRESSED[source]
geonode_logstash.logstash.DATA_TYPES_MAP[source]
geonode_logstash.logstash.CUSTOM_DATA_TYPES_MAP[source]
geonode_logstash.logstash.DATA_TYPES_MAP[source]
class geonode_logstash.logstash.LogstashDispatcher[source]

Bases: object

Dispatcher of GeoNode metric data for Logstash server

_centralized_server = None[source]
_logger = None[source]
_handler = None[source]
_interval = 0[source]
_collector = None[source]
_init_server()[source]

Initializing Dispatcher with basic information :return: None

static _get_centralized_server()[source]

Get the Centralized Server instance :return: Centralized Server

static get_socket_timeout()[source]

Configuring the SOCKET_TIMEOUT from the model :return: SOCKET_TIMEOUT

static get_queue_check_interval()[source]

Configuring the QUEUE_CHECK_INTERVAL from the model :return: QUEUE_CHECK_INTERVAL

static get_queue_events_flush_interval()[source]

Configuring the QUEUED_EVENTS_FLUSH_INTERVAL from the model :return: QUEUED_EVENTS_FLUSH_INTERVAL

static get_queue_events_flush_count()[source]

Configuring the QUEUED_EVENTS_FLUSH_COUNT from the model :return: QUEUED_EVENTS_FLUSH_COUNT

static get_queue_events_batch_size()[source]

Configuring the QUEUED_EVENTS_BATCH_SIZE from the model :return: QUEUED_EVENTS_BATCH_SIZE

static get_logstash_db_timeout()[source]

Configuring the DATABASE_TIMEOUT from the model :return: DATABASE_TIMEOUT

dispatch_metrics()[source]

Sending the messages :return: None

_update_server()[source]

Updating the CentralizedServer instance :return: None

_set_time_range()[source]

Set up the time range as valid_to/valid_from and interval :return: None

_get_message(data_type)[source]

Retrieving data querying the MetricValue model :param data_type: field mapping to keep only interesting information :return: data dictionary

static _build_data(item, key)[source]

Extract interesting data from the query result :param item: query result item :param key: interesting key :return: interesting value

static _get_registered_users()[source]

Retrieving the users currently registered in GeoNode :return: users count

static _get_layers()[source]

Retrieving all the existing datasets :return: datasets count

static _get_maps()[source]

Retrieving all the existing maps :return: maps count

static _get_documents()[source]

Retrieving all the existing documents :return: documents count

_get_errors()[source]

Retrieving errors :return: errors count

static _get_country_center(iso_3)[source]
test_dispatch(host=None, port=None)[source]

Testing connection to the centralized server :return: None

class geonode_logstash.logstash.GeonodeAsynchronousLogstashHandler(*args, **kwargs)[source]

Bases: logstash_async.handler.AsynchronousLogstashHandler

Extends AsynchronousLogstashHandler to allow gzip compression

formatter[source]
_start_worker_thread()[source]

Super method override to use GeonodeLogProcessingWorker :return: None

_format_record(record)[source]

Super method overriding to allow gzip compression :param record: message to be formatted :return: formatted message

get_last_entry_date()[source]

Get entry date of the last queued event :return: Events

class geonode_logstash.logstash.GeonodeLogstashFormatter(gzip=False, *args, **kwargs)[source]

Bases: logstash_async.formatter.LogstashFormatter

Extends LogstashFormatter to allow gzip compression

_gzip[source]
format(record)[source]

Super method overriding to allow json compression :param record: message :return: gzip compressed message :ref: https://stackoverflow.com/questions/8506897/how-do-i-gzip-compress-a-string-in-python

json_gzip(data)[source]

Gzip compression of serialized json :param j: input json to be compressed :return: compressed object

class geonode_logstash.logstash.GeonodeTcpTransport[source]

Bases: logstash_async.transport.TcpTransport

Extends TcpTransport to avoid loss of messages

_send(events)[source]

Super method override to avoid loss of messages :param events: events to be processed :return: None

class geonode_logstash.logstash.GeonodeLogProcessingWorker[source]

Bases: logstash_async.worker.LogProcessingWorker

Extends LogProcessingWorker to use GeonodeDatabaseCache

_setup_database()[source]

Ovverride of the super method to use GeonodeDatabaseCache :return: None

get_last_queued_event_date()[source]

Get the entry date of the last queued event :return: last event entry date

class geonode_logstash.logstash.GeonodeDatabaseCache[source]

Bases: logstash_async.database.DatabaseCache

Extends DatabaseCache to have more method

get_from_query(query_fetch)[source]

Method to execute query and retrieve results :return: query results