Projects
Kolab:Winterfell
bonnie
Log In
Username
Password
We truncated the diff of some files because they were too big. If you want to see the full diff for every file,
click here
.
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
Expand all
Collapse all
Changes of Revision 10
View file
bonnie.spec
Changed
@@ -17,7 +17,7 @@ %global bonnie_group_id 415 Name: bonnie -Version: 0.3.7 +Version: 0.3.8 Release: 1%{?dist} Summary: Bonnie for Kolab Groupware @@ -366,6 +366,9 @@ %defattr(-,root,root,-) %changelog +* Thu Jun 28 2018 Jeroen van Meeuwen <vanmeeuwen@kolabsys.com> - 0.3.8-1 +- Release version 0.3.8 + * Fri Jun 22 2018 Jeroen van Meeuwen <vanmeeuwen@kolabsys.com> - 0.3.7-1 - Don't run out of fields - Fix storing user data to the side
View file
bonnie-0.3.7.tar.gz/docs/architecture-and-design.rst
Deleted
@@ -1,1869 +0,0 @@ -:tocdepth: 2 - -.. _architecture-and-design: - -======================= -Architecture and Design -======================= - -The design principles could be summarized as follows: - -* Eventual consistency is more important than real-time availability, - -* Scalability is more important than one particular technology's individual - efficiency, - -* Reliability of the audit trail - -Bonnie receives and parses :term:`event notifications` issued by Cyrus -IMAP 2.5. - -An event notification is issued when a user logs on or off, and when a -user makes a change to a mailbox or message. - -These event notifications are used to build the data sets for each of -the following features included with Bonnie; - - * :ref:`about-archival` - - * :ref:`about-backup-and-restore` - - * :ref:`about-e-discovery` - - * :ref:`about-data-loss-prevention` - -These features are (each of them) optional, in that a deployment used -solely for e-Discovery does not imply the inclusion of the other -features, and most feature descriptions use super-nomenclature for -otherwise limited, self-contained facilities that tend to serve a more -narrow purpose. - -One example is a ``changelog`` module, which can be used to preserve -selected subsets of data, such as "who changed what in an event". This -particular module would fall under the category of e-Discovery, but not -include the full feature-set of e-Discovery. - -.. rubric:: Technology Base - -The initial design of Bonnie is created with the following technologies -as the basis: - - * Elasticsearch - * ZeroMQ - -The functional components that make up a complete Bonnie environment are -designed to be pluggable such that one or more extensions can be used -for either of the following channels: - -**input** - - Used to receive new event notifications, or jobs, to be handled. - -**storage** - - Used to store event notifications, queues and/or metadata. - -**output** - - The final and persistent record(s) of state. - -To parse information in to useful, digestible chunks, Bonnie uses -so-called *handlers*, that subscribe to event notifications using -*interests*. - -For example, a *Logout* event is only stored to the output channel, -should a corresponding *Login* event be available. After all, there must -not be a *Logout* event without a *Login* event also having occurred. - -A handler interested in the *Logout* event uses the storage channel to -determine the corresponding *Login* event is indeed available. - -This storage channel may be configured to be an intermediate buffer, -that holds all *Login* events until after the corresponding *Logout* -event notification is received, or a timeout occurs, or may be -configured to be the same channel as the output channel. - -Overview of Components -====================== - -The Bonnie infrastructure components are an add-on to an existing Cyrus -IMAP environment, typically as part of a `Kolab Groupware`_ deployment. - -More detailed diagrams of communication flows in a standard Kolab -Groupware environment are available in the -`Kolab Groupware documentation`_, and are outside of the scope of the -design documentation for Bonnie. Suffice it to say, that Kolab Groupware -components communicate with Cyrus IMAP: - -.. graphviz:: - - digraph { - rankdir = LR; - splines = true; - overlab = prism; - fontname = Calibri; - - edge [color=gray50, fontname=Calibri, fontsize=11]; - node [shape=record, fontname=Calibri, fontsize=11]; - - "Kolab Groupware"; - - "Cyrus IMAP"; - - "Kolab Groupware" -> "Cyrus IMAP" [dir=both]; - - } - -As Bonnie is added on top of this (existing) infrastructure the design -overview for the infrastructure becomes: - -.. graphviz:: - - digraph { - rankdir = LR; - splines = true; - overlab = prism; - fontname = Calibri; - - edge [color=gray50, fontname=Calibri, fontsize=11]; - node [shape=record, fontname=Calibri, fontsize=11]; - - "Kolab Groupware" [color=gray50,style=filled]; - - subgraph cluster_bonnie { - label = "Bonnie Infrastructure"; - - "Broker"; - - subgraph cluster_imap { - label = "IMAP Server"; - - "Cyrus IMAP" [color=gray50,style=filled]; - - "Collector"; - "Dealer"; - } - - "Worker"; - } - - "Kolab Groupware" -> "Cyrus IMAP" [dir=both]; - - "Cyrus IMAP" -> "Dealer"; - - "Dealer" -> "Broker"; - - "Broker" -> "Worker" [dir=both]; - - "Collector" -> "Broker" [dir=both]; - - "Worker" -> "Storage" [dir=both]; - - } - -The **Dealer** Component -======================== - -A dealer is a script executed once for each event notification, and is -used to dispatch the event notification as fast and as efficient as -possible. - -.. graphviz:: - - digraph { - rankdir = LR; - splines = true; - overlab = prism; - fontname = Calibri; - - edge [color=gray50, fontname=Calibri, fontsize=11]; - node [shape=record, fontname=Calibri, fontsize=11]; - - "Kolab Groupware" [color=gray50,style=filled]; - - subgraph cluster_bonnie { - label = "Bonnie Infrastructure"; - - "Broker"; - - subgraph cluster_imap { - label = "IMAP Server"; - - "Cyrus IMAP" [color=gray50,style=filled]; - - "Collector"; - "Dealer" [color="green",style=filled]; - } - - "Worker"; - }
View file
bonnie-0.3.7.tar.gz/bonnie/__init__.py -> bonnie-0.3.8.tar.gz/bonnie/__init__.py
Changed
@@ -18,34 +18,50 @@ # along with this program. If not, see <http://www.gnu.org/licenses/>. # +""" + Main entry point to the Bonnie module. + + Provides getLogger(name) and getConf(). +""" + import logging import threading +# pylint: disable=import-error +# pylint: disable=no-name-in-module from bonnie.logger import Logger logging.setLoggerClass(Logger) API_VERSION = 1 + +# pylint: disable=invalid-name def getLogger(name): """ Return the correct logger class. """ logging.setLoggerClass(Logger) - log = logging.getLogger(name=name.replace(".", "_")) - return log + _log = logging.getLogger(name=name.replace(".", "_")) + return _log + +# pylint: disable=invalid-name log = getLogger('bonnie') -from bonnie.conf import Conf +# pylint: disable=wrong-import-position +from bonnie.conf import Conf # noqa conf = Conf() + def getConf(): + """ + Get an instance of our configuration, safe from threading + discrepancies. + """ _data = threading.local() if hasattr(_data, 'conf'): - log.debug(_("Returning thread local configuration")) + log.debug("Returning thread local configuration", level=7) return _data.conf return conf - -
View file
bonnie-0.3.7.tar.gz/bonnie/broker/__init__.py -> bonnie-0.3.8.tar.gz/bonnie/broker/__init__.py
Changed
@@ -21,13 +21,14 @@ """ This is the broker for Bonnie. """ -import brokers +from bonnie.broker import brokers from bonnie.daemon import BonnieDaemon + class BonnieBroker(BonnieDaemon): """ - + A basic broker. """ pidfile = "/var/run/bonnie/broker.pid" @@ -49,13 +50,18 @@ Register a broker based on interests """ - for interest,how in interests.iteritems(): - if not self.broker_interests.has_key(interest): + for interest, how in interests.iteritems(): + if interest not in self.broker_interests: self.broker_interests[interest] = [] self.broker_interests[interest].append(how) + # pylint: disable=unused-argument def run(self, *args, **kw): + """ + Run the Broker. + """ + # pylint: disable=unused-variable for interest, hows in self.broker_interests.iteritems(): for how in hows: how()
View file
bonnie-0.3.7.tar.gz/bonnie/collector/__init__.py -> bonnie-0.3.8.tar.gz/bonnie/collector/__init__.py
Changed
@@ -43,7 +43,6 @@ self.handler_interests = {} self.num_threads = int(conf.get('collector', 'num_threads', 5)) - self.num_threads_busy = 0 sys_version = version.StrictVersion(sys.version[:3]) @@ -66,8 +65,6 @@ """ Dispatch collector job to the according handler(s) """ - self.num_threads_busy += 1 - # execute this asynchronously in a child process self.pool.apply_async( async_execute_handlers, @@ -125,21 +122,15 @@ def _execute_callback(self, result): (notification, job_uuid) = result - self.num_threads_busy -= 1 - # pass result back to input module(s) input_modules = conf.get('collector', 'input_modules').split(',') for _input in self.input_modules.values(): if _input.name() in input_modules: _input.callback_done( job_uuid, - notification, - threads=self._threads_available() + notification ) - def _threads_available(self): - return (self.num_threads - self.num_threads_busy) - def _worker_process_start(self, *args, **kw): log.info( "Worker process %s initializing" % (
View file
bonnie-0.3.7.tar.gz/bonnie/collector/handlers/imapdata.py -> bonnie-0.3.8.tar.gz/bonnie/collector/handlers/imapdata.py
Changed
@@ -61,16 +61,15 @@ # split the uri parameter into useful parts uri = parse_imap_uri(notification['uri']) - folder_path = imap_folder_path(uri) # get metadata using pykolab's imap module metadata = {} try: self.imap.connect() - metadata = self.imap.get_metadata(folder_path)[folder_path] + metadata = self.imap.get_metadata(uri['path'])[uri['path']] self.imap.disconnect() except Exception, e: - log.warning("Failed to get metadata for %r: %r", folder_path, e) + log.warning("Failed to get metadata for %r: %r", uri['path'], e) notification['metadata'] = metadata @@ -82,16 +81,15 @@ # split the uri parameter into useful parts uri = parse_imap_uri(notification['uri']) - folder_path = imap_folder_path(uri) # get folder acls using pykolab's imap module acls = {} try: self.imap.connect() - acls = self.imap.list_acls(folder_path) + acls = self.imap.list_acls(uri['path']) self.imap.disconnect() except Exception, e: - log.warning("Failed to get ACLs for %r: %r", folder_path, e) + log.warning("Failed to get ACLs for %r: %r", uri['path'], e) notification['acl'] = acls
View file
bonnie-0.3.7.tar.gz/bonnie/collector/inputs/zmq_input.py -> bonnie-0.3.8.tar.gz/bonnie/collector/inputs/zmq_input.py
Changed
@@ -64,14 +64,27 @@ pass def report_state(self, new_timeout=False): - log.debug("[%s] Reporting state %s, %r" % (self.identity, self.state, self.interests), level=9) + log.debug( + "[%s] Reporting state %s, %r" % ( + self.identity, + self.state, + self.interests + ), + level=9 + ) if self.report_timestamp < (time.time() - 10): - self.collector.send_multipart([b"STATE", self.state, " ".join(self.interests)]) + self.collector.send_multipart( + [b"STATE", self.state, " ".join(self.interests)] + ) + self.report_timestamp = time.time() if new_timeout: - self.ioloop.add_timeout(datetime.timedelta(seconds=10), self.report_state_with_timeout) + self.ioloop.add_timeout( + datetime.timedelta(seconds=10), + self.report_state_with_timeout + ) def report_state_with_timeout(self): self.report_state(new_timeout=True) @@ -84,7 +97,11 @@ self.notify_callback = callback self.stream.on_recv(self._cb_on_recv_multipart) - self.ioloop.add_timeout(datetime.timedelta(seconds=10), self.report_state_with_timeout) + self.ioloop.add_timeout( + datetime.timedelta(seconds=10), + self.report_state_with_timeout + ) + self.ioloop.start() def _cb_on_recv_multipart(self, message): @@ -105,14 +122,8 @@ if not self.notify_callback == None: self.notify_callback(cmd, job_uuid, notification) - def callback_done(self, job_uuid, result, threads = 0): + def callback_done(self, job_uuid, result): log.debug("Handler callback done for job %s: %r" % (job_uuid, result), level=8) - log.debug("Threads available: %d" % (threads), level=8) - - if threads > 0: - self.state = b'READY' - else: - self.state = b'BUSY' self.collector.send_multipart([b"DONE", job_uuid, result]) log.info("Job %s DONE by %s" % (job_uuid, self.identity))
View file
bonnie-0.3.7.tar.gz/bonnie/daemon.py -> bonnie-0.3.8.tar.gz/bonnie/daemon.py
Changed
@@ -175,6 +175,13 @@ raise SystemExit def drop_privileges(self): + """ + Drop our privileges if not already done so. + + Most commonly, for actual daemons that use this class as their + base, you would use ``--user`` and ``--group`` to indicate to + what user and supplemental groups this daemon should run. + """ try: try: (ruid, euid, suid) = os.getresuid()
View file
bonnie-0.3.7.tar.gz/bonnie/utils.py -> bonnie-0.3.8.tar.gz/bonnie/utils.py
Changed
@@ -74,6 +74,7 @@ # Take everything after the first slash, and omit any INBOX/ stuff. path_str = '/'.join([x for x in split_uri.path.split('/') if not x == 'INBOX'][1:]) path_arr = path_str.split(';') + result['path'] = urllib.unquote(path_arr[0]) # parse the path/query parameters into a dict @@ -95,6 +96,10 @@ result['user'] = '%s@%s' % (username, domain) + if 'UIDVALIDITY' in result: + if '/' in result['UIDVALIDITY']: + result['UIDVALIDITY'] = result['UIDVALIDITY'][:-1] + return result @@ -218,11 +223,7 @@ if isinstance(uri, str): uri = parse_imap_uri(uri) - folder_name = uri['path'] - - # Translate the folder name in to a fully qualified folder path such as it - # would be used by a cyrus administrator. - folder_path = imap_folder_path(uri) + folder_path = uri['path'] # Through filesystem # To get the mailbox path, use: @@ -236,6 +237,7 @@ mailbox_path = subprocess.check_output( ["/usr/lib/cyrus-imapd/mbpath", folder_path] ).strip() + else: # Do it the old-fashioned way p1 = subprocess.Popen( @@ -247,10 +249,5 @@ (stdout, stderr) = p1.communicate() mailbox_path = stdout.strip() - # TODO: Assumption #4 is we use altnamespace - if not folder_name == "INBOX": - if not len(folder_name.split('@')) > 0: - mailbox_path = os.path.join(mailbox_path, folder_name) - return mailbox_path
View file
bonnie-0.3.7.tar.gz/bonnie/worker/__init__.py -> bonnie-0.3.8.tar.gz/bonnie/worker/__init__.py
Changed
@@ -20,23 +20,27 @@ import json import time -import handlers -import inputs -import outputs -import storage import signal +from multiprocessing import Process + +from bonnie.worker import handlers +from bonnie.worker import inputs +from bonnie.worker import outputs +from bonnie.worker import storage from bonnie.translate import _ from bonnie.daemon import BonnieDaemon -from multiprocessing import Process import bonnie +# pylint: disable=invalid-name conf = bonnie.getConf() log = bonnie.getLogger('bonnie.worker') + class BonnieWorker(BonnieDaemon): """ - Bonnie Worker specific version of a :class:`Bonnie Daemon <bonnie.daemon.BonnieDaemon>` + Bonnie Worker specific version of a + :class:`Bonnie Daemon <bonnie.daemon.BonnieDaemon>` """ #: The process ID file to use. @@ -46,13 +50,13 @@ worker_group = conf.add_cli_parser_option_group("Worker Options") worker_group.add_option( - "-n", - "--num-children", - dest = "num_children", - action = "store", - default = None, - help = "Number of child processes to spawn" - ) + "-n", + "--num-children", + dest="num_children", + action="store", + default=None, + help="Number of child processes to spawn" + ) super(BonnieWorker, self).__init__(*args, **kw) @@ -108,16 +112,20 @@ Stop the worker daemon. """ self.running = False - for p in self.children: - p.terminate() + for process in self.children: + process.terminate() if self.manager: - for p in self.children: - p.join() + for process in self.children: + process.join() + class BonnieWorkerProcess(object): + """ + A single worker process. + """ #: Holder of interests registered by handlers - handler_interests = { '_all': [] } + handler_interests = {'_all': []} #: Holder of interests registered by input channel modules input_interests = {} @@ -138,6 +146,8 @@ output_modules = {} output_exclude_events = [] + + # pylint: disable=unused-argument def __init__(self, as_child=False, *args, **kw): if as_child: signal.signal(signal.SIGTERM, self.terminate) @@ -152,7 +162,9 @@ __class.register(callback=self.register_input) self.input_modules[_class] = __class - output_modules = [x.strip() for x in conf.get('worker', 'output_modules', '').split(',')] + output_modules = [] + for module in conf.get('worker', 'output_modules').split(','): + output_modules.append(module.strip()) for _class in outputs.list_classes(): _output = _class() @@ -160,7 +172,10 @@ _output.register(callback=self.register_output) self.output_modules[_class] = _output - storage_modules = [x.strip() for x in conf.get('worker', 'storage_modules', '').split(',')] + storage_modules = [] + for module in conf.get('worker', 'storage_modules', '').split(','): + storage_modules.append(module.strip()) + for _class in storage.list_classes(): _storage = _class() if _storage.name() in storage_modules: @@ -170,8 +185,11 @@ output_exclude_events = conf.get('worker', 'output_exclude_events', '') - self.output_exclude_events = [x.strip() for x in output_exclude_events.split(',')] + self.output_exclude_events = [ + x.strip() for x in output_exclude_events.split(',') + ] + # pylint: disable=too-many-branches def event_notification(self, notification): """ Input an event notification in to our process. @@ -187,113 +205,130 @@ jobs = [] - if self.handler_interests.has_key(event): + if event in self.handler_interests: for interest in self.handler_interests[event]: (notification, _jobs) = self.interest_callback( - interest, - notification - ) + interest, + notification + ) - if len(_jobs) > 0: + if _jobs: log.debug( - "Handler interest %r for event %s returns jobs: %r" % ( - interest, - event, - _jobs - ), - level = 6 - ) + "Handler interest %r for event %s returns jobs: %r" % ( + interest, + event, + _jobs + ), + level=6 + ) jobs.extend(_jobs) for interest in self.handler_interests['_all']: (notification, _jobs) = self.interest_callback( - interest, - notification - ) + interest, + notification + ) - if len(_jobs) > 0: + if _jobs: log.debug( - "Handler interest %r for _all returns jobs: %r" % ( - interest, - _jobs - ), - level = 6 - ) + "Handler interest %r for _all returns jobs: %r" % ( + interest, + _jobs + ), + level=6 + ) jobs.extend(_jobs) jobs = list(set(jobs)) - if len(jobs) > 0: + if jobs: log.debug( - "Handler interests included jobs: %r" % (jobs), - level = 6 - ) + "Handler interests included jobs: %r" % (jobs), + level=6 + )
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/aclchange.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/aclchange.py
Changed
@@ -24,9 +24,33 @@ from bonnie.worker.handlers import MailboxHandlerBase + class AclChangeHandler(MailboxHandlerBase): """ - Handler for an event that changes the ACL on an object. + Handler for an event that changes the ACL on a mailbox. + + When an ACL is updated, the current objects/folder will + need to be considered obsolete, and updated with the + contents of this notification. + + An example of such notification is: + + .. parsed-literal:: + + { + "aclRights": "lrswite", + "aclSubject": "john.doe@klab.cc", + "event": "AclChange", + "mailboxID": "f0dd4cf3-bc06-450b-8fcf-06b1c5e07167", + "pid": 24807, + "service": "imaps", + "timestamp": "2018-06-19T10:51:03.508+02:00", + "uri": "imap://kolab013.klab.cc/user/jane.doe%40klab.cc;UIDVALIDITY=1529320204", + "user": "jane/doe@klab.cc", + "user_data": null, + "user_id": null, + "vnd.cmu.sessionId": "cyrus-imapd-24807-1529398263-1-6720352549146616500" + } """ event = 'AclChange'
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/base.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/base.py
Changed
@@ -19,13 +19,22 @@ # import bonnie +# pylint: disable=invalid-name conf = bonnie.getConf() class HandlerBase(object): + """ + A base handler for events. + """ + + event = None features = [] def __init__(self, *args, **kw): + if self.event is None: + return + self.features = conf.get('bonnie', 'features') if self.features is None: @@ -41,15 +50,21 @@ self.log = bonnie.getLogger(log_name) def register(self, callback): + """ + A generic registration function for specific handlers. + """ interests = { - self.event: { - 'callback': self.run - } + self.event: { + 'callback': self.run } + } self.worker = callback(interests) def run(self, notification): + if self.event is None: + return + # resolve user_id from storage? resolve = False
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/flagsset.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/flagsset.py
Changed
@@ -19,12 +19,33 @@ # """ - Base handler for an event notification of type 'FlagsSet' + Base handler for an event notification of type 'FlagsSet'. """ from bonnie.worker.handlers import HandlerBase + class FlagsSetHandler(HandlerBase): + """ + { + "event":"FlagsSet", + "mailboxID":"c1417483-e54d-43e0-88a4-06a832377aab", + "modseq":4, + "messages":1, + "flagNames":"\\\\Answered", + "pid":19681, + "service":"imaps", + "timestamp":"2018-06-19T09:56:40.116+02:00", + "uidnext":2, + "uidset":"1", + "uri":"imap://kolab013.klab.cc/user/john.doe%40klab.cc;UIDVALIDITY=1529317681", + "user":"john.doe@klab.cc", + "vnd.cmu.midset":["<7a3a1149356e7705a4322bb44cce2ffe@klab.cc>"], + "vnd.cmu.sessionId":"cyrus-imapd-19681-1529395000-1-3957065946565248241", + "vnd.cmu.unseenMessages":0, + } + """ + event = 'FlagsSet' def __init__(self, *args, **kw):
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/login.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/login.py
Changed
@@ -25,6 +25,7 @@ from bonnie.worker.handlers import HandlerBase +# pylint: disable=too-few-public-methods class LoginHandler(HandlerBase): """ A *Login* event notification handler. @@ -33,19 +34,18 @@ layout:: { - "event":"Login", - "timestamp":"2014-11-27T10:45:38.201+01:00", - "service":"imap", - "serverDomain":"10.8.14.13", - "serverPort":143, - "clientIP":"10.8.13.10", - "clientPort":45735, - "uri":"imap://imapb13.example.org", - "pid":14210, - "user":"alexander.aachen@example.org", - "vnd.cmu.sessionId":"cyrus-imapd-14210-1417081537-1-10541965771286434889" - } - + "clientIP":"10.8.13.10", + "clientPort":45735, + "event":"Login", + "pid":14210, + "serverDomain":"10.8.14.13", + "serverPort":143, + "service":"imap", + "timestamp":"2014-11-27T10:45:38.201+01:00", + "uri":"imap://imapb13.example.org", + "user":"alexander.aachen@example.org", + "vnd.cmu.sessionId":"cyrus-imapd-14210-1417081537-1-10541965771286434889" + } """ event = 'Login'
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/logout.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/logout.py
Changed
@@ -40,8 +40,6 @@ # logout_time and suppress separate logging of this event with # notification['_suppress_output'] = True. if 'vnd.cmu.sessionId' in notification: - now = datetime.datetime.now(tzutc()) - results = self.worker.storage.select( query=[ ('event', '=', 'Login'), @@ -57,9 +55,9 @@ login_event = results['hits'][0] try: - timestamp = parse(login_event['@timestamp']) + login_timestamp = parse(login_event['@timestamp']) self.log.debug( - "Timestamp parsed to %r" % (timestamp), + "Timestamp parsed to %r" % (login_timestamp), level=8 ) @@ -70,10 +68,11 @@ errmsg ) ) + return (notification, [b"POSTPONE"]) - timestamp = now + logout_timestamp = parse(notification['timestamp']) - delta = now - timestamp + delta = logout_timestamp - login_timestamp duration = delta.days * 24 * 3600 duration += delta.seconds @@ -86,7 +85,7 @@ doctype=login_event['_doctype'], value={ 'logout_time': datetime.datetime.strftime( - now, + logout_timestamp, "%Y-%m-%dT%H:%M:%S.%fZ" ), 'duration': duration @@ -94,7 +93,9 @@ ) notification['_suppress_output'] = True + return (notification, []) + else: self.log.warning( "Could not find related Login event for session ID: %r" % (
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/mailboxbase.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/mailboxbase.py
Changed
@@ -37,17 +37,8 @@ # mailbox notifications require metadata if 'metadata' not in notification: jobs.append(b"GETMETADATA") - return (notification, jobs) - # extract uniqueid from metadata -> triggers the storage module - if 'folder_uniqueid' not in notification: - ann_uniqueid = '/shared/vendor/cmu/cyrus-imapd/uniqueid' - - if ann_uniqueid in notification['metadata']: - folder_uniqueid = notification['metadata'][ann_uniqueid] - notification['folder_uniqueid'] = folder_uniqueid - del folder_uniqueid - - del ann_uniqueid + if 'acl' not in notification: + jobs.append(b"GETACL") return (notification, jobs)
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/mailboxdelete.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/mailboxdelete.py
Changed
@@ -22,10 +22,10 @@ Base handler for an event notification of type 'MailboxDelete' """ -from bonnie.worker.handlers import HandlerBase +from bonnie.worker.handlers import MailboxHandlerBase -class MailboxDeleteHandler(HandlerBase): +class MailboxDeleteHandler(MailboxHandlerBase): event = 'MailboxDelete' def __init__(self, *args, **kw): - HandlerBase.__init__(self, *args, **kw) + super(MailboxDeleteHandler, self).__init__(self, *args, **kw)
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/messageappend.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/messageappend.py
Changed
@@ -40,7 +40,7 @@ if 'messageContent' not in notification or \ notification['messageContent'] in [None, ""]: - self.log.debug("Adding FETCH job for " + self.event, level=8) + self.log.debug("Adding FETCH job for %r" % (self.event), level=8) return (notification, [b"FETCH"]) return (notification, jobs)
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/messagebase.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/messagebase.py
Changed
@@ -56,7 +56,7 @@ # message notifications require message headers if not notification.has_key('messageHeaders'): - self.log.debug("Adding HEADER job for " + self.event, level=8) + self.log.debug("Adding HEADER job for %r" % (self.event), level=8) jobs.append(b"HEADER") return (notification, jobs)
View file
bonnie-0.3.7.tar.gz/bonnie/worker/handlers/messagenew.py -> bonnie-0.3.8.tar.gz/bonnie/worker/handlers/messagenew.py
Changed
@@ -53,7 +53,7 @@ if 'messageContent' not in notification or \ notification['messageContent'] in [None, ""]: - self.log.debug("Adding FETCH job for " + self.event, level=8) + self.log.debug("Adding FETCH job for %r" % (self.event), level=8) return (notification, [b"FETCH"]) return (notification, jobs)
View file
bonnie-0.3.7.tar.gz/bonnie/worker/outputs/elasticsearch_output.py -> bonnie-0.3.8.tar.gz/bonnie/worker/outputs/elasticsearch_output.py
Changed
@@ -62,7 +62,7 @@ 'serverDomain': 'domain', 'aclRights': 'acl_rights', 'aclSubject': 'acl_subject', - 'mailboxID': 'mailbox_id', + 'mailboxID': 'folder_id', 'messageSize': 'message_size', 'messageHeaders': None, 'messageContent': None,
View file
bonnie-0.3.7.tar.gz/bonnie/worker/outputs/riak_output.py -> bonnie-0.3.8.tar.gz/bonnie/worker/outputs/riak_output.py
Changed
@@ -62,7 +62,7 @@ 'serverDomain': 'domain', 'aclRights': 'acl_rights', 'aclSubject': 'acl_subject', - 'mailboxID': 'mailbox_id', + 'mailboxID': 'folder_id', 'messageSize': 'message_size', 'messageHeaders': None, 'messageContent': None,
View file
bonnie-0.3.7.tar.gz/bonnie/worker/storage/elasticsearch_storage.py -> bonnie-0.3.8.tar.gz/bonnie/worker/storage/elasticsearch_storage.py
Changed
@@ -75,7 +75,7 @@ 'uidset': { 'callback': self.resolve_folder_uri }, - 'folder_uniqueid': { + 'folder_id': { 'callback': self.resolve_folder_uri }, 'mailboxID': { @@ -117,13 +117,14 @@ except elasticsearch.exceptions.NotFoundError, errmsg: log.debug( - "ES entry not found for %s/%s/%s: %r" % ( - _index, - _doctype, - key, - errmsg - ) - ) + "ES entry not found for %s/%s/%s: %r" % ( + _index, + _doctype, + key, + errmsg + ), + level=8 + ) result = None @@ -384,6 +385,7 @@ # insert a user record into our database del user_data['id'] + user_data['user'] = user user_data['@timestamp'] = datetime.datetime.now( tzutc() @@ -413,6 +415,13 @@ """ log.debug('notification_to_folder on %r' % (notification), level=8) + # This is incomplete + if 'acl' not in notification: + return False + + if 'metadata' not in notification: + return False + # split the uri parameter into useful parts uri = parse_imap_uri(notification[attrib]) @@ -424,24 +433,20 @@ folder_uri = templ % uri + urllib.quote(uri['path']) - if 'metadata' not in notification: - return False - key = '/shared/vendor/cmu/cyrus-imapd/uniqueid' - if 'folder_uniqueid' not in notification and \ - key in notification: - - notification['folder_uniqueid'] = notification['metadata'][key] + if key in notification['metadata']: + notification['folder_id'] = notification['metadata'][key] - if 'folder_uniqueid' not in notification: - notification['folder_uniqueid'] = hashlib.md5( - notification[attrib] - ).hexdigest() + # despite the former, still incomplete? + if 'folder_id' not in notification: + return False + # obtain the folder type folder_type = 'mail' ann_folder_type = '/shared/vendor/kolab/folder-type' + if ann_folder_type in notification['metadata']: folder_type = notification['metadata'][ann_folder_type] @@ -450,25 +455,23 @@ else: owner = None + metadata = [] + for k, v in notification['metadata'].iteritems(): + metadata.append({k: v}) + + acl = [] + for k, v in notification['acl'].iteritems(): + acl.append({self.resolve_username(k, force=True): v}) + body = { '@version': bonnie.API_VERSION, '@timestamp': datetime.datetime.now( tzutc() ).strftime("%Y-%m-%dT%H:%M:%S.%fZ"), - 'metadata': list(set( - { - k: v - } for k, v in notification['metadata'] - )), - 'acl': list(set( - { - self.resolve_username(k, force=True): v - } for k, v in notification['acl'].iteritems() - )), - + 'metadata': metadata, + 'acl': acl, 'folder_type': folder_type, - 'owner': owner, 'server': uri['host'], 'name': uri['path'], @@ -482,11 +485,11 @@ '/shared/vendor/cmu/cyrus-imapd/size' ] - return dict(id=notification['folder_uniqueid'], body=body) + return dict(id=notification['folder_id'], body=body) def resolve_folder_uri(self, notification, attrib='uri'): """ - Resolve the folder uri (or folder_uniqueid) into an + Resolve the folder uri (or folder_id) into an elasticsearch object ID. """ # no folder resolving required @@ -559,7 +562,7 @@ folder = None # update entry if name changed - elif folder['_id'] == existing['_id'] and \ + elif folder['id'] == existing['_id'] and \ not folder['body']['name'] == existing['name']: try:
View file
bonnie-0.3.7.tar.gz/docs/about/archival.rst -> bonnie-0.3.8.tar.gz/docs/about/archival.rst
Changed
@@ -3,3 +3,14 @@ ======== Archival ======== + +Archiving, in the context of this application suite, principally revolves +around the act of preservation of records. + +It therefore implies these records do not necessarily need to be rendered +available in a near real-time production environment, and could be submitted +to slower, cheaper, larger-volume storage. + +.. seealso:: + + * :ref:`about-backup-and-restore`
View file
bonnie-0.3.8.tar.gz/docs/about/audit-trail.rst
Added
@@ -0,0 +1,5 @@ +.. _about-audit-trail: + +=========== +Audit Trail +===========
View file
bonnie-0.3.7.tar.gz/docs/about/backup-and-restore.rst -> bonnie-0.3.8.tar.gz/docs/about/backup-and-restore.rst
Changed
@@ -3,3 +3,8 @@ ================ Backup & Restore ================ + +Contrary to :ref:`about-archival`, backup and restore capabilities need to be +available to day-to-day operations of production environments more promptly, in +order to facilitate a work-flow against a variety of common issues; broken +hardware, mistakes by users, etc.
View file
bonnie-0.3.8.tar.gz/docs/about/lawful-interception.rst
Added
@@ -0,0 +1,5 @@ +.. _about-lawful-interception: + +=================== +Lawful Interception +===================
View file
bonnie-0.3.8.tar.gz/docs/administrator-guide
Added
+(directory)
View file
bonnie-0.3.8.tar.gz/docs/administrator-guide/configuring-index-templates.rst
Added
@@ -0,0 +1,174 @@ +========================================= +Configuring Elasticsearch Index Templates +========================================= + +``headers.Subject`` +=================== + +Adjust the template for ``logstash-*`` indexex to no longer analyze the +``headers.Subject`` field. + +Normally, the field is tokenized: + +.. parsed-literal:: + + $ :command:`curl -XGET /_analyze?text=1234-5678 | python -mjson.tool` + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed + 100 166 100 166 0 0 646 0 --:--:-- --:--:-- --:--:-- 648 + { + "tokens": [ + { + "end_offset": 4, + "position": 1, + "start_offset": 0, + "token": "1234", + "type": "<NUM>" + }, + { + "end_offset": 9, + "position": 2, + "start_offset": 5, + "token": "5678", + "type": "<NUM>" + } + ] + } + +Get the original template definition: + +.. parsed-literal:: + + $ :command:`curl -XGET /_template/logstash | python -mjson.tool` + { + "logstash" : { + "order" : 0, + "template" : "logstash-\*", + "settings" : { + "index.refresh_interval" : "5s" + }, + "mappings" : { + "_default_" : { + "dynamic_templates" : [ { + "string_fields" : { + "mapping" : { + "index" : "analyzed", + "omit_norms" : true, + "type" : "string", + "fields" : { + "raw" : { + "index" : "not_analyzed", + "ignore_above" : 256, + "type" : "string" + } + } + }, + "match_mapping_type" : "string", + "match" : "\*" + } + } ], + "properties" : { + "geoip" : { + "dynamic" : true, + "path" : "full", + "properties" : { + "location" : { + "type" : "geo_point" + } + }, + "type" : "object" + }, + "@version" : { + "index" : "not_analyzed", + "type" : "string" + } + }, + "_all" : { + "enabled" : true + } + } + }, + "aliases" : { } + } + } + +Define the new version: + +.. parsed-literal:: + + $ :command:`curl -XPUT http://es.lhm.klab.cc:9200/_template/logstash -d '` + { + "logstash" : { + "order" : 0, + "template" : "logstash-\*", + "settings" : { + "index.refresh_interval" : "5s" + }, + "mappings" : { + "_default_" : { + "dynamic_templates" : [ { + "string_fields" : { + "mapping" : { + "index" : "analyzed", + "omit_norms" : true, + "type" : "string", + "fields" : { + "raw" : { + "index" : "not_analyzed", + "ignore_above" : 256, + "type" : "string" + } + } + }, + "match_mapping_type" : "string", + "match" : "\*" + } + } ], + "properties" : { + "geoip" : { + "dynamic" : true, + "path" : "full", + "properties" : { + "location" : { + "type" : "geo_point" + } + }, + "type" : "object" + }, + **"headers.Subject": {** + **"type": "string",** + **"index": "not_analyzed"** + **},** + "@version" : { + "index" : "not_analyzed", + "type" : "string" + } + }, + "_all" : { + "enabled" : true + } + } + }, + "aliases" : { } + } + } + +.. parsed-literal:: + + $ :command:`curl -XGET /logstash-2014.12.04/_analyze?field=headers.Subject -d '1234-5678' | python -mjson.tool` + % Total % Received % Xferd Average Speed Time Time Time Current + Dload Upload Total Spent Left Speed + 100 102 100 93 100 9 364 35 --:--:-- --:--:-- --:--:-- 364 + { + "tokens": [ + { + "end_offset": 9, + "position": 1, + "start_offset": 0, + "token": "1234-5678", + "type": "word" + } + ] + } + +http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/_finding_exact_values.html
View file
bonnie-0.3.8.tar.gz/docs/architecture-and-design
Added
+(directory)
View file
bonnie-0.3.8.tar.gz/docs/architecture-and-design/index.rst
Added
@@ -0,0 +1,1873 @@ +:tocdepth: 2 + +.. _architecture-and-design: + +======================= +Architecture and Design +======================= + +The design principles could be summarized as follows: + +* Eventual consistency is more important than real-time availability, + +* Scalability is more important than one particular technology's individual + efficiency, + +* Reliability of the audit trail + +Bonnie receives and parses :term:`event notifications` issued by Cyrus +IMAP 2.5. + +An event notification is issued when a user logs on or off, and when a +user makes a change to a mailbox or message. + +These event notifications are used to build the data sets for each of +the following features included with Bonnie; + + * :ref:`about-archival` + + * :ref:`about-audit-trail` + + * :ref:`about-backup-and-restore` + + * :ref:`about-data-loss-prevention` + + * :ref:`about-e-discovery` + + * :ref:`about-lawful-interception` + +These features are (each of them) optional, in that a deployment used +solely for e-Discovery does not imply the inclusion of the other +features, and most feature descriptions use super-nomenclature for +otherwise limited, self-contained facilities that tend to serve a more +narrow purpose. + +One example is a ``changelog`` module, which can be used to preserve +selected subsets of data, such as "who changed what in an event". This +particular module would fall under the category of e-Discovery, but not +include the full feature-set of e-Discovery. + +.. rubric:: Technology Base + +The initial design of Bonnie is created with the following technologies +as the basis: + + * Elasticsearch + * ZeroMQ + +The functional components that make up a complete Bonnie environment are +designed to be pluggable such that one or more extensions can be used +for either of the following channels: + +**input** + + Used to receive new event notifications, or jobs, to be handled. + +**storage** + + Used to store event notifications, queues and/or metadata. + +**output** + + The final and persistent record(s) of state. + +To parse information in to useful, digestible chunks, Bonnie uses +so-called *handlers*, that subscribe to event notifications using +*interests*. + +For example, a *Logout* event is only stored to the output channel, +should a corresponding *Login* event be available. After all, there must +not be a *Logout* event without a *Login* event also having occurred. + +A handler interested in the *Logout* event uses the storage channel to +determine the corresponding *Login* event is indeed available. + +This storage channel may be configured to be an intermediate buffer, +that holds all *Login* events until after the corresponding *Logout* +event notification is received, or a timeout occurs, or may be +configured to be the same channel as the output channel. + +Overview of Components +====================== + +The Bonnie infrastructure components are an add-on to an existing Cyrus +IMAP environment, typically as part of a `Kolab Groupware`_ deployment. + +More detailed diagrams of communication flows in a standard Kolab +Groupware environment are available in the +`Kolab Groupware documentation`_, and are outside of the scope of the +design documentation for Bonnie. Suffice it to say, that Kolab Groupware +components communicate with Cyrus IMAP: + +.. graphviz:: + + digraph { + rankdir = LR; + splines = true; + overlab = prism; + fontname = Calibri; + + edge [color=gray50, fontname=Calibri, fontsize=11]; + node [shape=record, fontname=Calibri, fontsize=11]; + + "Kolab Groupware"; + + "Cyrus IMAP"; + + "Kolab Groupware" -> "Cyrus IMAP" [dir=both]; + + } + +As Bonnie is added on top of this (existing) infrastructure the design +overview for the infrastructure becomes: + +.. graphviz:: + + digraph { + rankdir = LR; + splines = true; + overlab = prism; + fontname = Calibri; + + edge [color=gray50, fontname=Calibri, fontsize=11]; + node [shape=record, fontname=Calibri, fontsize=11]; + + "Kolab Groupware" [color=gray50,style=filled]; + + subgraph cluster_bonnie { + label = "Bonnie Infrastructure"; + + "Broker"; + + subgraph cluster_imap { + label = "IMAP Server"; + + "Cyrus IMAP" [color=gray50,style=filled]; + + "Collector"; + "Dealer"; + } + + "Worker"; + } + + "Kolab Groupware" -> "Cyrus IMAP" [dir=both]; + + "Cyrus IMAP" -> "Dealer"; + + "Dealer" -> "Broker"; + + "Broker" -> "Worker" [dir=both]; + + "Collector" -> "Broker" [dir=both]; + + "Worker" -> "Storage" [dir=both]; + + } + +The **Dealer** Component +======================== + +A dealer is a script executed once for each event notification, and is +used to dispatch the event notification as fast and as efficient as +possible. + +.. graphviz:: + + digraph { + rankdir = LR; + splines = true; + overlab = prism; + fontname = Calibri; + + edge [color=gray50, fontname=Calibri, fontsize=11]; + node [shape=record, fontname=Calibri, fontsize=11]; + + "Kolab Groupware" [color=gray50,style=filled]; + + subgraph cluster_bonnie { + label = "Bonnie Infrastructure"; + + "Broker"; + + subgraph cluster_imap { + label = "IMAP Server"; + + "Cyrus IMAP" [color=gray50,style=filled]; + + "Collector"; + "Dealer" [color="green",style=filled];
View file
bonnie-0.3.7.tar.gz/docs/conf.py -> bonnie-0.3.8.tar.gz/docs/conf.py
Changed
@@ -73,9 +73,9 @@ # built documents. # # The short X.Y version. -version = '0.2' +version = '0.3' # The full version, including alpha/beta/rc tags. -release = '0.2.3' +release = '0.3.7' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. @@ -274,5 +274,5 @@ # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { 'python': ('http://docs.python.org/', None), - 'pyzmq': ('https://zeromq.github.io/pyzmq/', None), + 'pyzmq': ('https://pyzmq.readthedocs.io/en/latest/', None), }
View file
bonnie-0.3.7.tar.gz/docs/index.rst -> bonnie-0.3.8.tar.gz/docs/index.rst
Changed
@@ -2,17 +2,19 @@ About Bonnie ============ -Bonnie builds an audit trail from events that occur against an IMAP server. +Bonnie is a transcriptor for events that occur against an IMAP server, and +correlates events occuring in a Kolab environment. -As such, Bonnie is the answer to the questions about :ref:`about-archival`, -:ref:`about-backup-and-restore`, :ref:`about-e-discovery` and -:ref:`about-data-loss-prevention` for electronic communications. +As such, Bonnie is the answer to the questions about :ref:`about-audit-trail`, +:ref:`about-archival`, :ref:`about-backup-and-restore`, +:ref:`about-data-loss-prevention`, :ref:`e-discovery` and +:ref:`about-lawful-interception` for electronic communications. .. toctree:: :maxdepth: 1 getting-started - architecture-and-design + architecture-and-design/index technical-documentation/index .. The following are placeholders for documentation that is not to be @@ -22,9 +24,11 @@ :hidden: about/archival + about/audit-trail about/backup-and-restore - about/e-discovery about/data-loss-prevention + about/e-discovery + about/lawful-interception Indices and tables ==================
View file
bonnie-0.3.8.tar.gz/docs/technical-documentation/broker.rst
Added
@@ -0,0 +1,5 @@ +====== +Broker +====== + +.. automodule:: bonnie.broker
View file
bonnie-0.3.8.tar.gz/docs/technical-documentation/collector.rst
Added
@@ -0,0 +1,5 @@ +========= +Collector +========= + +.. automodule:: bonnie.collector
View file
bonnie-0.3.8.tar.gz/docs/technical-documentation/dealer.rst
Added
@@ -0,0 +1,21 @@ +====== +Dealer +====== + +A dealer is used as the means for an IMAP server to emit +:term:`event notifications` to the Bonnie environment. Two forms of dealing +exist: + +* Synchronous (meaning blocking), + +* Asynchronous (meaning non-blocking). + +A blocking dealer ensures the action placed against an IMAP server must await +the successful acceptance of the event having occured, and is particularly +cumbersome and therefore dangerous to implement. As such, at this point it +should be considered an academic exercise. + +The asynchronous dealer should be used, and render Bonnie a near- real-time, +eventually consistent application suite. + +.. automodule:: bonnie.dealer
View file
bonnie-0.3.7.tar.gz/docs/technical-documentation/index.rst -> bonnie-0.3.8.tar.gz/docs/technical-documentation/index.rst
Changed
@@ -6,4 +6,7 @@ :maxdepth: 1 daemon + broker + collector + dealer worker
View file
bonnie-0.3.8.tar.gz/docs/technical-documentation/worker-handlers
Added
+(directory)
View file
bonnie-0.3.8.tar.gz/docs/technical-documentation/worker-handlers/index.rst
Changed
(renamed from docs/technical-documentation/worker-handlers.rst)
View file
bonnie-0.3.7.tar.gz/docs/technical-documentation/worker.rst -> bonnie-0.3.8.tar.gz/docs/technical-documentation/worker.rst
Changed
@@ -5,6 +5,6 @@ .. toctree:: :maxdepth: 1 - worker-handlers + worker-handlers/index .. automodule:: bonnie.worker
View file
bonnie-0.3.8.tar.gz/requirements.txt
Added
@@ -0,0 +1,31 @@ +alabaster==0.7.11 +Babel==2.6.0 +backports-abc==0.5 +certifi==2018.4.16 +chardet==3.0.4 +docutils==0.14 +elasticsearch==5.3.0 +futures==3.2.0 +idna==2.7 +imagesize==1.0.0 +Jinja2==2.10 +ldap==1.0.2 +ldap3==2.5 +MarkupSafe==1.0 +packaging==17.1 +pyasn1==0.4.3 +Pygments==2.2.0 +pyparsing==2.2.0 +python-dateutil==2.7.3 +pytz==2018.4 +pyzmq==17.0.0 +requests==2.19.1 +singledispatch==3.4.0.3 +six==1.11.0 +snowballstemmer==1.2.1 +Sphinx==1.7.5 +sphinxcontrib-websupport==1.1.0 +SQLAlchemy==1.2.8 +tornado==5.0.2 +typing==3.6.4 +urllib3==1.23
Locations
Projects
Search
Status Monitor
Help
Open Build Service
OBS Manuals
API Documentation
OBS Portal
Reporting a Bug
Contact
Mailing List
Forums
Chat (IRC)
Twitter
Open Build Service (OBS)
is an
openSUSE project
.