...
The operator can save the learned tree into a database via the second output port. It continuously put out a serialized version of the tree which can be used to save it in a database, e.g. a MongoDB. When the analysis is started again, the information from the database can be used. This is useful, if a lot of knowledge is learned and if it would take a while to gain this knowledge again. On the startup, the information from the database is read a single time (if an information is available) and written into the operator. The PQL query in the codeblock shows, how to use the backup functionality.
Code Block | ||
---|---|---|
| ||
#PARSER PQL #DEFINE BACKUPSCHEMA [['tree', 'String'],['backupId', 'String']] #RUNQUERY /// Backup-Daten aus der Datenbank auslesen backupMongo = MONGODBSOURCE({ database = 'odysseus', port = 27017, host = 'localhost', collectionname = 'condition' } ) /// Backup-Daten in Tupel konvertieren backupTuple = KEYVALUETOTUPLE({ schema=${BACKUPSCHEMA}, TYPE = 'Backup', KEEPINPUT = 'false' }, backupMongo ) stateAnalysis = RARESEQUENCE({ treedepth = 100, minrelativefrequencyPath = 0.1, firsttupleisroot = 'true', UNIQUEBACKUPID = 'rareSequence_1' }, state, backupTuple ) analysis = SELECT({PREDICATE = 'anomalyScore > 0'}, stateAnalysis) /// Backup-Daten von Port 1 in ein Key-Value Objekt konvertieren keyValueOp = TUPLETOKEYVALUE({ type='KEYVALUEOBJECT' }, 1:stateAnalysis ) /// Backup-Daten in der Datenbank speichern mongoSink = MONGODBSINK({ database = 'odysseus', port = 27017, host = 'localhost', collectionname = 'condition', batchsize = 1, deletebeforeinsert = 'true', deleteequalattribute = 'backupId' }, keyValueOp ) |