Top.Mail.Ru
Export and Import of User Activity
CTRL+K

Export and Import of User Activity

In this article
  • Export and Import of User Activity
  • Exporting Activity
  • Importing Activity
  • Important Considerations

Export and import of user activity are intended for transferring monitoring data between servers, creating backups, or restoring data in the event of an emergency shutdown. These features are used:

  • During migration to another server or cluster
  • For backing up activity archives
  • When transferring activity between test and production environments
  • For integrating monitoring data with other analytics systems

The system provides for the export and import of user activity. To download and upload activity, you need to have the privilege Export/Import of User Activity with the E (Execute) operation.

Additionally, the system configuration file com.operavix.subsystem.monitoring.json must have the parameter for parsing and synchronizing activity archives received from agents, and subsequently transferring them to CH, disabled. The parsing_activity_enabled parameter must be set to false.

Exporting Activity

To export activity data to a specified directory on the disk, enter a GraphQL query:

mutation{
  activity_exchange{
    export_activity_queue(directory_path:"C:\\work\\Business-Projects\\activity\\monitoring\\data\\backup", limit_per_file:25000){
      time_ms
      total_size
      count
      files
    }
  }
}

Specify:

  • directory_path — the path where activity data is exported
  • limit_per_file — the file limit per export container. If the file limit is not specified, the exported data will be split into containers of 25000 activity archives each

Query fields are explained in the table below.

FieldData TypeDefinition
time_msLongOperation execution time in milliseconds
total_sizeLongSize of raw bytes read from the built-in file database, including the name, but excluding external container archive metadata
countIntTotal number of activity archives exported from the built-in file database
filesStringList of file names generated during the export process in the specified directory

During export, writing new archives to the built-in file database is blocked. The file is exported with the name activity_queue_({C}-{CN})_{yyyy_MM_dd_HH_mm_ss_SSS}.zip, where:

  • {C} — the first key
  • {CN} — the last key
  • {yyyy_MM_dd_HH_mm_ss_SSS} — the archive creation time

Example archive name: activity_queue_(1-3)_2022_12_21_16_22_46_000.zip.

Important
  • When exporting data from the embedded file-based database, do not delete the corresponding records from ClickHouse.
  • Data in the embedded database is synchronized with ClickHouse, and during subsequent import, the system relies on ClickHouse to restore the information.
  • If records are missing in ClickHouse, the import process triggers a lengthy reverse synchronization procedure or fails with data consistency errors.

Activity data will not be exported if:

  • The path where the data is uploaded is not specified in the GraphQL query
  • The path does not exist on the disk or a file is specified instead of a directory
  • It was not possible to lock the activity queue write process within 800 ms. In this case, a repeated request is expected

Importing Activity

To upload activity data of existing users and activity sources from a specified directory on the disk, enter a GraphQL query:

mutation{
  activity_exchange{
    import_activity_queue(directory_path:"C:\\work\\Business-Projects\\activity\\monitoring\\data\\backup", is_create_by_login:true){
      time_ms
      total_size
      count
      files
    }
  }
}

Specify:

  • directory_path — the server path from where activity data is imported
  • is_create_by_login: true/false — a parameter that controls the check for the existence of sources for the user and creates them if necessary:
    • If a source with the given login and domain already exists in the database, it is considered valid, and no further action is required
    • If a source with the given login exists but the domain does not match, the user ID is extracted from the incoming source, and a new source is created for that user
    • If no match is found for both login and domain, a new source is created automatically

The query fields are described in the table below.

FieldData TypeDefinition
time_msLongOperation execution time in milliseconds
total_sizeLongSize of raw bytes read from the archive container, including the name
countIntTotal number of activity archives imported from the built-in file database
filesStringList of file names read by the import operation
Warning

If the activity source and user are missing, you must execute a GraphQL query using the API key—and remove the letter i from "graphiql" in the URL. Example: automation-dev.corp.com/graphql?&api_key=<key>. After modifying the URL, refresh the page.

To import activity, place the monitoring agent archives into a single archive named activity_queue_({C}-{CN})_{yyyy_MM_dd_HH_mm_ss_SSS}.zip, where:

  • {C} — the first key
  • {CN} — the last key
  • {yyyy_MM_dd_HH_mm_ss_SSS} — the archive creation time
Note

You can specify an approximate archive creation time; it is only necessary to adhere to the indicated format.

Activity data will not be imported in the following cases:

  • The server path from where the data is uploaded is not specified in the GraphQL query
  • The path does not exist on the disk or a file is specified instead of a directory

Important Considerations

  • Errors related to the input/output system will be displayed in the event log
  • For a user's activity from one server to correspond to the same user on another server after import, a source with login and domain must be created for them
  • After import, the files are deleted from the disk, and the activity data is written to the built-in file database
  • After loading activity onto the server, the parsing_activity_enabled parameter must be set to true

Was the article helpful?

Yes
No
Previous
Collecting Screenshots with the Monitoring Agent
We use cookies to improve our website for you.