Export and Import of User Activity
The system provides for the export and import of user activity. To download and upload activity, you need to connect the privilege Export/Import of User Activity with the E (Execute) operation.
Additionally, the system configuration file com.operavix.subsystem.monitoring.json must have the parameter for parsing and synchronizing activity archives received from agents, and subsequently transferring them to CH, disabled. The parsing_activity_enabled parameter must be set to false.
Exporting Activity
To export activity data to a specified directory on the disk, enter a GraphQL query:
mutation{
activity_exchange{
export_activity_queue(directory_path:"C:\\work\\Business-Projects\\activity\\monitoring\\data\\backup", limit_per_file:25000){
time_ms
total_size
count
files
}
}
}
Specify:
directory_path— the path where activity data is exportedlimit_per_file— the file limit per export container. If the file limit is not specified, the exported data will be split into containers of 25000 activity archives each
| Field | Data Type | Definition |
|---|---|---|
time_ms | Long | Operation execution time in milliseconds |
total_size | Long | Size of raw bytes read from the built-in file database, including the name, but excluding external container archive metadata |
count | Int | Total number of activity archives exported from the built-in file database |
files | String | List of file names generated during the export process in the specified directory |
During export, writing new archives to the built-in file database is blocked. The file is exported with the name activity_queue_({C}-{CN})_{yyyy_MM_dd_HH_mm_ss_SSS}.zip, where:
- {C} — the first key
- {CN} — the last key
- {yyyy_MM_dd_HH_mm_ss_SSS} — the archive creation time
Example archive name: activity_queue_(1-3)_2022_12_21_16_22_46_000.zip.
Activity data will not be exported if:
- The path where the data is uploaded is not specified in the GraphQL query
- The path does not exist on the disk or a file is specified instead of a directory
- It was not possible to lock the activity queue write process within 800 ms. In this case, a repeated request is expected
Importing Activity
To upload activity data of existing users and activity sources from a specified directory on the disk, enter a GraphQL query:
mutation{
activity_exchange{
import_activity_queue(directory_path:"C:\\work\\Business-Projects\\activity\\monitoring\\data\\backup", is_create_by_login:true){
time_ms
total_size
count
files
}
}
}
If the activity source and user are missing, you need to execute a GraphQL query using an API key and remove the letter "i" from the word graphiql in the link. Example: example.operavix.com/graphql?&api_key=key. After changing the link, you need to refresh the page.
Specify:
directory_path— the server path from where activity data is importedis_create_by_login: true/false— a parameter that controls the check for the existence of sources for the user and creates them if necessary:- If a source exists in the database by login and domain, it is considered that the source exists, and no further action is required for it
- If a source exists by login but there is no match by domain, the user ID is extracted from the received source, and a new source is created for this user
- If there is no match by login and domain, the source is created automatically
| Field | Data Type | Definition |
|---|---|---|
time_ms | Long | Operation execution time in milliseconds |
total_size | Long | Size of raw bytes read from the archive container, including the name |
count | Int | Total number of activity archives imported from the built-in file database |
files | String | List of file names read by the import operation |
To import activity, place the monitoring agent archives into a single archive named activity_queue_({C}-{CN})_{yyyy_MM_dd_HH_mm_ss_SSS}.zip, where:
- {C} — the first key
- {CN} — the last key
- {yyyy_MM_dd_HH_mm_ss_SSS} — the archive creation time
You can specify an approximate archive creation time; it is only necessary to adhere to the indicated format.
Activity data will not be imported in the following cases:
- The server path from where the data is uploaded is not specified in the GraphQL query
- The path does not exist on the disk or a file is specified instead of a directory
Important Considerations
- Errors related to the input/output system will be displayed in the event log
- For a user's activity from one server to correspond to the same user on another server after import, a source with login and domain must be created for them
- After import, the files are deleted from the disk, and the activity data is written to the built-in file database
Was the article helpful?