Fortinet black logo

Administration Guide

Data backup

Data backup

FortiAnalyzer-BigData supports disaster recovery and data portability. You can back up all the data within a Storage Pool to Hadoop Distributed File System (HDFS) in Parquet file format.

To back up data:
  1. From the Data page, locate the storage pool you want to back up and select Actions > Backup.

    The Backup Storage Pool Configuration dialog loads with the following fields:

    Field name

    Description

    HDFS Url

    Defines the target directory of the HDFS cluster. By default, the field is set to the built-in HDFS in the Security Event Manager.

    Note

    If the URL is configured to an external HDFS cluster, all its hosts must be made accessible by the FortiAnalyzer-BigData hosts (see Backup and restore to external HDFS).

    Clean Previous Backup Data

    Enable to delete any previous backup data and start a new backup.

    Do not enable if you want to create an incremental backup.

    Backup Timeout

    Enter the number of hours before the backup job times out. After the timeout, the job will abort.

    Enable Safe Mode

    By default, the normal backup job processes multiple tables in parallel and ignore any intermediate errors. Enable Safe Mode to back up the Storage Pool tables sequentially and to fail early if any error occurs.

    Note

    This mode may take longer to complete the back up, so only enable Safe Mode when the normal backup job fails.

    Advanced Config

    These configurations define the resources used for the job. Normal users should keep the default configurations.

    Enable Scheduled Backup

    Enable so the backup can be scheduled automatically.

  2. When you are finished, click Save & Backup to begin the backup process.
  3. You can monitor the status of your backup by navigating to Jobs > Storage Pool Backup.

Data backup

FortiAnalyzer-BigData supports disaster recovery and data portability. You can back up all the data within a Storage Pool to Hadoop Distributed File System (HDFS) in Parquet file format.

To back up data:
  1. From the Data page, locate the storage pool you want to back up and select Actions > Backup.

    The Backup Storage Pool Configuration dialog loads with the following fields:

    Field name

    Description

    HDFS Url

    Defines the target directory of the HDFS cluster. By default, the field is set to the built-in HDFS in the Security Event Manager.

    Note

    If the URL is configured to an external HDFS cluster, all its hosts must be made accessible by the FortiAnalyzer-BigData hosts (see Backup and restore to external HDFS).

    Clean Previous Backup Data

    Enable to delete any previous backup data and start a new backup.

    Do not enable if you want to create an incremental backup.

    Backup Timeout

    Enter the number of hours before the backup job times out. After the timeout, the job will abort.

    Enable Safe Mode

    By default, the normal backup job processes multiple tables in parallel and ignore any intermediate errors. Enable Safe Mode to back up the Storage Pool tables sequentially and to fail early if any error occurs.

    Note

    This mode may take longer to complete the back up, so only enable Safe Mode when the normal backup job fails.

    Advanced Config

    These configurations define the resources used for the job. Normal users should keep the default configurations.

    Enable Scheduled Backup

    Enable so the backup can be scheduled automatically.

  2. When you are finished, click Save & Backup to begin the backup process.
  3. You can monitor the status of your backup by navigating to Jobs > Storage Pool Backup.