Fortinet black logo

Incremental backups

Incremental backups

We recommend that you create incremental backups by consistently backing up new data to the same HDFS directory.

The first time a backup job is run, a full backup of the storage group data will be saved to the HDFS directory. Subsequent runs will perform incremental backups which only contain the rows that have changed since the initial full backup. Thus, the subsequent backups will be faster and more efficient.

To create manual incremental backups:

If you have already created a previous backup, you can manually create an incremental backup against it.

  1. From the navigation bar, go Jobs and click Storage Group Backup to view all the completed backups.
  2. Select the backup which you want to create an incremental backup against and click View Config.
    The Job Instance Configuration dialog loads with the following fields:
  3. In the HDFS Url field, copy the URL.
    For example: hdfs://cluster/backup/7o7T
  4. Go to Data and select the same Storage Group as the previous backup, and click Actions > Backup.
  5. In the HDFS URL field, paste in the HDFS Url copied from step 3.
    Note

    You can check the number of existing backups in the Backup Storage Group Configuration dialog.

  6. Ensure the Clean Previous Backup Data option is disabled so you do not clean any previous backup data, allowing this backup to be incremental.
    Note

    You can enable this option to make a full backup to the HDFS directory, however, a full backup job will be more time consuming than an incremental backup.

  7. When you are finished, click Save & Backup to begin the backup process.
To create scheduled incremental backups:

You can also schedule incremental backup jobs by enabling the Enable Scheduled Backup option. This schedules incremental backup jobs to the HDFS you set. Fortinet strongly recommends scheduling maintenance jobs at off-peak hours.

Incremental backups

We recommend that you create incremental backups by consistently backing up new data to the same HDFS directory.

The first time a backup job is run, a full backup of the storage group data will be saved to the HDFS directory. Subsequent runs will perform incremental backups which only contain the rows that have changed since the initial full backup. Thus, the subsequent backups will be faster and more efficient.

To create manual incremental backups:

If you have already created a previous backup, you can manually create an incremental backup against it.

  1. From the navigation bar, go Jobs and click Storage Group Backup to view all the completed backups.
  2. Select the backup which you want to create an incremental backup against and click View Config.
    The Job Instance Configuration dialog loads with the following fields:
  3. In the HDFS Url field, copy the URL.
    For example: hdfs://cluster/backup/7o7T
  4. Go to Data and select the same Storage Group as the previous backup, and click Actions > Backup.
  5. In the HDFS URL field, paste in the HDFS Url copied from step 3.
    Note

    You can check the number of existing backups in the Backup Storage Group Configuration dialog.

  6. Ensure the Clean Previous Backup Data option is disabled so you do not clean any previous backup data, allowing this backup to be incremental.
    Note

    You can enable this option to make a full backup to the HDFS directory, however, a full backup job will be more time consuming than an incremental backup.

  7. When you are finished, click Save & Backup to begin the backup process.
To create scheduled incremental backups:

You can also schedule incremental backup jobs by enabling the Enable Scheduled Backup option. This schedules incremental backup jobs to the HDFS you set. Fortinet strongly recommends scheduling maintenance jobs at off-peak hours.