Configuring bucket backup options

Before you start protecting data in buckets, you can adjust bucket protection to the needs of your data protection environment by configuring bucket backup options.

Backup options

Backup option Description
Pre/post Scripts Enables you to specify the pre-backup and post-backup scripts to perform necessary actions before and/or after the backup of a bucket is performed.
Data Movers Enables you to specify where you want R‑Cloud to create a data mover during the backup. By default, the data mover is created in the original , Azure resource group, or Google Cloud project of the bucket.

Prerequisites

  • Only if you plan to specify pre-backup and post-backup scripts.

    • The #!/usr/bin/env python3 header must be specified in the script.

    • For Google Cloud:

      • The HYCU Managed Service Account (HMSA) must have access to the bucket where the script is located.

      • Only if using a service account for running the scripts. The following line of code must be present in the script:

        os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/tmp/hycu/serviceAccount.json'

  • Only if you plan to configure backup options for multiple buckets. All buckets must have the same values set for each option that you plan to configure.

Limitations

Only if you plan to specify pre-backup and post-backup scripts.

  • Only Python scripts are supported.

  • For AWS: The pre-backup and post-backup scripts must be located in the same and region as the bucket.

  • For Azure: The pre-backup and post-backup scripts must be located in the same Azure resource group as the bucket.

  • For Google Cloud: Only the googleapiclient Python library can be used for making Google Cloud API calls.

Considerations

Only if you plan to configure the data mover. If not specified otherwise, the data mover will be created:

  • For AWS: In the same region as the bucket (for example, US-EAST-1).

  • For Azure: In the same resource group and region as the bucket (for example, East US).

  • For Google Cloud: In the following region (based on the location type of the bucket):

    • The region: In the same region as the bucket (for example, US-CENTRAL1).
    • The dual-region:

      Dual-region name Data mover region
      ASIA1 ASIA-NORTHEAST1
      EUR4 EUROPE-NORTH1
      NAM4 US-CENTRAL1
    • The multi-region:

      Multi-region name Data mover region
      ASIA ASIA-EAST1
      EU EUROPE-WEST1
      US US-CENTRAL1

Procedure

  1. In the Buckets panel, select the buckets for which you want to configure backup options.

  2. Click Configuration Configuration. The Bucket Configuration dialog box opens.

  3. Depending on whether you want to specify the pre-backup and post-backup scripts for a single bucket or multiple buckets, or configure the data movers, do the following:

    • Only if you want to specify the pre-backup and post-backup scripts for a single bucket. On the Pre/post Scripts tab, specify the scripts to perform necessary actions before and/or after the backup of the bucket is performed:

      • In the Pre-backup Script field, enter the path to the script that R‑Cloud will run before it performs the backup of the bucket.

      • In the Post-backup Script field, enter the path to the script that R‑Cloud will run after it performs the backup of the bucket.

      Important  When entering the path to the script, make sure to enter it correctly, including lowercase and uppercase letters, as the path is case sensitive. You must specify the path in the following format:

      • For AWS: s3://bucket-name/script.py parameter1 parameter2 ...

      • For Azure: az://storage-account-name/container-name/script.py parameter1 parameter2 ...

      • For Google Cloud: gs://bucket-name/script.py parameter1 parameter2 ...

      Example  The following is an example of the first lines of a pre-backup script for a Google Cloud bucket:

      #!/usr/bin/env python3
      import os
      import googleapiclient.discovery


      os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/tmp/hycu/serviceAccount.json'

      storage = googleapiclient.discovery.build('storage', 'v1')

    • Only if you want to specify the pre-backup and post-backup scripts for multiple buckets. On the Pre/post Scripts tab, do the following:

      1. Specify the scripts to perform necessary actions before and/or after the backup of the bucket is performed. To do so, choose one of the following:

        • If you want to use a new script, select Add Add New, enter the path to the script, and then click Save.

        • If any of the selected buckets already have a pre-backup or post-backup script set and you want to use the same script for all other selected buckets, select the preferred script.

      2. Only if any of the selected buckets already have a pre-backup or post-backup script set. Select the Override these buckets check box if you want the specified script to be used for all the selected buckets.

      Important  When entering the path to the script, make sure to enter it correctly, including lowercase and uppercase letters, as the path is case sensitive. You must specify the path in the following format:

      • For AWS: s3://bucket-name/script.py parameter1 parameter2 ...

      • For Azure: az://storage-account-name/container-name/script.py parameter1 parameter2 ...

      • For Google Cloud: gs://bucket-name/script.py parameter1 parameter2 ...

      Example  The following is an example of the first lines of a pre-backup script for a Google Cloud bucket:

      #!/usr/bin/env python3
      import os
      import googleapiclient.discovery


      os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/tmp/hycu/serviceAccount.json'

      storage = googleapiclient.discovery.build('storage', 'v1')

    • Only if you want to configure the data movers. On the Data Movers tab, provide the following information:

      1. From the Region drop-down menu, select the preferred region.

        Note  It is recommended that you select the same region as the one where the bucket resides. Otherwise, you will be charged for outbound data transfer. For details, see Amazon S3, Azure, or Google Cloud pricing.

      2. For Azure buckets: From the Network drop-down menu, select the preferred network.

      3. From the Subnet drop-down menu, select the preferred subnet. By default, the data mover is created in the default subnet of the preferred region and zone.

      4. For Amazon S3 buckets: Optionally, from the Security Group drop-down menu, select the preferred security group. By default, the data mover is created in the default security group of the preferred subnet.

  4. Click Save.