You can import data to a database from the following location types:

  • HTTP/S
  • FTP
  • SFTP
  • Amazon S3
  • Google Cloud Storage
  • Microsoft Azure Storage
  • NAS/Local Storage

The source file to import should be in the RDB format. It can also be in a compressed (gz) RDB file.

Supply an array of dataset import source objects to import data from multiple files.

Basic parameters

For all import location objects, you need to specify the location type via the type field.

Location type“type” value
FTP/S“url”
SFTP“sftp”
Amazon S3“s3”
Google Cloud Storage“gs”
Microsoft Azure Storage“abs”
NAS/Local Storage“mount_point”

Location-specific parameters

Any additional required parameters may differ based on the import location type.

FTP

Key nameTypeDescription
urlstringA URI that represents the FTP/S location with the following format: ftp://user:password@host:port/path/. The user and password can be omitted if not needed.

SFTP

Key nameTypeDescription
keystringSSH private key to secure the SFTP server connection. If you do not specify an SSH private key, the autogenerated private key of the cluster is used and you must add the SSH public key of the cluster to the SFTP server configuration. (optional)
sftp_urlstringSFTP URL in the format: sftp://user:password@host[:port]/path/filename.rdb. The default port number is 22 and the default path is ‘/’.

AWS S3

Key nameTypeDescription
access_key_idstringThe AWS Access Key ID with access to the bucket
bucket_namestringS3 bucket name
secret_access_keystringThe AWS Secret Access that matches the Access Key ID
subdirstringPath to the backup directory in the S3 bucket (optional)

Google Cloud Storage

Key nameTypeDescription
bucket_namestringCloud Storage bucket name
client_emailstringEmail address for the Cloud Storage client ID
client_idstringCloud Storage client ID with access to the Cloud Storage bucket
private_keystringPrivate key for the Cloud Storage matching the private key ID
private_key_idstringCloud Storage private key ID with access to the Cloud Storage bucket
subdirstringPath to the backup directory in the Cloud Storage bucket (optional)

Azure Blob Storage

Key nameTypeDescription
account_keystringAccess key for the storage account
account_namestringStorage account name with access to the container
containerstringBlob Storage container name
sas_tokenstringToken to authenticate with shared access signature
subdirstringPath to the backup directory in the Blob Storage container (optional)
Note:
account_key and sas_token are mutually exclusive

NAS/Local Storage

Key nameTypeDescription
pathstringPath to the locally mounted filename to import. You must create the mount point on all nodes, and the redislabs:redislabs user must have read permissions on the local mount point.