February 8, 2019 By Phil Alger 4 min read

Migrating your data to IBM Cloud Databases for Redis

If you’re moving your data over to IBM Cloud Databases for Redis, you’ll need to take some steps to successfully migrate all of your data. We’ve got you covered. In this post, we’ll show you a quick way to start migrating your data across to Databases for Redis, whether your database is on-premise or in the cloud.

If you’re a Redis user, you already know why Redis is a great database for quickly storing and retrieving in-memory data. If you’re thinking about moving into the cloud or transitioning from Compose for Redis (or another cloud provider) to Databases for Redis, we’d like to guide you through the migration process. The good thing is that there isn’t much for you to do if you’re migrating from your local or Compose for Redis database to Databases for Redis.

Let’s talk about how to migrate.

The migration script

Migrating to Databases for Redis involves running a simple Python script that we’ve made available on Github. The script will copy all the keys from your source database over to your Databases for Redis deployment. You’ll want to download the script and make sure you have Python 3.x installed. If you’re on macOS, you can use homebrew to install it running brew install python3, which will give you the latest version.

Note that you will also need the python dependencies listed in the comments of the script.

Next, we recommend creating a migration window to let your users know that you’ll be doing some maintenance. That way you’ll have some time to migrate all your data over to your new Databases for Redis deployment. If your using Redis as a key-value store with expire times on keys, rest assured that these expiry times will be copied over to your new database. We’re testing this with 10 million keys in our database, which won’t take much time to migrate depending on your bandwidth.

Getting the destination and source database credentials

Now, you’ll need to have the credentials of both the source database and your Databases for Redis deployment. You can get the credentials for the Databases for Redis deployment by clicking on your database from the IBM Cloud resources panel. Then click on the Service credentials link from the left-hand menu that’ll take you to the Service credentials view. From there, you can create a New credential by clicking on that button or you can use any credentials that you’ve already created.

Another way to get this information is using the IBM Cloud CLI. Using the cdb plugin, you’d run:

ibmcloud cdb deployment-connections <Redis deployment name>

This will provide you with your Databases for Redis connection URI that includes the hostname and port. To get the decoded CA certificate for the database, you’d run:

ibmcloud cdb deployment-cacert <Redis deployment name>

Once the CA certificate is decoded, you need to save that to a file to connect to the database later. If you don’t know the password for your Redis deployment, you’ll need to either get that from your generated service credentials or you can create a new password by running:

ibmcloud cdb deployment-user-password <Redis deployment name> admin <new password>

If your destination database is running Redis 6 or greater, you will need both a username and a password.

With this information, we have what we need for the destination. For the source, let’s say that we want to migrate our data from Compose for Redis to our new Databases for Redis deployment. To do that, you’d have to get the host, port, and password for your Compose for Redis database. You can get those by following the same steps that we covered above by clicking on your Compose for Redis database from the IBM Cloud resource panel and then creating or using your service credentials.

Running the script and migrating data

Since you have all the credentials for both databases (Compose and Databases for Redis), we’ll now show you how to run the script. We’ve named the Python script file pymigration.py. All you’ll need to do now is run the code from your terminal using the credentials you’ve gotten above:

python pymigration.py <source host> <source password> <source port> 
<destination host> <destination password> <destination port> 
<destination ca certificate path> --sslsrc --ssldst

If your destination database is running Redis 6 or greater, your “destination password” should be formatted as username:password. If less than Redis 6, you can simply provide your password.

Since we’re copying data from a Compose for Redis database, you’ll need to add the --sslsrc flag if your Compose for Redis database is SSL/TLS enabled. If it isn’t, then don’t add the flag. This makes sure that Redis is connecting to a SSL/TLS enabled database. You also need to add --ssldst since the destination database is Databases for Redis which also is SSL/TLS enabled. Supplementary flags you could add are --dband --flush. Using --db, you can indicate the database your keys are copied from, which will be the database they recopied into in your Databases for Redis deployment. The --flush flag allows you to flush the destination database before importing the keys from the source database. If you want to keep things fresh in your Databases for Redis deployment, flush will delete all the keys first then import the new keys from your source database.

Running the above script using Compose for Redis as the source for the data migration and Databases for Redis as the destination for the migrated data, we’d get something like:

python pymigration.py portal0.0000.composedb.com composepassword1 88888 000.000.databases.appdomain.cloud dbredispassword1 99999 ~/dbredisCA  --sslsrc --ssldst  10000000 keys: 100% |###################################################| Time: 0:00:00 Keys disappeared on source during scan: 0 Keys already existing on destination: 0

As you can see from the results, we copied 10 million keys from Compose for Redis to Databases for Redis. No keys were deleted on the Compose for Redis database. If we add a new key to the Compose for Redis deployment and attempt to migrate the data again, we’ll see that the Keys already existing on destination will change to 10000000 since the original 10 million keys already exist on that database.

10000000 keys: 100% |###################################################| Time: 0:00:00
Keys disappeared on source during scan: 0
Keys already existing on destination: 10000000

Conclusion

Migrating your data couldn’t be simpler. After your migration, all you need to do is swap out your application’s database connection strings with your Databases for Redis connection string and credentials. That’s all it takes!

Was this article helpful?
YesNo

More from Cloud

IBM Tech Now: April 8, 2024

< 1 min read - ​Welcome IBM Tech Now, our video web series featuring the latest and greatest news and announcements in the world of technology. Make sure you subscribe to our YouTube channel to be notified every time a new IBM Tech Now video is published. IBM Tech Now: Episode 96 On this episode, we're covering the following topics: IBM Cloud Logs A collaboration with IBM watsonx.ai and Anaconda IBM offerings in the G2 Spring Reports Stay plugged in You can check out the…

The advantages and disadvantages of private cloud 

6 min read - The popularity of private cloud is growing, primarily driven by the need for greater data security. Across industries like education, retail and government, organizations are choosing private cloud settings to conduct business use cases involving workloads with sensitive information and to comply with data privacy and compliance needs. In a report from Technavio (link resides outside ibm.com), the private cloud services market size is estimated to grow at a CAGR of 26.71% between 2023 and 2028, and it is forecast to increase by…

Optimize observability with IBM Cloud Logs to help improve infrastructure and app performance

5 min read - There is a dilemma facing infrastructure and app performance—as workloads generate an expanding amount of observability data, it puts increased pressure on collection tool abilities to process it all. The resulting data stress becomes expensive to manage and makes it harder to obtain actionable insights from the data itself, making it harder to have fast, effective, and cost-efficient performance management. A recent IDC study found that 57% of large enterprises are either collecting too much or too little observability data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters