Customer Identity and Access Management (CIAM) is a hot topic of late for enterprises that directly communicate online with their customers, attracting attention from analyst firms including Gartner, Forrester and KuppingerCole.
However, a CIAM system isn’t really useful if it contains no customer identities, or if those identities don’t retain sufficient data about customers to be useful. When a company migrates to a new CIAM solution, it is very common to also migrate data from their old site, and perhaps merge data from multiple digital properties. Here’s how Gigya helps our customers with these sometimes challenging transitions.
In 2016, the Gigya Global Services team carried out over 160 data migration projects involving more than 110 million users from legacy systems, many of which included merging of disparate data from multiple sources, into our cloud-based software-as-a-service (SaaS) platform. Our single biggest project: migrating 30 million user records, all at once.
If you have to maintain an existing user management system or database while migrating its data to a new system, then a temporary feedback loop will be necessary to keep the original system updated. Depending on the complexity of this feedback loop and the performance of the old system, this sort of mechanism can have considerable impact on the performance of a site at web-scale.
When our clients engage with Gigya, we migrate all users into our platform to reduce the overhead of maintaining another database of user data and credentials. This also reduces the security risks of maintaining databases and adds value to platform features such single sign-on (SSO), preference management, federation support and more.
The key objective is a transition where customers don’t know that a change has occurred, with usernames and passwords or their social login credentials remaining the same. The end customer should only notice features from the CIAM implementation, such as SSO, risk-based authentication, preference management and so on.
The migration journey at Gigya starts when a technical consultant reviews your existing password hashing format. Out of the box, Gigya supports a wide range of password hashing algorithms and, if these are not supported, we have the ability to migrate a custom hash into the platform.
When it comes to the actual migration for multiple sites and multiple markets the question often asked is “Do we need to go for a big bang approach or a phased approach?” The answer is “yes.” We can support and design an approach for both methods and have done so for many of our existing clients.
The big bang data migration approach will typically involve four stages:
- Test Import – Typically around 10 percent of production accounts. This provides the client with the opportunity to validate the data quality and that logins for existing accounts work.
- Staging Import – Typically a full import. This provides the client with the opportunity to validate the data quality on a larger scale and perform volume tests.
- Production Import – Typically we advise that this be completed two to three days before the “go live” date to ensure all accounts are in place and ready for seamless migration.
- Production Delta Import – This will take place on the day of the migration or the day after, and will typically include a small number of accounts which have been updated or created since the production import.
The phased data migration approach will typically involve the same four stages. However, these will be repeated for each migration. This typically includes the merging of data from the various different data sources into a single identity in Gigya. This enriches the customer’s identity if collating from multiple legacy data sources. With a phased approach, the design will also typically include a synchronization of new data to the legacy platform to ensure that new customers will be able to login.
Data migrations can be daunting. However, the Gigya Global Services team has a great deal of experience guiding clients to a smooth transition and a successful outcome with even the most complex data project.
By Stephen Purvis