When you think of DevOps, Agile development, CI/CD, or even just release management in general, Salesforce isn’t the first platform that comes to mind. While Salesforce was indeed born in the cloud and pioneered the concept of software-as-a-service, Salesforce teams are generally considered to be “behind the curve” in terms of adopting DevOps practices. This is because as opposed to “walled gardens,” open ecosystems offer much faster innovation. There’s no real way to escape progress, however, especially since Salesforce is so much more than just a CRM platform today. A better way to describe it would probably be as a platform to build highly customized applications on top of the Salesforce framework. Where there’s a build, there’s deployment, and where there’s deployment, there needs to be DevOps if you want to stay relevant and competitive in today’s day and age.
Customization at a cost
Now in addition to separate clouds for sales, marketing, community, apps, and IoT, Salesforce also features automated marketing solutions, CPQ solutions, and Einstein Analytics. With great customizability comes great complexity and as projects scale and expand, Salesforce teams find themselves in a similar situation as their counterparts in the public cloud. While some call it cloud complexity or API sprawl, it’s the same problem and the result of using traditional methods to deal with modern environments.
Salesforce requires some heavy lifting when it comes to change management in the context of multiple development tracks and parallel releases, and keeping track of the implications of those changes can be especially taxing. This is where DevOps tools come to the rescue with features like automated release management and version control that take away the “guesswork” and ensure that all changes are effective and delivered to production as desired.
Metadata can quite simply be described as “data about data,” and in Salesforce this exists as XML files that describe the structure of both standard and custom objects, along with their fields and page layouts. The need for version control arises from Salesforce’s heavy dependence on metadata along with the fact that metadata is shared among custom objects and profiles, causing a lot of overwriting and ensuing chaos. Additionally, any changes made to a sub-metadata type causes the entire custom object to be retrieved, further complicating an already complex situation.
Ideally, restrictions need to be set in place that prevent changes to a sub-metadata type from pushing the entire object. AutoRABIT has a pretty efficient Version Control System for Salesforce that not only allows users to maintain different versions of changes and keep track of them but also features EZ-Check-ins so developers can retrieve changes made to the last sub-metadata “child.” AutoRABIT also “appends” the XML file as opposed to the traditional overwriting technique that doesn’t work for modern complex environments.
In case of emergency
AutoRABIT also features a backup and recovery solution that’s metadata aware. This is a pretty big deal, to say the least, not just due to the complexity of Salesforce environments today but also because Salesforce advertises its own recovery services as “time-consuming, expensive and to be used as a last resort.” Even with version control and release management, no backup and recovery means that even a minor event could affect your production environment and leave you incapable of a rollback.
A good example is the Target registers that went offline earlier this year, costing the company over $50 million. AutoRABIT offers features such as one-click recovery that syncs transactional data with the corresponding metadata. This ability is powered by an entire parallel backup environment that can help you either selectively recover from a minor event or completely recover from a breach or a crash with unlimited depth hierarchy.
Now with regards to backup and recovery in Salesforce, it’s the recovery process that’s the hard part. This is because there is always so much happening with accounts being updated, changes being made, transactions being processed and hotfixes being pushed to production. The ability to compare data between the live environment and the backup environment is critical here so that the best route to get back online can quickly be determined.
To ensure your recovery is the most current and relevant, AutoRABIT Vault has a live comparison utility that frequently compares your backup environment to production and makes sure metadata is in sync. This is essential in the case of a minor event where only a few accounts are corrupted and you would rather selectively restore just the affected accounts as opposed to the entire object. The latest release of Vault (version 19.3) is especially focused on maintaining this integrity with metadata during data transfer. It does this with Metadata Mastery, a proprietary technology that flows through all DevOps processes as a core element in AutoRABIT.
Flexibility and compliance
Salesforce applications can often create difficulties with regards to compliance with government and industry regulations like HIPAA, SOX, or GDPR. This is because a lot of these regulations include geographic restrictions that users cannot comply with, simply because they do not have full control over their data. This is why it’s important to make sure that you choose a backup and recovery solution that’s both compliant with government and industry regulations and gives you control over where your data is stored.
Applications that interact with customers around the clock, cannot afford to go down for any time whatsoever. So, in addition to making sure your backup and recovery solution covers the government regulations you are working under, you need to make sure your backups are flexible or “agile.” Frequent auto backup is a key feature to look for, especially if your application is online 24/7. Delta or “lean” backups are another core element of AutoRABIT and allow users to back up selectively, reducing the resource footprint considerably. To ensure compliance, additional features include encrypted backup on-premises, in a public cloud, or in a data lake for analysis.
Fodder for tests
Perhaps one of the greatest benefits of having a parallel backup environment, which is basically captured production data, is the ability to use it as a testing ground. When you pull production data into QA, however, you need to be careful since you’re dealing with real customer data and need to take into account privacy regulations. AutoRABIT enables users to basically have their cake and eat it too with data-masking (or privacy-masking) technology that lets you use real data for testing while also making sure sensitive information isn’t available to developers.
Salesforce and DevOps: It’s about time
While Salesforce took its time to jump on the DevOps wagon, high-release velocity and improved time to market are now essential commodities across platforms. Additionally, with the complexity that comes from a heavy sprawl of files due to Salesforce’s 39MB file size limit, version control and automated release management are the only way to escape from long hours of manual labor and heavy guesswork. Backup and restore is a key factor in Salesforce DevOps, especially in the case of lean deployments and selective backups that help save time and keep the organization as simple as possible.
Featured image: Shutterstock