When Salesforce is life!

Tag: Release Management

[Salesforce / Release Management] YOU SHALL NOT PASS! (Unless you pass the Quality Gates)

 

What happens if, once you learned how to walk, you start to run?

Well, you might stumble and fall.

Depending on how well you react you might get a bruise or a bloody nose.

What happens if you have a tool that can increase the speed of your releases?

Well, you might run into some issues upon releases.

Depending on if you are using version control, you might recover your work or not.

But from a project management perspective the main question is:
How do I avoid stumbling, falling and having a discussion about delivery capabilities with project stakeholders (aka “those who pay the bills”)?

Starting from the running example, we probably stumble because we do not control the movement at some point. Maybe, because we did not practice walking long enough or because the floor was uneven and “surprised” us. In any case, it is because we lost control.

Believe it or not, it’s similar in software development: If you lose control over what happens in your process ( = movement) you can damage the product you deliver.

We have learned how to “walk” with Copado in the first post of this series, we outlined how all team members contribute to an efficient movement in the second one, and started to “run” in the third post.

So before we even come close to falling, let’s talk about how to maintain control and achieve reproducible and predictable release outcomes.

Our final goal is to achieve that German thing everyone once in a while mentions: Quality.
(Hehe, World Cup 2018, that will stick for a while)

Quality gates are not there to annoy – they are a safety net

Working with Copado and having a seamless integration with Git in the background, we avoid a worst case scenario (fall on your face and damp the hit with your forehead) as we always have a way to roll back to a stable version.

But the process we introduced also contains two review points, which is something no release process should miss:

  • A review of your developments from a technical perspective by a peer or a lead developer. This will ideally prevent devs from introducing sketchy solutions and you get to keep your job.
  • A review of your developments from an end user perspective (Test). This will make sure end users don’t complain, and you get to keep your job.

There is a reason both steps are manual:

First of all, as of now, no AI will be able to understand how and why you developed your features. If a robot could review your code, then admit it, a robot can write your code and a lot of us would be out of business.

Also it is to ensure personal accountability for approvals, so people take it seriously.

If you work in an agile project environment the technical and business review might be detailed as a Definition of Done (DoD, aka “the list of things you need to accomplish to burn the story points”). A sample DoD can look like the following list:

  • Document your feature
  • Get peer approval
  • Ensure business test script is available
  • Deploy to QA
  • Test and approve story by business

So wouldn’t it be nice to tie the technical capability to move a feature to the next environment and release it based on the completion of your DoD/Quality Gates?

Validation rules & history tracking to save the day

Before we jump into any config work, let’s get an overview of what we need and what elements (fields, automations) Copado offers already:

DoD Element

Required

Available

To Configure

Document your feature

Field to indicate document location

Yes, but not in the way we would like it.

Create hyperlink field.

Validate, that the field is not empty.

Get peer approval

Field to indicate approval

Yes, but we don’t jump into pull requests. Maybe later. Start small now.

Create Checkbox.

Validate, that the field is checked.

Ensure business test is available

Indicator for approved test script

Yes.
But I would like to automate the checkbox, and also, I don’t like the current name.

Create Checkbox on User Story to indicate approved scripts.

Update the field on an approved test script.

Validate, that the field is checked.

Deploy to QA

Indicator for current org of a feature

Automated by Copado

Nothing to do.

Test approved by business.

Indicator for approved tests.

Yes, but the automation needs to be configured.

Create Checkbox on User Story to indicate a successful test.

Update the field upon a successful test execution..

Validate, that the field is checked.

To sum up, in order to implement our quality gates we need:

  • 4 fields
  • 4 validation rules
  • 2 processes

And to make sure we can track who changed the checkboxes, we will set Feed Tracking on all of them, so that we see a nice history of DoD on the chatter feed related to the User Story. Also we need to modify the layout to make sure fields are displayed.

Create the required fields

Starting with the fields, we create the following custom fields on the Copado User Story object:

  • PeerReviewPassed__c, Checkbox
  • TestScriptReady__c, Checkbox
  • TestScriptPassed__c, Checkbox
  • DocumentationLocation__c, URL

For those who need a deeper explanation, click here.

Ensure accountability through history tracking

Next, we will enable those fields in Feed Tracking (Setup → Customize → Chatter → Feed Tracking → User Story). Enable the Object Field tracking if required, and select the fields we just created. Add more fields, if you consider them worth tracking. Status is always a good one.

Enforce process adherence with validation rules

Next, we need to tackle the validation rules.

As explained in the first post, a user story is deployed to the next environment, when the “Promote & Deploy” checkbox is checked. Also, it can be selected for a manual promotion if you check the “Promote Change” checkbox. So our validation rules should fire in the following logic:

  • If the story is still in the Dev environment, fire when:

    • “Promote & Deploy” or “Promote Change” is checked
    • Documentation Location is empty
    • Peer Review Passed is unchecked
    • Test Script Ready is unchecked

  • If the story is in the UAT environment, fire when:

    • “Promote & Deploy” or “Promote Change” is checked
    • Test Script Passed is unchecked

Modify the layout to make it easy for users to follow

We added a lot of fields, so let’s make it better to manage for the end user and modify the layout. Copado has already some fields, which we do not use in our cases, such as Documentation Complete, because they are not entirely for the case (also it’s about showing what can be done, right?).

However, there is a field called “Apex Tests Passed”. It’s a checkbox and it is set automatically, if you hit the “Run Apex Tests” button on the story, and the classes in your User Story have sufficient coverage. Also, there is already a section called “Definition of Done”. We will just remove unwanted and add our new fields.

Looks nice, and has a certain logic. Let’s check the validation rule:

Perfect.

We are done here, so the only thing left is to set up the process builder automations and go for an after work drink with colleagues and/or friends.

Reduce redundant clicks with process builder

The key here is to fit into the Copado approach a little to get our automations started. Our favorite release management tool assumes, that a test script might not necessarily be written by a single developer, but also by a larger team, where a test script review process can be part of a test script creation. (Yes, those testers again…).

In order to indicate, that a test script is ready, the status needs to be marked as “Complete”. And that’s all the details we need, in order to create a process builder:

  • Object: Test Script (copado__Test_Script__c)
  • Event: On record creation or update
  • Conditions: Status equals Complete

As an immediate action, we want to update a record related to the record, which triggered the process. The object we want to update is the User Story, and the field we want to set is the Test Script Ready field, which should be checked ( = TRUE).

What we also could do is to set the status of the story accordingly (e.g. “Ready for Testing”), but that is not required for now.

For automating the check of the “Test Run Passed” object, we will create a different, but similar process.

  • Object: Test Run (copado__Test_Run__c)
  • Event: On record creation or update
  • Conditions: Status equals “Passed” or “Passed with Comments”

The immediate action, in this case, would need to update the Test Script Passed field on the related story. So it is more or less the same as for the process created before, just the target field changes to “Test Script Passed” = TRUE.

Done. Well, you should test it, of course 🙂

Try that with Jenkins. Or Bamboo. Or TeamCity.

Gosh, I love this tool.

That’s all nice, but what else could you do?

Although this is a rather simple scenario, it will help already any user to follow the process, although they did not study the process before intensively. Simply by setting up the tool correctly to support what we outlined with boxes and arrows and providing information on how to proceed.

We can now easily check on DoD status of stories with list views or reports to find gaps. If someone asks “Who approved this?” we just need to check the list of chatter posts generated by Salesforce. If we now start to discuss User Stories using the Chatter Feed, your team will be best buddies with any internal project audit initiative.

Going a step further, validation rules can help to cover more complex scenarios, e.g. allowing to validate, but not to deploy to a specific org.

And if your project has issues with code quality, a code analysis tool can be an enabler for your code review. Copado has a build in support for an open source framework to analyze Apex (PMD) but if your team is using CodeScan or Checkmarx for analyzing code, I have been told that CodeScan will be available in the next version (Copado v12) and Checkmarx integration is on the roadmap.

For those who want a more structured technical review process, Copado can be set up to mirror pull requests as records linked to the User Story, so with a checkbox, a process builder and a validation rule you would be able to prevent deploying stories, if a PR has not been passed yet.

I am aware that following a release process sometimes is annoying for developers, but there are good reasons for certain checks and why to perform them, even though the change seems small.

But thanks to tools like Copado following the required steps is a breeze and automating repetitive field updates or notifications, you can ensure frictionless releases.

[Salesforce / Release Management] Automation for the win!

You hate manual tasks and you are working on a Salesforce.com implementation? Then get a pen, a piece of paper and install & set up Copado.

Why Copado? Because it’s a great solution to manage releases of and with Salesforce.

Why pen and paper? Because it’s a great and versatile tool for drawing your current process and highlighting automation possibilities.

Things are ok. But can we make it better?

A couple of weeks ago, we have installed and set up Copado to manage our release process.

Committing to and deploying based on Git version control turned out to be easier and more secure than working with change sets (finally you can track who screwed up your stuff and restore it).

Next, we took some time to think about the people involved in an implementation, and how projects can work better as a team to release faster and better.

So now, as our team is aligned and everyone has confidence in the process, during the retrospective, the team decided to further improve the process and automate some of the steps.

  • Avoid manual step after deploying a custom setting to change endpoints
  • Ability to commit and deploy dashboards with running users, because it is just annoying
  • Mitigate bad development practices hard coding IDs
  • Send a notification to testers, once a user story is in the test environment

Deploy the same, but different

There are items in software development, which need to be modified based on your environment. Typical cases are integration endpoints, which change depending on if you want to connect to a development, test, or production instance of e.g. SAP. In Salesforce, you can add further items to the list, like IDs, if you want to exept an admin profile users from a validation rule. You can hardcode it as part of the formula (bad, think about kittens and unicorns) or you can put the ID in a custom setting, which is better, but still requires a manual step after deployment, updating the setting record in the target org.

Here is where Copado can help you to automate those steps with the concept of Environment Variables.

If this is too technical and not emotional enough, we will get rid of any manual changes related to hard coded IDs or URLs. Regardless if they are part of a Custom Setting, Validation Rule, Visualforce Page or Class. Believe me.

How does it work?

If you define a specific string, Copado will recognize it and translate it to a variable name, which you define. And upon deployment, it will replace the variable with the string you defined for your target.

Setup Environment Variables

In our case, we will set up an enVar (Environment Variable) for an endpoint, a user and the Admin profile Id.

  1. Get a clear view on your current unique strings. A table works best. Feel free to use pen and paper, if you like.

  2. In Copado, go to your active Deployment Flow, and click on the “Manage Environment Variables” button.
  3. Create a new Variable with the name you prefer. To keep an overview, I prefer to use a naming convention: elementType_usageOrItem
  4. Populate the string per environment or copy&paste from your table to Copado.

Follow the same steps for the other strings, and you will end up with something like this:

Use Environment Variables

So, let’s go ahead and deploy those items. The running dashboard user is part of the xml file, so it’s ok if we just commit it. The same is valid for hard coded IDs in classes or validation rules. However, the custom setting is stored in a record, so it needs to be handled differently.

  1. Create a Copado User Story.
  2. Go to “Commit Changes”, select the dashboard, underlying report, endpoint custom setting (in case it is not deployed yet) and the validation rule with a hardcoded Id.

  3. Provide a commit message, and click on “Commit Changes” to finish the process.
    Take your time to review the commit and see, if items have been added to Git as expected. On the dashboard, you will see that the Running User tag has been replaced with the variable name. The same happened with the Id in the Validation Rule.

  4. Ok, so commit is done, but we still need to account for the custom setting. On the User Story, scroll down to the “Deployment Tasks” related list and create a new task: after deployment, type: “Custom Setting”, select your setting, and click on “Get Custom Setting Values”
  5. Once you have the list, select the setting records you need, and save the step.

Well, this is the moment!

Will it work, will it deploy and replace?

Check the “Promote & Deploy” checkbox on the story to see Copado Magic at work.

Deployment Done.

Log into UAT.

A quick check…

All items were resolved. Perfectly!

Apart from deploying the custom setting, Copado has moved the setting entry, and exchanged the URL in the Target field correctly.

The Id in the Validation rule was replaced:

The Dashboard in UAT has the expected CEO running user:

As Copado applies this mechanism to all files, with a single entry we can account for hard coded IDs or URLs in all Salesforce Metadata, including visualforce pages or classes.

Process Automation

Ok, now the deployment is done, the user story is in UAT, and although we don’t have to update IDs, we still need to notify the tester that the story is now available for review.

Easy, because we are working on the Force.com platform, and Copado provides all the information required.
When a story is deployed to the next environment, the Field is updated by Copado automatically (Dev1 → UAT in this case). Also, there is a Test-Script Owner field on the Story, which is linked to a user record.

In technical terms: We have a DML operation and an email on a lookup parent record, so we have tons of options.

Which salesforce automation tool do you prefer? We can use Workflows, Process Builder, and Apex to fire an email to the user story tester or to a generic email address of the test team.

But email? This is so SAP. What about a chatter message instead?! Process it is.

Create a new Process on Creation or Edit of the story record, to fire for all Stories where:

  • The “Test Script Owner” field is not empty (Test Script Owner → User Id).
  • The “Environment Name” field equals “UAT” (Environment → Environment Name).

Next, create an immediate action of type “Post to Chatter”.

To get the notification right, you only need to consider, that @mentions from process builder require square brackets: @[ mergeFieldOfAUserId ]

If you cannot find the Chatter action on the User Story, enable it for the User Story object in Setup → Feed Tracking.

From now on, when we deploy a user story to UAT, there will be a chatter message on the record and the tester will get notified too (also via email, depending on the settings).

Copado & Force.com: A lot of value for little time invested

Now you might want to know how much time and effort it really takes to set up this type of automations.

To be honest, it took me longer to go through all environments and get the correct values than to set up the environment variables. But in most cases, projects have this information as part of their org-refresh steps documentation.

The main “difficulty” setting up the process was getting the @mention syntax (you are welcome 🙂 ) and improve on some type-os.

It is so incredibly easy to automate even complex scenarios that I set up another action for user stories in UAT to get ahead of my team requirements: Run all unit tests in UAT. But not as part of a deployment, because it’s too easy and this might slow down the deployment. Instead, it should happen after the deployment is finished successfully.

The only thing to do here is to create another condition in our Process (Environment = UAT), and create an action of type “Apex”.

Copado provides a set of helpful methods, which can be triggered in Apex or from the Process Builder, and the one we pick is “Invoke Run All Apex Tests for an Org”. As a parameter, we provide the Org Credential Id of the user story.

Now all apex tests will be run, once a story is successfully in UAT.

Done. It took longer to write about it, than to actually set it up.

One platform. A lot of possibilities

Usually, the tools for committing, deploying, tracking/managing stories and testing are separated. But Copado puts all of it on the highly flexible Force.com platform, so that users can tweak it how they like.

  • Automatic deployments to Stage once testing is done? Create a Process Builder to check “Promote & Deploy” once there is a successful test execution record.
  • A nightly validation deployment of all stories in Stage to Prod, just to check if there would be any issues? Well, you would need Apex to bundle the user stories and fire the deployment. Basically Query & create records.
  • Auto-execute regression testing after a deployment? Record your Selenium Script and add it to be executed after deployment to a certain environment.
  • You can even go crazy, read the information on committed metadata and auto-deploy only if less critical items such as Report Types have been committed. Dev to Prod including a sync back to Dev2 and the only human intervention would be the commit process (include an approval on QA though. We still have to follow the process, and you don’t want testing to be mad at you).

    If this is too technical, and too little emotion, maybe this explains better how it feels now to release features:

Powered by WordPress & Theme by Anders Norén