What happens if, once you learned how to walk, you start to run?
Well, you might stumble and fall.
Depending on how well you react you might get a bruise or a bloody nose.
What happens if you have a tool that can increase the speed of your releases?
Well, you might run into some issues upon releases.
Depending on if you are using version control, you might recover your work or not.
But from a project management perspective the main question is:
How do I avoid stumbling, falling and having a discussion about delivery capabilities with project stakeholders (aka “those who pay the bills”)?
Starting from the running example, we probably stumble because we do not control the movement at some point. Maybe, because we did not practice walking long enough or because the floor was uneven and “surprised” us. In any case, it is because we lost control.
Believe it or not, it’s similar in software development: If you lose control over what happens in your process ( = movement) you can damage the product you deliver.
We have learned how to “walk” with Copado in the first post of this series, we outlined how all team members contribute to an efficient movement in the second one, and started to “run” in the third post.
So before we even come close to falling, let’s talk about how to maintain control and achieve reproducible and predictable release outcomes.
Our final goal is to achieve that German thing everyone once in a while mentions: Quality.
(Hehe, World Cup 2018, that will stick for a while)
Quality gates are not there to annoy – they are a safety net
Working with Copado and having a seamless integration with Git in the background, we avoid a worst case scenario (fall on your face and damp the hit with your forehead) as we always have a way to roll back to a stable version.
But the process we introduced also contains two review points, which is something no release process should miss:
- A review of your developments from a technical perspective by a peer or a lead developer. This will ideally prevent devs from introducing sketchy solutions and you get to keep your job.
- A review of your developments from an end user perspective (Test). This will make sure end users don’t complain, and you get to keep your job.
There is a reason both steps are manual:
First of all, as of now, no AI will be able to understand how and why you developed your features. If a robot could review your code, then admit it, a robot can write your code and a lot of us would be out of business.
Also it is to ensure personal accountability for approvals, so people take it seriously.
If you work in an agile project environment the technical and business review might be detailed as a Definition of Done (DoD, aka “the list of things you need to accomplish to burn the story points”). A sample DoD can look like the following list:
- Document your feature
- Get peer approval
- Ensure business test script is available
- Deploy to QA
- Test and approve story by business
So wouldn’t it be nice to tie the technical capability to move a feature to the next environment and release it based on the completion of your DoD/Quality Gates?
Validation rules & history tracking to save the day
Before we jump into any config work, let’s get an overview of what we need and what elements (fields, automations) Copado offers already:
DoD Element |
Required |
Available |
To Configure |
Document your feature |
Field to indicate document location |
Yes, but not in the way we would like it. |
Create hyperlink field. Validate, that the field is not empty. |
Get peer approval |
Field to indicate approval |
Yes, but we don’t jump into pull requests. Maybe later. Start small now. |
Create Checkbox. Validate, that the field is checked. |
Ensure business test is available |
Indicator for approved test script |
Yes. |
Create Checkbox on User Story to indicate approved scripts. Update the field on an approved test script. Validate, that the field is checked. |
Deploy to QA |
Indicator for current org of a feature |
Automated by Copado |
Nothing to do. |
Test approved by business. |
Indicator for approved tests. |
Yes, but the automation needs to be configured. |
Create Checkbox on User Story to indicate a successful test. Update the field upon a successful test execution.. Validate, that the field is checked. |
To sum up, in order to implement our quality gates we need:
- 4 fields
- 4 validation rules
- 2 processes
And to make sure we can track who changed the checkboxes, we will set Feed Tracking on all of them, so that we see a nice history of DoD on the chatter feed related to the User Story. Also we need to modify the layout to make sure fields are displayed.
Create the required fields
Starting with the fields, we create the following custom fields on the Copado User Story object:
- PeerReviewPassed__c, Checkbox
- TestScriptReady__c, Checkbox
- TestScriptPassed__c, Checkbox
- DocumentationLocation__c, URL
For those who need a deeper explanation, click here.
Ensure accountability through history tracking
Next, we will enable those fields in Feed Tracking (Setup → Customize → Chatter → Feed Tracking → User Story). Enable the Object Field tracking if required, and select the fields we just created. Add more fields, if you consider them worth tracking. Status is always a good one.
Enforce process adherence with validation rules
Next, we need to tackle the validation rules.
As explained in the first post, a user story is deployed to the next environment, when the “Promote & Deploy” checkbox is checked. Also, it can be selected for a manual promotion if you check the “Promote Change” checkbox. So our validation rules should fire in the following logic:
-
If the story is still in the Dev environment, fire when:
- “Promote & Deploy” or “Promote Change” is checked
- Documentation Location is empty
- Peer Review Passed is unchecked
- Test Script Ready is unchecked
-
If the story is in the UAT environment, fire when:
- “Promote & Deploy” or “Promote Change” is checked
- Test Script Passed is unchecked
Modify the layout to make it easy for users to follow
We added a lot of fields, so let’s make it better to manage for the end user and modify the layout. Copado has already some fields, which we do not use in our cases, such as Documentation Complete, because they are not entirely for the case (also it’s about showing what can be done, right?).
However, there is a field called “Apex Tests Passed”. It’s a checkbox and it is set automatically, if you hit the “Run Apex Tests” button on the story, and the classes in your User Story have sufficient coverage. Also, there is already a section called “Definition of Done”. We will just remove unwanted and add our new fields.
Looks nice, and has a certain logic. Let’s check the validation rule:
Perfect.
We are done here, so the only thing left is to set up the process builder automations and go for an after work drink with colleagues and/or friends.
Reduce redundant clicks with process builder
The key here is to fit into the Copado approach a little to get our automations started. Our favorite release management tool assumes, that a test script might not necessarily be written by a single developer, but also by a larger team, where a test script review process can be part of a test script creation. (Yes, those testers again…).
In order to indicate, that a test script is ready, the status needs to be marked as “Complete”. And that’s all the details we need, in order to create a process builder:
- Object: Test Script (copado__Test_Script__c)
- Event: On record creation or update
- Conditions: Status equals Complete
As an immediate action, we want to update a record related to the record, which triggered the process. The object we want to update is the User Story, and the field we want to set is the Test Script Ready field, which should be checked ( = TRUE).
What we also could do is to set the status of the story accordingly (e.g. “Ready for Testing”), but that is not required for now.
For automating the check of the “Test Run Passed” object, we will create a different, but similar process.
- Object: Test Run (copado__Test_Run__c)
- Event: On record creation or update
- Conditions: Status equals “Passed” or “Passed with Comments”
The immediate action, in this case, would need to update the Test Script Passed field on the related story. So it is more or less the same as for the process created before, just the target field changes to “Test Script Passed” = TRUE.
Done. Well, you should test it, of course 🙂
Try that with Jenkins. Or Bamboo. Or TeamCity.
Gosh, I love this tool.
That’s all nice, but what else could you do?
Although this is a rather simple scenario, it will help already any user to follow the process, although they did not study the process before intensively. Simply by setting up the tool correctly to support what we outlined with boxes and arrows and providing information on how to proceed.
We can now easily check on DoD status of stories with list views or reports to find gaps. If someone asks “Who approved this?” we just need to check the list of chatter posts generated by Salesforce. If we now start to discuss User Stories using the Chatter Feed, your team will be best buddies with any internal project audit initiative.
Going a step further, validation rules can help to cover more complex scenarios, e.g. allowing to validate, but not to deploy to a specific org.
And if your project has issues with code quality, a code analysis tool can be an enabler for your code review. Copado has a build in support for an open source framework to analyze Apex (PMD) but if your team is using CodeScan or Checkmarx for analyzing code, I have been told that CodeScan will be available in the next version (Copado v12) and Checkmarx integration is on the roadmap.
For those who want a more structured technical review process, Copado can be set up to mirror pull requests as records linked to the User Story, so with a checkbox, a process builder and a validation rule you would be able to prevent deploying stories, if a PR has not been passed yet.
I am aware that following a release process sometimes is annoying for developers, but there are good reasons for certain checks and why to perform them, even though the change seems small.
But thanks to tools like Copado following the required steps is a breeze and automating repetitive field updates or notifications, you can ensure frictionless releases.
Leave a Reply