For the #MadeInItaly series where I want to showcase amazing artisanal Italian products from our incredible Italian Ohana, today’s guest post is delivered by Paolo Carrara, software developer, tech enthusiast, scrum master. After a Master’s degree in Software Engineering, he approached Salesforce ecosystem during his early career in a consulting firm and since then he’s been learning, coding, and thinking about new ways to improve his and his team’s work. A huge fan of the Agile movement and an active collaborator of the Italian agile community. You can visit his page (watch out for a hidden easter egg) here.
Many time things don’t go as we planned; we certainly don’t wish so but we must be prepared in such occasions, and when problems happen, there’s one just key factor: time.
That’s why I developed a tool that could help me get to the core problem quicker than whatever I was previously doing.
And that tool is SFDX Lens, a VS Code Extension available for free on the marketplace.
The typical scenario is this: you are happily developing the next new business logic in Salesforce when suddenly you get 3 different issues from 3 different users in your ticketing system, all marked with high priority. Ugh. Now once you’ve read the ticket to understand what’s the problem you have to switch from your code to your QA environment, go to Setup > Debug Logs > New > compile all the required fields .. oh no! you have just created the 10th trace flag for the same user that was in debug last week. Anyway, now you’re ready and can (if possible) replicate the issue, switch back to VS Code and inspect the log
And you have to do it 2 other times.
Or, you can leverage SFDX Lens’s Command SFDX Lens: Debug user from Org. With this command, the extension lets you pick just 3 options:
An Org from the ones configured in your VS Code
An active user to debug
An amount of time from 15, 30 and 60 minutes
That’s it.
The extension will take care of setting a trace flag for that user with the maximum amount of precision allowed for the trace flag (which is more or less FINEST to everything), a process that usually take 1-2 seconds, and then you’re ready to replicate the issue and get the log in VS Code.
>Can you ask your user to replicate the issue himself?
Even better, “let me just activate the extension”, and 2 seconds after there you go, “go ahead”.
>Are you already connected to the QA environment?
Even better, you can skip point 1. With the command SFDX Lens: Debug user which creates a trace flag in the environment you’re connected to.
>Don’t want to search for the command in VSCode?
There’s a neat button for the command SFDX Lens: Debug user just in the activity bar below your code.
You don’t have to worry about trace flag pollution anymore, the extension ensures there will be just one trace flag per user (which is, the minimum required to set a trace flag)
Here’s a demo:
Right, now you have the log in VSCode and it’s a monolith of 7500 rows so.. what’s exactly happening here?
To address this question I often ask myself, I’ve developed a new command, right now in beta : SFDX Lens: Log Analysis (Beta)
This command helps splitting the log into its components, each displayed visually and proportionally to its duration, so now you can focus on a single execution event at a time
This is particularly useful even for performance tuning, now you can see how many times a trigger fires per execution for example.
Ivano Guerini is a Salesforce Senior Developer at Webresults, part of Engineering Group since 2015. He started my career on Salesforce during his university studies and based his final thesis on it. He’s passionate about technology and development, in his spare time he enjoys developing applications mainly on Node.js.
In this article, I’m
going to walk you through the steps to set up CI with Salesforce DX.
For this, I decided to
take advantage of Bitbucket and it’s integrated tool Bitbucket Pipelines.
This choice is not made after a comparison between the various version control systems and CI tools but is driven by some business needs for which we decided to fully embrace the cloud solutions and in particular the Atlassian suite of which Bitbucket its part.
What is Continuous Integration?
In software
engineering, continuous integration (often abbreviated to CI) is a practice
that is applied in contexts in which software development takes place through a
versioning system. It consists of frequent alignment from the work environments
of the developers to the shared environment.
In particular, it is
generally assumed that automatic tests have been prepared that developers can
execute immediately before releasing their contributions to the shared
environment, so as to ensure that the changes do not introduce errors into the
existing software.
Let’s apply this
concept to our Salesforce development process using sfdx.
First of all, we have
a production org where we want to deploy and maintain the application than
typically we have one or more sandboxes such as for UAT, Integration Test and
development.
With sfdx, we also
have the concept of scratch org, disposable and preconfigured organizations
where we, as developers, can deploy and test our work before pushing them into
the deployment process.
In the image below you can see an approach to the CI with Salesforce DX. Once a developer have finished a feature he can push into the main Developer branch, from this the CI take place creating a scratch Org to run automated tests, such as Apex Unit Test or even Selenium like test automatisms. If there is no error the dev can create a pull request moving forward in the deployment process.
In this article, I’ll show you how to set up all the required tools and as an example, we will only set up an auto-deploy to our Salesforce org on every git push operation.
Toolbox
Let’s start with a
brief description of the tools we’re going to use:
Git – is a version control system for tracking
changes in files and coordinating work on those files across the team. All
metadata items, whether modified on the server or locally, are tracked via GIT.
This provides us with a version history as well as traceability.
Bitbucket – is a cloud-based GIT server from Atlassian used for hosting our
repository. It provides a UI to navigate the GIT repository and has many
additional features like pull requests. These are used for
approving and merging changes.
Docker
– provides a way to run
applications securely, packaged with all its dependencies and libraries. So, we
will be using it to create an environment for running sfdx commands.
Bitbucket
Pipelines – is an add-on for
Bitbucket cloud that will allow us to kick off deployments and validations when
updates are made to the branches in Bitbucket.
If you have always worked in Salesforce, then it’s quite possible that Docker containers sound alien to you. So what is Docker? In simple terms, Docker can be thought of as a virtual machine in the cloud. Docker provides an environment in the cloud where applications can run. Bitbucket Pipelines support Docker images for running the Continuous Integration scripts. So, instead of installing sfdx in your local system, you’d now specify them to be installed in your Docker image, so that our CI scripts can run.
Create a developer Org and enable the
DevHub
We made a brief
introduction about what CI is and the tools we’re going to use, now it’s time
to get to the heart of it and start configuring our tools. Starting from our
Salesforce Org.
We are going to enable
the devhub to be able to work with sfdx and we are going to set up a connected
app that allows us to handle the login process inside our docker container.
For this article, I
created a dedicated developer Org in order to have a clean environment.
In this way, we will
obtain a new environment on which to perform all the tests we want.
Let’s go immediately
to enable the DevHub: Setup →
Development → DevHub click on
the Enable DevHub toggle.
Once enabled it can’t
be disabled but this is a requirement to be able to work with SFDX.
Now you can install the sfdx cli tool on you computer.
Create a connected app
Now that we have our
new org and the sfdx cli installed, we can run sfdx commands that makes it easy
for us to manage the entire application development life cycle from the command
line, including creating scripts that facilitate automation.
However, our CI will
run in a separate environment where we haven’t a direct control, such as for
the logging. So we will need a way to manage the authorization process inside
the docker container when your CI automation job runs.
To do this we’ll use the OAuth JSON Web Token (JWT) bearer flow that’s supported in the Salesforce CLI, this OAuth flow gives you the ability to authenticate using the CLI without having to interactively login. This headless flow is perfect for automated builds and scripting.
Create a Self-Signed SSL Certificate and Private Key
For a CI solution to
work, you’ll generate a private key for signing the JWT bearer token payload,
and you’ll create a connected app in the Dev Hub org that contains a
certificate generated from that private key.
To create an SSL
certificate you need a private key and a certificate signing request. You can
generate these files using OpenSSL CLI with a few simple commands.
If you use Unix Based System, you can install the OpenSSL CLI from the official OpenSSL website.
If you use Windows instead, you can download an installer from Shining Light Productions, although there are plenty of alternatives.
We will follow some specific command to create a certificate for our needs, if you want to better understand how OpenSSL works, you can find a handy guide in this article.
If you are not
familiar with OpenSSL you can find a good
Create a folder on your PC to store the generated files mkdir certificates
Create a key file from the server.pass.key file using the same password from before: openssl rsa -passin pass:<password> -in server.pass.key -out server.key
Delete the server.pass.key: rm server.pass.key
Request and generate the certificate, when prompted for the challenge password press enter to skip the step: openssl req -new -key server.key -out server.csr
To upload your server.crt file, click Choose File.
For OAuth scopes, add:
Access and manage your data (api)
Perform requests on your behalf at any time (refresh_token, offline_access)
Provide access to your data via the Web (web)
Click Save
Edit Policies to avoid authorization step
After you’ve saved
your connected app, edit the policies to enable the connected app to circumvent
the manual login process.
Click Manage.
Click Edit Policies.
In the OAuth policies section, for Permitted Users select Admin approved users are pre-authorized, then click OK.
Click Save.
Create a Permission Set
Lastly, create a
permission set and assign pre-authorized users for this connected app.
From Setup, enter Permission in the Quick Find box, then select Permission Sets.
Click New.
For the Label, enter: sfdx ci
Click Save.
Click sfdx ci | Manage Assignments | Add Assignments.
Select the checkbox next to your Dev Hub username, then click Assign | Done.
Go back to your connected app.
From Setup, enter App Manager in the Quick Find box, then select App Manager.
Next to sfdx ci, click the list item drop-down arrow (), then select Manage.
In the Permission Sets section, click Manage Permission Sets.
Select the checkbox next to sfdx ci, then click Save.
Test the JWT Auth Flow
Open your Dev Hub org.
If you already authorized the Dev Hub, open it: sfdx force:org:open -u DevHub
If you haven’t yet logged in to your Dev Hub org: sfdx force:auth:web:login -d -a DevHub
Adding the -d flag
sets this org as the default Dev Hub. To set an alias for the org, use the -a
flag with an argument to set an alias.
To test the JWT auth flow you’ll use some of the information that we asked you to save previously. We’ll use the consumer key that was generated when you created the connected app (CONSUMER_KEY), the absolute path to the location where you generated your OpenSSL server.key file (JWT_KEY_FILE) and the username for the Dev Hub (HUB_USERNAME).
On the command line, create these three session-based environment variables: export CONSUMER_KEY=<connected app consumer key> export JWT_KEY_FILE= ../certificates/server.key export HUB_USERNAME=<your Dev Hub username>
These environment variables facilitate running the JWT auth command.
Enter the following command as-is on a single line: sfdx force:auth:jwt:grant –clientid ${CONSUMER_KEY} –username ${HUB_USERNAME} \ –jwtkeyfile ${JWT_KEY_FILE} –setdefaultdevhubusername
This command logs in
to the Dev Hub using only the consumer key (client ID), the username, and the
JWT key file. And best of all, it doesn’t require you to interactively log in,
which is important when you want your scripts to run automatically.
Congratulations, you’ve created your connected app and you are able to login using it with the SFDX CLI.
Set up your development environment
In this section we
will configure our local environment, creating a remote repository in Bitbucket
and linking it to our local sfdx project folder.
If you are already familiar with these steps you can skip and pass directly to the next section.
Just insert your email
and follow the first registration procedure.
Once logged in you
will be able to create a new git repository from the plus button on the right
menu.
You will be prompted
to a window like the following, just insert a name for the repository, in my
case I’ll name it: sfdx-ci, leaving Git selected as Version Control System.
We’re in but our repo
is totally empty, Bitbucket provides some quick commands to initialize our
repo. Select the clone command:
Move to your desktop and open the command line tool and paste and execute the git clone command. This command will create a folder named like the Bitbucket repository already linked to it as a remote branch.
Initialize SFDX project
Without moving from our position, execute the sfdx create project command: force:project:create -n sfdx-ci
Using the -n parameter with the same name of the folder we just cloned from git.
Try deploy commands
Before we pass to
configure our CLI operations let’s try to do it in our local environment.
First of all, we must
create our sfdx project.
The
general sfdx deployment flow into a sandbox or production org is:
Convert from source form to metadata api form sfdx force:source:convert -d <target directory>
Use the metadata api to deploy sfdx force:mdapi:deploy -d <same directory as step 1> -u <username or alias>
These commands will be the same we are going to use inside our Bitbucket Pipelines, You can try in your local environment to see how they work.
Set up Continuous Integration
In previous sections,
we talked mostly about common Salesforce project procedures. In the next one,
we are going deeper in the CI world. Starting with a brief introduction to
Docker and Bitbucket Pipelines.
Lastly, we’ll see how to create a Docker image with SFDX CLI installed and how to use it in our pipeline to run sfdx deploy commands.
Docker
Wikipedia defines
Docker as
an open-source project that
automates the deployment of software applications inside containers by
providing an additional layer of abstraction and automation of OS-level
virtualization on Linux.
In simpler words, Docker is a tool that allows developers, sys-admins, etc. to easily deploy their applications in a sandbox (called containers) to run on the host operating system i.e. Linux. The key benefit of Docker is that it allows users to package an application with all of its dependencies into a standardized unit for software development.
Docker Terminology
Before we go further,
let me clarify some terminology that is used frequently in the Docker
ecosystem.
Images – The blueprints of our application which form the basis of containers.
Containers – Containers offer a logical packaging mechanism in which applications can be abstracted from the environment in which they actually run.
Docker Daemon – The background service running on the host that manages building, running and distributing Docker containers. The daemon is the process that runs in the operating system to which clients talk to.
Docker Client – The command line tool that allows the user to interact with the daemon.
Docker Hub – A registry of Docker images. You can think of the registry as a directory of all available Docker images.
Dockerfile – A Dockerfile is a simple text file that contains a list of commands that the Docker client calls while creating an image. It’s a simple way to automate the image creation process. The best part is that the commands you write in a Dockerfile are almost identical to their equivalent Linux commands.
Build our personal Docker Image with SFDX CLI installed
Most Dockerfiles start
from a parent image. If you need to completely control the contents of your
image, you might need to create a base image instead. A parent image is an
image that your image is based on. It refers to the contents of the FROM
directive in the Dockerfile. Each subsequent declaration in the Dockerfile
modifies this parent image.
Most Dockerfiles start
from a parent image, rather than a base image, this will be our case, we will
start from a Node base image.
Create a folder on
your machine and create a file named Dockerfile, and paste the following code:
FROM node:jessie RUN apk add --update --no-cache git openssh ca-certificates openssl curl RUN npm install sfdx-cli --global RUN sfdx --version USER node
Let’s explain what
this code means, in order:
We use a Node base image, this image comes with NPM and Node.js preinstalled. This one is the official Node.js docker image, and jessie indicate the last available version;
Next, with the apk add command we are going to install some additional utility tools mainly git and openssl to handle sfdx login using certificates;
Lastly using npm command we install the SFDX CLI tools;
Just a check for the installed version;
And finally the USER instruction sets the user name to use when running the image.
Now we have to build
our image and publishing it to the Docker Hub so to be ready to use in our
Pipelines.
Login to Docker Hub with your credentials. docker login –username=yourhubusername –password=yourpassword
Build you Docker Image docker build -t <your_username>/sfdxci
Test your docker image locally: docker run <your_username>/sfdxci
Push your Docker image to your Docker Hub repository docker push <your_username>/sfdxci
Pushing a docker image on the Docker Hub will make it available for use in Bitbucket pipelines.
Bitbucket Pipelines
Now that we have a
working Docker Image with sfdx installed we can continue configuring the
pipeline, that’s the core of our CI procedure.
Bitbucket Pipelines is
an integrated CI/CD service, built into Bitbucket. It allows you to
automatically build, test and even deploy your code, based on a configuration
file in your repository. Essentially, it creates containers in the cloud for
you.
Inside these
containers, you can run commands (like you might on a local machine) but with
all the advantages of a fresh system, custom configured for your needs.
To set up Pipelines you need to create and configure the bitbucket-pipelines.yml file in the root directory of your repository, if you are working with branches, to be executed this file must be present in each branch root directory.
A bitbucket-pipelines.yml file looks like the following:
There is a lot you can
configure in the bitbucket-pipelines.yml file, but at its most basic the
required keywords are:
image – the Docker image that will be used to create the Docker Container, You can use the default image (atlassian/default-image:latest), but using a personal one is preferred to avoid time consumption during the installation of required tools (e.g. SFDX CLI), To specify an image, use image: <your_dockerHub_account/repository_details>:<tag>
pipelines – contains all your pipeline definitions.
default – contains the steps that run on every push, unless they match one of the other sections.
branches – Specify the name of a branch on which run the defined steps, or use a glob pattern (to learn more about the glob patterns, refer to the BitBucket official guide).
step – each step starts a new Docker container with a clone of your repository, then runs the contents of your script section.
script – a list of cli commands that are executed in sequence.
Other than default and branches there are more signals keyword to identify what step must run, such as pull-request, but I leave you to the official documentation, we are going to use only these two.
Keep in mind that each step in your pipeline runs a separate Docker container and the script runs the commands you provide in this environment with the repository folder available.
Configure SFDX deployment Pipelines
Before configuring our
pipeline, let’s review for a moment the steps needed to deploy to a production
org using sfdx cli.
First of all we need to login into our SF org, to do so we have created a Salesforce Connected App to allow us logging in without any manual operation, simply using the following command:
As you can see there
are three parameters that we have to set in this command line:
CONSUMER_KEY
SFDC_PROD_USER
SFDC_PROD_URL
Bitbucket offer a way
to store some variables that can be used in our pipelines in order to avoid
hard-coded values.
Under Bitbucket
repository Settings →
Pipelines → Repository Variables create three variables and fill them in with the data at your disposal.
Another parameter
required by this command is the server.key file, in this case I simply added it in my
repository under the keys folder.
It’s not a good
practice and I will move it in a more secure position, but for this
demonstration it’s enough.
Now you are logged in, you need only two sfdx commands to deploy your metadata. One to convert your project in a metadata API format and one to deploy in the sf org: sfdx force:source:convert -d mdapi sfdx force:mdapi:deploy -d mdapi -u <SFDC_PROD_USER>
Like the login command
we are going to use a Pipeline Variable to indicate the target org username
under the -u parameter.
OK, now that we know
how to deploy a SFDX proggect we can put all this into our pipeline.
Move to the root of our sfdx project and create the bitbucket-pipelines.yml file and paste the following code (replace the image name with your own Docker image):
Commit and push this changes to the git repository.
Test the CI
OK we have our CI up
and running, let’s do a quick test.
In your project create a new apex class and put some code in it. Then commit and push your changes.
git add .
git commit -am “Test CI”
git push
As we said the pipeline will run on every push into the remote repository, you can check the running status under the Pipelines menu. You will see something like this:
As you know, the mdapi:deploy command is asynchronous so to check if there was some errors during the deploy you have to run the following command mdapi:deploy:report specifying the jobId or if you prefer you can check the deploy directly in the salesforce Org under Deployment section.
Conclusions
With this article I
wanted to provide you with the necessary knowledge to start configuring a CI
using the BitBucket Pipelines.
Obviously what I
showed you is not enough for a CI that can be used in an enterprise project,
there is still a lot to do.
Here are some starting
points to improve what we have seen:
Store the server.key in a safe place so that it is not directly
accessible from your repository.
Manage the CI in the various sandbox environments used
For the developer branch, consider automating the creation a
scratch org and running Apex Unit Tests.
Let’s talk about a great new addition of the Spring’19 platform release to the Salesforce Dev world, the Lightning Web Components framework, with our guest blogger Priscilla Sharon, Salesforce Business Solution Executive for DemandBlue.
DemandBlue is in the business of helping its customers maximize their Salesforce investment through predictable outcomes. As we thrive in an era of cloud-based Infrastructure, Platform and Software services, DemandBlue has pioneered “Service-as-a-Service” through a value-based On Demand Service model that drives bottom-line results. They foster innovation through “Continuous Engagement and On Demand Execution” that offers their customers Speed, Value and Success to achieve their current and future business objectives.
Salesforce launched Lightning Web Components as part of Spring ’19 pre-release to enable a
quicker and easier way to program applications on the Salesforce Lightning
platform. It engages modern Javascript innovations such as web components,
custom elements, shadow DOM and more. Lightning Web
Components is the Salesforce implementation of Lightweight frameworks built as
per the web standards. It provides specialized salesforce services in addition to the
core stack, such as Base Lightning Components, Lightning Data Service, User
Interface API, etc.
Read on to discover how the Lightning Web Components fuses Web components programming model with Salesforce metadata and services to deliver unparalleled performance and productivity.
With Lightning Web Components, we are giving developers a standards-driven JavaScript model for building enterprise apps on Lightning. Every time we release a new platform capability we see an acceleration of innovation in our 150,000 customer base, and we are excited to see what our community of developers will do with Lightning Web Components.
Mike Rosenbaum, EVP of Product, Salesforce
Why Lightning Web Components
Lightning Web Components is like a newer version of Lightning Components with additional features.
Knowledge Domain – Developers who know Web Components are familiar with Salesforce Lightning Web Components out-of-the-box. Aura is proprietary, so the better you know the web standards, the better you’ll have of skills that can be used outside Salesforce.
Better Execution – Lightning Web Components leverages built-in browser security features from Web Components standards, which reduces the level of custom coding, which means they run faster and are more consistent in how they ensure security. Moreover, events have a limited scope, so there is lesser processing required handling events.
New Security Features – It gives better CSS isolation, DOM isolation, script isolation and limited event scope that facilitate a more consistent component design.
ES6+ – We have a better support for ES6 and ES7 that is not available in Aura. This enables you to do more with less coding. This also transpires code to work in IE 11 and other browsers which were not supported earlier.
More Consistent Data Binding – The not so user-friendly two-way data binding has been eliminated. This pushes developers to coordinate the way in which data moves between components. It also means that data binding will work as expected, without any unforeseen problems from Aura.
Mixins – You can even import accessible methods from other components and import specific Apex methods from multiple classes. Moreover, the Apex methods can be cached for improved performance.
What Lightning Web Components means for Developers and Customers
Cutting-Edge Advantages of Lightning Web Components
Boosted Performance –
Developing Lightning Web Components does not involve complex abstractions to
run on the browser, providing better performance to end users.
Ease of Use – Post
development, the admins can deploy Lightning Web components with just clicks,
not code to the applications.
Standardized – Salesforce Lightning Web Components is built on ES6+ that provides developers with modern and advanced JavaScript features.
How to create a Lightning Web Components framework?
LWC (Lightning Web Components) cannot be created directly from the developer console. You need to set up Salesforce DX to create a Lightning component. After the SFDX setup, you need to do a few more things:
Get your Salesforce DX plugin updated with the latest release (Spring’19). Run the command below in your terminal or command prompt.
Command:
sfdx update
Once you finish this process, follow the trailhead link to set up the basic project and create a basic Lightning Web Component
Transition from Aura Components to Lightning Web Components
Developers using Aura framework to build lightning components can continue to work on it as the Aura components will continue to function like before. However, the new components can be created using Aura or the Lightning Web Component framework. For future developments, it is best if you use the Lightning Web Components.
Lightning Web Components Availability
Lightning Web Components are available for users since February 2019 in Enterprise, Unlimited, Performance and Developer editions.