When Salesforce is life!

Category: Post Page 8 of 26

Warning: Google Chrome’s SameSite Cookie Behaviour Changes – How to face it properly in Salesforce

This post is brought to you by Luca Miglioli, an Information System Analyst that works at WebResults (Engineering group) in the Solution Team, an highly innovative team devoted to Salesforce products evangelization.


Some months ago, Google announced a secure-by-default model for cookies, enabled by a new cookie classification system. Changes concern in particular to the SameSite attribute: on a cookie, this attribute controls its cross-domain site behavior, that is if no SameSite attribute is specified, the Chrome 80 release sets cookies as SameSite=Lax by default while previous to the Chrome 80 release (the current one), the default is SameSite=None.
Ok, but what does it mean?

To safeguard more websites and their users, the new secure-by-default model assumes all cookies should be protected from external access unless otherwise specified: this is important in a cross-site scenario, where websites typically integrate external services for advertising, content recommendations, third party widgets, social embeds, etc. and external services may store cookies in your browser and subsequently access those file.

The cross-site scenario, where an external resource on a web page accesses a cookie that does not match the site domain – courtesy of Google ©

These changes are being made in Chrome, but it’s likely other browsers will follow soon: Mozilla and Microsoft have also indicated intent to implement these kind of changes in Firefox and Edge, on their own timelines. While the Chrome changes are still a few months away, it’s important that developers who manage cookies assess their readiness as soon as possible: that’s why Salesforce rapidly notifies its customers and partners with an annoucement (contained in the latest release notes, Spring ’20).
Especially, the announcement explains that:

  • “Cookies don’t work for non-secure (HTTP) browser access. Use HTTPS instead.”

    Check the URL of your website, if it starts with http:// and not https:// then you’ll need to get some form of SSL certificate. It’s probably worthwhile checking all of the links to your pages to make sure they are directing the the https:// version of the page. For example, make sure you are using the HTTPS links if you are embedding Pardot forms on your websites: this was not enabled for our organisation by default, so it’s likely that your organisation may need to do this.
  • “Some custom integrations that rely on cookies no longer work in Google Chrome. This change particularly affects but is not limited to custom single sign-on, and integrations using iframes.”

    1st, 2nd and 3rd party integrations might be seariously impacted. Salesforce recommends to test any custom Salesforce integrations that rely on cookies owned and set by your integration. For example, an application not working as expected could be Marketing Cloud’s Journey Builder not rendering in the browser or Cloud Pages/Landing Pages/Microsites returning blank pages. If you determine that your account is affected by the SameSite cookie change, you need to investigate your implementation code to ensure cookies are being utilized appropriately.

Ok, this looks a little bit scary, but don’t worry!

First, developers and admins can already test the new Chrome’s cookie behavior on the sites or cookies they manage, simply going to chrome://flags in Chrome (type that in the URL bar) and enable the “SameSite by default cookies” and “Cookies without SameSite must be secure” experiments.

Test the changes enabling these features in the “Experiments” section in Google Chrome

Second, developers can still opt in to the status quo of unrestricted use by explicitly setting SameSite=None; Secure: only cookies with the SameSite=None; Secure setting will be available for external access, provided they are being accessed from secure connections.

Third, If you manage cookies that are only accessed in a same-site context (same-site cookies) there is no required action on your part; Chrome will automatically prevent those cookies from being accessed by external entities, even if the SameSite attribute is missing or no value is set.

That’s all! You can still find more detailed info here:

Integrating Salesforce with AWS: A beginner’s guide

Today’s guest post is delivered by Gilad David Maayan, a technology writer who has worked with over 150 technology companies including SAP, Samsung NEXT, NetApp and Imperva, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership.


Salesforce is a Platform as a Service (PaaS) that provides developers with cloud-based environments and resources for building and deploying cloud-based applications. The platform offers a number of modules, each provides different resources. 
AWS is a cloud computing vendor that offers a variety of cloud-based compute resources, such as object storage, data lakes, Artificial Intelligence (AI), and development resources. Salesforce and AWS are partners, and offer a number of connectivity options. 
This article provides an overview of Salesforce services and technology, and key integrations you can create to establish a connection between Salesforce and AWS.

Salesforce for Developers: Main Services

Below, you’ll find a brief overview of the main Salesforce offerings for developers.

Lightning Platform

A PaaS module geared for fast apps delivery. Lightning’s cloud architecture is based on multitenancy, which means you share cloud resources with other users. Lightning comes with ready-made solutions, automation processes, and API integrations. You can create your own customizations metadata fields and with Apex code.

Heroku 

A container-based cloud PaaS. It supports languages and frameworks such as Python, Clojure, Node.js, PHP, Java, Ruby, Scala, and Go. The main advantage of Heroku DX Developer Experience is the user-friendly dashboard, which provides easy-to-use metrics, and API and automation controls. 

You can extend the capabilities of Heroku with services such as Enterprise (with 24x7x365 support with a 30 minute SLA), and Elements (additional languages and services, fully managed services). You can also host your Heroku operations in a private cloud, while maintaining smooth integration with Salesforce.

Einstien

Einstein Platform Services provide Artificial Intelligence (AI) resources for Salesforce developers. The purpose of the platform is to make your applications smarter, as the name implies. To that end, Einstein provides the following services:

  • Einstein Vision—a computer vision module that enables image recognition
  • Einstein Language—a natural language processing module 

You can use Einstein Platform Services to train deep learning models in the cloud. As in any Salesforce service, it comes with APIs and Apex for integration and customization. Einstein Analytics enables AI-based data analyses.

Trailhead

A learning center for Salesforce skills, which offers a variety of educational modules. You can read guides, sign up for classes, and earn certifications. There are modules for companies, as well as a huge community-based hub. Everything is available online through the Trailhead website. 

Salesforce Technology

Salesforce uses metadata, APIs, and containers to enable its technology. To enable quick deployment and scaling, Salesforce runs Kubernetes in production on bare-metal throughout their cloud infrastructure. This kind of architecture enables Salesforce to provide fine-grained microservices for Salesforce end-users and developers alike.

You use containers to pack your apps, metadata to describe the structure of your development artifacts, and APIs to enable connectivity between Salesforce services, third-party vendors, and connected devices and technology. These three main technologies make the Salesforce app development process fast and simple. 

Salesforce and AWS Integrations

There are many use cases and integration methods. This section focuses on AWS, but you can find more information about Salesforce integration here.

1. Integrating Salesforce with Amazon S3

Amazon Simple Storage Service (Amazon S3) is an affordable object storage and archive offered by AWS. Typically, you would do this integration in order to build a data lake on S3. To move data from Salesforce to S3, you need to use the Amazon S3 connector. You’ll find more information here

2. Integrating Salesforce with Amazon Lambda

AWS Lambda is a serverless compute service for processing events. To do this integration, you need a Salesforce account and an AWS account. Once you set up your account, you need to create a connection between Salesforce and AWS API gateway. You do this by creating open authentication. Then, you need to configure the data flow, and “tell” Salesforce to subscribe to AWS Lambda events. Here’s a step-by-step guide that shows you how to do this integration.

3. Integrating Salesforce with Amazon Athena

Amazon Athena is a serverless interactive query service for performing S3 data analysis with standard SQL. This type of integration is more complex, because you would need to create more than one integration. First, you need to connect Salesforce with S3, then connect Salesforce with Lambda. Once you transfer data from Salesforce to S3, you’ll be able to query it using Athena. Here’s a guide that shows you how to do this.

4. Integrating Salesforce with AWS PrivateLink

AWS PrivateLink creates secure and private connectivity between AWS services, AWS-hosted on-prem apps, and Virtual Private Clouds (VPCs). This is a built-in AWS feature that enables integration between AWS and SaaS offerings from AWS Partner Network (APN) Partners, such as Salesforce. In May, Heroku Postgres via PrivateLink, which enables connectivity between private Heroku Postgres databases and AWS VPCs, was made generally available. This connection is easy and fast. You can learn how to create it here.

Conclusion

Secure and simple integration between your development PaaS and your cloud resources is vital for business continuity. You’re using these APIs to transfer data, listen to data, and establish connectivity between systems, devices, and networks.

Native integration is a major advantage, because it was created especially for the two connected points. Security is typically covered well in these scenarios, and the connection fits the two vendors (or in house services) in a way that necessitates little to no configuration. In other cases, like integration #3 in this article, you would need to set up the connection. 

Take the time to assess your situation, and find out what kind of integration you need. Experiment with free tiers, make use of community knowledge base, and keep security concerns in mind as you create your integrations. Data is valuable, and you don’t want just anyone listening on your connections.

Salesforce Winter ’20 Highlights

Our week’s trailblazer is Claudio Marzorati, who will be listing some of his favorite Winter’20 Salesforce platform release.

Claudio is a Senior Salesforce Developer @ Sintegra s.r.l. (Milan). He worked with different retails that allowed him to increase his technical background on several aspects.
From analysis, to development, to the direct relationship with the customer, nothing is left to chance.
Other passions are running and travels!


In this article I summarize the most important features introduced in the new release.

General Update

Access the Recycle Bin in Lightning Experience

You no longer have to switch to Salesforce Classic to access the Recycle Bin. You can now view, restore, and permanently delete the items in your Recycle Bin and the org Recycle Bin. Now you can access the Recycle Bin by selecting it in the App Launcher under All Items or personalizing your navigation bar. Or add the Recycle Bin tab for your org in the Lightning App Builder.

View Records by Topic on the Topic Detail Page

If you enable topics from your organization, now you can view records by topics as in the below image.

You must enable a permission in your profile to use topics, as below. Actually (before Winter ’20) you have to navigatore from Setup – Topics for Object Settings in order to enable object by object which fields are candidate for topics detection.

Save Ink and Paper with Printable View for Lists

Printable view for List View is finally available also in Lightning Experience.

Manage Security Contacts for Your Org

Keeping key members of your org in the know about security incidents is important to us. You can now designate your org’s security contacts in Salesforce Help so that if an information security incident impacts your org, your contacts are notified. From Salesforce Help, click Support & Services Manage next to Customer Security Contacts on the My Profile Settings tile. Here you can add, edit, or delete the email addresses of your security contacts.

Mobile APP

New Mobile App arrives the week of October 14, 2019
Give Your Users Custom Record Pages on Their Phones

Custom Lightning record pages are no longer restricted to desktop. Now you can create record pages tailored to the needs of your mobile users that they see only when viewing the page on a phone. When you create a record page in the Lightning App Builder, you can select a page template that matches the form factor that you’re designing the page for. Preview what the page looks like on different devices using the form factor switcher. When you activate your page, you can choose which form factors to make the record page available on: phone, desktop, or both, depending on which form factors its template supports.

List and Related List Components Are Optimized for the New Salesforce Mobile App

We updated the List View, Related List – Single, Related Lists, and Related List Quick Links components to support mobile navigation and the new Salesforce mobile app. When you place the List View component on a record page, a View More button loads more records in batches, so you can easily get more records or scroll to the information you want. The Related Lists component groups all your related lists in one section and no longer includes News and Twitter. The Related Lists component also uses a View More button for efficient navigation.

File

Set File Sharing to Inherit Record Settings

When attaching files to records, you can have the files inherit the sharing settings of those records. For instance, when a user can edit a record, you want them to be able to edit the files on that record, too. Now you can set the default sharing permissions on files that are attached to records. The preference lets files follow the sharing settings of the record.

For new orgs this option is enabled by default, in other case it must be activated from Setup – Salesforce Files – General Settings, as below.

With this option you can inherit the sharing setting of a record. For example a file attached before you enable this option appear as below

Then became

Remove a File from a Record Without Deleting It Everywhere

Sales Cloud

Contacts: Customize Opportunity Contact Roles for Better Tracking and Reporting

Opportunity contact role customization options give you the flexibility to track and attribute revenue to roles, titles, and individuals. With custom fields and page layouts, validation rules, and Apex triggers, you can design an Opportunity Contact Role to match your specific sales and reporting processes. You can capture new data, such as titles and roles, to help your sales reps be more efficient in targeting the right contacts.

Accounts: Customize Account Teams to Better Support Team Selling

Team selling involves complex account relationships. Now you can manage those relationships better in Salesforce. Collect more information by adding custom fields, buttons, and links to account team layouts. Use validation rules, Apex triggers, Process Builder (NEW), and workflow rules (NEW) with account teams to help keep data clean and minimize manual data entry. You can now report on account teams, too.

View Opportunities Owned by Your Team with One Click in Lightning Experience

The new My team’s opportunities list view is based on role hierarchy. Sales managers can use it to easily see all the opportunities owned by their direct and indirect reports without creating a list view.

Price Book Entries: Track Changes with Field History Tracking and Audit Trails

Price book entries now support field audit trails and field history tracking, so now you can easily track changes to price book entry fields.

Develop

Lightning Web Components: Open Source

To develop off-platform Lightning web components, see https://lwc.dev/

Add Lightning Web Components as Custom Tabs
<?xml version="1.0" encoding="UTF-8"?>
<LightningComponentBundle xmlns="http://soap.sforce.com/2006/04/metadata">
    <targets>
        <target>lightning__Tab</target>
    </targets>
</LightningComponentBundle>
Share CSS Style Rules

To share CSS style rules, create a component that contains only a CSS file. Import the style rules from that CSS file into the CSS files of other Lightning web components.

DOM API Changes

Attention! In Winter ’20, code can’t use document or document.body to access the shadow tree of a Lightning web component. For example, code in a test can’t call document.querySelector() to select nodes in a Lightning web component’s shadow tree.

To fix see the guide: https://releasenotes.docs.salesforce.com/en-us/winter20/release-notes/rn_lwc_dom_api.htm

Aura Components in the ui Namespace Are Being Retired

The ui components are scheduled for retirement in all Salesforce orgs in Summer ’21. Use similar components in the lightningnamespace instead. Retiring our legacy components enables us to focus on components that match the latest web standards in performance, accessibility, user experience, and internationalization. See the list of replacement: https://releasenotes.docs.salesforce.com/en-us/winter20/release-notes/rn_aura_ui_deprecate.htm

Callouts Are Excluded from Long-Running Request Limit

Every org has a limit on the number of concurrent long-running Apex requests. This limit counts all requests that run for more than 5 seconds (total execution time). However, HTTP callout processing time is no longer included when calculating the 5-second limit. We pause the timer for the callout and resume it when the callout completes.

Changed LWC

See full list: https://releasenotes.docs.salesforce.com/en-us/winter20/release-notes/rn_lwc_components.htm

New Apex Classes

Formula Class in the System NamespaceThe new System.Formula class contains the recalculateFormulas method that updates (recalculates) all formula fields on the input sObjects.

At the moment that’s all.

See you in Spring ’20!

[Salesforce / HowTo] Manage/ Workspaces links missing from Salesforce Community

For this new how to post, welcome Akashdeep Arora, Salesforce Evangelist/Team Lead at HyTechPro. He started Salesforce journey in 2015. 4X Salesforce Certified Professional, 4X Trailhead Ranger, 5X Trailhead Academy Certified.Founder of #BeASalesforceChamp campaign.


It happens many a times when developers create Communities in Salesforce that somehow they are not able to see the Manage or Workspaces link, like below:

Let’s demystify the mystery without any delay.

Before creating communities, it’s mandatory to check the “Enable Communities” checkbox from Communities Settings in Quick Find Box.

Afterwards, search for All Communities in Quick Find box and you will see something like this:

Click on New Community Button and you will be redirected to choose the template for creating the community.

Choose the template as per your requirement and give it a name.

Then, you would be able to see your community like this:

Now, here is a twist in the story as you can see under Action, Workspaces and Builder link is missing.

You might be thinking what could be the issue?

Well, I’ll help you out.

The main problems is that your profile is not listed as a member of the community. So, you need to add yourself as a Community Member.

But, if you ain’t a member of a community, you can’t access Community Management to update administration settings.

Now, the question would be: how to add yourself as a Community Member?

You need to update the Network Id of the community URL that you created and Profile or Permission Set ID.

STEP 1: From Setup, enter All Communities in the Quick Find box, select All Communities, and then right-click the community URL and select Inspect. The data-networkId provides your NetworkId.

This ID should start with “0DB”.

STEP 2: From Setup, enter Profiles in the Quick Find box, then select Profiles. Click on the profile that you want to add. The ProfileId is the last part of the browser URL after the last “/” character in classic or after the %2F character in Lightning Experience (/lightning/setup/Profiles/page?address=%2F00e58000000n15c): this ID should start with “00e”.

STEP 3: Create a .csv file having two columns for NetworkId and ParentId.

N.B. Parent ID represents the Profile Id.

The CSV file should be formatted like this:

"NetworkId","ParentId"
"0DBXXXXXXXXXXXX","00eXXXXXXXXXXXX"

STEP 4: Open Data Loader and select Network Member Group Object. Make sure that you check the “Show all Salesforce objects” checkbox.

Browse the .csv file that you created earlier and map the fields on the Network Member Group object, start the update call and you are all set!

Now, go to “All Communities” in Salesforce and you will be able to see the required links:

Bravo!
You have successfully added yourself as a member of the Community and now, as a member, you are able to access Community Management using Workspaces link.

It doesn’t matter how slowly you go as long as you don’t stop.

#BeASalesforceChamp

How to create an Apex reusable Data Factory Library using Forceea Templates

Today’s post has been written by Nikos Mitrakis, the creator of Forceea, an amazing Data Factory Framework for Salesforce.

Some facts about Nikos:

– Salesforce Developer at Johnson & Johnson EMEA Development Centre (EDC)
– Started his Salesforce journey in 2014
– Has passed 10 certifications, including Certified Application Architect
– Holds a Physics degree
– Leader of Limerick, Ireland Developer Group
– Married since 1994, has a daughter
– Loves watching sci-fi movies and good comedies
– Lives in Limerick, Ireland


Introduction to a familiar situation

Let’s face it with honesty and courage: writing test methods for our Apex code is (very often) boring and time-consuming, usually because of the difficulty regarding the creation of test data. In most implementations we can see a mediocre result, which could be described as a class (e.g. DataFactory) including static methods that generate and insert SObject records. The following code describes this inefficient pattern:

public class DataFactory {
    public static List<Account> createAccounts(Integer numRecords) {
        List<Account> results = new List<Account>();
        for (Integer counter = 1; counter <= numRecords; counter++) {
            Account record = new Account();
            // define fields
            record.Name = ...
            ...
            results.add(record);
        }
        return results;
    }
    // other methods
}

Using this logic, we can execute our Data Factory method with:

List<Account> accounts = DataFactory.createAccounts(100);
insert accounts;

to create and insert 100 accounts. As you can see, one issue here is how we generate data for other fields in our test method. But things get worse when we create related SObjects, for example contacts with accounts. Let’s examine a new method createContacts, based on the previous pattern:

public static List<Contact> createContacts(Integer numRecords) {
    List<Account> accounts = createAccounts(10);
    // optionally process accounts and manually add/modify Account fields
    insert accounts;
    List<Contact> results = new List<Contact>();
    for (Integer counter = 1; counter <= numRecords; counter++) {
        Contact record = new Contact();
        // define fields
        record.LastName = ...
        record.AccountId = ... // get the ID from accounts list
        ...
        results.add(record);
    }
    return results;
}

When we call the above method from our test method, e.g. with

List<Contact> contacts = DataFactory.createContacts(100);
//  optionally process contacts and manually add/modify Contact fields
insert contacts;

we certainly insert 10 accounts and 100 contacts related to these accounts. But what if we need to modify the generated accounts or we need to insert additional Account fields? This pattern doesn’t allow to do this. In more complex scenarios, we may have to insert many more SObjects. The final result is a Data Factory class with methods that create test data BUT without the ability to easily modify the created records.

I can finally hear your question: Do you propose a better approach? Is there a more flexible and easier way to do it? And the answer is YES!

Making a Data Factory Library with Forceea Templates

Forceea Data Factory framework is an open source GitHub project, with the following capabilities:

  • creates records for standard or custom objects, for any standard or custom field
  • automatically definines the required fields
  • creates static or random data for fields of any data type: Integer, Currency, Double, Date, Datetime, Time, Boolean, String, TextArea, Percent, Reference, Email, Phone, URL, Base64 (BLOB), Picklist and MultiPicklist
  • creates real random first and last names
  • creates real random addresses with street, zip code, city, region/state and country
  • creates serial data for date, datetime, integer, decimal, currency and percent
  • can copy data from another field of the same record or a lookup record
  • can create the same random data, using a pseudo-random number generator
  • handles record types and field dependencies (dependent picklists)
  • supports record groups for inserting and deleting records
  • validates the definitions based on the field data type
  • provides many methods to get/insert the created records, add/delete field definitions, get the errors, configure the amount of information returned during run-time (debug log) and more
  • includes an extended error messaging system

and will be our main tool to build a powerful and flexible DataFactory class (our Data Factory Library). This class will include static methods, our Templates, which actually will not insert any data at all! What these Templates do is to instruct Forceea how to generate the data.

Let’s meet our first Template:

public class DataFactory {
    // returns definitions for: Accounts 
    public static FObject getDefAccounts() {
        FObject result = new FObject('Account');
        result.setDefinition('Name', 'static value(Company)');
        result.setDefinition('Name', 'serial type(number) from(1) step(1) scale(0)');
        result.setDefinition('Phone', 'random type(phone) format("(30) DDD dD-00-DD")');
        result.setDefinition('Industry', 'random type(picklist)');
        result.setDefinition('Site', 'random type(url)');
        return result;
    }
}

Obviously the method getDefAccounts returns an FObject – the class instance for generating data with Forceea. Reading the code you can see that we define accounts with random values for all required fields. So, these are our guidelines so far:

  • Create a DataFactory class
  • Create a master Template for each SObject, with the name getDef<SObjectApiName>s, e.g. getDefCases
  • Use the above pattern for each master Template, defining all common required fields (the fields required by any record type)
  • For the Template field definitions, use definitions which generate
    – random values for picklist fields
    – random values for fields with date/datetime, checkbox, email, phone, currency, percent, address, and text area data types
    – serial values for the Name field (notice how we did it in the getDefAccounts method)

Even though it’s not obvious from the above code, Forceea (by default) will find and insert field definitions for any required fields we haven’t defined, but it’s a best practice to define all these required fields in our master Template.

The setDefinition method sets the expected values for each field, using a descriptive data generation language called Dadela. For example, the definition random type(picklist) except(Hot) for the Rating field generates random values from the pisklist field’s values, excluding the value “Hot”.

Now, for every record type of each SObject create a new Template, for example:

// returns definitions for: Accounts with MediumAccount record type
public static FObject getDefMediumAccounts() {
    FObject result = getDefAccounts();
    result.setDefinition('RecordTypeId', 'static value(MediumAccount)');
    result.setDefinition('NumberOfEmployees', 'random type(number) from(10) to(100) scale(-1)');
    result.setDefinition('AnnualRevenue', 'random type(number) from(1000000) to(10000000) scale(3)');
    return result;
}

This new Template builds on the master Template getDefAccounts, defining only the record type and the additional fields which are related to this specific record type (NumberOfEmployees and AnnualRevenue). All other defined fields from the master Template are used as they are, so we don’t duplicate any field definitions. Our additional guideline:

  • Create a record type Template for each SObject’s record type, with the name getDef<RecordTypeDescription><SObjectApiName>s, e.g. getDefServiceCases

This is it! This is what we need to do for SObjects which don’t include a Lookup or Master-detail relationship. But how do we create templates for those SObjects that they do include a relationship? Let’s see how, with our second SObject:

// returns definitions for: Accounts - Contacts
public static Map<String, FObject> getDefContactsAndAccounts() {
    // initialize
    Map<String, FObject> fobjectsByName = new Map<String, FObject>();
    String objName = '';
    // Account
    objName = 'Account';
    FObject objAccount = getDefMediumAccounts(); 
    fobjectsByName.put(objName, objAccount);
    // Contact
    objName = 'Contact';
    FObject objContact = new FObject(objName);
    objContact.setDefinition('FirstName', 'random type(firstname) group(name)');
    objContact.setDefinition('LastName', 'random type(lastname) group(name)');
    objContact.setDefinition('AccountId', 'random lookup(Account) source(forceea)');
    objContact.setDefinition('LeadSource', 'random type(picklist)');
    objContact.setDefinition('Title', 'random type(list) value(Developer, CFO, Account Manager, CEO, Logistics Manager)');
    objContact.setDefinition('Email', 'random type(email)');
    fobjectsByName.put(objName, objContact);

    return fobjectsByName;
 }

I think you’ll agree that this Template is more interesting. Its first lines use the previously created record type Template getDefMediumAccounts to define the Account fields. We could also

  • insert one or more new field definitions using objAccount.setDefinition('FieldApiName', '<Field Definition>') before fobjectsByName.put(..) or
  • modify an existing field definition – for example to define a static value for the (existing) NumberOfEmployees field, we can use
    // delete previous field definitions
    objAccount.deleteFieldDefinitions('NumberOfEmployees');
    // add new definition
    objAccount.setDefinition('NumberOfEmployees', 'static value(100)');

Finally, we insert the FObject for the Account into the fobjectsByName map and we proceed to the field definitions for contacts.

If you noticed the definition objContact.setDefinition('AccountId', 'random lookup(Account) source(forceea)' and asked yourself what is source(forceea), this is a way to request that the framework will get the related account IDs from the previously inserted accounts. There are a lot of lookup field definitions that will certainly need your attention if you start developing with the framework, but for the moment let’s not go any deeper.

In many implementations we have a set of dependent SObjects. Let’s say that in order to create records for the Case, we have to create records for Account and Contact (and perhaps records for 4 more other SObjects) using a Template like getDefCasesAndAccountsContacts. This is a kind of quite complex data factory process, which can be handled by Forceea very smoothly – you just add the following pattern for each requird SObject:

objName = '<SObjectApiName>';
FObject objMySObject = new FObject(objName);
objMySObject.setDefinition('<FieldApiName>', '<Field Definition>');
// other required field definitions
fobjectsByName.put(objName, objMySObject);

Our last guidelines:

  • Document the SObjects that are returned by any Template, with the correct order, e.g. // returns definitions for: Accounts - Contacts - Cases
  • Use the format getDef<SObjectName>And<RelatedSObjects> for Templates with related SObjects, e.g. getDefCasesAndAccountsContacts

Finally, insert the following method in your DataFactory class:

public static void insertRecords(Map<String, Fobject> fobjects) {
    for (FObject obj: fobjects.values()) {
      obj.insertRecords(true);
    }
}

The test method

After you have created your Templates, let’s see how you can take full advantage of them. We’ll use getDefContactsAndAccounts as an example. In your test method, the first step is to define a map:

Map<String, FObject> fObjects = DataFactory.getContactsAndAccounts();

The second step is to modify any SObject definitions, if it’s needed. For our example here, we’ll make things a little more difficult with the following requirements:

  • For the Account: we need to insert 10 records, with a) random values for the Description field and b) any picklist value except “Hot” for the Rating field.
  • For the Contact: we need to insert 100 records, with the random values from 1/1/1960 to 31/12/2000 for the Birthdate field.
  • All other fields will get the default definitions from the their Templates.
// initialize
Map<String, FObject> fobjectsByName = DataFactory.getDefContactsAndAccounts();
FObject objAccount = fobjectsByName.get('Account');
FObject objContact = fobjectsByName.get('Contact');
// define number of records
objAccount.records = 10;
objContact.records = 100;
// optionally modify an existing definition
objAccount.deleteFieldDefinitions('Rating');
//  optionally define new field definitions
objAccount.setDefinition('Description', 'random type(text) minlength(10) maxlength(40)');
objAccount.setDefinition('Rating', 'random type(picklist) except(Hot)');
objContact.setDefinition('Birthdate', 'random type(date) from(1960-1-1) to(2000-12-31)');
// insert records
DataFactory.insertRecords(fobjectsByName);

Using the above pattern, it’s easy for everyone to understand what changes have been made in comparison to the getDefContactsAndAccounts Template.

Did you say that we need the inserted contacts for our System.assert? No problem at all! Just use:

List<Contact> contacts = objContact.getInsertedRecords();
OR
List<Contact> contacts = FObject.getInsertedRecords('Contact');

Conclusion

Forceea Templates are easy to implement and they are powerful enough to help you write your Apex test methods faster. The most important is that

  • your test methods will be more understandable by any other developer, and
  • the new test methods will require less effort to develop

The best way to see if this solution is suitable for you is to start working with it and create a pilot version of your new DataFactory class. If you’re not satisfied with your existing Data Factory (or if you don’t have a Data Factory at all), why don’t you give it a try?

7 Salesforce Developer hacks you didn’t know about

I’ve recently published a post on Mason Frank’s blog, where I wrote about some Salesforce Developer hacks. Here’s a quick summary below and link to the full article, I hope you enjoy!


I’m lazy. Most developers are! This is not necessarily a bad thing, and Bill Gates summarizes this concept easily by saying “I choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it“.

Don’t misunderstand this statement—laziness is not staying on your couch the whole day and watching all of Game of Thrones in one sitting. It’s a different kind of being lazy.

The lazy developer is the one that, in order to avoid doing anything more than once, tries to automate it or knows exactly when that line of code is stored (they may actually not be able to write it themselves and thus have to Google for it).

That’s exactly what I saw in my own 12 years of work experience: the best developer is not the one who knows exactly which Apex function has which parameters (if you can, well… congratulations!), but the one who quickly knows how to solve a problem and where to look to maximize productivity (and happiness for the project manager or the customer).

Keep reading on Mason Frank blog…

3 undeniable reasons to love Salesforce Blockchain

Let’s start talking abount Salesforce Blockchain, with our guest blogger Priscilla Sharon, Salesforce Business Solution Executive for DemandBlue.

DemandBlue is in the business of helping its customers maximize their Salesforce investment through predictable outcomes. As we thrive in an era of cloud-based Infrastructure, Platform and Software services, DemandBlue has pioneered “Service-as-a-Service” through a value-based On Demand Service model that drives bottom-line results. They foster innovation through “Continuous Engagement and On Demand Execution” that offers their customers Speed, Value and Success to achieve their current and future business objectives.


Demystifying Salesforce Blockchain

Salesforce is always a company that is looking ahead to the next big in technology, whether it is mobile, social, IoT or Artificial Intelligence. The world’s leading CRM Company recently took its wraps off Salesforce Blockchain, its new low-code platform enabling organizations to share verified and distributed data sets across a trusted network of partners and third parties. With the new launch announced at TrailheaDX, the fourth annual developer conference, Salesforce is bringing the combined capabilities of its World’s #1 CRM and low-code Blockchain platform to enable organizations to create Blockchain networks, workflows and apps that have the potential to deliver holistic customer experiences.

Leading global brands are already taking advantage of this new platform. Arizona State University uses Blockchain for Salesforce to design and create an educational network that allows universities to securely verify and share information. IQVIA, a global leader in advanced analytics, technology solutions and contract research services has partnered with Salesforce to explore an array of possible blockchain technology initiatives including regulatory information management and drug label processing. S&P Global Ratings, a key provider of ratings, benchmarks, analytics and data for the global capital and commodity markets is leveraging the platform to review and approve new business bank accounts with improved agility. And more leading brands are exploring the infinite possibilities of delivering seamless customer experiences using the Salesforce Blockchain platform.

Take a deeper dive into the intricacies and the nuances of the brand-new Salesforce Blockchain platform that packs unique and incredible features, and promises to be the first low-code Blockchain platform for the CRM.

Salesforce Blockchain – The Technology

Blockchain Technology for Salesforce is built on Hyperledger Sawtooth, an open source modular platform for designing, deploying, and running distributed ledgers, and it’s customized for Salesforce Lightning, Salesforce’s front-end framework for app development. It consists of three components:

  1. Salesforce Blockchain Builder – a developer toolset for building blockchain applications
  2. Blockchain Connect – integrates blockchain actions with Salesforce apps
  3. Blockchain Engage – enables customers to invite parties to blockchain apps created within Salesforce

Businesses can take advantage of the platform to build and manage blockchain networks, apps and smart contracts using Salesforce’s powerful low-code capabilities. Customers can even create and share a blockchain object using the same process as they already do for any CRM data object in Salesforce without the need for writing code.

We help companies build for the future by making breakthrough technology accessible and easy to use — today we are doing just that with Salesforce Blockchain. Now, companies will be able to create new ecosystems and achieve new levels of interconnectivity through trusted partner networks

Bret Taylor, President and Chief Product Officer of Salesforce at TrailheaDX

3 undeniable Reasons to Love it

Salesforce Blockchain is deeply customized for Salesforce Lightning and is uniquely designed to lower the barrier for creating trusted partner networks. It enables companies to slickly bring together authenticated, distributed data and CRM processes. With Salesforce Blockchain integration you can:

  • Build Blockchain Networks with Clicks, not Code—Salesforce platform is well received by developers for its unique low-code capabilities. With the newly launched Salesforce Blockchain platform, you can slickly build and maintain blockchain networks, apps and smart contracts, using just clicks, not code
  • Create Actionable Blockchain Data—Make blockchain data actionable through native integration with Salesforce. Layer complex blockchain data along with existing sales, service and marketing workflows like search queries and process automation. Also, you can now run Einstein-powered artificial intelligence algorithms that integrate blockchain data into sales forecasts, predictions and much more
  • Lower Barrier to Entry for Partners—Engage partners, distributors and intermediaries easily to leverage Salesforce Blockchain. Even more, companies can now pull in APIs, pre-built apps and integrate any existing blockchains with Salesforce. With an intuitive engagement layer, organizations can also easily interact with and add third parties to their blockchain with a few clicks and simple authentication thereby creating trust networks

Salesforce Blockchain is currently available to select design partners and will be generally available in 2020.

Setting up SFDX Continuous Integration using Bitbucket Pipelines with Docker image

Ivano Guerini is a Salesforce Senior Developer at Webresults, part of Engineering Group since 2015.
He started my career on Salesforce during his university studies and based his final thesis on it.
He’s passionate about technology and development, in his spare time he enjoys developing applications mainly on Node.js.


In this article, I’m going to walk you through the steps to set up CI with Salesforce DX.

For this, I decided to take advantage of Bitbucket and it’s integrated tool Bitbucket Pipelines.

This choice is not made after a comparison between the various version control systems and CI tools but is driven by some business needs for which we decided to fully embrace the cloud solutions and in particular the Atlassian suite of which Bitbucket its part.

What is Continuous Integration?

In software engineering, continuous integration (often abbreviated to CI) is a practice that is applied in contexts in which software development takes place through a versioning system. It consists of frequent alignment from the work environments of the developers to the shared environment.

In particular, it is generally assumed that automatic tests have been prepared that developers can execute immediately before releasing their contributions to the shared environment, so as to ensure that the changes do not introduce errors into the existing software.

Let’s apply this concept to our Salesforce development process using sfdx.

First of all, we have a production org where we want to deploy and maintain the application than typically we have one or more sandboxes such as for UAT, Integration Test and development.

With sfdx, we also have the concept of scratch org, disposable and preconfigured organizations where we, as developers, can deploy and test our work before pushing them into the deployment process.

In the image below you can see an approach to the CI with Salesforce DX. Once a developer have finished a feature he can push into the main Developer branch, from this the CI take place creating a scratch Org to run automated tests, such as Apex Unit Test or even Selenium like test automatisms. If there is no error the dev can create a pull request moving forward in the deployment process.

In this article, I’ll show you how to set up all the required tools and as an example, we will only set up an auto-deploy to our Salesforce org on every git push operation.

Toolbox

Let’s start with a brief description of the tools we’re going to use:

  • Git – is a version control system for tracking changes in files and coordinating work on those files across the team. All metadata items, whether modified on the server or locally, are tracked via GIT. This provides us with a version history as well as traceability.
  • Bitbucket – is a cloud-based GIT server from Atlassian used for hosting our repository. It provides a UI to navigate the GIT repository and has many additional features like pull requests. These are used for approving and merging changes.
  • Docker – provides a way to run applications securely, packaged with all its dependencies and libraries. So, we will be using it to create an environment for running sfdx commands.
  • Bitbucket Pipelines – is an add-on for Bitbucket cloud that will allow us to kick off deployments and validations when updates are made to the branches in Bitbucket.

If you have always worked in Salesforce, then it’s quite possible that Docker containers sound alien to you. So what is Docker? In simple terms, Docker can be thought of as a virtual machine in the cloud. Docker provides an environment in the cloud where applications can run. Bitbucket Pipelines support Docker images for running the Continuous Integration scripts. So, instead of installing sfdx in your local system, you’d now specify them to be installed in your Docker image, so that our CI scripts can run.

Create a developer Org and enable the DevHub

We made a brief introduction about what CI is and the tools we’re going to use, now it’s time to get to the heart of it and start configuring our tools. Starting from our Salesforce Org.

We are going to enable the devhub to be able to work with sfdx and we are going to set up a connected app that allows us to handle the login process inside our docker container.

For this article, I created a dedicated developer Org in order to have a clean environment.

We can do this simply filling out the form from the Salesforce site: https://developer.salesforce.com/signup and complete the registration process.

In this way, we will obtain a new environment on which to perform all the tests we want.

Let’s go immediately to enable the DevHub: Setup → Development → DevHub click on the Enable DevHub toggle.

Once enabled it can’t be disabled but this is a requirement to be able to work with SFDX.

Now you can install the sfdx cli tool on you computer.

Create a connected app

Now that we have our new org and the sfdx cli installed, we can run sfdx commands that makes it easy for us to manage the entire application development life cycle from the command line, including creating scripts that facilitate automation.

However, our CI will run in a separate environment where we haven’t a direct control, such as for the logging. So we will need a way to manage the authorization process inside the docker container when your CI automation job runs.

To do this we’ll use the OAuth JSON Web Token (JWT) bearer flow that’s supported in the Salesforce CLI, this OAuth flow gives you the ability to authenticate using the CLI without having to interactively login. This headless flow is perfect for automated builds and scripting.

Create a Self-Signed SSL Certificate and Private Key

For a CI solution to work, you’ll generate a private key for signing the JWT bearer token payload, and you’ll create a connected app in the Dev Hub org that contains a certificate generated from that private key.

To create an SSL certificate you need a private key and a certificate signing request. You can generate these files using OpenSSL CLI with a few simple commands.

If you use Unix Based System, you can install the OpenSSL CLI from the official OpenSSL website.

If you use Windows instead, you can download an installer from Shining Light Productions, although there are plenty of alternatives.

We will follow some specific command to create a certificate for our needs, if you want to better understand how OpenSSL works, you can find a handy guide in this article.

If you are not familiar with OpenSSL you can find a good

  1. Create a folder on your PC to store the generated files
    mkdir certificates
  2. Generate an RSA private key
    openssl genrsa -des3 -passout pass:<password> -out server.pass.key 2048
  3. Create a key file from the server.pass.key file using the same password from before:
    openssl rsa -passin pass:<password> -in server.pass.key -out server.key
  4. Delete the server.pass.key:
    rm server.pass.key
  5. Request and generate the certificate, when prompted for the challenge password press enter to skip the step:
    openssl req -new -key server.key -out server.csr
  6. Generate the SSL certificate:
    openssl x509 -req -sha256 -days 365 -in server.csr -signkey server.key -out server.crt

The self-signed SSL certificate is generated from the server.key private key and server.csr files.

Create the Connected App

The next step is to create a connected app on Salesforce that includes the certificate we just created.

  1. From Setup, enter App Manager in the Quick Find box, then select App Manager.
  2. Click New Connected App.
  3. Enter the connected app name and your email address:
    1. Connected App Name: sfdx ci
    1. Contact Email: <your email address>
  1. Select Enable OAuth Settings.
  2. Enter the callback URL:
  3. http://localhost:1717/OauthRedirect
  4. Select Use digital signatures.
  5. To upload your server.crt file, click Choose File.
  6. For OAuth scopes, add:
    • Access and manage your data (api)
    • Perform requests on your behalf at any time (refresh_token, offline_access)
    • Provide access to your data via the Web (web)
  7. Click Save

Edit Policies to avoid authorization step

After you’ve saved your connected app, edit the policies to enable the connected app to circumvent the manual login process.

  1. Click Manage.
  2. Click Edit Policies.
  3. In the OAuth policies section, for Permitted Users select Admin approved users are pre-authorized, then click OK.
  4. Click Save.

Create a Permission Set

Lastly, create a permission set and assign pre-authorized users for this connected app.

  1. From Setup, enter Permission in the Quick Find box, then select Permission Sets.
  2. Click New.
  3. For the Label, enter: sfdx ci
  4. Click Save.
  5. Click sfdx ci | Manage Assignments | Add Assignments.
  6. Select the checkbox next to your Dev Hub username, then click Assign | Done.
  7. Go back to your connected app.
    1. From Setup, enter App Manager in the Quick Find box, then select App Manager.
    2. Next to sfdx ci, click the list item drop-down arrow (), then select Manage.
    3. In the Permission Sets section, click Manage Permission Sets.
    4. Select the checkbox next to sfdx ci, then click Save.

Test the JWT Auth Flow

Open your Dev Hub org.

  • If you already authorized the Dev Hub, open it:
    sfdx force:org:open -u DevHub
  • If you haven’t yet logged in to your Dev Hub org:
    sfdx force:auth:web:login -d -a DevHub

Adding the -d flag sets this org as the default Dev Hub. To set an alias for the org, use the -a flag with an argument to set an alias.

To test the JWT auth flow you’ll use some of the information that we asked you to save previously. We’ll use the consumer key that was generated when you created the connected app (CONSUMER_KEY), the absolute path to the location where you generated your OpenSSL server.key file (JWT_KEY_FILE) and the username for the Dev Hub (HUB_USERNAME).

  1. On the command line, create these three session-based environment variables:
    export CONSUMER_KEY=<connected app consumer key>
    export JWT_KEY_FILE= ../certificates/server.key
    export HUB_USERNAME=<your Dev Hub username>


    These environment variables facilitate running the JWT auth command.
  2. Enter the following command as-is on a single line:
    sfdx force:auth:jwt:grant –clientid ${CONSUMER_KEY} –username ${HUB_USERNAME} \ –jwtkeyfile ${JWT_KEY_FILE} –setdefaultdevhubusername

This command logs in to the Dev Hub using only the consumer key (client ID), the username, and the JWT key file. And best of all, it doesn’t require you to interactively log in, which is important when you want your scripts to run automatically.

Congratulations, you’ve created your connected app and you are able to login using it with the SFDX CLI.

Set up your development environment

In this section we will configure our local environment, creating a remote repository in Bitbucket and linking it to our local sfdx project folder.

If you are already familiar with these steps you can skip and pass directly to the next section.

Create a Git Repository on Bitbucket

If you don’t have a bitbucket account, you can create a free one registering to the following link: https://bitbucket.org/account/signup/

Just insert your email and follow the first registration procedure.

Once logged in you will be able to create a new git repository from the plus button on the right menu.

You will be prompted to a window like the following, just insert a name for the repository, in my case I’ll name it: sfdx-ci, leaving Git selected as Version Control System.

We’re in but our repo is totally empty, Bitbucket provides some quick commands to initialize our repo. Select the clone command:

git clone https://[email protected]/username/sfdx-ci.git

Move to your desktop and open the command line tool and paste and execute the git clone command. This command will create a folder named like the Bitbucket repository already linked to it as a remote branch.

Initialize SFDX project

Without moving from our position, execute the sfdx create project command:
force:project:create -n sfdx-ci

Using the -n parameter with the same name of the folder we just cloned from git.

Try deploy commands

Before we pass to configure our CLI operations let’s try to do it in our local environment.

First of all, we must create our sfdx project.

The general sfdx deployment flow into a sandbox or production org is:

  1. Convert from source form to metadata api form
    sfdx force:source:convert -d <target directory>
  2. Use the metadata api to deploy
    sfdx force:mdapi:deploy -d <same directory as step 1> -u <username or alias>

These commands will be the same we are going to use inside our Bitbucket Pipelines, You can try in your local environment to see how they work.

Set up Continuous Integration

In previous sections, we talked mostly about common Salesforce project procedures. In the next one, we are going deeper in the CI world. Starting with a brief introduction to Docker and Bitbucket Pipelines.

Lastly, we’ll see how to create a Docker image with SFDX CLI installed and how to use it in our pipeline to run sfdx deploy commands.

Docker

Wikipedia defines Docker as

an open-source project that automates the deployment of software applications inside containers by providing an additional layer of abstraction and automation of OS-level virtualization on Linux.

In simpler words, Docker is a tool that allows developers, sys-admins, etc. to easily deploy their applications in a sandbox (called containers) to run on the host operating system i.e. Linux. The key benefit of Docker is that it allows users to package an application with all of its dependencies into a standardized unit for software development.

Docker Terminology

Before we go further, let me clarify some terminology that is used frequently in the Docker ecosystem.

  • Images – The blueprints of our application which form the basis of containers.
  • Containers – Containers offer a logical packaging mechanism in which applications can be abstracted from the environment in which they actually run.
  • Docker Daemon – The background service running on the host that manages building, running and distributing Docker containers. The daemon is the process that runs in the operating system to which clients talk to.
  • Docker Client – The command line tool that allows the user to interact with the daemon.
  • Docker Hub – A registry of Docker images. You can think of the registry as a directory of all available Docker images.
  • Dockerfile – A Dockerfile is a simple text file that contains a list of commands that the Docker client calls while creating an image. It’s a simple way to automate the image creation process. The best part is that the commands you write in a Dockerfile are almost identical to their equivalent Linux commands.

Build our personal Docker Image with SFDX CLI installed

Most Dockerfiles start from a parent image. If you need to completely control the contents of your image, you might need to create a base image instead. A parent image is an image that your image is based on. It refers to the contents of the FROM directive in the Dockerfile. Each subsequent declaration in the Dockerfile modifies this parent image.

Most Dockerfiles start from a parent image, rather than a base image, this will be our case, we will start from a Node base image.

Create a folder on your machine and create a file named Dockerfile, and paste the following code:

FROM node:jessie
RUN apk add --update --no-cache git openssh ca-certificates openssl curl
RUN npm install sfdx-cli --global
RUN sfdx --version
USER node

Let’s explain what this code means, in order:

  1. We use a Node base image, this image comes with NPM and Node.js preinstalled. This one is the official Node.js docker image, and jessie indicate the last available version;
  2. Next, with the apk add command we are going to install some additional utility tools mainly git and openssl to handle sfdx login using certificates;
  3. Lastly using npm command we install the SFDX CLI tools;
  4. Just a check for the installed version;
  5. And finally the USER instruction sets the user name to use when running the image.

Now we have to build our image and publishing it to the Docker Hub so to be ready to use in our Pipelines.

  1. Create an account on Docker Hub.
  2. Download and install Docker Desktop. If on Linux, download Docker Engine – Community
  3. Login to Docker Hub with your credentials. 
    docker login –username=yourhubusername –password=yourpassword
  4. Build you Docker Image
    docker build -t <your_username>/sfdxci
  5. Test your docker image locally:
    docker run <your_username>/sfdxci
  6. Push your Docker image to your Docker Hub repository
    docker push <your_username>/sfdxci

Pushing a docker image on the Docker Hub will make it available for use in Bitbucket pipelines.

Bitbucket Pipelines

Now that we have a working Docker Image with sfdx installed we can continue configuring the pipeline, that’s the core of our CI procedure.

Bitbucket Pipelines is an integrated CI/CD service, built into Bitbucket. It allows you to automatically build, test and even deploy your code, based on a configuration file in your repository. Essentially, it creates containers in the cloud for you.

Inside these containers, you can run commands (like you might on a local machine) but with all the advantages of a fresh system, custom configured for your needs.

To set up Pipelines you need to create and configure the bitbucket-pipelines.yml file in the root directory of your repository, if you are working with branches,  to be executed this file must be present in each branch root directory.

A bitbucket-pipelines.yml file looks like the following:

image: atlassian/default-image:2
 pipelines:
   default:
     - step:
         script: 
           - echo "Hello world default"
   branches:
     features/*:
         - step:
             script: 
               - echo "Hello world feature branch"

There is a lot you can configure in the bitbucket-pipelines.yml file, but at its most basic the required keywords are:

  • image – the Docker image that will be used to create the Docker Container, You can use the default image (atlassian/default-image:latest), but using a personal one is preferred to avoid time consumption during the installation of required tools (e.g. SFDX CLI), To specify an image, use image: <your_dockerHub_account/repository_details>:<tag>
  • pipelines – contains all your pipeline definitions.
  • default – contains the steps that run on every push, unless they match one of the other sections.
  • branches – Specify the name of a branch on which run the defined steps, or use a glob pattern (to learn more about the glob patterns, refer to the BitBucket official guide).
  • step – each step starts a new Docker container with a clone of your repository, then runs the contents of your script section.
  • script – a list of cli commands that are executed in sequence.

Other than default and branches there are more signals keyword to identify what step must run, such as pull-request, but I leave you to the official documentation, we are going to use only these two.

Keep in mind that each step in your pipeline runs a separate Docker container and the script runs the commands you provide in this environment with the repository folder available.

Configure SFDX deployment Pipelines

Before configuring our pipeline, let’s review for a moment the steps needed to deploy to a production org using sfdx cli.

First of all we need to login into our SF org, to do so we have created a Salesforce Connected App to allow us logging in without any manual operation, simply using the following command:

sfdx force:auth:jwt:grant --clientid  --username  --jwtkeyfile keys/server.key --setdefaultdevhubusername --setalias sfdx-ci --instanceurl 

As you can see there are three parameters that we have to set in this command line:

  • CONSUMER_KEY
  • SFDC_PROD_USER
  • SFDC_PROD_URL

Bitbucket offer a way to store some variables that can be used in our pipelines in order to avoid hard-coded values.

Under Bitbucket repository Settings → Pipelines → Repository Variables create three variables and fill them in with the data at your disposal.

Another parameter required by this command is the server.key file, in this case I simply added it in my repository under the keys folder.

It’s not a good practice and I will move it in a more secure position, but for this demonstration it’s enough.

Now you are logged in, you need only two sfdx commands to deploy your metadata. One to convert your project in a metadata API format and one to deploy in the sf org:
sfdx force:source:convert -d mdapi
sfdx force:mdapi:deploy -d mdapi -u <SFDC_PROD_USER>

Like the login command we are going to use a Pipeline Variable to indicate the target org username under the -u parameter.

OK, now that we know how to deploy a SFDX proggect we can put all this into our pipeline.

Move to the root of our sfdx project and create the bitbucket-pipelines.yml file and paste the following code (replace the image name with your own Docker image):

image: ivanoguerini/sfdx:latest
 pipelines:
  default:
 step:
    script: 
      - echo $SFDC_PROD_URL
      - echo $SFDC_PROD_USER
      - sfdx force:auth:jwt:grant --clientid $CONSUMER_KEY --username $SFDC_PROD_USER --jwtkeyfile keys/server.key --setdefaultdevhubusername --setalias sfdx-ci --instanceurl $SFDC_PROD_URL
      - sfdx force:source:convert -d mdapi
      - sfdx force:mdapi:deploy -d mdapi -u $SFDC_PROD_USER 

Commit and push this changes to the git repository.

Test the CI

OK we have our CI up and running, let’s do a quick test.

In your project create a new apex class and put some code in it. Then commit and push your changes.

git add .
git commit -am “Test CI”
git push

As we said the pipeline will run on every push into the remote repository, you can check the running status under the Pipelines menu. You will see something like this:

As you know, the mdapi:deploy command is asynchronous so to check if there was some errors during the deploy you have to run the following command mdapi:deploy:report specifying the jobId or if you prefer you can check the deploy directly in the salesforce Org under Deployment section.

Conclusions

With this article I wanted to provide you with the necessary knowledge to start configuring a CI using the BitBucket Pipelines.

Obviously what I showed you is not enough for a CI that can be used in an enterprise project, there is still a lot to do.

Here are some starting points to improve what we have seen:

  1. Store the server.key in a safe place so that it is not directly accessible from your repository.
  2. Manage the CI in the various sandbox environments used
  3. For the developer branch, consider automating the creation a scratch org and running Apex Unit Tests.

But, I leave this to you.

10 signs you’re an amazing Salesforce Developer

I recently joined other Salesforce influencers in contributing to Mason Frank’s ‘Ask The Experts’ series, where I wrote about my ten best tips to become an amazing Salesforce Developer. Here’s a quick summary below and link to the full article, I hope you enjoy!

10 signs you’re an amazing Salesforce Developer

“Am I the best Salesforce Developer I can be?”

This is a question all Salesforce Developers should be asking themselves. If you said “Yes”, well… you don’t need to read this post as you may be in the “Olympus” of coders.

If your answer is “No”, welcome my friend, keep reading this post. I have some tips for you, based on my experiences, that may lead you to the right trail.

I’ve always felt like I’ve never achieved anything to the top level, and I guess this drove me to overcome my limits and achieve a lot in my personal and professional life.

If you are in the circle of developers that believe they can empower their skills day after day, you are using a mental process that I call “Continuous Self-Improvement” (CSI, isn’t it cool? I guess I’ve not invented anything, but I love giving names to stuff). I even call it the “John Snow syndrome”, because your student mentality means you’re a coder who feels like they “know nothing”.

Keep reading on Mason Frank blog…

Salesforce Summer ’19 Platform Release: quick highlights

Our week’s trailblazer is Claudio Marzorati, who will be listing some of his favorite Summer’19 Salesforce platform release.

Claudio is a Senior Salesforce Developer @ Sintegra s.r.l. (Milan). He worked with different retails that allowed him to increase his technical background on several aspects.
From analysis, to development, to the direct relationship with the customer, nothing is left to chance.
Other passions are running and travels!


Summer 19′ hass finally arrived and in our org all changes are going to be applied.

Here I summarize some of the most important features that can impact your org.

Lightning URL parameters have been namespaced

Finally they have been release the funtiality that forces URL parameters to be namespaced.
So if you add ?foo=bar to the URL, it will get auto-stripped.
But if you add ?c__foo=bar to the URL, it will persist.

Keep Record Context When Switching from Salesforce Classic to Lightning Experience

When you switch from Salesforce Classic to Lightning Experience, you land on the same page in Lightning Experience, if it exists. If the same page doesn’t exist in Lightning Experience, you are redirected to your default landing page, which is determined by the org default or your customizations.

Choose from Two Record View Options

Now you have two record page view default options. Choose between the current view—now called Grouped view—and the new Full view. In Setup, enter Record Page Settings in the Quick Find box, and select Record Page Settings.


Full view (1) displays all details and related lists on the same page. Grouped view (2), the original Lightning Experience record view, focuses on specifics by grouping information across tabs and columns.

Search Picklist Fields in List Views

You don’t have to manually pick through your list views to find the picklist values you’re looking for. List view search now includes picklists in your query results. Dependent picklists and picklists with translated values aren’t searchable.

Continuation

Finally we can use the Continuation pattern from an Aura component or a Lightning web component. Continuation class
in Apex are used to make a long-running request to an external web service Process the response in a callback method. An asynchronous callout made with a continuation doesn’t count toward the Apex limit of 10 synchronous requests that last longer than five seconds. Therefore, you can make more long-running callouts and integrate your component with a complex back-end API. In Lightning-Web-Component now we can use
@salesforce/apexContinuation in order to provides access to an Apex method that use the Continuation.

Aura Components

There a lot of improvements expecially in LWC and below I report the most used in my develop.

lightning:recordEditForm

density option

Sets the arrangement style of fields and labels in the form. Accepted values are compact, comfy, and auto. The default is auto, which lets the component dynamically set the density according to the user’s Display Density setting and the width of the form.

onerror handler event changed

You can now return the error details when a required field is missing using event.getParam("output").fieldErrors. To display the error message automatically on the form, include lightning:messages immediately before or after the lightning:inputField components.  

lightning:inputField

reset function added

Resets the form fields to their initial values.

There are few deprecated component in force and ui: force:recordEdit, force:recordView, ui:input (all types), ui:button, ui:menu (all types), ui:output (all types), ui:spinner.

Web Component

Salesforce is spending a lot of time and resources in improving this new components. There are a lot of new functionalities added and below I report the most significant.

lightning-input

autocomplete function added

Controls autofilling of the field. This attribute is supported for email, search, tel, text, and url input types.

date-style (or time-style) function added

The display style of the date when type='date/time' or type='datetime'. Valid values are short, medium, and long. The default value is medium. The format of each style is specific to the locale. This attribute has no effect on mobile devices. 

lightning-input-field

reset function added

Resets the form fields to their initial values.

lightning-record-edit-form

density option

Sets the arrangement style of fields and labels in the form. Accepted values are compact, comfy, and auto. The default is auto, which lets the component dynamically set the density according to the user’s Display Density setting and the width of the form.

onerror handler event changed

You can now return the error details when a required field is missing using event.getParam("output").fieldErrors. To display the error message automatically on the form, include lightning:messages immediately before or after the lightning:inputField components.  

Minor UX Improvement

They have changed the UX provided. Some example for recent records and for the related list are below reported.

More comprehensive layout for Recent Records
Related Lists have more impact for users

More details can be found at
https://releasenotes.docs.salesforce.com/en-us/summer19/release-notes/salesforce_release_notes.htm.

Page 8 of 26

Powered by WordPress & Theme by Anders Norén