When Salesforce is life!

Tag: MongoDB

[Salesforce / Lightning Connect] Lightning Connect Custom Adapters and MongoDB (DML support) pt.2

Do you remember the Lightning Connect Custom Adapters and MongoDB post?

Things have changed after that post and finally the platform supports all DML statements: insert, update and delete!

To make the magic work I had to make:

Follow the post instructions to setup up your Heroku app.

The core changes are the support for the methods.

Open the MongoDBDataSrouceProvider.cls class:

override global List getCapabilities() {
 List capabilities = new List();
 capabilities.add(DataSource.Capability.ROW_QUERY);
 capabilities.add(DataSource.Capability.SEARCH);
 //new capabilities added
 capabilities.add(DataSource.Capability.ROW_CREATE);
 capabilities.add(DataSource.Capability.ROW_UPDATE);
 capabilities.add(DataSource.Capability.ROW_DELETE);

 return capabilities;
}

The provider has been added with more capabilities, CREATE, UPDATE and DELETE.

Let’s open the MongoDBDataSourceConnectio.cls class and look at the 2 new methods:

global override List<DataSource.UpsertResult> upsertRows(DataSource.UpsertContext context) {
    List<DataSource.UpsertResult> results = new List<DataSource.UpsertResult>();
    List<Map<String, Object>> rows = context.rows;
    Http h = new Http();
    
    for(Integer i = 0; i < rows.size(); i++){
        Map<String,Object> row = rows[i];
        HttpRequest request = new HttpRequest();
        request.setHeader('Content-Type','application/json');
        request.setTimeout(60000);
        Map<String,Object> invoice = new Map<String,Object>();
        if(String.isBlank((String)row.get('ExternalId'))){
            request.setMethod('POST');
            request.setEndpoint(DB_ENDPOINT_NC);
        }else{
            request.setMethod('PUT');
            request.setEndpoint(DB_ENDPOINT_NC+'/'+row.get('ExternalId'));
        }
        
        invoice.put('accountid', row.get('Account'));
        invoice.put('contractid', row.get('Contract'));
        invoice.put('created', row.get('CreatedDate'));
        invoice.put('amount', row.get('Amount'));
        invoice.put('description', row.get('Description'));
        
        request.setBody(JSON.serialize(invoice));
        
        HttpResponse response = h.send(request);
        
        List<Object> mList = (List<Object>)JSON.deserializeUntyped(response.getBody());
        Map<String, Object> m = (Map<String, Object>)mList[0];
        if (response.getStatusCode() == 200){
            String objId = String.valueOf(m.get('_id'));
            if(String.isBlank(objId)){
                objId = String.valueOf(row.get('ExternalId'));
            }
            results.add(DataSource.UpsertResult.success(objId));
        } 
        else {
            results.add(DataSource.UpsertResult.failure(
                String.valueOf(row.get('ExternalId')), 'The callout resulted in an error: ' + response.getStatusCode()+' - '+response.getBody()));
        }
    }
    return results;
}

global override List<DataSource.DeleteResult> deleteRows(DataSource.DeleteContext context) {
    List<DataSource.DeleteResult> results = new List<DataSource.DeleteResult>();
    Http h = new Http();
    
    for (String externalId : context.externalIds){
        HttpRequest request = new HttpRequest();
        request.setHeader('Content-Type','application/json');
        request.setTimeout(60000);

        request.setMethod('DELETE');
        request.setEndpoint(DB_ENDPOINT_NC+'/'+externalId);
        
        HttpResponse response = h.send(request);
        if (response.getStatusCode() == 200
            || response.getStatusCode() == 201){
            results.add(DataSource.DeleteResult.success(String.valueOf(externalId)));
        } 
        else {
            results.add(DataSource.DeleteResult.failure(
                String.valueOf(externalId), 'The callout resulted in an error: ' + response.getStatusCode()+' - '+response.getBody()));
        }
    }
   return results;
}

WARNING: this code is not optimized for bulk upsert/delete because it makes a callout for every record.

It’s a proove of concept, so I challenge you to bulkify the class!

How can you insert an external object provided by a Lightning Connect adapter?

The Database class has been provided with new methods:

  • deleteAsync
  • insertAsync
  • updateAsync

These methods are used to make the calls to the remote system and actually do the work!

Database.insertAsync(new List<MongoDB_Invoice__x>{
    new MongoDB_Invoice__x(Amount__c=1, Description__c ='Async Test 1'),
    new MongoDB_Invoice__x(Amount__c=2, Description__c ='Async Test 2')
});
Database.deleteAsync([Select Id From MongoDB_Invoice__x Where Description__c = 'Async Test 1']);

Every method has an alternative method that provides a callback class, which allows to make further actions after the records are upserted/deleted.

For instance, the asyncUpdate has an optional second parameter of type Database.AsyncSaveCallback that can be created to execute some logic after a specific record is done (the class is called every time a record is updated).

Every asyncDML method returns a List of Database.DeleteResult or Database.SaveResult that contains a link to the asynchrounous operation that can be retrieved by calling the Database.getAsyncLocator(result) method and passing the value to the Database.getAsyncSaveResult(asyncLocator) or Database.getAsyncDeleteResult(asyncLocator): this way you can get the status of the asynchronous operation.

[Salesforce / Lightning Connect] Lightning Connect Custom Adapters and MongoDB

This is a repost of the article I published on Salesforce Developer Blog.

Summer ’15 Platform Release brought us the awesome Lightning Connect Custom Adapters: you can connect to external data sources and use them as custom Salesforce Objects writing custom Apex code.

After I read Introducing Lightning Connect Custom Adapters by Lawrence McAlpin, I wanted to get my hands dirty and try to implemented my own version of the adapter thinking of a “pseudo-real” use case.

If you are a TL;DR person, here is the code:

  1. Salesforce classes
  2. heroku-crest MondoDB NodeJS proxy

At first the DataSource namespace documentation has been published here, this way you can understand all the objects / interfaces / methods we’ll be using in this example.

In my use case we have:

  • A MongoDB server which stores several invoices
  • A NodeJS server that proxies Salesforce / MongoDB hosted on Heroku
  • A Custom Adapter implemented in Salesforce

The Data Model

An external system writes invoices records on the MongoDB server, storing data on the invoices table, following the model (you can simulate it by POST requesting on the following Heroku REST service):

{
    "_id": "55abc13dad230b03001d7edf",
    "contractid": "C20150001",
    "accountid": "ACC20150001",
    "created": "2013-10-25T20:15:45.851Z",
    "amount": 1000,
    "description": "invoice #1"
}

Where:

  • contractid: is a custom external id for the Contract object
  • accountid: is a custom external id for the Account object
  • _id: MongoDB id
  • created: is the created date (date/time field)
  • amount: is the amount of the invoice (number)
  • description: is a generic description

MongoBD REST interface

The next step is to give MongoDB a REST interface.
I’ve changed an old Project by Ricard Fredin, adding support for:

  • Basic authentication
  • Environment variables for MongoDB host name and port
  • Added support for $oid, $date and $regex BSON types
  • Support for Heroku Button

This is the updated repository with all the infos of the current changes and original project of the new Heroku Crest project.

This project comes with an Heroku Button:

Deploy

This means you can easily deploy this MongoDB proxy on your personal Heroku account:

You need to set your app name, the host name and the server port.

Let’s populate the DB

The last thing to do is to create at least one database on your MongoDB server with one valid user (with at least read permissions) and one table.

In our example we have a “lighthningconnect” database with an “invoices” table.

To see if your Heroku app works, make a get request:

GET https://[YOUR_HEROKU_APP_NAME].herokuapp.com/lighthningconnect/invoices
   Headers: 
   Authorization: BASIC [BASE64(username:password)]

You can also use POST/PUT/DELETE methods.

To populate your instance, make a post request of this kind:

POST https://[YOUR_HEROKU_APP_NAME].herokuapp.com/lighthningconnect/invoices
   Headers: 
     Authorization: BASIC [BASE64(username:password)]
     Content-Type: application/json

BODY:

{
    "contractid": "C20150001",
    "accountid": "ACC20150001",
    "created": {"$date" : "2014-09-15T20:12:42Z"},
    "amount": 1230,
    "description": "invoice #X"
  }

Once the MongoDB REST interface is set up, we are gonna create a new Salesforce Named Credential: this way we can setup access to the REST server (and MongoDB server) with a simple configuration on the interface, every time we make a callout to this endpoint:

The Lightning Custom Adapter implementation

The whole code of the Salesforce implementation can be found here.

The implementation is similar to the one of Lawrence McAlpin’s post.

The main modification involves the handling of the named credential and MongoDB database and table names.

Open the MongoDBDataSourceConnection.cls file and change the first lines:

private static String NC_NAME = 'MongoDB'; //Name of the Named Credential
   private static String DB_ENDPOINT_NC = 'callout:'+NC_NAME+'/lighthningconnect/invoices'; //database and table name in your own database

If you want to change the mappings, you can do it easily by modifying the following map:

private static Map FIELD_MAP = new Map{
         'Id' => '_id',
            'ExternalId' => '_id',
            'DisplayUrl' => '_id',
            'Account' => 'accountid',
            'Contract' => 'contractid',
            'CreatedDate' => 'created',
            'Amount' => 'amount',
            'Description' => 'description'
    };

And the sync() method:

override global List sync() {
        List tables = new List();        
        List columns = new List();
        
        // Always declare these two fields.
        columns.add(DataSource.Column.text('ExternalId', 255));
        columns.add(DataSource.Column.url('DisplayUrl'));
        
        columns.add(DataSource.Column.text('Description', 255));
        columns.add(DataSource.Column.text('Contract', 255));
        columns.add(DataSource.Column.text('Account', 255));
        columns.add(DataSource.Column.number('Amount', 16, 2));
        columns.add(DataSource.Column.get('CreatedDate','Created Date','', true, true, DataSource.DataType.DATETIME_TYPE, 255,0,null,null));
        
        tables.add(DataSource.Table.get('MongoDB_Invoice', 'ExternalId', columns));
        return tables;
    }

Finally we can analize how the query is really done using the REST proxy we just installed on Heroku:

private List<Map> execQuery(string soqlQuery) {
        List<Map> rows = new List<Map>();
        HttpRequest request = new HttpRequest();
        request.setEndpoint(DB_ENDPOINT_NC+soqlQuery);
        request.setHeader('Content-Type','application/json');
        request.setMethod('GET');
        request.setTimeout(60000);
        Http h = new Http();
        HttpResponse response = h.send(request);
        if(response.getStatusCode() != 200){ 
            return new List<Map>();
        }
        
        List result = (List)JSON.deserializeUntyped(response.getBody());
        for(Object obj : result){
            Map jsonObject = (Map)obj;
            Map row = new Map();
            row.put('Amount',jsonObject.get('amount'));
            row.put('Description',jsonObject.get('description'));
            row.put('Contract',jsonObject.get('contractid'));
            row.put('Account',jsonObject.get('accountid'));
            row.put('CreatedDate',parseDateTime(String.valueOf(jsonObject.get('created'))));
            row.put('ExternalId', jsonObject.get('_id'));
            row.put('DisplayUrl', DB_ENDPOINT+'/'+jsonObject.get('_id'));
            rows.add(row);
        }
        return rows;
    }

Basically we do a GET HTTP callout to our endpoint (the Named Credentials will work as a charm to authenticate user) and parse the result to create the object that will be used to present the query result.

The getSoqlQuery() method has been modified from Lawrence’s implementation, in order to adhere to the Heroku Crest application requirements (a JSON input query): we support all basic filters and result ordering.

Configure the Data Source

Once we have uploaded the two  classes, we can configure the External Data Source:

Click on the Validate and Sync button to get the new External Object named MongoDB_Invoice.

Configure the External Object’s lookups

Next step is to link this External Object to actual Account and Contract records referenced by the “contactid” and “accountid” fields on the “invoices” table.

For this reason we will create two new fields on the Account:

and Contract standard objects:

Then we can change the Contract__c and Account__c fields on the MongoDB_Invoice__x external object to be indirect lookups: follow this article but you have to change the field types of both fields type to be Indirect Lookup instead of External Lookup.

This is what you get after changing field types:

This way each record of type MongoDB_Invoice__x that have a “Contact__c” field equal to Contact.MongoDB_Ext_ID__c will be linked to the Contract as if it were a standard lookup field (the same happens for the Account object).

Use the External Object on your app

Next step, create a new custom tab for the MongoDB_Invoice__x object (I’ve configured the “All” list view to show all fields):

You can sort all columns (here the Created Date field):

Click on an External ID value you get all object info:

Where the account:

And its related lists:

And the contract:

With its related lists:

Query External Objects

Finally there is an example of a complex query:

You can certanly improve this implementation and add better query handling, speaking as a pure developer, it’s been a real funny implementation. Can’t wait to see DML statements to be added as well!

Stay tuned for the next platform releases!

Powered by WordPress & Theme by Anders Norén