Salesforce alone itself stands as a powerful tool in CRM and allied aspects of enterprise administration. It can power up your customer relations, customer info, sales pipelines, account renewals, and manage your marketing and contracts, etc. Salesforce can also integrate with your existing systems like the enterprise ERP or e-com application.
How to integrate other applications with Salesforce is the biggest question users may have. The answer is that it is based on your enterprise’s business needs and the volume of data involved. When it comes to MuleSoft integration with Salesforce, you are half done if you are already familiar with the Anypoint Studio of MuleSoft and Salesforce Connector. By using your Salesforce Connector credentials and with the triggers already in place, the next step is so easy to go ahead with any of the most common ways for MuleSoft-SFDC integration, as discussed below.
Standard Salesforce MuleSoft integration approaches
The primary query of Salesforce Object, also known as the SOQL query, is the simplest mode of connecting Salesforce with any external database, API, or application with the MuleSoft. Once you configure the Salesforce authentication within the MuleSoft platform, you may place the Salesforce Connector to the flow using the appropriate SOQL query in the Connector. Running this query will return a list of essential Salesforce Objects.
This is an optimal method while you anticipate only a small number of query results and also that the results may not always be timely. Suppose you want to get the most updated data from the Salesforce for reporting daily and unable to see a considerable number of results returning. In that case, the basic Salesforce Query is enough, which is also very simple to implement.
Using bulk Salesforce query
Sometimes, you have to use bulk and batch queries while inserting or updating data in an external database. While this needs a greater number of codes than the previous SOQL method, it is still an ideal choice if the number of records is bigger and getting involved in the updates, but do not need any real-time responsiveness.
You may go through various steps to set up bulk query integration:
- Creating batch flows: Using the processes like PK Chunking (Primary Key Chunking), you can also create your workflow. Using the chunking method is much efficient as your data gets broken into minimal size and more manageable pieces. At this stage, you may get the job ID in bulk from Salesforce. You may need these for the remainder of your queries.
- Creation of batch flow query: You need to define the specific data to be returned from batch flow using the corresponding JOB ID, which you get through the step before.
- Receiving the job status: Return the job status as either success or failure just before doing anything with that data.
- When the job is completed, you may apply the need data.
This method is ideal when you have data in large batches, which needs to be written out daily, like the customer data, etc. This approach does have some restrictions which you have to be aware of like there is only a limited amount of data that gets returned, i.e., 1 GB or less. It should also be noted that every 10 minutes, you have only 15 attempts to make with a query processing limit of two minutes. As the API is also asynchronous, being an SF DevOps user, you may not immediately get the results as needed.
Near real-time notification triggers
Another relatively new feature to Salesforce Connector is the real-time notification triggers. They are ideally built into the Connector and also can check for triggers. There is no need for custom coding via push notifications.
These new real-time notification triggers include:
- On New Object: It gets triggers while a new record gets created. It can be like a new account, new lead, or opportunity.
- On Modified Object: This gets triggered on a record getting updated.
- On Deleted Object: It returns the records which are moved to the recycle bin.
As a new lead is getting added to the Salesforce, the triggered notification gets the same updated in an external database or the destined application.
The platform-based events
Platform Events are considered to be the closest to the real-time updates among all the methods and the most involved method. Unlike what we had seen in the previous method, the platforms may not notify you as and when something happens. It merely kicks off the MuleSoft flow as almost nearly real-time databases, APIs, messaging, file sharing, etc.
The actual trade-off with the involvement in this method is the speed with which the actions can be executed and the ability to define the events in a custom way. Also, the use of this event-driven architecture will make it a scalable solution to rely on. Let us explore the steps involved in this method:
- In MuleSoft, we may have to subscribe to the Salesforce Connector streaming channel. To listen to the incoming platform events, we may also use the long polling. On receiving the event, MuleSoft may initiate the designated flow.
- At the next step, MuleSoft may perform the most desired operation of creation, update, deletion, etc. on the target object. Following the best practice, users also get the status as success or failure in return from the database, file share, or API.
Once on completing the target operation, MuleSoft will update Salesforce events as above with the success of failure. By using Connector, you can update the most appropriate Salesforce Object with the status of success status and the time stamp or the error message with error ID and time stamp. Salesforce records also will receive the status update and auto-populate the fields related to it with corresponding data. Once the Event ends, the MuleSoft stream may resume listening to other events.
This use case is ideal when we use Salesforce as a record for the customer data where an important change like an address update has to be shared in real-time with the other systems like a shipping database or an e-com app.
A right MuleSoft partner may help you determine which of the above approaches is ideal for integration and help organize it well. With the help of a consultant, you can best determine the right approach for SFDC integration and implement the best strategy and design suiting to your need.