Posted in Dreamforce, How-To, Salesforce

Scalable Salesforce Integrations on a Shoestring Budget – Step-by-Step Guide

AWS Lambda

Create the wrapper application

Open Eclipse IDE with Amazon SDK installed and create a new Maven project.

To be sure you’ve got all required dependencies, modify your pom.xml like the one below:

<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>

	<groupId>com.amazonaws.lambda</groupId>
	<artifactId>cardax.salesforce.account</artifactId>
	<version>1.0.0</version>
	<packaging>jar</packaging>

	<build>
		<plugins>
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-compiler-plugin</artifactId>
				<version>3.6.0</version>
				<configuration>
					<source>1.8</source>
					<target>1.8</target>
					<encoding>UTF-8</encoding>
					<forceJavacCompilerUse>true</forceJavacCompilerUse>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-shade-plugin</artifactId>
				<version>3.0.0</version>
				<executions>
					<execution>
						<phase>package</phase>
						<goals>
							<goal>shade</goal>
						</goals>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>

	<dependencyManagement>
		<dependencies>
			<dependency>
				<groupId>com.amazonaws</groupId>
				<artifactId>aws-java-sdk-bom</artifactId>
				<version>1.11.529</version>
				<type>pom</type>
				<scope>import</scope>
			</dependency>
		</dependencies>
	</dependencyManagement>

	<dependencies>
		<dependency>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
			<version>4.12</version>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>com.amazonaws</groupId>
			<artifactId>aws-lambda-java-core</artifactId>
			<version>1.1.0</version>
		</dependency>
		<dependency>
			<groupId>com.amazonaws</groupId>
			<artifactId>aws-lambda-java-events</artifactId>
			<version>2.2.6</version>
		</dependency>
		<dependency>
			<groupId>com.google.code.gson</groupId>
			<artifactId>gson</artifactId>
			<version>2.8.5</version>
		</dependency>
	</dependencies>

</project>

Adjust the Java Build Path to add all relevant External JARs (i.e. the executable JAR produced by Talend and all its dependencies from the lib folder in the same zip file) :

Here’s the main java class for the Account job, you can easily adjust it for Contact by adjusting lines 16 and 35:

package com.amazonaws.lambda.cardax.salesforce;

import java.util.LinkedList;
import java.util.Map;
import com.google.gson.Gson;

/* Import Amazon libraries for the relevant event that will trigger the Lambda, 
 * in this case SQS
 */
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.SQSEvent;
import com.amazonaws.services.lambda.runtime.events.SQSEvent.SQSMessage;

/* This is the executable JAR generated by Talend, import everything in it */
import local_project.aws_lambda_sfdc_account_df19_0_2.*;


public class SFDCIntegrationAccount implements RequestHandler<SQSEvent, String> {

	@Override
	public String handleRequest(SQSEvent input, Context context) {
		try {
			/*
			 * Setting Defaults for Environmental Variables
			 */
			Map<String,String> env = System.getenv();
			final String TalendContext = env.get("TalendContext")==null || env.get("TalendContext").isEmpty() ? "DF19" : env.get("TalendContext");
			final String DBSCHEMA = env.get("DBSCHEMA")==null || env.get("DBSCHEMA").isEmpty() ? "public" : env.get("DBSCHEMA");

			/**
			 * Loop through SQS messages received from the Queue
			 */
			for(SQSMessage msg : input.getRecords()){
				final AWS_Lambda_SFDC_Account_DF19 talendJob = new AWS_Lambda_SFDC_Account_DF19();
				context.getLogger().log("*********** MESSAGE FROM SQS ***********");
				String body = msg.getBody();
				body = cleanupBody(body);

				LinkedList<String> argsList = new LinkedList<String>();
				argsList.add("--context="+TalendContext);
				argsList.add("--context_param AWS_SQSmessage=" +body);
				/** You can override Context variables taking them from System Env, 
				 * otherwise the values defined in Talend will be used 
				 */   	        
				if(!DBSCHEMA.isEmpty()) argsList.add("--context_param DBSCHEMA=" +DBSCHEMA);
				String[] args = argsList.toArray(new String[argsList.size()]);

				/* Once we have our arguments we can run the Talend job with a single line of code */    	            	        
				String[][] bufferOutput = talendJob.runJob(args);

				/* The job output can be sent to CloudWatch Log for further analysis */
				Gson gson = new Gson();
				String bufferIn = gson.toJson(bufferOutput);
				context.getLogger().log(bufferIn);
			}
		} catch (Exception e) {
			context.getLogger().log("Error!");
			context.getLogger().log(e.getMessage());
			e.printStackTrace();
		}    	
		return "OK";
	}

	/**
	 * Removing carriage returns and other unneeded characters before passing it to Talend
	 */
	private String cleanupBody(String body) {
		body = body.substring(1, body.length()-1);
		body = body.replaceAll("\\n", "");
		body = body.replaceAll("\\r", "");
		body = body.replace("\n","");
		body = body.replace("\\n","");
		body = body.replace("\\r","");
		body = body.replace("\\", "");
		return body;
	}

}

Adjust the Lambda function settings

Your code can now be packaged and uploaded as an AWS Lambda function using the AWS Toolkit for Eclipse. Right-click in your code, select AWS Lambda > Upload function to AWS Lambda… , and follow the instructions:

You will now see your Lambda function in the AWS Console, you can adjust the allocated memory, environment variables, etc.

Add a trigger for the lambda function

We can now link the Lambda function with its relevant SQS queue by clicking “+ Add Trigger“:

As soon as the Lambda function is added as trigger to the relevant SQS queue, the queued messages will be processed in batches of the size you specified [1 to 10 messages].

Test the integration

Create a record in Salesforce and verify that it is correctly inserted into the DB.
In case of issues, look for error messages on CloudWatch Logs


AWS CloudWatch

As per AWS description:

CloudWatch collects monitoring and operational data in the form of logs, metrics, and events, and visualizes it using automated dashboards so you can get a unified view of your AWS resources, applications, and services that run in AWS and on-premises. You can correlate your metrics and logs to better understand the health and performance of your resources. You can also create alarms based on metric value thresholds you specify, or that can watch for anomalous metric behavior based on machine learning algorithms. To take action quickly, you can set up automated actions to notify you if an alarm is triggered and automatically start auto scaling, for example, to help reduce mean-time-to-resolution. You can also dive deep and analyze your metrics, logs, and traces, to better understand how to improve application performance.

How Amazon CloudWatch works

Follow the excellent tutorials to get started on CloudWatch on AWS: https://aws.amazon.com/cloudwatch/getting-started/


Conclusion

I hope you will find this tutorial useful.
Feel free to share your feedback or enhancements to this AWS architecture using the comments below.
I’m not an AWS expert and I’m always looking forward to learning new skills.
For example, I believe there is an easier way to reproduce this configuration using AWS CloudFormation. It is a service to describe and recreate the whole architecture setup using JSON or YAML files. If you have experience with that service, please let me know if my assumption is correct. 🙂

Author:

13x Salesforce Certifications (incl. Application & System Architect) | 3x Advanced Accreditations (Einstein Discovery, Einstein Analytics and Sales Cloud Einstein) | #CertifiedPro on my #JourneyToCTA | #AnalyticsChampion | Dreamforce '19 Speaker

Leave a Reply

Your email address will not be published. Required fields are marked *