Assumption & Data Schema

Let’s consider a scenario where a Marketing Automation software provider wants to analyze Email Campaign data. We can assume that we get fresh data everyday -- implies one record per Campaign per day. The date element in this example can be the Campaign execution date, which means that data for Campaigns executed on 11/4/2014 would come in on 11/5/2014. We can also assume that we have a User identifier that corresponds to the person who initiated the Campaign. This Id, for the purposes of this example, can live on the Contact object.

With these assumptions, the data feed has to have the following four columns:

  1. campaign_id (acts as the unique identifier for each Campaign)
  2. execution_date (tells us when the Campaign was run)
  3. emails_sent (tells us how many emails were sent as part of the Campaign)
  4. user_id (tells us who ran the Campaign)

Let’s also assume that the end goal here is to get to the following:

  • Marketing Campaigns run per Account per week
  • Emails sent per Account per week

Create Subject Area

Create Subject Area

Now that we know what the structure is, we can configure a subject area in the MDA to store this data. This can be done using the Tenant Management App available internally and Gainsight Services will configure during your implementation. Let’s call it Sample_Data_Load.


In this case, we have three dimensions and one measure. SFDC Mapping is not required, since we don’t have the Account Id available directly in the data.

Create Account Linkage

Technically, you could just populate this data into the collection without any linkage to SFDC Accounts, since you can build reports on this data and now even push them to Gainsight Home. This clearly is not enough, and is therefore, not the recommended way to use Gainsight.

There are broadly three ways to create the Account linkage:

  1. Provide us the SFDC Account Id directly.
  2. Provide us some other field on the Account record that can uniquely identify the Account, such as an External Id.
  3. Provide us an Id that lives on an object that is related directly to SFDC Accounts. For example, this could be a User Id or Email Id on the Contact object, or something like a Group Id that lives on a custom object that connects back up to Account.

In scenario #1, Account linkage can be created directly in the previous step. For the other two scenarios, we will have to treat this subject area as a staging site, and then perform lookups to create the Account linkage.

For this example, we are assuming scenario #3 (user_id lives on the Contact object, which then connects to SFDC Account). Following are the steps you can take:

  • Navigate to Administration > Connectors > Data Load Configuration and click +Add New. Then create a new project on top of the Subject Area called Sample_Data_Load.
  • Using the drop-down, check all the fields you want to retain from the staging subject area.
  • Provide SFDC mappings like shown below. Account Id mapping should correspond to the field to helps create the Account linkage. Date mapping is used to auto-calculate Week, Month, Quarter and Year dimensions -- useful for pivoting and creating rules.
  • Since user_id has been selected to help us make the Account linkage, we can now chart our path there. SFDC Field Name is the field that contains the value present in user_id, and Account Key is the field on the Contact object that has the SFDC Account Id.

Compose CSV

Let’s start with four records for now. We can name this file: campaign_data.csv.


Please note that the date format used here is MM/dd/yyyy, which is one of the following supported formats:

  • yyyy-MM-dd'T'HH:mm:ssZ
  • yyyy-MM-dd:HH-mm-ss
  • MM-dd-yyyy
  • yyyy-MM-dd
  • yyyy-M-d
  • M-d-yyyy
  • MM/dd/yyyy
  • yyyy/MM/dd
  • yyyy/M/d
  • M/d/yyyy

Compose JSON

Let’s start with a simple INSERT operation. UPSERT is not recommended, as it negatively affects our performance. Exceptions can, however, be made if the volume and frequency are low. Please discuss and confirm with Gainsight before taking the UPSERT route.

We can name this file: sample.json.


If this were an UPSERT operation instead for some reason, here’s how the JSON would look:


Compose CURL

The fully composed command would look like this:


Here’s how you can find the various components:

  • accessKey: This can be generated from Gainsight → Administration → Integrations → Data Load Configuration → Generate Access Key.
  • appOrgId: This is the 18-digit Id of the SFDC Org associated with the MDA tenant to which this data is being pushed. The 15-digit version of this Id can be found under Setup → Company Information → Organization Id.
  • loginName: This is the Username that was used to log into the connected SFDC Org, and using which the Access Key was generated. It can be found under Setup → Personal Information → Username.

Please also note that the entire file path has to be specified for the payload CSV and metadata JSON.

Automate Data Load to MDA

This command can be used in two ways:

  1. Customers own extracting the data and pushing it to MDA. In this case, all the code is created and maintained by the customer. Gainsight can provide guidance, but will not own the data load piece.
  2. Using Informatica Cloud’s post-processing feature. In this case, the customer owns extracting the data and loading it to the SFTP in the agreed-upon format and frequency. Gainsight will own and manage the rest.

Perform Aggregations via Rules Engine

Before we start aggregating the data further, let’s do a quick recap of what we have so far.

Subject Area 1: Sample_Data_Load


Subject Area 2: Sample_Data_Load_Lookup Day Agg


With the data in the second subject area, we can get to the final weekly metrics in two ways:

  • Use the report builder to perform run-time aggregations.
  • Build aggregation rules in the Rules Engine to push weekly data to Usage Data metrics in SFDC.

In both the cases, assuming the week definition is Sunday through Saturday, a weekly schedule can be set to run every Sunday morning.