Skip to main content
Gainsight Inc.

Connectors Troubleshooting Guide

IMPORTANT NOTE: Gainsight is upgrading Connectors 2.0 with Horizon Experience. This article applies to tenants which are yet to be upgraded to the Horizon Experience of Connectors 2.0. If you are using Connectors 2.0 with Horizon Experience, you can find the documentation here.

Issue: Raw data and Day agg data is not matching

DATA_API: whenever you see this issue, rerun the aggregation for the timeframe where you are seeing this issue.

Cause: Data may have been loaded at a later date, i.e. post the scheduled aggregation Date.

SEGMENT_IO: this might be because scheduled aggregation time is not set properly. Ideally, there should be a minimum of 4 hour buffer between Segment project timezone & Aggregation schedule time. Days prior to schedule date should be minimum “1”.    

Example: Project time zone is 12:00 am PST, then aggregation should be scheduled at 4:00 am PST.

Mixpanel & Google Analytics: in schedule aggregation, set up Days prior to schedule date should be minimum “1”.

Issue: Data is present in Raw data, but not visible in Day agg

For all connectors, the most likely reason a particular record appears as -999 is because that particular account is not present in the account object. In this case, you need to check for the record using the source account id in day agg and then check whether that record is present in the account object.

Another reason could be the record might not be a Gainsight customer, which means you can find the record present in Account object but the Customerinfo field in the account object is null. In these cases, the records are not even pulled into day agg subject areas.

Issue: All records in Day agg or Flipped are displaying as -999, i.e. Account Not Found

This is because either the account mapping is improperly configured, or it’s bad data.

Direct Lookup:


Here we have a field SF_NATIVE_ACCOUNT_ID coming from source with an 18 digit account id matching to account object id.

Indirect Lookup:


Here we have a field SF_NATIVE_ACCOUNT_ID coming from source which matches to data present in column Case Number on the Case object (It can be any object of salesforce). The account key  should be a field that has a lookup on the Account object; in this case the field name is Account ID.


Here we have a field SF_NATIVE_ACCOUNT_ID coming from source which matches data present in column Case Number on the Case object. The account key should be a field that has lookup on Account object; in this case the field name is Business Hours ID (the field name need not be Account ID). Sometimes Admins may select a field that has a name Account ID but that field does not have a lookup on Account object. Hence the -999 issue.

Similarly for users i.e. -888 or User not found

Users may be improperly configured. For account configuration, we can select any salesforce object, but for user configuration, Contact object is a fixed choice. We can only select fields present in the Contact object that matches with source data.


Here the field PRODUCTFULLNAME from the source has data that matches with the Email field.

The above configuration, i.e Account & user mapping configuration If you are planning to alter, please raise a support ticket as it requires some changes to be made on the backend.

Issue: Altering the schema, i.e changing the datatype

For example, let’s say we have created an MDA object using COM and the Object name is “Dummy”; it has two fields, Account id & Date. The Account id field is created with Number data type, and the Date field is set up as a Date datatype.

If you need to change the datatype, and you haven’t loaded the data yet, then we can edit the data type from Gainsight Data Management and fix the issue. But after data is loaded, we should not make these changes; they must be performed in the back end via techops, so please raise a support ticket in this case. Do not try to change via Data Management, as this will not resolve the issue.

Issue: Random failure in aggregation (Data load API, Segment IO, Mixpanel, Google Analytics)

Sometimes we see that scheduled aggregations fail, but when we rerun them manually, the aggregation succeeds. This may be due to some network issue with dyno restart, or a db network issue. This is typically an intermittent issue; the schedule aggregation will run successfully from the next day onwards.

In this case, the question is whether you need to rerun the project alone, or if there are other impacted areas (eg. rules, Journey Orchestrator, etc.). For example, you should check whether day agg or flipped acts as a source in any of the rules. If so, then you need to rerun the rules for that particular day alone.

You also need to check usage configuration and if necessary, rerun usage aggregation as well.

S3 common failures

Whenever there is a failure in S3 jobs, you need to check the S3 execution history and check the error message. Then, you need to login into S3 using any tool, then enter the error folder and pick the corresponding file which is tied to the job failure and failure reason.

Some examples:

  • Only header is present with no data
  • Datatype mismatch between data present in file and schema created in S3 configuration

Issue: Events are visible in Day agg but not in Flipped (Segment IO & Mixpanel)

Whenever new events are pushed into Gainsight, these should be by default visible in raw data as well as Dayagg, if they are properly formatted.

But for them to be visible in flipped, you need to go to integration and select the events that you want to be displayed in Flipped measure.

Unless we select them here, new events will not be visible in Flipped measure.


Issue: Deletion of data in MDA objects

To delete MDA objects using COM, we need to go to Data Selection in COM, and in the right hand side and click the “delete all” option.

Delete All Option.png

This completely deletes the data from the object permanently.

If you want to delete only a subset of data, then you need to use a filter and then click the “delete” option.

Delete Option.png

Deletion of data is possible only at row level, but not at column level. Deleting data in a particular column from COM is not possible; this can be done via tech ops only.

  • Was this article helpful?