Skip to main content
Gainsight Inc.

S3 Account and User Import

This article describes how you configure the S3 integration required to import accounts and users into PX.

 This article describes how you configure the S3 integration required to import accounts and users into PX.

Overview

Gainsight PX is able to import daily updates to user data from a pre-defined directory structure in an AWS S3 bucket. The bucket directory structure and files need to follow strict rules that are defined below.

Prerequisites

  • S3 Bucket Root Folders must be configured as defined in the Bucket Structure section of this article.

  • An empty file with naming conventions as defined in the Process Data Flow section of this article must be placed in the subdirectory.

  • The file guidelines as defined in the CSV Structure and Data Type sections of this article must be followed.

Configure Account Import

To configure a connection for account import:

  1. Navigate to Administration > Integrations.
  2. Click the Settings icon or Authorize button on the S3 Account Import widget in the Data Integrations section. The AWS S3 Import Settings dialog appears.

S3 account import.png

  1. Enter the following details to authorize the connection:
    • S3 Bucket Name: Bucket name and any optional sub-folder(s) separated by slashes. This is the name of the bucket that was created in S3. For Example, to export files to a bucket ABC123 with a folder “export” use ABC123/export.
    • AWS Access Key and AWS Security Token credentials of your S3 bucket. Ensure you enter valid credentials to establish an S3 integration.
    • Path to base import folder: The path of the import folder 
    • Date Format for the dates in the imported files
    • API Key for the status callback endpoint
  2. Click Apply.

Configure User Import

To configure a connection for user import:

  1. Navigate to Administration > Integrations.
  2. Click the Settings icon or Authorize button on the S3 User Import widget in the Data Integrations section. The AWS S3 Import Settings dialog appears.

S# user import.png

  1. Enter the following details to authorize the connection:
    • S3 Bucket Name: Bucket name and any optional sub-folder(s) separated by slashes. This is the name of the bucket that was created in S3. For Example, to export files to a bucket ABC123 with a folder “export” use ABC123/export.
    • AWS Access Key and AWS Security Token credentials of your S3 bucket. Ensure you enter valid credentials to establish an S3 integration.
    • Path to base import folder: The path of the import folder 
    • Date Format for the dates in the imported files
    • API Key for the status callback endpoint
  2. Click Apply.

S3 Bucket Configurations

Bucket Permissions

You need to provide an authentication token (and security token) that provides read/write access to the import folder in the bucket.

Bucket Structure

You need to configure the Bucket Structure as defined in the following sections.

Root Folder

The root folder should contain two subfolders called current and processed

Process/Data Flow

Schedule: 

The PX import job runs every 15 minutes. If any input files are found, the file is imported. If more than one file is found in a folder, the files are imported serially.

Data Flow: 

For each day’s data, you need to create a subdirectory with a name created using the current date in the format dt=YYYYMMDD. The exported user file needs to be placed in that subdirectory. When a data file is ready to be processed, place an empty file named “_SUCCESS” into the directory to signal that the write process is complete and PX can begin processing the file. Once the job is complete, the files are moved to the processed folder. For more details, refer to the Validate Import using Status File section of this article.

Input Queue Example:

 ./current 
   /dt=20210228 
      _SUCCESS 
      users.csv.gz

Processed Files Example:

./processed 
   /20210228 
      status.txt
        _SUCCESS
        users.csv.gz

CSV Structure

The following is the CSV structure suggested by Gainsight PX:

  • Standard CSV (RFC0180)
    For more information, refer to the CSV specifications article from the Additional Resources section.
  • Gzipped, with extension “.csv.gz”.
  • Filename prefix (example: “users”) is not significant.
  • One input file per date directory
  • All columns must be defined in the User attributes area of PX, the column headers should be the “apiName”, not the View Name of the attribute.
  • Data must be parseable according to types defined in PX (see below) • Limited to 250,000 rows per file

Note: The id column is mandatory.

Data Types

 The following are the data types:

  • String: Fields containing line breaks (CRLF), double quotes, and commas should be enclosed in double quotes. If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with another double quote. 
  • Numeric: Integer or floating-point values 
  • Boolean: “true” or “false” (not case sensitive)
  • Date: Should be in the format specified by the Date Format value on the configuration screen.
    The valid characters are listed here.
    Note: The proper abbreviation for month is “MM”, not “mm”

Typical values are: 

  • Date Only

                 Format String: YYYYMMDD

                 Example Data: 20211225 

  • Date/Time (UTC time-zone)

                Format String: yyyy-MM-dd'T'HH:mm:ssX 

                Example Data: 2021-06-25T13:55:42Z

Validate Import Using Status File 

When an import job finishes, the imported file is moved to the processed folder and a status.txt file is created. The first line contains the result, either “SUCCESS” or “ERROR”. Subsequent lines contain a human-readable message with more details about the results of the import.

Success Example:

SUCCESS
4155 records were uploaded

Error Example:

ERROR
Parsing Errors:[
       Unknown PX Column: 'flantana_carteret',
       Missing required Column: 'id' 
]

Validate Import Using Status Endpoint

You can provide an optional HTTP endpoint for receiving notifications from PX when jobs are finished (errors or success).

  • Example Endpoint: https://www.example.com/prd/data-export
  • Method: POST
  • Authentication: x-api-key header with api key provided by customer
  • Body: { folder_path: “{path to processed folder}”, success: {true|false}, message: “{load result details}” }
  • Response: 200 okay, body: String

Additional Resources

  • Was this article helpful?