Extract Transform Load Tool
ETL Overview
Previously in Fenergo SaaS, clients were restricted to using the Data Migration API's for bulk creation or updates of pre-verified entities. This method is primarily designed for clients with technical expertise, requiring in-house resources to manage data transfer jobs. As a result, business-focused clients or those seeking to minimize reliance on their tech teams faced significant challenges in managing data migrations or data updates.
To address some of these challenges, the new ETL tool provides a more user-friendly alternative. Clients can upload .CSV files containing their data directly through the UI to create or update client records, simplifying the entity transfer process. Within the tool, users can join and filter the uploaded data sources, the fields from these data sources can then be mapped to the corresponding fields in Fen X. A preview feature allows clients to review transformations and mappings, and once satisfied, they can validate the data against the field specifications defined in Fen X before initiating the loading process. Enhanced observability enables users to monitor the progress and success of validation and data loads in real time, eliminating the need for extensive technical involvement.
ETL supports the following primary scenarios:
- System Replacement: When transitioning to the Fenergo SaaS platform by retiring or replacing an existing system, the ETL tool ensures seamless migration of critical entity data from the legacy system to maintain uninterrupted operations.
- Day 1 Readiness: The ETL tool enables the initial seeding of data into the Fenergo SaaS platform, ensuring full operational capabilities from the very first day of deployment.
- Data Alignment and Updates: In cases where pre-verified or mastered data exists in another system, the ETL tool facilitates updating data attributes in Fen X to maintain alignment and consistency.
Until ETL has been enabled for all entity types, the existing Data Migration API's, may be more suitable. You can collaborate with your client partner to determine which solution, Data Migration APIs or the ETL tool best fits your needs.
ETL Performance Limits
| Entity Type | Throughput (Load) | Max Volume per ETL Project |
|---|---|---|
| Entities | 70,000 per hour | Up to 1.6 million total records across entities, associations and Products |
| Associations | 70,000 per hour | Up to 500,000 (expected to increase to 1.6 million) |
| Products | 100,000 per hour | Up to 1.6 million total records across entities, associations and Products |
Note: The file size for a dataset containing 1.6 million entities was approximately 200–300MB. This is an estimate and may vary depending on the structure of your data source and the applied migration policy. We do not anticipate any issues with supporting increased volumes.
Performance enhancements are actively in progress and we are looking to increase these limits.
This is now available for release on production tenants. Currently Individual, Other (incl Investment and Bank Accounts), Company, Products, Client and Non Client entities and Associations are available for migrating or updating, we are actively working on the inclusion of other entities.
While the tool itself is capable of supporting significantly higher volumes (up to 1.6 million entities in a single project, as referenced above), tenant-level limits (10,000 of each entity type) are applied to manage load and maintain system stability. If your use case requires higher volumes, please reach out to your client partner or Fenergo support to discuss increasing your tenant’s capacity. You will receive a message on preview if you exceed your tenant load. See trouble shooting guide.
There is also a limit that only one migration project per tenant can have Validation or Load happen at the same time
When using the ETL functionality, it is advised to use a specific Policy that is not necessarily Jurisdictional. Users need to ensure that they do not use large Policies like Global in order to prevent creating large Data Sets with rigid requirements that will never pass validation. Additionally, keeping the Policy being used in ETL smaller will be advantageous to the end-user, as it means that they are only seeing the specific requirements needed in their migration job.
Data will be persisted directly to verified data, the ETL tool does not write to draft data or initiate any post processing activities.