Top ETL Developer Skills

Below we've compiled a list of the most important skills for an ETL Developer. We ranked the top skills based on the percentage of ETL Developer resumes they appeared on. For example, 13.8% of ETL Developer resumes contained Data Warehouse as a skill. Let's find out what skills an ETL Developer actually needs in order to be successful in the workplace.

The six most common skills found on ETL Developer resumes in 2020. Read below to see the full list.

1. Data Warehouse

high Demand
Here's how Data Warehouse is used in ETL Developer jobs:
  • Involved in implementing star schema for the data warehouse using Erwin for Logical/Physical data modeling and Dimensional Data Modeling.
  • Generated Surrogate Keys for composite attributes while loading the data into Data Warehouse using Key Management function.
  • Reviewed business requirements and current data models to develop a framework that supported BI Reporting/ Data warehouse.
  • Coordinated with counter parts end-to-end design and development of assigned Enterprise Data Warehouse solutions and subject areas.
  • Performed analysis, design and development activities associated with maintenance and production and Data Warehouse support.
  • Gathered and analyzed business requirements to design staging database and dimensional model for data warehouse.
  • Implemented same functional design for different data warehouse, to implement substitute of legacy database.
  • Distributed data residing in heterogeneous data sources is consolidated into target enterprise Data Warehouse.
  • Create source/target mappings of data from applications into the data warehouse environment.
  • Experienced in developing test plans used in validation of data warehouse solutions.
  • Designed logical and physical models for staging database and data warehouse.
  • Analyzed and designed source code documentation for investment Data Warehouse.
  • Developed the transformation/business logic to load data into data warehouse.
  • Designed and customized data models for data warehouse.
  • Translated business requirements into data warehouse design.
  • Used DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into Data Warehouse database.
  • Implemented SCD Type 1 and SCD Type 2 methodologies in ODS tables loading, to keep historical data in data warehouse.
  • Used Data Stage Designer to develop various jobs to extract, transform, integrate and load data into data warehouse database.
  • Converted the business rules into technical specifications for ETL process for populating fact and dimension tables of the data warehouse.
  • Created ETL logic design for data warehouse and data marts in star schema methodology with conformed dimension and fact tables.

Show More

2. Informatica

high Demand
Here's how Informatica is used in ETL Developer jobs:
  • Created sessions by extensively using ETL methodology for complete processing of data extraction, transformations and loading using Informatica.
  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ application for matching and merging process.
  • Experience in Development the Mappings using needed Transformations in Informatica Tool according to technical specifications.
  • Worked on migrating Informatica mappings between environments for development, testing and production implementation purposes.
  • Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Created Informatica PowerExchange registration, data map, configuring connection, recovery and tokens.
  • Developed various reusable transformations using the Transformation Developer in the Informatica PowerCenter Designer.
  • Developed mappings/sessions using Informatica Power Center for initial loading and incremental loading.
  • Worked on performance tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Involved in creating Mortgage applications and handled application named Letters using Informatica.
  • Involved in Tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Administered Impact Analysis, fine tune Informatica mappings to implement enhancements.
  • Performed data manipulation using basic functions and Informatica Transformations.
  • Developed ETL mappings, transformations using Informatica Power Developer.
  • Involved in Developing Mappings using Informatica Designer Tool.
  • Developed various mappings and Transformations using Informatica Designer.
  • Key member in defining standards for Informatica implementation.
  • Experienced in designing and developing Informatica IDQ environment.
  • Perform Data Conversion/Data migration using Informatica PowerCenter.

Show More

3. Business Requirements

high Demand
Here's how Business Requirements is used in ETL Developer jobs:
  • Created Technical Design specification documents for data extraction, transformation and loading by interacting with client and understanding the business requirements.
  • Team Lead and interacting with Business Analyst to understand the business requirements & Involved in analyzing requirements to refine transformations.
  • Experienced in developing parallel jobs using various change capture two methodology using various processing stages as per business requirements.
  • Study and comprehend business requirements, constant interaction with business and come up with a detailed technical specification document.
  • Analyzed business requirements and created Functional Specifications for Member, Provider, Enrollment, Eligibility and Benefits Services Applications.
  • Interacted with business analysts to gather the business requirements by attending regular meetings with the business community.
  • Worked on understanding business requirements and enhancing the existing data warehouse architecture for a better performance.
  • Analyzed Business Requirements Document and Functional Specification document to develop detailed Test Plan and Test Cases.
  • Developed Aggregations, partitions and calculated members for cube as per business requirements and technical standards.
  • Interacted with the business users on a regular basis to collect business requirements and specifications.
  • Analyzed the business requirement document and created functional requirement document mapping all the business requirements.
  • Worked closely with the Business Analyst to formulate Business Requirements and assimilating the required documentation.
  • Participated in user meetings, gathered business requirements & specifications for the data warehouse design.
  • Coordinated off -shore team and provided training in understanding business requirements and performed code reviews.
  • Created update strategy and stored procedure transformations to populate targets based on business requirements.
  • Interacted with Business Analyst to understand the business requirements and created Technical Design Document.
  • Interacted with business teams for gathering business requirements and involved in documentation of specifications.
  • Gathered, analyzed business requirements and created mapping documents for the integration process.
  • Created Conversion Mapping Documents and Extract Mapping Documents with respect to Business requirements.
  • Developed High Level Technical Design specification and Low-level specifications based on business requirements.

Show More

4. Target Database

high Demand
Here's how Target Database is used in ETL Developer jobs:
  • Performed tuning by eliminating source and target database bottlenecks, mapping bottlenecks and fine-tuned the pipeline logic and transformation settings.
  • Generate data and load into target database for Customer segmentation reporting.
  • Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
  • Test the source, target databases and the ETL process (Data Extraction, Data Loading and Data Transformations).
  • Understand the Technical Specifications and develop the ETL packages for Extraction Transformation, Cleansing and Loading process into target database.
  • Worked for ETL process of data loading from different sources and data validation process from staging area to Target database.
  • Created new mappings according to business rules to extract data from different sources, transform and load target databases.
  • Created PL/SQL Stored procedure construct to automate the checking of status of target database before loading data into it.
  • Tuned mappings, transformations and recommended tuning options of source/target database to DBA team for obtaining optimum performance.
  • Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.
  • Used Version mapping to update the slowly changing dimensions to keep full history to the target database.
  • Scheduled Sequential and Concurrent sessions and batches for loading from source to target database through server manager.
  • Created and ran Sessions to load Target databases, while ensuring clean data loading into the Warehouse.
  • Write BTEQ scripts for validation & data integrity between source and target databases and for report generation.
  • Optimized mappings, sessions/tasks, source, and target databases as part of the performance tuning.
  • Extracted data from Sales department to flat files and load the data to the target database.
  • Write SQL Scripts to extract and compare data from source and target databases for testing purposes.
  • Performed Verification, Validation, and Transformations on the Input data before loading into target database.
  • Created mappings with proper DI transformations between source and target databases with Business Objects Data Services.
  • Involved in performance tuning on the source and target database for querying and data loading.

Show More

5. Unix

high Demand
Here's how Unix is used in ETL Developer jobs:
  • Developed audit scripts in UNIX which are helpful to effectively monitor everyday data warehouse activities.
  • Developed application using UNIX operating system including scripting.
  • Worked with UNIX scripts to perform the purge on the table and to do the automation on refreshing the materialized views.
  • Work on creating Generic wrapper script framework for running all the mapping ETL graphs using UNIX and Oracle DB configuration tables.
  • Developed Unix Shell scripts to invoke different ETL processes to automate the whole process for better handling of overall ETL process.
  • Involved in creation of various UNIX scripts which help the ETL scheduling jobs and help in initial validation of various tasks.
  • Used mainframe tools such as NDM Connect Direct and Unix script wrapper to transfer the outbound files using secure EFT process.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Modified UNIX shells to implement new landing path for various feeds with new FTP connections and for proper log maintenance.
  • Used UNIX shell scripting, FTP, file management and automation of ETL processes in CA Service desk tools.
  • Worked on writing UNIX scripts for finding out space usage / Collecting CPU Statistics/ Data Archival and Clean-up process.
  • Developed Mappings that extract data to UNIX server and monitored the Daily, Weekly, Monthly and Quarterly Loads.
  • Channelized the different ETL components and programmed jobs to execute in the required sequence using UNIX shell scripting.
  • Used Unix Operating system and various UNIX commands to get, filter, search, find, replace.
  • Created UNIX Scripts for creating the indexes, altering the default table spaces and analyzing the tables.
  • Experience with server log files, editing UNIX scripts, FTP files and checking space on server.
  • File System Set Up - migrated all UNIX scripts from the old environment to new production system.
  • Created UNIX Shell scripts to abort the job upon critical warnings and to transfer files between servers.
  • Developed UNIX scripts and SQL scripts to configure the database tables to perform different kinds of checks.
  • Developed and maintained jobs using UNIX shell scripting to load daily, weekly, monthly data.

Show More

Job type you want
Full Time
Part Time
Internship
Temporary

6. Lookup

high Demand
Here's how Lookup is used in ETL Developer jobs:
  • Worked towards optimal performance when using Stages like LOOKUP, JOIN, and MERGE.
  • Validated data from Pivot tables, VLOOKUP validations in Excel.
  • Implemented slowly changing dimensions methodology to keep track of historical data using Connected, Unconnected lookup transformations with Update Strategy transformation.
  • Developed mappings using connected/unconnected lookup, router, joiner, expression, filter, sequence generator transformations.
  • Worked extensively on connected, unconnected and dynamic lookup transformations for better performance.
  • Worked extensively on lookup and update strategy transformations to implement SCD Type2.
  • Created connected, unconnected and dynamic lookup transformation for better performance.
  • Performed various update strategies using Lookup and Update Strategy transformations.
  • Created complex transformations using connected / unconnected lookups / procedures.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Implemented Lookup transformation to update already existing target tables.
  • Used Lookup transformations to manipulate the information.
  • Created complex transformations using connected/unconnected Lookups.
  • Developed Parallel jobs using Parallel stages like: Merge, Join, Lookup, Transformer, Funnel, Oracle Enterprise Stage.
  • Used Transformation Developer to create the Joiner, filters, lookups, Expressions and Aggregation transformations that are used in mappings.
  • Used various transformations like Source Qualifier, Joiner, Expression, Aggregate, Lookup, Filter, and Update Strategy etc.
  • Designed and developed complex ETL mappings making use of transformations like router, update strategy, lookup, expression, joiner.
  • Created and deployed SSIS packages using various Transformations such as Fuzzy Lookup, Fuzzy Grouping, Aggregate and Derived Column Transformations.
  • Used various Transformations like Expression, Filter, Joiner and Lookups for better data massaging to migrate clean and consistent data.
  • Developed the various mappings using transformation like source qualifier, joiner, filter, router, Expression and lookup transformations etc.

Show More

7. Windows XP

high Demand
Here's how Windows XP is used in ETL Developer jobs:
  • Prepared a production monitoring and support handbook Environment: Informatica power center 6.1/7.1, Oracle, windows XP, MS office Suite
  • Do the detail unit/integration testing Environment: Informatica Power Center 8.6.0, Oracle 10g, Windows XP, UNIX.
  • Created Unit Test cases and captured Test logEnvironment: Informatica, Hyperion 11.1.1.3, Oracle, Windows XP, UNIX.
  • Created Unit Test cases and captured Test logEnvironment: Cognos 8.4, Oracle, Windows XP, UNIX
  • Managed and coordinated Weekly/Monthly enhancement releases Tools: Informatica PowerCenter 8.6.1, MySQL, Putty, Windows XP Project #8
  • Created Unit Test cases and captured Test log Environment: Informatica 8.6/Cognos, Oracle, Windows XP, UNIX
  • Project Location: Bangalore, Karnataka, India Hardware:Dell Operating System:Windows XP, Linux

Show More

8. SQL

high Demand
Here's how SQL is used in ETL Developer jobs:
  • Incorporated SQL tuning recommendations for data retrieval by using indexing strategy and using hints.
  • Encapsulated frequently used SQL Statements into stored procedures thereby reducing the query execution time.
  • Automated job sequence and monitored/audited SQL tasks.
  • Created and Designed SSAS Cubes, Data Source and Data Source Views Using SQL Server Analysis Services 2008 (SSAS).
  • Troubleshooted and Debugged Macros in excel to run SQL queries using macros on Oracle to check for validations before the loads.
  • Developed various T-SQL stored procedures, functions, views and adding/changing tables for data extraction, transformation, load and reporting.
  • Involved in knowledge transfer and training to production support team Environment: MS SQL Server 2000, MS DTS and Visual Basic
  • Developed and maintained T-SQL stored procedures, functions, triggers and scripts, which were integral to the client's accounts.
  • Created mappings for various sources like Oracle, SQL server, flat files to load and integrate data to warehouse.
  • Created full upgrade process Acclaim version 4.19 to version 4.34 for Lake County, Florida Recording Office 2014- SQL 2008R2.
  • Performed unit testing and system testing by creating automated test scripts in SQL and manual test cases for all modules.
  • Involved in SQL Tuning by creation of indexes, rebuilding Indexes using Explain Plan, SQL Trace and TKPROF Tools.
  • Developed SQL Stored Procedures, Functions, Packages, Cursors, Triggers and Records to implement business rules into application.
  • Used SQL Override and reduced the number of transformations used in the mappings and improved the performance of the sessions.
  • Performed tuning of SQL queries for speedy extraction of data and troubleshooted the long running sessions and fixed the issues.
  • Migrated and Converted Crystal Reports into SQL Server Reporting Services (SSRS) by using MS SQL Server Reporting Services.
  • Analyzed the source fields and created SQL queries for field to field validation by referring source to target mapping document.
  • Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
  • Perform T-SQL tuning and optimizing queries for Reports, which take longer, time in execution SQL Server 2005..
  • Experience of package deployment both to SQL server and file system and package scheduling using SQL Server Agent.

Show More

9. Pl/Sql

high Demand
Here's how Pl/Sql is used in ETL Developer jobs:
  • Designed and developed PL/SQL functions/ stored procedures/ cursors/ triggers/ packages.
  • Developed PL/SQL stored procedures used by front-end and middle-tier developers.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Developed Complex PL/SQL objects like Stored Procedures, Functions, and Packages in oracle for implementing/building logic for complex business requirements.
  • Involved in periodic Performance Tuning and Unit Testing of the PL/SQL code by using appropriate joins, indexes, and partitions.
  • Constructed Oracle PL/SQL code to write Stored Procedures and Packages using cursors, constructs and loops for data transformation and manipulation.
  • Developed DDL's, DML's, PL/SQL stored procedures, indexes for ETL operations on Oracle and SQL Server databases.
  • Design and Development of data validation, load processes, test cases using PL/SQL, Stored Procedures, Functions, Triggers.
  • Created various Oracle database SQL, PL/SQL objects like Indexes, stored procedures, views and functions for Data Import/Export.
  • Optimized SQL statements, PL/SQL blocks, created and modified triggers and stored procedures to enhance the performance of project.
  • Implemented PL/SQL triggers and other data base objects on Spectrum Justice System (SJS) for capturing changed data.
  • Coded database Triggers, Packages, Functions and Procedures using PL/SQL and maintained the scripts for various data feeds.
  • Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
  • Worked on change requests for enhancing the existing Oracle Pl/SQL application and configured batch jobs using UNIX shell scripting.
  • Utilized PL/SQL Functions & Procedures for the ETL process and for Data Validations between source and target tables.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Analyze, design and develop stored procedures, functions, packages, materialized views and triggers using PL/SQL.
  • Wrote/Modified SQL, PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.
  • Developed, tested Stored Procedures, Functions and packages in PL/SQL for Data ETL Performed Data conversions.
  • Created SQL scripts to create tables and create triggers and PL/SQL scripts to insert or update tables.

Show More

10. Toad

high Demand
Here's how Toad is used in ETL Developer jobs:
  • Used Toad extensively for query optimization.
  • Experience in extracting data from a wide range of sources such as flat files, oracle and query processing using TOAD.
  • Used PL-SQL Developer and TOAD to Analyze and test the loaded data of sources as well as Target RDBMS Databases.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Developed the code as per requirements and used Oracle and Toad to fine tune the performance of queries.
  • Conducted Performance tuning of application by modifying the SQL statements and using Explain Plan and TOAD Software.
  • Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.
  • Developed PL/SQL Views and Materialized Views with fast and Incremental Refresh Rates using Oracle TOAD.
  • Used Toad for creating PL/SQL (trigger, sequence, stored procedure) objects.
  • Used TOAD to run SQL queries and validate the data in warehouse and mart.
  • Used TOAD 8.0 to analyze the execution of SQL queries for performance tuning purposes.
  • Used TOAD to develop and debug oracle SQL functions, procedures and packages.
  • Worked extensively with Toad, PLSQL Developer and MS SQL Server Management Studio.
  • Worked on Toad to write Oracle SQL and PL/SQL statements and scripts.
  • Utilized Quest tool (Toad 8.0) for database monitoring and tuning.
  • Validate the required data at the database level by using tool Toad.
  • Used Toad and MS SQL Management Studio to analyze that oracle database.
  • Used TOAD to FTP file moving processes to and from source systems.
  • Used Toad for data quality verification and unit testing the data.
  • Have written Complex SQL Queries using TOAD to validate the loads.

Show More

11. Source Qualifier

high Demand
Here's how Source Qualifier is used in ETL Developer jobs:
  • Performed various transformations like Joiner transformation, aggregate transformation, source qualifier transformation, Filter transformation, Expression transformation.
  • Created Simple and Complex Source Qualifier Query's for ETL Jobs and Test Query's for Data Validation.
  • Used most of the transformations such as the Source Qualifiers and Filter Transformation as per the Business requirement.
  • Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scans.
  • Created Mappings for Initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
  • Implemented different Performance tuning methods such as filtering the data right at the source qualifier, implemented partitioning.
  • Configured Source Qualifier transformation to read data from the ODS as per the business requirement on weekly basis.
  • Used Source Qualifier Transformation extensively to filter data at Source level rather than at Transformation level.
  • Enhanced the performance of mapping by implementing pass through partition at the source qualifier transformation.
  • Used SQL Override in Source qualifier to customize SQL and filter data according to requirement.
  • Created partitions, SQL over ride in Source Qualifier, session partitions for improving performance.
  • Developed mappings using different transformations like expression, joiner, source qualifier, filter.
  • Developed SQL Overrides for the Source Qualifiers Based on the Business Requirements.
  • Used SQL Override in Sorter, Filter & in Source Qualifier Transformation.
  • Implemented complex ETL logic using SQL overrides in the source Qualifier.
  • Implemented SAP BW extraction logic using Application Source qualifier.
  • Used SQL Override function in Source Qualifier Transformation.
  • Developed SQL join queries in Source Qualifier.
  • Worked on simple/complex SQL queries and joins to have anoverride in source qualifier.
  • Involved in preparing stored procedures and complex sql to filter data at source qualifier level it self.

Show More

12. Aggregator

high Demand
Here's how Aggregator is used in ETL Developer jobs:
  • Used ODBC, TRANSFORMER, SEQUENTIAL FILE, AGGREGATOR and HASH FILE stages for designing the Jobs in Data stage Designer.
  • Configured incremental aggregator transformation functions to improve the performance of data loading.
  • Created and used complex aggregator transformation in various mappings.
  • Used extensively Aggregator/Sort/Merge stages for data processing.
  • Worked on different transformations like aggregator, expression, look up, filter, router, joiner, union, normalize.
  • Created mappings using various transformations like Aggregator, Joiner, Filter, Expression, Router, Look up and Update Strategy.
  • Used aggregator, router, joiner, stored procedures, look-up, filter, source qualifier and update strategy transformations extensively.
  • Developed DataStage Server jobs extensively using various stages like Transformer, Hash File, Aggregator, Sequential stage, ORAOCI etc.
  • Used aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
  • Used different types of Stages such as Sequential File, Sort Aggregator, Transformer and ODBC to develop different jobs.
  • Tested various jobs using DataStage stages like change capture, joins, Aggregators, filters, Sequential files.
  • Used various transformations Aggregator, Expressions, Filters, Sorter, Sequence Generator, Joiner, and Router.
  • Developed various jobs using stages Hashed file, Sequential file, XML, Aggregator, Pivot and Sort.
  • Developed jobs to aggregate the source data using Aggregator and handled duplicate records using Remove Duplicate stage.
  • Developed various jobs using Dynamic RDBM, Universe, Hashed file, Aggregator, Sequential file stages.
  • Developed various jobs using DataStage stages like ODBC, Hashed file, Aggregator, Sequential file.
  • Used Sorter and Aggregator transformations in combination for performance tuning of aggregations used in the mappings.
  • Developed mappings using transformations such as the Source qualifier, Aggregator, Expression and SQL transformation.
  • Used the Expression and Aggregator transformations to summarize the volume of stocks traded in a Day/Month/Year.
  • Created various Transformations like Joiner, Aggregator, Expression, Filter, and Update Strategy.

Show More

13. Update Strategy

high Demand
Here's how Update Strategy is used in ETL Developer jobs:
  • Worked extensively with Update Strategy transformation for implementing inserts and updates.
  • Designed and developed complex Aggregate, Join, Router, look up and Update strategy transformation based on the business rules.
  • Created various other Transformations like Aggregate, Expression, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Used Update Strategy DD_INSERT, DD_UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Used Update Strategy Transformation to update the slowly changing Dimension tables, Type2 updates was often used.
  • Used Update Strategy to insert, delete, update and reject the items based on the requirement.
  • Created various transformations such as Update Strategy, Look Up, Joiner, Filter and Router Transformations.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
  • Developed and implemented Update strategy plan for the Data Warehouse project to implement slowly growing dimension.
  • Used various ETL Transformations such as Look-Up, Joiner, Aggregate, Filter and Update Strategy.
  • Worked with Update strategy transformation using functions like DD_INSERT, DD_UPDATE, DD_REJECT, and DD_DELETE.
  • Designed and developed Expression, Look up, Filter, Router, Update strategy transformations to.
  • Used update strategy to effectively migrate slowly changing data from source system to target Database.
  • Worked with the Joiner, Router and Update Strategy Transformation to implement data synchronization.
  • Used Update strategy and Target load plans to load data into Type-2 /Type1 Dimensions.
  • Created IDQ transformations using Expression, Decision, Exception, and update strategy transformations.
  • Used update strategy transformation to effectively migrate data from source to target.
  • Used the update strategy to effectively load data from source to target.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.

Show More

14. Test Cases

high Demand
Here's how Test Cases is used in ETL Developer jobs:
  • Performed unit testing and reconciliation, thereby documenting relevant test cases and defect log documents.
  • Analyzed productivity and quality of test cases to determine efficiency.
  • Created test cases, implementation plan and post-implementation validation plan.
  • Involved in the preparation of Test cases and Test results using Mercury Quality Center for Component Integration Testing and Unit Testing.
  • Created test cases (functional, unit and regression test cases) that was reusable for future ETL developments and deployments.
  • Prepared test cases and performed unit testing to confirm the functionality of this initiative and no balance calculation has been impacted.
  • Created the unit test cases (UTC) and tested mapping to assure that data is loaded as per mapping design.
  • Coordinated with testing team to review the testing strategy and test cases and also ensured maximum number of scenarios are covered.
  • Created documents that have the detail explanation of the mappings, test cases and the expected results and the actual results.
  • Developed unit and system test cases, using System Procedures to check data consistency with adherence to the data model defined.
  • Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.
  • Involved in Testing and Test Plan, Test Cases and Test Scenarios Preparation and Process Improvement for the ETL developments.
  • Designed unit test cases for each ETL map and Involved in Integration testing and system testing phases of the project.
  • Have written Test cases and expected results to help the testing team to validate the mapping in the Test environment.
  • Prepared test cases and involved in unit testing of mappings, system testing, Unit Testing and User Acceptance Testing.
  • Test the code and prepare Unit test cases, update the result and make the code ready for migration.
  • Prepared Test Case and Test Data and executed the test cases during Unit, System and User Acceptance Testing.
  • Developed test cases based on transformation rules related to extracting and loading of data Between Source and Target Systems.
  • Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.
  • Executed manual test cases by using positive and negative data inputs and verified actual results with expected results.

Show More

15. DB2

average Demand
Here's how DB2 is used in ETL Developer jobs:
  • Loaded consolidated data using DB2Loaders.
  • Developed jobs using Join, Merge, XML input, Transformer, Funnel, DB2Connector, DB2Enterprise, Copy, Filter.
  • Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).
  • Design, Testing/Documentation and test plan review Environment: AIX, DataStage, SQL, Base SAS, DB2 UDB, Oracle
  • Involved in the migration of data from user system to DB2 system through ETL process performing both Initial and incremental loads.
  • Extracted and loaded data from/to various sources like DB2 database, Oracle Database, Flat Files and loaded to target warehouse.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the DB2 database.
  • Developed the ETL logic to load data from various heterogeneous systems such as DB2, Flat files, Oracle 9i.
  • Extracted data from various sources like DB2, Oracle, XML and Flat Files and loaded to target tables.
  • Involved working on IBM DB2 and Oracle databases, designing and creating tables, relations, constraints, indexes.
  • Transformed the data using AbInitio v2.13 on AIX and loaded the data into DB2 tables on AIX and MVS.
  • Extracted data from fixed width flat files to load into the DB2 stage, then to the data mart.
  • Extracted the data from various heterogeneous sources like DB2, SQL Server and Flat files to the target database.
  • Extracted data from Flat files, Oracle, DB2 and SQL server sources to build an Operation Data Source.
  • Involved in handling and selecting heterogeneous data source like Oracle, DB2, SQL Server and Flat files.
  • Extracted data from different sources such as Oracle, DB2, XML, flat files using Power Center.
  • Extracted the Data items from different sources like DB2 and Flat/VSAM files and loaded them into Oracle target.
  • Extracted data from different sources like Oracle, SQL Server, DB2 and Flat files loaded into DWH.
  • Created mappings to pull the data from Sources like Mainframes and DB2 tables by following the mapping standards.
  • Involved in extraction, cleansing and loading of data into Oracle database from flat files and DB2 tables.

Show More

16. Teradata

average Demand
Here's how Teradata is used in ETL Developer jobs:
  • Used BTEQ sessions to access from TERADATA database.
  • Performed application level DBA activities creating tables, indexes and monitored and tuned TeradataBETQ scripts using Teradata Visual Explain utility.
  • Involved in the ongoing delivery of migrating client mini-data warehouses or functional data marts from oracle environment to Teradata.
  • Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.
  • Developed and tested various stored procedure as part of process automation in Teradata.
  • Worked with Teradata extracting data and loading into different database.
  • Loaded data to Staging area using Teradata parallel transporter.
  • Coded Teradata functions using macros and stored procedures.
  • Involved in developing Teradata load utility scripts.
  • Worked on Teradata optimization and performance tuning.
  • Developed loading scripts using Teradata utilities.
  • Created TeraData Loader Connections in Informatica.
  • Generated reports using Teradata BTEQ.
  • Developed Teradata TPT utility scripts to move the data from PRD environment to the lower environments for the purpose of testing.
  • Used SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL.
  • Designed various mappings for extracting data from various heterogeneous sources involving Flat files, Oracle, SQL Server and Teradata.
  • Project involved designing and developing ETL jobs to load data from Oracle / MS SQL to DB2 database and Teradata.
  • Used SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Used ETL to standardize data from various sources and load into Data Stage area, which was in Teradata.
  • Designed Complex Teradata SQL's to pull the data from source systems and to populate data into target tables.

Show More

17. Mapplet Designer

average Demand
Here's how Mapplet Designer is used in ETL Developer jobs:
  • Worked on Source Analyzer, Warehouse designer, Mapping Designer&Mapplet Designer.
  • Used various transformations, Mapplet Designer, Transformation Developer.
  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
  • Worked extensively with Designer tools like Source Analyzer, Transformation Developer, Mapping and Mapplet Designers.
  • Used mapping and mapplet designer to generate different mappings for different loads.
  • Created Reusable Transformations and Mapplets in the designer using transformation developer and mapplet designer according to the business requirements.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in InformaticaPower Center 8.6.1.
  • Created Reusable transformations and Mapplets using transformation developer and Mapplet Designer throughout project life cycle.
  • Created complex reusable transformations and Mapplets in Transformation Developer, and Mapplet Designer, respectively.
  • Worked on Informatica-source analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations.
  • Developed reusable transformations and mapplets using transformation developer and mapplet designer tool.
  • Developed Mapplets using Mapplet designer.
  • Created new mapping design / schema using various Informatica Designer tools such as Source Analyzer, Mapplet Designer and Mapping Designer.
  • Worked on Informatica tools like Source Analyzer, Target Analyzer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Worked with Informatica PowerCenter client tools like Source Analyzer, Warehouse Designer, Mapping designer, Mapplet designer etc.
  • Used Informatica PowerCenter 8.6x client tools - Source Analyzer, Mapping Designer, Mapplet Designer, and Transformation Developer.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter8.6.1.
  • Developed mappings/Transformations/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter.
  • Created and edited mapplets using Informatica Power Mart Mapplet Designer.

Show More

18. Repository

average Demand
Here's how Repository is used in ETL Developer jobs:
  • Configured Repositories and security Domain, Folders, User group, and Security Level to manage repositories using Repository Manager.
  • Build a centralized Distribution data repository enabling data driven insights to guide and optimize our future supply chain.
  • Develop and maintain technical documentation on WellPoint Knowledge Repository; compile documentation for design changes or creation criteria.
  • Developed Metadata repository, configured metadata the Presentation Layer, Business Model Layer & Physical Data Model.
  • Report Manager Application is a central repository built to consolidate all reporting platforms for the National Financial.
  • Migrated mappings from Development Repository to Test Repository and also from Test Repository to Production Repository.
  • Developed critical company-wide Operational Data Store for uniform, source system insensitive corporate data repository.
  • Created and managed the local repositories and permissions using Repository Manager in Oracle Database.
  • Performed Repository object migration from Development to testing and testing to production Environments.
  • Created and managed the global and local repositories and permissions using Repository Manager.
  • Maintained the Development, Test and Production Repositories using Repository server administration console.
  • Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.
  • Involved in migration of mappings and sessions from development repository to production repository.
  • Created new repositories and new folders within the repositories using Repository Manager.
  • Created global repository, Groups, Users assigned privileges using repository manager.
  • Created Objects into repository by merging and importing into Micro Strategy Repositories.
  • Involved in code migration from development to higher environments using Repository Manager.
  • Configured Repository Manager, created folders and managed objects in repository manager.
  • Created repository created and maintained users, domains and database connections.
  • Maintained Development, Test and Production mapping migration using Repository Manager.

Show More

19. Source Systems

average Demand
Here's how Source Systems is used in ETL Developer jobs:
  • Worked on data mapping from source systems to Data Warehouse and to external interfaces per business requirements and technical specifications.
  • Involved in analysis of source systems, business requirements and identification of business rules and creating low-level specifications.
  • Provide staging solutions for Data Validation techniques and Cleansing applications for different source systems.
  • Prepared specification documents for different mappings between Source Systems and Data Warehouse.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Reviewed multiple data source systems and recommended data acquisition and transformation strategy.
  • Developed data Mappings between source systems and warehouse components BACARDI application.
  • Analyzed variety of source systems and designed target model accordingly.
  • Worked closely with data population source systems and warehouse components.
  • Created report requirements documentation and physical mapping from source systems.
  • Developed data Models/Mappings between source systems and warehouse components.
  • Reviewed source systems and proposed data acquisition strategy.
  • Identified data quality issues and provided solutions to remove junk data from source systems to satisfy the DW and KPI needs.
  • Developed ETL interfaces to integrate master data from Oracle, SQL Server, and Flat file source systems into Oracle database.
  • Involved in data Extraction and Transformation from different source systems and Loading the same data to the database using DataStage tool.
  • Worked on Power Exchange to create data maps, pull the data from source systems, and transfer into Staging area.
  • Lead ETL Developer on several projects that extracted data from heterogeneous source systems, transforming and finally loading into data marts.
  • Analyze data sources from multiple source systems and create technical design and data mapping documents for ETL processes into the DW.
  • Designed the system components for the extract/transform or conversion of data from source systems to the target application for 48 interfaces.
  • Source systems usually include data related to Patients, Patient Visits information like diagnosis, procedures, referrals and treatment etc.

Show More

20. Sequence Generator

average Demand
Here's how Sequence Generator is used in ETL Developer jobs:
  • Developed stored procedures for generic values instead of using the sequence generator transformations.
  • Used Joiner Transformations to extract data from multiple sourced and Sequence Generator Transformation to create unique primary key values.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Used the Sequence Generator transformation in generating the Surrogate Keys.
  • Generated sequence numbers using Informatica logic without using the sequence generator.
  • Created Primary keys using Sequence Generator and loaded dimension from flat file using source analyzer and Expression builder in Informatica.

Show More

21. XML

average Demand
Here's how XML is used in ETL Developer jobs:
  • Managed Package Configurations in XML Files to efficiently Promote Unit Tested Packages to Production Environment from Development Environment.
  • Created XML Package configuration files to update database connectivity information.
  • Created XML Package Configurations, Error Handling using Event Handlers for On Error, On Warning, and On-Task FailedEvent Types.
  • Extracted data from multiple sources, which include relational sources, flat files and XML files into target Oracle 8i/9i database.
  • Experience in extracting the data from SQL Server and using XML and present it to the end user in XML format.
  • Developed FTP scripts to move files across different servers, XML files and Excel files to target Oracle Data Warehouse.
  • Loaded data from XML, CSV files using derived columns, Data conversion, Pivot transformation by creating SSIS package.
  • Involved in pulling data from XML files, flat Files, SQL Server into Data warehouse and then Data Mart.
  • Developed various graphs which include extracting various XML files, CSV, flat files and loading it into the database.
  • Used heterogeneous sources Oracle, XML Files, Flat Files as source and imported stored procedures from Oracle for transformations.
  • Extracted data from different data sources such as Flat files, XML, Excel, oracle 10g and SQL Server.
  • Worked with external vendor to make sure XML file is loaded into the Legal Exchange system properly without any issues.
  • Developed mappings to read data from various sources including XML, Flat files, SQL server and oracle tables.
  • Imported metadata from different sources such as Relational Databases, XML Sources and impromptu Catalogs into Frame Work Manager.
  • Edited and deployed SSIS 2008 R2 package's XML native code as a fast track approach to production support.
  • Extracted XML data from different sources such as messaging system TIBCO, files and databases using XML Parser Transformation.
  • Populated data from Relational Sources, flat files, XML, CSV to staging tables then load into target.
  • Experience in analyzing Business specification documents, developing test plans defining test cases, developing and XML test scripts.
  • Worked with the flat files in both the direct and indirect methods and also worked with XML files.
  • Designed and developed SSIS Packages to download data from variety of sources like XML, Excel and OLEDB.

Show More

22. Schema

average Demand
Here's how Schema is used in ETL Developer jobs:
  • Created and implemented target definitions based on Star schema design and developed aggregate tables to accommodate storage of summarized data.
  • Worked in creating the consolidated script for reporting schema creation, table relationships establishment and indexing for performance improvements.
  • Involved in Data Warehouse landing zone schema Design and provided Data granularity to accommodate time phased CDC requirements.
  • Participated in design of Staging Databases and Data Warehouse/Data marts using Star Schema/Snowflakes schema in data modeling.
  • Participated in system requirements and design specifications and also involved in the development of Star Schema.
  • Managed star schema/snowflake data warehousing projects for medium and large corporations in diverse industries.
  • Designed a Data warehouse using a Star schema methodology.
  • Contributed significant design inputs for dimensional db schema design.
  • Developed data marts extensively using Star schema.
  • Developed Star schema as per business requirements.
  • Created the Multi-Dimensional data model (STAR Schema) and enhanced it to meet the growing requirements due to corporate mergers.
  • Prepared the DDL's for the staging/work tables and coordinated with DBA for creating the development environment schema and data models.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin / Star Schema.
  • Involved in Dimensional Modeling using Star schema and Snow Flake schema for faster, effective query processing and Business Analysis Requirements.
  • Implemented various database schema objects such as indexes, packages, procedures, functions, triggers using SQL and PL/SQL.
  • Involved in dimensional modeling (star schema) of the data warehouse and used Erwin to design the data model.
  • Involved in identifying the sources for various dimensions and facts for different data marts according to star schema design pattern.
  • Designed and implemented schema in more than five databases for various different kinds of storage as per the business requirement.
  • Implemented SCD 2 for the dimension tables in star schema to save the history as required by The business requirements.
  • Designed the logical schema and physical schema creation for Data Mart and integrated the legacy system data into Data Mart.

Show More

23. Design Documents

average Demand
Here's how Design Documents is used in ETL Developer jobs:
  • Documented implementation and user training documents and software requirement specification verification, software design documents, technical specifications and application development.
  • Experience in preparing design specification documents based on functional requirements and also involved in the preparation of technical design documents.
  • Prepared various design documents like technical design documents, mapping specification documents, target to source mapping document.
  • Gathered business requirements and prepared technical design documents, target to source mapping document, mapping specification document.
  • Communicated with business users directly, discussed business requirements and prepared high level design documents for offshore team.
  • Documented business requirements, Coordinated with offshore team to prepare flawless ETL specifications, Technical design documents.
  • Analyzed and understanding the functional requirements and converted them to Technical requirements using mapping and design documents.
  • Created Technical design specification documents based on the functional design documents and the physical data model.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Prepared functional and technical documents including user narratives, technical design documents and other support documents.
  • Provided documentation about database / data warehouse structures and Updated functional specification and technical design documents.
  • Understand requirements and preparation/review of High level, Low level design documents and technical specification documents.
  • Created High level/detailed level design documents and also involved in creating ETL functional and technical specification.
  • Created various documents including high-level design documents, mapping documents and knowledge transfer documents.
  • Involved in preparing the Technical Analysis Design documents based on the Functional Specification documents.
  • Developed design documents, technical specifications for mappings and sessions based on user requirements.
  • Designed the technical specification documents and Functional design documents as per the business requirements.
  • Participated in gathering the user requirements and involved in writing technical design documents.
  • Designed technical specifications, mapping documents, design documents and transformation rules.
  • Involved in various documentation like Application Information documents, and Design documents.

Show More

24. Different Transformations

average Demand
Here's how Different Transformations is used in ETL Developer jobs:
  • Created different transformations according to the business and technical requirements.
  • Developed mappings for loading the staging tables from text files using different transformations.
  • Created mappings with different transformations, mapping parameters and variables.
  • Worked on different transformations, slowly changing dimensions.
  • Used Informatica Designer to create mappings using different transformations to populate data to a data warehouse.
  • Developed Slowly Changing Dimension Mappings of type II Created different transformations for loading the data into targets using various transformations.
  • Used Informatica Designer to create mappings using different transformations to move data to Data marts and Data Warehouse.
  • Simplified data flows by removing unneeded objects Created new batch jobs using different transformations.
  • Used mapplets and different transformations to meet the complex logic.
  • Developed Informatica mappings, sessions and workflows using different transformations as per the requirement.
  • Involved in developing the Mappings using different transformations Created Sessions and workflows in PowerCenter Workflow manager.

Show More

25. Ssis

average Demand
Here's how Ssis is used in ETL Developer jobs:
  • Analyzed software requirement documents and business requirement and assisted in high level data modeling using Erwin data modeling tool.
  • Configured Connection Manager files for SSIS packages to dynamically execute on Quality Analysis server and Production server.
  • Experience in deployment of SSIS packages into different environment using configurations for deployments to different environments.
  • Assisted with implementation/upgrade of AutoSys and JAWS, troubleshoot errors, and documentation/procedures.
  • Provided technical assistance by responding to inquiries regarding errors or questions with programs/interface.
  • Assisted in troubleshooting and performance enhancements to existing processes.
  • Assisted in Data Modeling and Dimensional Data Modeling.
  • Developed, Maintained, and supported stored procedures, triggers, SSIS and DTS packages in an extremely high volume environment.
  • Created SSIS packages to extract data from OLTP to OLAP systems and Scheduled Jobs to call the packages and Stored Procedures.
  • Experience in configuring package logging, error logging and event handling to redirect error rows and fix the errors in SSIS.
  • Developed SQL Queries for pulling large volume of records from source system database using stored procedures and ETL processes using SSIS.
  • Assisted the data analysts in identifying the changes needed in the system as per the requirements of the Marketing groups.
  • Created SSIS package to load data from XML Files and SQL Server 2008 R2 using Derived Columns & Condition Split.
  • Created Logging, Break Points, Check Points, and used Error Handler in SSIS and automating the ETL process.
  • Designed and developed SSIS Packages to import and export data from MS Excel, SQL Server 2008/2005 and Flat files.
  • Used Visual Source Safe on regular basis for version controlling of SSIS packages as well as various T- SQL objects.
  • Worked with various tasks of SSIS include Transform Data, Execute SQL, Created SQL Jobs to schedule SSIS Packages.
  • Used SSIS to populate data from various data sources, creating packages for different data loading operations for application.
  • Configured SSIS Send Mail Task on multiple Control Flow tasks to send email during failure using SMTP Connection manager.
  • Identified various transformations in SSIS and their use at various situations in the project and prepared documents and packages.

Show More

26. Fact Tables

average Demand
Here's how Fact Tables is used in ETL Developer jobs:
  • Populated various fact tables by deriving necessary calculations and joins on warehouse.
  • Implemented a process to establish Referential Integrity between related dimension/fact tables.
  • Involved in performance tuning on the mapping level when loading millions of records in the Fact tables on a daily basis.
  • Created Data Warehouse schema to deploy fact and dimension table and defined Referenced relationships with the measure groups and fact tables.
  • Created cubes for Dimension and fact tables for future and current reporting using SQL server Analysis Services (SSAS).
  • Created dimension and fact tables for customer per product, per profile, per region, per branch, time.
  • Developed business models conforming to functional Mapping, defined logical and complex joins for the Dimension & Fact tables.
  • Worked with Data Modeler, BI Developers to create dimensions and fact tables in the database using Star Schema.
  • Designed Indexing strategy for Fact Tables, Disk usage statistics and automated email notification to Tech Ops team.
  • Created Source to target Mapping Matrix in loading the data to Dimensions and Fact Tables using SAS.
  • Handled slowly changing dimensions (type 1, 2 and 3) and populated the fact tables.
  • Implemented slowly changing dimension type 2 for each dimension and fact tables in the data warehouse.
  • Designed and developed ETL mappings for Type -1 & Type -2 Dimensions and Fact tables.
  • Performed unit testing / integration testing on data loaded into the dimension and fact tables.
  • Implemented slowly changing dimensions Type1, Type2 approach for loading the Dimensions and Fact tables.
  • Involved in logical and physical database design, Identified Fact Tables, Transaction Tables.
  • Dimension Data Modeling, Star Schema Design, Design of Dimension and Fact Tables.
  • Created necessary Index/Partition on huge fact tables for getting a better performance in ETL.
  • Developed mappings to load Slowly Changing Dimensions Type-1 and Type-2, Fact tables.
  • Identified the facts and dimensions and designed the relevant dim and fact tables.

Show More

27. Business Logic

average Demand
Here's how Business Logic is used in ETL Developer jobs:
  • Developed user defined Routines and transformations to implement business logic and Shell scripts to automate file manipulation and data loading procedures.
  • Prepared ETL design specification documents with information on implementation of business logic for Member Records, Pharmacy and Medical Claims.
  • Created Oracle Stored Procedures to implement complex business logic for improved performance and called it from Stored Procedure Transformation.
  • Designed and developed complex mappings that involved Slowly Changing Dimensions, Error handling, Business logic implementation.
  • Design and creation of detailed Technical Mapping document with information on implementation of business logic.
  • Developed Stored Procedures and Functions to implement necessary business logic for interface and reports.
  • Reviewed Functional and Technical documents and gathered the Business logic requirement for code development.
  • Prepared ETL design specification documents with information on implementation of business logic.
  • Converted a complex business logic into SQL stored procedures and user-defined functions.
  • Created stored procedures and packages to effectively handle complex business logic.
  • Developed user defined Routines and Transformations for implementing complex business logic.
  • Performed in-depth analysis and documentation of business logic and requirements.
  • Developed packages for implementing business logic through procedures and functions.
  • Designed the Transformation Logic using business logic.
  • Modified existing ETL jobs to process new data types; including adjusting columns, derivations, and business logic as needed.
  • Involved in the creation of shortcuts for source, targets and in the creation of reusable transformations according to business logic.
  • Worked on Field analyzing like identifying the business logic for each field, Participated in team meetings and proposed ETL Strategy.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Involved in knowledge transfer to the peers documenting of all the business logic, insuring maintenance is easy for the peers.
  • Modified the existing mappings and applets based on user change requests to implement new business logic and improve the Session Performance.

Show More

28. Autosys

average Demand
Here's how Autosys is used in ETL Developer jobs:
  • Participated in complete process of functionality testing of all versions of AutoSys applications.
  • Designed Batch Processing system for applications using AutoSys.
  • Created Shell scripts with PMCMD command and scheduled using AutoSys for nightly batch processing.
  • Create and maintain AutoSys jobs for scheduling on Windows and UNIX machines.
  • Defined and scheduled the load jobs using AutoSys and WLA scheduler.
  • Collaborated with operations personnel to create batch cycles in AutoSys.
  • Call Visual Basic for Excel code from AutoSys job.
  • Involved in Scheduling using the scheduling tool like AUTOSYS.
  • Created AutoSys jobs to schedule the SHELL scripts.
  • Scheduled sessions and work flows using AutoSys.
  • Worked on scheduling operations using AUTOSYS.
  • Used AUTOSYS GUI TOOL from Execution/ABORT/FORCE START of the BATCH JOBS.
  • Scheduled jobs using Sequencers and Autosys job scheduler utility based on the requirements and monitored the development and test environments closely.
  • Scheduled jobs using Autosys scheduler utility based on the requirements and monitored the production processes closely for any possible errors.
  • Job scheduling using Autosys Publish periodical test status reports to the stake-holders
  • Implemented autosys job for Daily/weekly/monthly load of data.
  • Created Schedule dependencies using Autosys scheduling tool.
  • Involved in the migration of the AutoSys version from 4.5 to R11.3.5 and creating AutoSys scripts to schedule jobs for workflows.
  • Created various Autosys entries for scheduling various data cleansing scripts fixing the Bugs in the Mappings, Sessions and Parameter files.
  • Used Autosys for scheduling various data cleansing scripts and loading processes; maintained the batch processes using UNIX Scripts.

Show More

29. QA

average Demand
Here's how QA is used in ETL Developer jobs:
  • Migrated development mappings to QA and Production environment.
  • Worked closely with QA, Business and Architects to solve various defects in quick and fast manner to meet deadlines.
  • Coordinated with the QA team and Operations (production) team while the code was being migrated across the servers.
  • Have also worked on Testing the Code in DEV, QA and Production and addressing the issues at all levels.
  • Involved in complete SDLC including analysis, design, development, implementation, QA and maintenance of various software applications.
  • Performed unit testing, system integration testing, and QA testing for entire process from source file to EDW.
  • Created, executed and documented unit test plans and recommended QA test plans for ETL and data integration processes.
  • Helped out in migration of the mappings, Session and other objects from Development to QA and to Production.
  • Assisted in migration of mappings, sessions, and common objects from development to QA and to production.
  • Provided Support to code that I developed when the code is in QA and then moved to Production.
  • Work along with QA testers to troubleshoot tickets in a full SDLC (Software Development Life Cycle).
  • Involved in creation of Environment Document which provides instructions to Implement/Test the Project in QA and Production environments.
  • Documented and presented the supporting documents and Install Guide to QA and Production team for the components developed.
  • Involved in test plan creation, and a full round of no harm testing and QA was done.
  • Generated detailed QA queries to test data integrity for the launch of a new cloud based reporting system.
  • Deployed mappings in various environments from DEV to QA and finally to PROD using deployment groups and Export/Import.
  • Prepared detail documentation for the developed code for QA to be used as guide for future migration work.
  • Mentor Data Quality team concerning testing process and QA approach used to validate source to target data flows.
  • Create source to target mapping for the QA team as per the transformation logic defined by the business.
  • Performed unit testing and QA testing and created documents for the code to work in efficient manner.

Show More

30. User Acceptance

low Demand
Here's how User Acceptance is used in ETL Developer jobs:
  • Coordinated with developers for preparation of unit-testing, User Acceptance Testing and Post Implementation Validation Documents.
  • Used Agile-testing methodology for achieving deadlines in User Acceptance Testing.
  • Supported System/ UAT testing and documenting user acceptance criteria.
  • Participated in unit testing and coordinated User Acceptance Testing.
  • Provided 24x7 production /Integration/User Acceptance/Performance tests Support.
  • Participated in user acceptance testing with business.
  • Formulated QA reports for black box testing application including Functional testing, Regression, Integration and User Acceptance testing.
  • Performed system, unit, and User Acceptance Testing (UAT) of BO systems with database validation.
  • Assisted the creation/validation of End to End test scenarios and also data loading for User Acceptance Testing scenarios.
  • Involved in analyzing user requirements, Coding, Preparing testing specs, database designing and User acceptance training.
  • Involved Working on Functional testing, Integration/ System testing, Regression testing, And User Acceptance Testing.
  • Develop test scripts, Execute test cycles and Facilitate User Acceptance Testing by preparing use cases.
  • Performed smoke, Functional and GUI Testing to ensure that the user acceptance criteria are met.
  • Supported on Various test cycles Integration, System, User acceptance tests and Data Warehouse testing.
  • Assist Business Users in User Acceptance Testing and fix the bugs identified during the UAT.
  • Interacted with end users and line managers to gather business requirements and conducted user acceptance.
  • Developed test case for unit, System integration and User Acceptance testing of ETL Process.
  • Perform Unit testing and get the user acceptance on the quality of data.
  • Involved in unit testing to test the processed data and user acceptance testing.
  • Performed unit level and SIT and interacted with business during User acceptance testing.

Show More

31. Complex Mappings

low Demand
Here's how Complex Mappings is used in ETL Developer jobs:
  • Developed complex mappings for data integration based on business requirement and logic.
  • Developed Transformation logic and designed various complex Mappings in the Designer.
  • Developed complex mappings for VSAM file extract and normalizing the legacy data for further data cleansing and logical processing for migration.
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 (Time Stamping) to maintain full history of transactions.
  • Developed complex mappings with a slowly changing dimensions (type2: version) to keep track of current and historical data.
  • Developed complex mappings using a range of transformations on the extracted data according to the Transformation Business Rules and user requirements.
  • Helped in developing complex mappings such as Slowly Changing Dimensions (SCD) Type-II Time stamping in the Mapping Designer.
  • Develop complex mappings involving loading of high-volume data including loading data into Oracle Partitioned tables (Oracle 8i).
  • Worked on complex mappings and always guided the team when stuck and ensured timely delivery of the ETL components.
  • Developed several complex mappings to migrate millions of data from legacy SQL server system to Oracle Data Warehouse.
  • Worked extensively on various active & passive transformations and created complex mappings and tuned them for better performance.
  • Implemented complex mappings using DWH techniquesType1, Type2& Type 3 of slowly changing dimensions (SCD).
  • Developed complex mappings using corresponding Source, Targets and Transformations, which were optimized for maximum performance.
  • Developed complex mappings from scratch and was responsible in tuning the coded mappings that were previously implemented.
  • Designed and developed complex mappings using various transformations in Designer to load data from Oracle DB.
  • Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
  • Developed several complex Mappings and successfully implemented SCD Type1/Type 2 to keep the history changes.
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 1, Type 2.
  • Designed and developed complex mappings using Mapping Designer to load data from various sources.
  • Used Debugger in Mappings to debug the Complex mappings and fixed the Errors.

Show More

32. Source Data

low Demand
Here's how Source Data is used in ETL Developer jobs:
  • Worked on source data discovery and data profiling to determine data quality and availability.
  • Validate source data populated into the integrated and/or dimensional repositories.
  • Source data file structures/layout analysis and creating mapping documents.
  • Analyzed requirement specification documents and understands the source data.
  • Identified sources/targets and analyzed source data for dimensional modeling.
  • Worked on Analysis of the source database & target databases for the root cause analysis of issues and fixing the bugs.
  • Worked with Facets 4.48, 4.51 and different EDI transaction files to understand the source structure and the source data pattern.
  • Used SSIS and T-SQL stored procedures to transfer data from source databases to staging area and finally transfer into data warehouse.
  • Created and modified COBOL copybook to connect source data using Power Exchange Navigator Monitoring the ETL jobs and fixing the Bugs.
  • Aided in change tracking implementation for new ETL design to only extract new or updated data from SQL server source databases.
  • Developed complex ETL's to calculate various monthly and YTD values based one the source data using business financial calculations.
  • Developed UNIX K-Shell scripts to validate source data, load tables like Staging, Transaction, Aggregate and Snapshot tables.
  • Performed extensive Data Profiling on the source data by loading the data sample into Database using Database Load Utilities.
  • Developed UNIX scripts to create file lists, transfer and archive source data files for backup and audit purposes.
  • Involved in creation of mapping for extraction of source data from different OLTP applications and loading data into target.
  • Performed detailed Analysis on source data and provided data profiling results on the outbound files from the OLTP system.
  • Work with the Business Data Analysts to help define approaches and needs for moving source data to the EDW.
  • Constructed various different types of Connection Manager between source data files and destination positions in Data Warehouse Staging Area.
  • Involved in analyzing the source data coming from different Data sources such as Oracle, Flat files and Mainframes.
  • Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.

Show More

33. Datastage

low Demand
Here's how Datastage is used in ETL Developer jobs:
  • Involved with project analysis to understand the business requirement specification and implemented the ETL jobs using DataStage Server and Parallel Extender.
  • Used DataStage Manager for importing metadata into repository and also used for Importing and exporting jobs into different projects.
  • Designed DataStage jobs using Quality Stage7.5 for data cleansing & data standardization Process.
  • Performed regular backups of the jobs developed using DataStage Manager Export/Import utility.
  • Developed control jobs for each financial instrument using DataStage Designer.
  • Designed and developed DataStage jobs to ensure easy restart ability.
  • Assisted Systems Administrator in DataStage installation and maintenance.
  • Exported and imported DataStage components using DataStage Manager.
  • Developed DataStage Parallel and Sequence Jobs.
  • Developed Shell scripts for event automation to run, FTP, archive and upload XML output data generated from DataStage jobs.
  • Used DataStage Designer extensively for developing various jobs to extract, transform and generate XML output data from Oracle source data.
  • Designed and developed purge process for the staging area fact and dimension tables using combination of DataStage jobs and UNIX scripts.
  • Used to work as an ETL application developer by using PL/SQL and ETL tool DataStage and provided existence system continuation support.
  • Involved in Design, Development and unit testing of DataStage jobs and migrating the code from one environment to the other.
  • Developed several complex ETL jobs for Historical data loads and ongoing data loads using various active and passive stages of DataStage.
  • Created Parallel jobs in DataStage 8.1 using parallel extender and load data into warehouse and mart tables using architecture guidelines.
  • Populated Data Marts at different levels of granularity for the inside customers using DataStage, SQL scripts and stored procedures.
  • Used Hash files and flat files creation methodology in DataStage Designer to get the best performance in data loading process.
  • Created jobs in DataStage to import data from heterogeneous data sources like Oracle 9i, Text files and SQL Server.
  • Worked with DataStage Director for setting up production job cycles for daily, weekly monthly loads with proper dependencies.

Show More

34. UAT

low Demand
Here's how UAT is used in ETL Developer jobs:
  • Evaluated database performance and performed maintenance duties such as tuning, backup, restoration and disaster recovery whenever needed.
  • Created Debugging Mappings to examine any performance issues in existing mappings by analyzing the data flow and evaluating transformations.
  • Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
  • Evaluated all functional requirements and map documents and perform troubleshoot on all development processes.
  • Debugged bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Evaluate an organization for data warehouse maturity and business architecture alignment.
  • Examined and evaluated data structures in source and destination databases.
  • Used to solve production incident problems in time-constrained situations.
  • Evaluated and documented existing processes for improvements.
  • Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
  • Tested the code developed by other team members to ensure the quality of code before promoting to System test or UAT.
  • Migrated the code to SIT and UAT using Components Tool Migration (CLM) and PROD with change ticket using HPSM.
  • Evaluated the effect of optimization processes on the time taken by different batch runs through time series analysis and DOE.
  • Performed Unit Testing, System Testing and User Acceptances Testing (UAT) for Universe, Reports and Security Management.
  • Designed and developed data Validation controls to evaluate the quality of the ETL process using UNIX scripts and ETL Mappings.
  • Work with the UAT team to see if the graph produces the correct enriched data and provide UAT testing support.
  • Create UAT reports out the data loaded by the procedures and to analyze the data consistency to meet user demands.
  • Performed Unit Testing and assisted QA team in Quality Assurance Testing, Load Testing and UAT, Performance estimation testing.
  • Worked on the Optimization and restructuring of stored procedures for the application by evaluating indexing and partitioning large tables.
  • Defined the test strategy, created unit test plans and supported UAT and regression testing using HP Quality Center.

Show More

35. Debugger

low Demand
Here's how Debugger is used in ETL Developer jobs:
  • Performed Data Transformation Consistency check using the Debugger.
  • Used debugger to test the mapping and fixed the bugs and identified the bottlenecks in all levels to tune the performance.
  • Developed Unit Test Cases and used debugger to ensure successful execution of the data loading processes and fixing production defects.
  • Used debugger to effectively test the mapping by observing the data movement in order to troubleshoot the transformation errors.
  • Used debugger to test the data flow and fix the mappings, Created and Monitored Batches and Sessions.
  • Used Debugger wizard to analyze the flow of data on row by row basis from transformation to transformation.
  • Used Debugger by making use of Breakpoints to monitor data movement, identified and fixed the bugs.
  • Used the Debugger to run Debug sessions setting Breakpoints across instances to verify the accuracy of data.
  • Used the Debugger in debugging some critical mappings to check the data flow from instance to instance.
  • Used debugger to analyze the data flow between source and target to fix the data issues.
  • Used Debugger to validate transformations by creating break points to analyze, and monitor Data flow.
  • Used debugger to test the data flow between source and target to fix the invalid mappings.
  • Used Debugger wizard to remove bottlenecks at source, transformation, and target for optimum loads.
  • Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
  • Checked Sessions and error logs to troubleshoot problems and also used debugger for complex problem troubleshooting.
  • Used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
  • Used Debugger utility and made appropriate changes to generate the required results for various tickets.
  • Worked extensively with debugger to debug errors identified in data and to fix them.
  • Created partitions for parallel processing of data and used Debugger to troubleshoot the mappings.
  • Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping.

Show More

36. Ssrs

low Demand
Here's how Ssrs is used in ETL Developer jobs:
  • Create SSRS reports illustrating completeness of conversion developments to customer.
  • Created various technical documentation, Business requirement gathering, and analyzed data, developed and built SSIS Packages and SSRS reports.
  • Designed, deployed, and maintained of various SSRS Reports in SQL Server 2008 using report manager and the report server.
  • Created data sets for various reports for physicians and clinics which were eventually converted into SSRS and Power Pivot Reports.
  • Transferred data from sources like MS Excel, MS Access, SQL Server using SSIS and created reports using SSRS.
  • Involved in designing, developing, debugging and testing of reports in SQL Server 2005 Reporting Services (SSRS).
  • Created Reports in SSRS with different type of properties like Chart controls, filters, Interactive sorting and SQL parameters.
  • Generated and consolidated monthly reports and budget analysis to drive forecast production plan with SQL report service (SSRS).
  • Created Technical Specs to convert current state (in COBOL) reports to future state Reports (in SSRS).
  • Worked on SSRS reports using Report Parameters, Drop-Down Parameters, Multi-Valued Parameters debugging Parameter Issues Matrix Reports and Charts.
  • Created SSRS Data Model projects using Microsoft Visual Studio and using Report Builder for report server to create reports.
  • Involved in development and implementation of SSIS, and SSRS application solutions for various business units across the organization.
  • Designed and created Report templates, bar graphs and pie charts based on the financial data using SSRS 2012.
  • Utilized SSRS and Excel files to create reports including profit/loss analysis by vendor and explore business opportunities areas.
  • Used breaks, calculations, sorts, filters and sections to enhance data presentation in reports using SSRS.
  • Migrated all Legacy system (Quest application) Crystal Reports to New SSRS Reports by implementing same functionality.
  • Migrated data from Oracle to SQL Server data warehouse and to generate the reports using SSRS for different.
  • Designed and implemented stylish report layouts, and related MS SQL Server Reports and Dashboards using SSRS 2008.
  • Developed reports with SSRS, Excel services, and Power BI; deployed them on SharePoint server.
  • Created subscription for the windows file share and implemented three layer securities for the reports in SSRS.

Show More

37. Worklets

low Demand
Here's how Worklets is used in ETL Developer jobs:
  • Created Worklets to run several sessions sequentially and concurrently.
  • Created and scheduled Worklets, configured email notifications.
  • Created reusable transformations, worklets, and made use of the shared folder concept using shortcuts wherever possible to avoid redundancy.
  • Set up batches of Worklets and sessions to schedule the loads at the required frequency.
  • Created worklets when a group of sessions need to run multiple times.
  • Involved in designing worklets and reusable tasks according to the business needs.
  • Used parameter files, reusable sessions and worklets to create generic code.
  • Atomized Worklets and Session schedule using UNIX Shell scripts.
  • Created worklets to control the execution of various sessions.
  • Created reusable Worklets involving many tasks.
  • Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
  • Identified reusable functionality, developed re-usable Transformation, Mapplets, Sessions and Worklets.
  • Migrated mappings/sessions/worklets/workflows from Development to Testing and from Testing to Production.
  • Have used various Power center Transformations and tools, Design Power Center Mappings, Workflows, Worklets based on the requirements.
  • Worked in fixing poorly designed mappings, workflows, worklets, sessions, and target data loads for better performance.
  • Involved in creation of Mapplets, Worklets, Reusable transformations, shared folder, and shortcuts as per the requirements.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision-mail, command, worklets.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Developed huge workflows with worklets, event waits, assignments, conditional flows, and email and command tasks.
  • Designed and developed an integrated Workflow with Worklets for incremental data load capture for all Line of Business.

Show More

38. Warehouse Designer

low Demand
Here's how Warehouse Designer is used in ETL Developer jobs:
  • Imported and Created Target Definitions using Warehouse Designer.
  • Created/imported the target tables using Warehouse Designer.
  • Imported Source and target definition from the database, using Source Analyzer and Warehouse Designer.
  • Worked extensively on Source Analyzer, Mapping Designer, and Warehouse Designer.
  • Assisted warehouse designer while designing the models.
  • Worked on Informatica Power Center tools-Source analyzer, Warehouse Designer, Mapping Designer and Transformation developer.
  • Imported various source and target definitions using Informatica Source Analyzer and Warehouse Designer.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Used Informatica Source Analyzer, Mapping Designer, Transformation Developer and Warehouse Designer for Extraction, Transformation and Loading.
  • Worked on Informatica Power Center 7.1 tool - Source Analyzer, Warehouse designer, Mapping Designer.
  • Worked with Informatica PowerMart client tools like Source Analyzer, Warehouse Designer and Mapping designer.
  • Used the Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer.
  • Created the warehouse tables using Informatica Power Mart Warehouse designer.
  • Used Informatica client tools -- Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
  • Worked in Informatica 8.1 - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet and Transformations.
  • Worked on Informatica tool -Source Analyzer, Warehouse designer, Mapping Designer & Mapplets Designer.
  • Designed and created mappings and maplets using Informatica Source Analyzer, Warehouse designer, Maplet Designer and Transformation Designer.
  • Worked on Informatica tools Source Analyzer, Warehouse designer, Mapping Designer and Workflow Manager and workflow monitor.

Show More

39. EDW

low Demand
Here's how EDW is used in ETL Developer jobs:
  • Developed strategy and implemented for deployment of data integration into EDW.
  • Implemented and customized RapidDecision EDW Marts in near-real-time.
  • Project Description- Aviation Services EDW - This project is to create Enterprise Data Warehouse (EDW) from different data sources.
  • Handled critical job failures in the production environment and Involved in fixing the data issues in the production EDW database.
  • Design and Implement ETL processes for History load and Incremental loads of EDW, Customer and Transaction level data warehouse.
  • Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process and DataMart.
  • Developed data warehouse solutions for JD Edwards EnterpriseOne, JD Edwards World, PeopleSoft, and Oracle systems.
  • Involved in design, development and implementation of the Enterprise Data Warehouse (EDW) and Data Mart.
  • Involved in designing, developing, implementing and loading inbound files into EDW data base from major vendors.
  • Loaded customer information from CIS System (Customer Information System) into EDW as per the reporting needs.
  • Worked with several Data Warehousing projects to load data into EDW, data marts and Operational Data Stores.
  • Implemented Erwin Data Modeler 7.2, Worked with data modelers to validate and maintain models for EDW.
  • Created Solution Review Documents and EDW Deployment Guides for effectively moving the CR's to System Testing.
  • Create new DataStage jobs, reusable containers and Job Sequences to replicate the data integration for EDW.
  • Worked on analyzing the data and fill the gaps from between old EDW and new built EDW.
  • Worked as a Senior ETL Developer to implement the EDW for the Provider & Patient information reporting.
  • Coordinate with cross functional teams for EDW and ensure successful completion of End to End System Testing.
  • Develop and test extraction, transformation, and load (ETL) processes for the EDW.
  • PROJECT PROFILE: IT Environment Setup for EDW PROJECT NAME IT Environment Setup for EDW INDUSTRY Manufacturing
  • Project Description: AG Edwards is a USA based brokerage firm which deals in capital market.

Show More

40. Dimension Tables

low Demand
Here's how Dimension Tables is used in ETL Developer jobs:
  • Applied enhancements to Fact tables and Dimension Tables, Control status tables that were needed as part of data loading requirements.
  • Developed the tables with Type I and Type II slowly changing dimension tables from several mainframe flat files and tables.
  • Created OLAP cubes on top of the data warehouse basing various fact and dimension tables for analysis purpose using SSAS.
  • Defined Primary key-Foreign key relationships, complex joins bridge tables between dimension tables and Fact tables in the physical layer.
  • Created the new columns in existing fact and dimension tables and creating all together the new dimension and fact tables.
  • Created mappings to load Slowly Changing Dimension tables based on the amount of historical dimension data wanted to keep.
  • Implemented slowly changing dimensions (Type1 and Type2) to maintain current information and history information in dimension tables.
  • Worked on customizing the Activity and Revenue fact groups by adding new columns to the fact and dimension tables.
  • Build efficient ETL packages for processing fact and dimension tables with complex transformations and type 1 and type2 changes.
  • Recognized the fact and dimension tables in the OLAP database and created cubes using MS SQL Server Analysis Services.
  • Involved in Dimensional modeling by identifying the fact and dimension tables based on the user & reporting requirements.
  • Created staging and dimension tables in SQL Server database using SQL Query Analyzer and SQL Server Enterprise Manager.
  • Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
  • Created staging tables to do validations against data before loading data into original fact and dimension tables.
  • Worked on populating the EDW / Data Marts, multiple fact and dimension tables with star methodology.
  • Developed ETL to Load Data in staging tables and used SCD Techniques for Loading of Dimension tables.
  • Imported the custom fact and dimension tables and configured it in the 3 layers of the RPD.
  • Implemented slowly changing dimensions SCD Type 1 to update current information and maintain history in dimension tables.
  • Implemented logic for Slowly Changing Dimensions to handle Incremental Load for Dimension Tables and Fact Tables.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Show More

41. SCD

low Demand
Here's how SCD is used in ETL Developer jobs:
  • Performed SCD type2 mapping implementation.
  • Perform analysis and coding to maintain Deletions and SCD History ENV:: Oracle 11g, Oracle Warehouse Builder, PL/SQL.
  • Used CDC (Change Data Capture) Stage to Capture the New records and updated records and implemented SCD type 2.
  • Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
  • Worked on loading techniques like SCD's and incremental loading for extraction, transformation and loading of the data.
  • Worked extensively on complex standard, non-reusable Mappings and successfully implemented SCD Type1/Type 2 to keep the history changes.
  • Developed JOBS to load the data into the Warehouse environment using the slowly Changing Dimension (SCD) techniques.
  • Implemented slowly changing dimensions on customers table using SCD stage in Data stage 8.7 on IBM Information server 8.7.
  • Implemented mapping for slowly changing dimensions (SCD) to maintain current data as well as historical data.
  • Implemented Slowly Changing dimension (SCD) methodology for accessing the full history of accounts and transaction information.
  • Developed SSIS Packages for Snapshot, Incremental, Historical Changes (SCD Type 2) Data Loads.
  • Implemented SCD type1 and SCD type 2 logic at various stages in loading data into final tables.
  • Implemented slowly changing dimension (SCD) Type 1 and Type 2 mappings for changed data capture.
  • Created mappings to keep historical data (SCD type 2) and load control for ETL extract.
  • Develop slowly changing dimensions SCD Type 1, Type 2 and CDC to maintain history of transactions.
  • Worked with Slowly Changing Dimensions (SCD) Type1, Type2, and Type3 for Data Loads.
  • Experienced in building Incremental/Full load ETL process, Slowly Changing Dimensional (SCD-I, SCD-II) process.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Based on the requirement of history, both SCD 1 and 2 type has been used.

Show More

42. Technical Specifications

low Demand
Here's how Technical Specifications is used in ETL Developer jobs:
  • Interviewed business users individually and as a team to facilitated requirements gathering, created functional and technical specifications for the same.
  • Created requirement specifications documents, user interface guides, functional specification documents, ETL technical specifications document and test cases.
  • Involved in requirement gathering, analysis and designing technical specifications for the data migration according to the Business requirement.
  • Analyzed the functional specifications provided by the data architect and created technical specifications documents for all the mappings.
  • Document the business process by identifying the requirements and prepared excellent documentation on business requirements and Technical Specifications.
  • Evaluated business requirements and prepared design and technical specifications within the boundaries of the enterprise architecture and standards.
  • Interacted with Development Teams and Managers to analyze business needs and involved in developing technical specifications.
  • Involved in analysis, requirements gathering, function/technical specifications, development, deploying and testing.
  • Worked closely with business analysts to gather functional specifications and turn them into technical specifications.
  • Developed and Implemented different transformation techniques for identifying slowly changing Dimensions as per technical specifications.
  • Documented the detailed technical specifications and translated them into programmed application ETL code modules.
  • Designed the Mapping Technical Specifications and Design Document on the basis of Functional Requirements.
  • Interacted with Business Analyst to gather requirements and translated them into ETL technical specifications.
  • Skilled in writing technical specification documents, translating user requirements to technical specifications.
  • Gathered requirements directly from end users and converted them in to technical specifications.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Converted functional specifications to technical specifications (design of mapping documents).
  • Translated the business requirements to technical specifications based on the functional specs.
  • Developed functional/technical specifications such as business scope document utilizing agile methodology.
  • Analyze & translate functional specifications & change requests into technical specifications.

Show More

43. BI

low Demand
Here's how BI is used in ETL Developer jobs:
  • Involved, Conducted and participated in process improvement discussions, recommending possible outcomes focusing on production application stability and enhancements.
  • Position Netting Engine has capabilities to perform daily netting operations at a higher level of granularity.
  • Build common exception handling process to enable verification and validation capabilities and provide necessary audit.
  • Designed and developed Business Objects reports for a Generic Billing system data for analysis.
  • Provided 24*7 production support to ensure upon data refreshes and reports accessibility.
  • Analyzed the Technical, organizational, and economical feasibility of the project.
  • Reviewed specifications for feasibility of customer list pull criteria and commit date.
  • Developed job sequencers and implemented restart ability using various checkpoints.
  • Optimized Query Performance, Session Performance and Reliability.
  • Participated in system design and feasibility analysis discussions.
  • Monitored locking issues and high availability issues.
  • Implemented linked universe concept for best maintainability.
  • Communicated data availability with users and management.
  • Implemented conditional formatting using OBI Publisher.
  • Conducted Impact and feasibility analysis.
  • Provide dynamic subscriber traffic network for business support to over 150 million customers to ensure high availability time to minimize downtime.
  • State Farm Insurance companies have consistently received high ratings for financial strength and claims paying ability from various independent rating agencies.
  • Reduced daily application downtime from 5+ hours to 3 seconds with individual designed high availability staging process (HASP) solution.
  • Participated in data modeling (logical and physical) for designing the Data Marts for Finance, Sales, Billing.
  • Used Hive queries and Hive DDL to transform data format and combine data from several sources into one data file.

Show More

44. Data Analysis

low Demand
Here's how Data Analysis is used in ETL Developer jobs:
  • Performed initial data analysis on service requests from business clients and provided a detailed analysis report using Excel Pivot tables.
  • Worked on data analysis understanding source systems thoroughly, designing transformation maps and performing change control management and impact analysis.
  • Provided Knowledge transfer to external organizations on basic data analysis skills, providing visibility to primary business performance.
  • Performed data analysis and informational systems design in developing new enhancements to existing systems and new projects.
  • Translated business requirements into data analysis, data sources and reporting solutions for different types of consumers.
  • Performed data analysis, assessed data quality, performed data cleansing and developed benchmarks and reporting.
  • Develop design specifications for data integrate & to populate Data warehouse databases and data analysis.
  • Performed data analysis, data mapping, data quality & design logical/physical data models.
  • Conducted complex data analysis and data mining in order to enhance company-wide decision making.
  • Worked closely with Subject Matter Experts on the Requirement analysis, Source/Target data analysis.
  • Implemented and maintained data mining, statistical and visualization algorithms needed for data analysis.
  • Performed detailed data analysis and reported for FDA quarterly submissions and semi-annual reports.
  • Worked on project specification document, participated in data analysis meeting etc.
  • Used T-SQL for extensive data analysis and actively participated in Database Modeling.
  • Performed Data analysis and prepared the Physical database based on the requirements.
  • Performed multidimensional data analysis and reports using the cubes to produce reports.
  • Interacted directly with business for data analysis and implementing the business requirements.
  • Translated business requirements into technical specification for reporting and Data analysis.
  • Perform data analysis and consolidating the extracted data for API development.
  • Uncovered several data issues to remedy through rigorous data analysis.

Show More

45. Parameter Files

low Demand
Here's how Parameter Files is used in ETL Developer jobs:
  • Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Developed mappings to dynamically generate parameter files used by other mappings.
  • Defined parameters, variables and parameter files for flexible mappings/sessions.
  • Generated parameter files dynamically with the data from database table.
  • Created mapping to dynamically generate parameter files.
  • Involved in creating UNIX shell scripts for scheduling the jobs using PMCMD and generating the Parameter Files, Manipulate/Archive Source Files.
  • Used mapping variables to reference and record values in mappings, and passed parameter files in sessions to reassign variable values.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files * Improved session run times by partitioning the sessions.
  • Created Shell Scripts to generate the parameter files, automatic backup old log files, and to automate the loading process.
  • Created UNIX shell script to generate parameter files and check the statuses of dependent loads based on a status table.
  • Developed UNIX shell scripts to create parameter files, rename files, compress files and for scheduling periodic load processes.
  • Used Parameter files to restore multiple DB connections to the sources and to share some arguments between the different transformations.
  • Designed automation of bad data reporting process to source system teams and incremental loads using Shell scripting and parameter files.
  • Used Mapping parameters and Variables in conjunction with ETL functions and also with Parameter files to produce desired results.
  • Employed variable functions (like SetMaxVariable) to manipulate mapping variables and parameter files to initialize mapping parameters.
  • Involved in parameter files creation for all the staging mapping being developed for the different ETL objects.
  • Created Dynamic parameter files and changed Session parameters, mapping parameters, and variables at run time.
  • Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.
  • Used Parameter files to define values for parameter and variable used in the mappings and sessions.

Show More

46. Bteq

low Demand
Here's how Bteq is used in ETL Developer jobs:
  • Created a Generic BTEQ script to load the data into various target tables from flat files using control table mechanism.
  • Created BTEQ scripts to transform data, FastExport scripts to export data and MicroStrategy (Web) reports.
  • Involved in writing scripts for loading data to target data warehouse for BTEQ, FastLoad and MultiLoad.
  • Developed BTEQ and Fast Export scripts to extract data from warehouse for downstream applications and user reports.
  • Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit and system test.
  • Provided the BTEQ scripts to validate the Edge Model and to generate Serial Key.
  • Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
  • Used Fast Load, BTEQ and Fast Export Utilities to populate fact/dim tables.
  • Created BTEQ scripts for initial load of data, error and audit logs.
  • Prepared BTEQ scripts to load data from Preserve area to Staging area.
  • Created and validated BTEQ/MLOAD/FLOAD scripts to load the data into target tables.
  • Developed BTEQ scripts which load the data from Landing to Stage area.
  • Used BTEQ scripting for executing SQL as part of post success commands.
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Automated these extract using BTEQ and Unix Shell Scripting.
  • Involved in writing BTEQ scripts to transform the data.
  • Develop and optimize BTEQ scripts and SQL codes.
  • Developed UNIX wrapper scripts which call internally BTEQ.
  • Involved in writing BTEQ scripts.
  • Used the TERADATA utilities BTEQ, FASTEXPORT, FASTLOAD, MULTILOAD, BULKLOAD extensively to load build various Summaries and Feeds.

Show More

47. Normalizer

low Demand
Here's how Normalizer is used in ETL Developer jobs:
  • Used Normalizer Transformation to split a single source row into multiple target rows by eliminating redundancy and inconsistent dependencies.
  • Involved in data normalization and access of legacy systems using Normalizer transformation.
  • Used Normalizer transformation to generate normalized data.
  • Designed the mappings with Normalizer transformation.
  • Used Normalizer transformation to identify the nested records within the COBOL source and display them accordingly.
  • Used Normalizer transformation to normalize the source files.
  • Developed mappings with Normalizer transformation for COBOL Sources.
  • Experience using various transformations; normalizer, router, filter, SAP/ Rfc, Salesforce Look up.

Show More

48. Oltp

low Demand
Here's how Oltp is used in ETL Developer jobs:
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications.
  • Developed and implemented ETL jobs that facilitated OLTP processing for systems integration.
  • Performed tuning of SQL queries and Stored Procedure for speedy extraction of data to resolve and troubleshoot issues in OLTP environment.
  • Project also included the migration of existing OLTP server data from SQL Server 2005 to SQL Server 2012 utilizing side-by-side migration.
  • Worked on the ETL project to extract data from OLTP to Staging DB and to load data into Enterprise Data Warehouse.
  • Designed an advanced ETL architecture for the overall data transfer between the OLTP to OLAP with the help of SSIS.
  • Created mappings to move from Various Systems (CRM, OLTP) to Transaction History database (Reporting Database).
  • Developed and deployed ETL packages using SSIS from OLTP to stage in the data mart to perform different transformations.
  • Involved design and development of Data Migration from Legacy system using Oracle Loader and import/export tools for OLTP system.
  • Identified dimensions and facts to be include in target data mart through analysis of multitude of OLTP data sources.
  • Studied the existing OLTP systems (3NF models) and created facts and dimensions in the data mart.
  • Developed mapping for extraction, transformation and loading of data from OLTP to staging using various Transformations.
  • Designed ETL (Extraction Transformation Loading) architecture for overall data transfer from the OLTP to OLAP.
  • Bank's OLTP System is functioning for day-to-day transaction as well as for generating performance reports.
  • Involved in extraction of data from the different flat files and data from Oracle OLTP Systems.
  • Identified issues in OLTP application and reported to the Business analysts for verification and further analysis.
  • Performed Unit testing of each of the interfaces for different scenarios in OSCAR (OLTP).
  • Created and deployed triggers to track DDL operations at server and database levels on OLTP servers.
  • Implemented OLAP to OLTP conversions, Data Staging, Data Integration, Data Cleaning tasks.
  • Created triggers on master data tables in OLTP to keep track of insertion and updates.

Show More

49. ODS

low Demand
Here's how ODS is used in ETL Developer jobs:
  • Established relation managing methods, automatic execution of tasks, accessible information transferring, and logical decision making.
  • Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement.
  • Recommended new and modified reporting methods and procedures to improve report content and accuracy.
  • Designed and implemented data verification and testing methods for the entire Data Warehouse.
  • Involved in debugging mappings, recovering sessions and developing error-handling methods.
  • Formulate methods to perform Positive and Negative testing against requirements.
  • Used Partition methods and collecting methods for implementing parallel processing.
  • Used different data transformation methods for data transformations.
  • Worked with business users and developers to develop the model and documentation for the new projects like marketing and ODS building.
  • Audit, Balance & Control that ensure both technical and business validity, integrity and consistency of data in the ODS.
  • Project also involved adding new maps to get the data from HCM to ODS layer and from ODS to DW layer.
  • Bean is an American private manufacturer of shoes and sporting goods, which privately maintains its own stores and its orders.
  • Implemented appropriate Error handling logic, data load methods and capturing invalid data from the source system for further data cleanup.
  • Documented each and every aspect of BODS including the setup, upgrade, scripts and got great appreciation from the client.
  • Work with other team members to assist any BODS related design and development technical solutions based on business customer requirements.
  • Analyzed, developed strategies and approaches to import and transfer data between source, staging, and ODS/Data Warehouse destinations.
  • Used Oracle's EXPLAIN PLAN optimizer methods like Rule-based or Cost-based to analyze the execution of complex SQL queries.
  • Used SCD (type1, type2 and type3) to Load present and Historical Data to ODS, EDW.
  • Developed Mappings that extract data from ODS to Data mart and monitored the Daily, Weekly and Monthly Loads.
  • Deployed different partitioning methods like Hash by field, Entire, Modulus, and Range for bulk data loading.

Show More

50. Database

low Demand
Here's how Database is used in ETL Developer jobs:
  • Created and monitored Database maintenance plans for checking database integrity, data optimization, rebuilding indexes and updating statistics.
  • Promoted database objects from test/development to production servers by coordinating and communicating with production schedules within development team.
  • Performed Database performance tuning, performance monitoring and optimization utilizing Oracle Hints, Explain plans and Table partitioning.
  • Performed testing and implementing of database backup procedures, restore procedures, disaster recovery procedures and contingency plans.
  • Used ER-win, Normalization, Dimension Modeling and Enterprise Manager for Logical and physical database development.
  • Designed database structures for effective data extraction, validation, run detail notification and error logging.
  • Performed database administration tasks which include Installation, Configuration, Maintenance, Monitoring and Troubleshooting.
  • Experienced on GoldenGate integration tool in handling transaction with heterogeneous databases and platforms.
  • Involved in Designing and developing universes for generating reports from warehouse databases.
  • Analyze the production applications and databases periodically for any process improvements.
  • Developed Database Triggers to enforce security also used ref cursors.
  • Review database statistics to recommend and implement performance improvements.
  • Designed, developed and maintained security of relational databases.
  • Involved in database tuning using normalization and application tuning.
  • Developed mappings and loaded data into relational database.
  • Performed query tuning while optimizing database performance.
  • Performed Windows patching on all database environments.
  • Created database triggers for Data Security.
  • Prepare prototyping and database design.
  • Created Database objects including Tables, Triggers, Views, Stored Procedures, Data Modeling, Indexes in SQL Server Database.

Show More

20 Most Common Skill for an ETL Developer

Data Warehouse18.2%
Informatica16.5%
Business Requirements7.5%
Target Database7%
Unix5.9%
Lookup4.7%
Windows XP4.6%
SQL4.4%

Typical Skill-Sets Required For An ETL Developer

RankSkillPercentage of ResumesPercentage
1
1
Data Warehouse
Data Warehouse
13.8%
13.8%
2
2
Informatica
Informatica
12.6%
12.6%
3
3
Business Requirements
Business Requirements
5.7%
5.7%
4
4
Target Database
Target Database
5.3%
5.3%
5
5
Unix
Unix
4.5%
4.5%
6
6
Lookup
Lookup
3.6%
3.6%
7
7
Windows XP
Windows XP
3.5%
3.5%
8
8
SQL
SQL
3.3%
3.3%
9
9
Pl/Sql
Pl/Sql
3.2%
3.2%
10
10
Toad
Toad
2.4%
2.4%
11
11
Source Qualifier
Source Qualifier
2.3%
2.3%
12
12
Aggregator
Aggregator
2.3%
2.3%
13
13
Update Strategy
Update Strategy
2.1%
2.1%
14
14
Test Cases
Test Cases
2%
2%
15
15
DB2
DB2
1.7%
1.7%
16
16
Teradata
Teradata
1.7%
1.7%
17
17
Mapplet Designer
Mapplet Designer
1.6%
1.6%
18
18
Repository
Repository
1.6%
1.6%
19
19
Source Systems
Source Systems
1.5%
1.5%
20
20
Sequence Generator
Sequence Generator
1.5%
1.5%
21
21
XML
XML
1.4%
1.4%
22
22
Schema
Schema
1.3%
1.3%
23
23
Design Documents
Design Documents
1.3%
1.3%
24
24
Different Transformations
Different Transformations
1.2%
1.2%
25
25
Ssis
Ssis
1.2%
1.2%
26
26
Fact Tables
Fact Tables
1%
1%
27
27
Business Logic
Business Logic
1%
1%
28
28
Autosys
Autosys
0.9%
0.9%
29
29
QA
QA
0.9%
0.9%
30
30
User Acceptance
User Acceptance
0.9%
0.9%
31
31
Complex Mappings
Complex Mappings
0.9%
0.9%
32
32
Source Data
Source Data
0.9%
0.9%
33
33
Datastage
Datastage
0.8%
0.8%
34
34
UAT
UAT
0.8%
0.8%
35
35
Debugger
Debugger
0.8%
0.8%
36
36
Ssrs
Ssrs
0.8%
0.8%
37
37
Worklets
Worklets
0.7%
0.7%
38
38
Warehouse Designer
Warehouse Designer
0.6%
0.6%
39
39
EDW
EDW
0.6%
0.6%
40
40
Dimension Tables
Dimension Tables
0.6%
0.6%
41
41
SCD
SCD
0.6%
0.6%
42
42
Technical Specifications
Technical Specifications
0.6%
0.6%
43
43
BI
BI
0.6%
0.6%
44
44
Data Analysis
Data Analysis
0.5%
0.5%
45
45
Parameter Files
Parameter Files
0.5%
0.5%
46
46
Bteq
Bteq
0.5%
0.5%
47
47
Normalizer
Normalizer
0.5%
0.5%
48
48
Oltp
Oltp
0.5%
0.5%
49
49
ODS
ODS
0.5%
0.5%
50
50
Database
Database
0.5%
0.5%

22,948 ETL Developer Jobs

Where do you want to work?