FIND PERSONALIZED JOBS
Sign up to Zippia and discover your career options with your personalized career search.
Sorry, we can't find that. Please try a different city or state.

Senior ETL Developer

This job has expired and is no longer available.
APPLY NOW
Apply Now
×
FIND
PERSONALIZED JOBS

Sorry, we can't find that. Please try a different city or state.

CONTENT HAS
BEEN UNLOCKED
Close this window to view unlocked content
or
find interesting jobs in

Log In

Log In to Save

Sign Up to Save

Sign Up to Dismiss

Sign Up

SIGN UP TO UNLOCK CONTENT

or

The email and password you specified are invalid. Please, try again.

Email and password are mandatory

Forgot Password?

Don't have an account? Sign Up

reset password

Enter your email address and we will send you a link to reset your password.

Back to Log In

Log In

Log In to Save

Sign Up to Save

Sign Up to Dismiss

Sign up to save the job and get personalized job recommendations.

Sign up to dismiss the job and get personalized job recommendations.

or

The email and password you specified are invalid. Please, try again.

Email and password are mandatory

Already have an account? Log in

reset password

Enter your email address and we will send you a link to reset your password.

Back to Log In

Company Saved

Answer a few questions and view jobs at that match your preferences.

Where do you want to work?

Job Saved

See your Saved Jobs now

or

find more interesting jobs in

Job Dismissed

Find better matching jobs in

Your search has been saved!

Top 50 Senior ETL Developer Skills

Below we've compiled a list of the most important skills for a Senior ETL Developer. We ranked the top skills based on the percentage of Senior ETL Developer resumes they appeared on. For example, 15.1% of Senior ETL Developer resumes contained Informatica as a skill. Let's find out what skills a Senior ETL Developer actually needs in order to be successful in the workplace.

These are the most important skills for a Senior ETL Developer:

1. Informatica

demand arrow
high Demand
Here's how Informatica is used in Senior ETL Developer jobs:
  • Involved with Data Steward Team for designing documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Managed enhancements and coordinated with every release with in Informatica objects.
  • Developed ETL mappings, transformations using Informatica PowerCenter.
  • Worked with Informatica version Control excessively.
  • Worked on fixing invalid Mappings, testing of Stored Procedures and Functions, and Integration Testing of Informatica Sessions.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Installed Informatica, oracle and all other required software's in newly created jump servers.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Provide details migration steps to Informatica admin for Test and Production deployment.
  • Designed the ETL process and customized templates around Informatica and Oracle.
  • Involved in upgrade of Informatica from 9.1 to 9.5.
  • Developed Informatica SCD type-I, Type-II mappings.
  • Developed sessions using Informatica workflow Manager.
  • Worked on Informatica tools -Source Analyzer, warehouse designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Used Informatica Cloud's Data Replication & Data Synchronization apps to get data from Salesforce to Oracle.
  • Started conversion process initially with Informatica and then added Talend ETL tool later on.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Designed and developed Informatica mappings in Informatica 9.5.1 environment, workflows to load data into Oracle ODS.
  • Experience working with Informatica Analyst, Informatica Developer tools, Informatica Power center and Power exchange.

Show More

27 Informatica Jobs

No jobs at selected location

2. Data Warehouse

demand arrow
high Demand
Here's how Data Warehouse is used in Senior ETL Developer jobs:
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Ensured acceptable performance of the data warehouse processes by monitoring, researching and identifying the root causes of bottlenecks.
  • Provided a generic design for implementing tables whose content was managed in part by end-users of the Data Warehouse.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Analyzed the specifications and identifying the source data that needs to be moved to the data warehouse.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
  • Build ODS and EDW for data warehouse needs of UAM for managing Facets and ACO data.
  • Created a data warehouse and involved in loading different type of sources to the target Oracle.
  • Document the process flow and logic to move data from staging to data warehouse.
  • Involved in data profiling and testing data warehouse data and Monitoring the ETL Processes.
  • Worked to load different flat files and relational data to oracle data warehouse.
  • Managed all development and support efforts for the Data Integration/Data Warehouse team.
  • Provided support for the production department in handling the data warehouse.
  • Worked on ORACLE database for loading data into data warehouse.
  • Worked on ETL process of populating the data warehouse.
  • Created data model using ERwin 7.1 to source data for the Staging area for Data Warehouse for Retail customers for Vanguard.
  • Used Informatica PowerCenter 8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Used Informatica designer to create complex mappings using different transformations to move source data to a Data Warehouse.
  • Involved in creating informatica mapping to populate staging tables and data warehouse tables from Mainframe sources.
  • Developed Informatica Mappings, Re-usable Transformations, and Mapplets for data load to data warehouse and database (oracle).

Show More

11 Data Warehouse Jobs

No jobs at selected location

3. Business Requirements

demand arrow
high Demand
Here's how Business Requirements is used in Senior ETL Developer jobs:
  • Studied the software requirements specifications and gathered the business requirements, analyzed them from functional perspective.
  • Interacted with various business people and gathered the business requirements and translated them into technical specifications.
  • Performed analysis of Business requirements and interpreted transformation rules for all target data objects.
  • Designed and implemented different validation mappings meeting the business requirements and data quality.
  • Created unit testing scripts and representative unit testing data based on business requirements.
  • Modified existing mappings for enhancements of new business requirements.
  • Developed ETL specification based on business requirements.
  • Worked with business owners by scheduling several summit sessions to gather business requirements and rules to start the conversion project.
  • Worked with SCPM business user to collect the business requirements and converted them to ETL specifications.
  • Interacted with Business Analysts to understand the business requirements and types of reports being generated.
  • Modified existing mappings for enhancements of new business requirements through PCR's and DMR's.
  • Worked with source system owners to identify data source based on business requirements.
  • Applied Slowly Changing Dimensions Type I and Type II on business requirements.
  • Involved in discussion of user and business requirements with business team.
  • Explain business requirements in terms of technology to the developers.
  • Involved in gathering the business requirements from Business Analyst.
  • Interacted with business users at various levels and converted their business requirements Into technical specifications.
  • Created sessions, workflows for the mapping to run daily based on the business requirements.
  • Developed ETL programs using Informatica Power center 9.6.1/9.5.1 to implement the business requirements.
  • Develop mappings and workflows as per business requirements using Power Center 9.5.1.

Show More

5 Business Requirements Jobs

No jobs at selected location

4. Pl/Sql

demand arrow
high Demand
Here's how Pl/Sql is used in Senior ETL Developer jobs:
  • Created several Procedures, Functions, Triggers and Packages to implement the functionality in PL/SQL.
  • Worked on different projects using Oracle forms, PL/SQL, and WebFocus Reporting.
  • Created procedures, functions and database triggers to enforce Business rules using PL/SQL.
  • Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.
  • Worked with PL/SQL Joins, Stored procedures and functions whenever needed.
  • Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Developed Oracle PL/SQL Package, procedure, function and trigger.
  • Developed several PL/SQL procedures, functions to provide data consistency.
  • Developed PL/SQL and UNIX Shell Scripts using VI editor.
  • Created PL/SQL Procedures and Packages.
  • Created SQL blocks and PL/SQL routines.
  • Developed Shell Scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
  • Converted ETL code in Perl to PL/SQL Provided effective support in delivering process and product change improvement solutions.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Involved in writing PL/SQL scripts for rollback when a session fails in workflow.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Converted existing PL/SQL Packages to ETL Mappings using Informatica Power Center.

Show More

5. Unix

demand arrow
high Demand
Here's how Unix is used in Senior ETL Developer jobs:
  • Coordinated any required Unix related scripts in support of the ETL deployment working with the ETL Admin.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Created UNIX Shell scripts to automate the process of generating and consuming the flat files.
  • Developed Unix scripts to automate different tasks involved as part of loading process.
  • Developed UNIX Shell scripts to implement the ETL load and integration management process.
  • Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts.
  • Automated Batch and Session schedule using PMCMD and Unix Shell scripts.
  • Generated UNIX shell scripts for automating daily load processes.
  • Created UNIX Shell Scripting for automation of ETL process.
  • Automated the entire processes using UNIX shell scripts.
  • Worked on modification of UNIX shell scripts.
  • Automated the process through UNIX Shell scripting.
  • Loaded data from files to Netezza tables (stage area) using NZLOAD utility & to HDFS files using UNIX scripts.
  • Scheduled various daily and monthly ETL loads using Control-M Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Worked on Unix OS and wrote shell scripts to start workflows and archive files.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Involved in writing UNIX Shell scripts to invoke the Workflows at scheduled timings.
  • Handled UNIX operating system tasks by generating Pre and Post-Session UNIX Shell scripts.
  • Created various UNIX shellscripts for Job automation of data loads.
  • Design and develop Informatica mapping, workflows and UNIX shell scripting.

Show More

5 Unix Jobs

No jobs at selected location

Job type you want
Full Time
Part Time
Internship
Temporary

6. Target Database

demand arrow
high Demand
Here's how Target Database is used in Senior ETL Developer jobs:
  • Configure Database and ODBC connectivity to various source/target databases.
  • Improved XML file creation and parsing performance by presenting properly tuned indexes and activated then populated into the target database.
  • Used version mapping to update the slowly changing dimensions to keep full history to the target database.
  • Source Databases is SAP and people soft and Target database is File files and oracle 9 i.
  • Performed Verification, Validation, and Transformations on the Input data before loading into target database.
  • Extracted data from Sales department to flat files and load the data to the target database.
  • Optimized mappings, sessions/tasks, source, and target databases as a part of performance tuning.
  • Developed and used the SQL Queries to validate the data in both source and target databases.
  • Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
  • Created target definition in oracle which was the target database.
  • Maintain the target database in the production and testing environments.
  • Configured Informatica Server to generate control and data files to load data into target database using SQL Loader utility.
  • Worked on different environments with different source and target databases like Teradata, DB2, and SQL server.
  • Executed Pre and Post session commands on Source and Target database using Shell Scripting.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Created and Monitored Workflows/Sessions using SSIS Workflow Manager/Monitor to load data from different Sources to target Database.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database Professional Experience:
  • Worked on Teradata Sql Assistant in querying the source and target databases to validate the BTEQ scripts.
  • Created Informatica sessions in workflow manager to load the data from staging to Target database.
  • Created and scheduled workflows using Workflow Manager to load the data into the Target Database.

Show More

7. Lookup

demand arrow
high Demand
Here's how Lookup is used in Senior ETL Developer jobs:
  • Worked on writing Scripts, Functions and used Lookup_Ext functions to reference the reference table data.
  • Designed and developed new mappings using Connected, Unconnected Lookups and Update strategy transformations.
  • Developed SQL overrides in Source Qualifier and Lookup transformations according to business requirements.
  • Created Connected, Unconnected and Dynamic Lookup transformations for better performance.
  • Worked extensively with dynamic cache with the connected lookup Transformations.
  • Implemented slowly changing dimension using dynamic lookup.
  • Created source, target, lookups, transformation, session, batches and defined schedules for those batches and sessions.
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables.
  • Populated Company and Location Cross reference Table/Hash files for Data Lookups and to update Company Master Table.
  • Developed override SQL statements in Source Qualifier and Lookups to suit extraction of desired data.
  • Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Created source & target lookup transformations, session and batches.
  • Loaded data into load, staging and lookup tables.
  • Used sorter transformation and newly changed dynamic lookup.
  • Worked with Connected and Unconnected Lookups.
  • Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
  • Worked on Informatica DVO tool as Trail Version to compared data for ETL Testing using Lookups, filters.
  • Interact with system analysts to understand the requirements Developed various kinds of mappings to load landing , lookup and reference tables.
  • Worked with various Lookups-Connected lookup, un connected lookup, Static cache, and Dynamic cache lookups.

Show More

3 Lookup Jobs

No jobs at selected location

8. SQL

demand arrow
high Demand
Here's how SQL is used in Senior ETL Developer jobs:
  • Involved in Performance Tuning of mappings, SQL overrides and Stored Procedures and also by introducing Indexes and Caching.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Developed the control files to load various sales data into the system via SQL*Loader.
  • Worked on Database level tuning and SQL Query tuning for the Data warehouse.
  • Converted old data from Flat files to Oracle database making use of SQL*Loader.
  • Involved in writing SQL scripts, stored procedures and functions and debugging them.
  • Used relational SQL wherever possible to minimize the data transfer over the network.
  • Used SQL to query Databases for Performing various validations and mapping activities.
  • Worked on SQL stored procedures, functions and packages in Oracle.
  • Integrated external functionality using web services Developed mapping parameters and variables to support SQL override.
  • Worked on the Mapping specification documents and provided complex sql queries to QA team which helps in testing data.
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Optimized performance by tuning the Informatica ETL code as well as SQL.
  • Create packages to migrate data from oracle to sql server.
  • Experienced in loading data between Netezza tables using NZ_SQL utility.
  • Created sql scripts to remove redundant data from base tables.
  • Created customize sql code to enhance the process.
  • Used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio and Pentaho.
  • Performed Unit testing by generating SQL scripts based on the pre-defined test plans and moved Data into QA.
  • Worked on performance tuning of Informatica workflow's and SQL queries.

Show More

33 SQL Jobs

No jobs at selected location

9. Source Qualifier

demand arrow
high Demand
Here's how Source Qualifier is used in Senior ETL Developer jobs:
  • Designed SQL overrides in source qualifier according to business requirements.
  • Experience in Source Qualifier for SQL override, performing joins, Filtering data based on columns, and distinct option.
  • Created mappings for initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
  • Implemented different Performance tuning methods such as filtering the data right at the Source Qualifier, implemented Partitioning.
  • Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scan.
  • Involved in Fine tuning SQL overrides in Source Qualifier and Look-up SQL overrides for performance Enhancements.
  • Used database links to connect to remote database and used them in source qualifier join queries.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Worked on XML Source Qualifier to extract the data from XML files.
  • Involved in tuning the mappings, sessions and the Source Qualifier query.
  • Tuned SQL, PL/SQL Queries in Source qualifier Transformation for better performance.
  • Developed SQL overrides in Source Qualifier/Lookup according to business requirements.
  • Used Source Qualifier & Lookup SQL overrides, Persistent caches, Incremental Aggregation etc for better performance.
  • Created pre-SQL and post-SQL queries in Source Qualifier transformation to improve the performance of data extraction.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Created Mapping documents explaining the logic (Source Qualifier SQL's, joins etc.
  • Used SQL Overrides in Lookups, Source filter in Source Qualifier Used parameter files to initialize the sessions and session variables.
  • Worked on frequently used transformations such as the Source qualifier, Aggregator, Lookup, Filter, Sequence and Router.
  • Used various transformations like Source Qualifier, Expression, filter, Aggregator, Lookup and SQL stored procedures.

Show More

10. Toad

demand arrow
high Demand
Here's how Toad is used in Senior ETL Developer jobs:
  • Programmed PL/SQL code to implement business rule through Procedures, Triggers, Functions and Packages using SQL*PLUS and Toad editors.
  • Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Worked on SQL tools like TOAD, SQL Developer to run SQL queries to validate the data.
  • Experience in coding using SQL, TOAD, SQL*Plus, PL/SQL procedures/functions, triggers and exceptions.
  • Used TOAD to run PL-SQL queries and validate the data in warehouse and mart.
  • Validated the test results by executing the queries using the Toad software.
  • Accessed the data, creation of stored procedures and functions using TOAD.
  • Analyzed and Debugged the procedures and packages using TOAD and SQL Developer.
  • Used Toad for data quality verification and unit testing the data.
  • Used Toad to write queries and interact with Oracle database.
  • Used TOAD and SQL navigator extensively.
  • Used TOAD to develop oracle PL/SQL, DDL's, and Stored Procedures.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Validated the data by writing the SQL queries in the TOAD.
  • Created IDQ rules to profile, cleanse, standardize & match and merge the data Used TOAD to evaluate SQL execution.
  • Used the SQL tools like TOAD inorder to validate the data against the various databases like Oracle.
  • Used TOAD for Oracle in creating & executing SQL queries for testingETL process.
  • Profiled Data using Informatica Developer and SQL Queries in Toad.
  • Power Center 9.1Power Center IDQ, Oracle 10g, SQL Server, Flat files, UNIX, Autosys, JIRA, TOAD
  • Performed Unit Testing and tuned the Informatica mappings for better performance * Used TOAD to run SQL queries.

Show More

11. Aggregator

demand arrow
high Demand
Here's how Aggregator is used in Senior ETL Developer jobs:
  • Configured incremental aggregator transformation functions to improve the performance of data loading.
  • Worked with Data and Index caches for better throughput when working with rank, Joiner, sorter and aggregator transformations.
  • Worked extensively with complex mappings using expressions, aggregators, filters and procedures to develop and feed Data Marts.
  • Worked with different transformations like Stored Procedure, Filter, Expression, Aggregator and Joiner.
  • Worked extensively with mappings using Expressions, Aggregators, Filters, Lookup, Joiner, Update Strategy and Stored Procedure transformations.
  • Used the mapping operators like sorter, Lookup, aggregator, expression, joiner, filter in the creation of mappings.
  • Used lookup transformation (Static as well as Dynamic), aggregator transformations, Update Strategy, Filter and Router Transformations.
  • Worked with Index Cache and Data Cache in cache using transformation like Rank, Lookup, Joiner, and Aggregator Transformations.
  • Worked with static and dynamic caches for the better throughput of sessions containing Rank, Lookup, Joiner and aggregator transformations.
  • Used transformations like router, filter, joiner, lookup, stored procedure, source qualifier, aggregator and update strategy.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
  • Used Router, Filter, Sorter, Sequence generator, aggregator, lookup transformation, expression transformation extensively.
  • Worked with Lookup, Aggregator, Expression, Router, Filter, Update Strategy, Stored Procedure transformations.
  • Designed and developed Aggregator, Lookup, Joins, and Merge transformations according to business rules.
  • Developed various Datastage jobs using ODBC, Hashed file, Aggregator and Sequential file stages.
  • Worked on transformations like Source qualifier, aggregator, Joiner, Lookup, Update strategy.
  • Worked with mappings expressions, aggregators, filters, lookups and stored procedure mappings.
  • Worked with aggregator, lookup, filter, router and update strategy transformations.
  • Developed complex Informatica Mappings coding with transformations like lookup, router, aggregator, expression, update strategy, joiner etc.
  • Created complex Mappings using union, xml, lookup, router, transaction control, aggregator, sorter and sql transformations.

Show More

2 Aggregator Jobs

No jobs at selected location

12. Update Strategy

demand arrow
high Demand
Here's how Update Strategy is used in Senior ETL Developer jobs:
  • Created update strategy and stored procedure transformations to populate targets based on business requirements.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • Designed and developed the logic for handling slowly changing dimension table's load by flagging the record using update strategy.
  • Used Update Strategy DD_INSERT, DD_UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.
  • Worked on transformation Source Qualifier, Expression, Router, Filter and Update Strategy.
  • Used the update strategy to effectively migrate data from source to target.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Developed various transformations like Source Qualifier, Update Strategy, Lookup transformation, Expressions for loading the data into target table.
  • Designed Complex Mappings, used lookup (Connected and Unconnected), Update Strategy and Filter transformations for loading data.
  • Performed data manipulations using various Informatica Transformations like Aggregate, Filter, Update Strategy, and Sequence Generator etc.
  • Used transformations like lookup, update strategy, expression, filter, router, aggregate, sequence generator.
  • Used lookup stage with reference to Oracle tables for insert/update strategy and updating of slowly changing dimensions.
  • Implemented SCD Type 2 Slowly Changing Dimension Mappings using Lookup and Update Strategy transformations.
  • Worked with SCD tables using Lookup and Update Strategy transformations.
  • Worked on various transformations like Lookup, Aggregator, Expression, Router, Filter, Update Strategy, and Sequence Generator.
  • Utilized lookup and update strategy transformations to lookup values from different tables and update slowly changing dimensions.
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql , router, Filter, Expression and Update Strategy.
  • Created complex Informatica mappings using transformations like lookup, sorter, aggregator, router, joiner, update strategy etc.

Show More

13. Windows XP

demand arrow
high Demand
Here's how Windows XP is used in Senior ETL Developer jobs:
  • Experience in usage of Operating Systems such as UNIX, WINDOWS XP/NT/2000/98/95.

Show More

14. Parameter Files

demand arrow
high Demand
Here's how Parameter Files is used in Senior ETL Developer jobs:
  • Worked with mapping parameters, variables and parameter files and designed the ETL to create parameter files to make it dynamic.
  • Experience working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files.
  • Used Parameter files to define values for parameter and variable used in the mappings and sessions.
  • Used session parameters and parameter files to reuse sessions for different relational sources or targets.
  • Developed various sessions, batches containing parameter files, indicator files and multiple source files.
  • Used Mapping Parameters and Variables, Dynamic parameter files, Shortcuts, Reusable Transformations.
  • Used Parameter files to reuse the mapping with different criteria to decrease the maintenance.
  • Set up the environment with Parameter files and Audit Balance Control mechanism.
  • Used Parameter Files defining variable values for mapping parameters and variables.
  • Used UNIX to create Parameter files and for real time applications.
  • Created Unix script to load flat file and parameter files.
  • Created Parameter files and parameterized all jobs.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Created Parameter files and validation scripts, Created Reusable and Non-Reusable command task in Workflow Manager.
  • Uploaded/modified Customer Parameter files in UNIX to use variables for Workflow sessions.
  • Created parameter files to Dev, Test and Prod environments.
  • Created and maintained parameter files for workflows in UNIX.
  • Migrated and tested Informatica Folders, Parameter files, Shell Scripts from Informatica PowerCenter 9.1.0 to 9.6.1 and 10.1.1.
  • Parameterized using mapping, session parameters, and workflow variables by defining them in parameter files.
  • Coded Netezza 'nzsql' scripts, UNIX shell scripts to handle parameter files.

Show More

15. Test Cases

demand arrow
average Demand
Here's how Test Cases is used in Senior ETL Developer jobs:
  • Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.
  • Developed unit and system test cases, using System Procedures to check data consistency with adherence to the data model defined.
  • Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.
  • Created test cases for Unit test, System Integration test, regression test and UAT to check the data quality.
  • Analyzed and executed the test cases for various phases of testing - integration, regression and user acceptance testing,
  • Created effective Test Cases and did Unit and Integration Testing to ensure the successful execution of data loading process.
  • Managed writing test cases and test scenarios from requirement for newly added features and executing test scripts.
  • Participated in testing in preparation of test plans and test cases to ensure the requirements are testable.
  • Provided support and quality validation through test cases for all stages of Unit and Integration testing.
  • Involved in creating Run books, Migration docs, Unit test plans and test cases.
  • Completed the Unit Testing (UT) and generated the Unit Test Cases.
  • Created Run books, Migration instructions, Unit testing plans and test cases.
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Created test cases for unit testing and functional testing.
  • Prepared Unit Test cases and Integration Test cases.
  • Created Test Cases and Test Data.
  • Worked on test cases and use cases creating Unit Test Plans and unit Testing on the ETL Packages and Workflows.
  • Created and executed unit test cases for various scenarios for all the mappings, workflows and scripts.
  • Created check lists, Procedure documentations and Test cases for Informatica installations and migration.
  • Reviewed Informatica mappings and test cases before delivering to Client.

Show More

16. Mapplet Designer

demand arrow
average Demand
Here's how Mapplet Designer is used in Senior ETL Developer jobs:
  • Worked on Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Repository Manager.
  • Used Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer & Transformation Developer for various data extractions.
  • Designed mappings by using the Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Worked on Power Center Designer client tools like Source Analyzer, Target Analyzer, Mapping Designer and Mapplet Designer.
  • Worked extensively with Designer tools like Source Analyzer, Transformation Developer, Mapping and Mapplet Designers.
  • Used Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer.
  • Used mapping and mapplet designer to generate different mappings for different loads.
  • Created Reusable Transformations and Mapplets in the designer using transformation developer and Mapplet designer according to the business requirements.
  • Created reusable transformations and mapplets in the designer using transformation developer and mapplet designer tools.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Based on the business requirements Reusable transformations are created in transformation developer and Mapplets in the Mapplet designer.
  • Worked on source analyzer, Ware House Designer, Mapping and Mapplet Designer, Workflowmanager and WorkflowMonitor.
  • Created reusable mapplets using Mapplet Designer which involved date conversions.
  • Created Mapplets with the help of Mapplet Designer.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Developed mappings/Reusable Objects/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter 8.6.1/8.1.1.
  • Worked on source analyzer, Target Designer, Mapping and Mapplet Designer, Workflow manager and Workflow Monitor.
  • Created Mapplets with the help of Mapplet Designer and using these Mapplets in various mappings.
  • Created mapplets using Mapplet Designer and used those Mapplets for reusable business process in development.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and Mapplet designer in Informatica PowerCenter 10.1.1, 9.6.1 ,9.1.0.

Show More

17. Repository

demand arrow
average Demand
Here's how Repository is used in Senior ETL Developer jobs:
  • Migrated repository objects, services and scripts from development environment to production environment.
  • Maintain Development, Test and Production mapping migration Using Repository Manager.
  • Worked with export and import utilities in Repository manager.
  • Design and implementation of a Metadata Repository.
  • Worked with DataStage Manager to import/export metadata, jobs, and routines from repository and also created data elements.
  • Utilized Repository Manager to create Repository, User Groups and managed users by setting up Privileges.
  • Created the repository manager, users, user groups and their access profiles.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Created respositories, folders, assigned permissions to users using repository manager.
  • Implemented an Operational Data Store (ODS), Which merges multiple loan origination systems into a single foundational repository.
  • Tested and migrated code over Dev, QA and Production of Repository, Dashboard Reports.
  • Created source table definitions in the Informatica repository by studying the data sources.
  • Used Designer, Repository Manager, Workflow Manager, Workflow Monitor, and Repository Server Administration Console.
  • Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
  • Worked with Power Center Client Tools like Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
  • Worked with Workflow Manager, Workflow Monitor, Designer and Repository Manager, Source Analyzer and Warehouse Designer.
  • Used Informatica workflow manager, monitor, and repository manager to execute and monitor workflows and assign user privileges.
  • Worked on Informatica PowerCenter tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Worked Extensively on Informatica modules -Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Worked exclusively on Informatica Power Center client tools - Repository Manager, Designer, Workflow Manager & Workflow Monitor.

Show More

5 Repository Jobs

No jobs at selected location

18. XML

demand arrow
average Demand
Here's how XML is used in Senior ETL Developer jobs:
  • Worked extensively on different sources COBOL (CopyBooks), Oracle, SQL Server, Flat file, XML.
  • Developed complex transformations like 'transaction control', Java, XML parser, calling Sync services through WSDL.
  • Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
  • Transform the Policy data into XML and send to rating engine for Premium validations using XML generator.
  • Developed Ab-Initio graphs to process XML files and load data into IQ database tables.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files.
  • Involved in XML parsing and loading XML files into the database.
  • Extracted data from XML files and loaded it into relational Databases.
  • Worked with XSD and XML files generation through ETL process.
  • Used MS Visual SourceSafe to save mappings as XML files.
  • Used XML sources to load data into relational tables.
  • Used XML spy tool to validate the input source XML files.
  • Exported reports into PDF, CSV, and XML formats.
  • Extract current and historical data load from legacy system and load into policy admin system using Informatica PowerCenter XML generator.
  • Experience with different types of sources and targets Flat files, Relational Tables, VSAM Files, XML files etc.
  • Based on the payments present in the XML there will be a backend file generated for each payment type.
  • Created Mappings to get the data loaded from flat files, xml files during the Stage Process.
  • Worked with XML sources as well as midstream parsers for creating mappings based on the XML specifications.
  • Used Informatica B2B Data Exchange to Structured data like XML.
  • Designed Sources to Targets mappings from Excel/Flat files, Xml Files to Oracle using Informatica Power Center.

Show More

19. Teradata

demand arrow
average Demand
Here's how Teradata is used in Senior ETL Developer jobs:
  • Resolved chronic integration/data migration issues with Teradata database.
  • Lead the code development based on the Design, using various Teradata utilities, UNIX Shell scripting.
  • Extracted data from Heterogeneous source systems like Oracle, Teradata, SQL Server and flat files.
  • Worked extensively with Teradata utilities (MLOAD, TPUMP and FAST LOAD) to load data.
  • Involved in migration project to migrate data from data warehouses on Oracle/DB2 to Teradata.
  • Worked on all major databases like Teradata, Oracle, DB2 and SQL Server.
  • Worked with heterogeneous sources like Flat files, Oracle, Teradata.
  • Worked on Teradata TPT Loader to load the target tables.
  • Involved in Teradata upgrade process from TD 12/TD 14.
  • Worked on Teradata database and developed Views, tables.
  • Used BTEQ, FLOAD, MLOAD, TPT Teradata utilities to export and load data.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the systems.
  • Loaded data from various data sources into Teradata production and development warehouse using BTEQ, FastExport, multi load and FastLoad.
  • Extracted data from source systems to a staging database running on Teradata using utilities like MultiLoad and Fast Load.
  • Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
  • Provide technical support to ETL applications on Informatica, UNIX and Teradata.
  • Used Sqoop to export data from HDFS to Teradata database.
  • Worked on BTEQ and multiload scripts to load Teradata tables.
  • Developed many Informatica mappings to replace DB2 targets with Teradata tables using Informatica 9.1.
  • Used Teradata utilities fast load, multi load, tpump to load data.

Show More

3 Teradata Jobs

No jobs at selected location

20. DB2

demand arrow
average Demand
Here's how DB2 is used in Senior ETL Developer jobs:
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Extracted data from different sources like Oracle, SQL Server, DB2 and Flat files loaded into DWH.
  • Created ETL processes to extract data from Mainframe DB2 tables for loading into various oracle staging tables.
  • Involved working on IBM DB2 database, designing and creating tables, relations, constraints, indexes.
  • Used Power Exchange utility to read DB2 files in order to load customer table.
  • Loaded the aggregated data onto DB2 for reporting on the dashboard.
  • Export and import table definitions using DB2 plug-ins for various purposes.
  • Migrated data from flat files to DB2 using DB2 load utilities.
  • Extracted data from relational databases DB2, Oracle and Flat Files.
  • Experienced in writing Queries against DB2 on AS400 Environment.
  • Extracted data from several source systems like Oracle,DB2,Flat files, XML files, etc.
  • Worked on Flat Files and XML, DB2, Oracle as sources.
  • Worked on SQL Server, Db2 sources and FLAT files.
  • Designed the ETL processes using Informatica to load data from DB2, SQL Server and Flat files to the Target Database.
  • Extracted data from various heterogeneous sources like Oracle, SQL Server, DB2, Sybase, MS Access and Flat Files.
  • Worked on data integration of different sources like Teradata, Oracle, SQL Server, DB2 UDB and flat files.
  • Loaded data from multiple sources into staging using Informatica - DB2, Flat files, XML.
  • Created Informatica mappings for data migration from DB2 to Oracle.
  • Extracted data from different source systems - Oracle, DB2, My Sql, Flat Files and XML Files.
  • Worked with Informatica power centre 9.5 to extract the data from IBM Mainframes DB2 sources into Teradata.

Show More

21. Sequence Generator

demand arrow
average Demand
Here's how Sequence Generator is used in Senior ETL Developer jobs:
  • Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
  • Developed stored procedures for generic values instead of using the sequence generator transformations.
  • Used Sequence generator transformation for Surrogate keys acting as PK in the dimension table.
  • Used various transformations like Filter, Expression, Sequence Generator, Update strategy.
  • Created synonyms for copies of time dimensions, used the sequence generator.
  • Used Sequence Generator Transformation to create surrogate keys for dimension tables.
  • Created Stored procedures to use oracle generated sequence number in mappings instead to using Informatica Sequence generator.
  • Developed mappings using connected/unconnected lookup, router, joiner, expression, filter, sequence generator transformations.
  • Created complex mappings using unconnected/connected Lookups, Aggregate, Router, Expression, Update strategy, Sequence generator transformations etc.
  • Source Qualifier, Joiner transformation, Update Strategy, Expressions, Aggregator, Sequence Generator.
  • Worked with Lookup Dynamic caches and Sequence Generator cache.
  • Worked extensively with various passive transformations in Informatica PowerCenter like Expression Transformation, Sequence Generator Transformation and Lookup Transformation.
  • Source Qualifier, Joiner Transformation, UpdateStrategy, Lookup transformation, Rank transformation, Expressions, Aggregator, and Sequence Generator.
  • Used most of the transformations such as Expression, Lookup, Joiner, Router, Filter, Aggregator and Sequence Generators.
  • Used various transformations like Aggregator, Expression, Lookup, Rank, Update Strategy, Stored procedure and Sequence Generator.
  • Created various Informatica jobs using transformations such as Update Strategy, Sorter, Aggregator, Router and Sequence Generator etc.
  • Worked extensively on transformations like Lookups, Aggregator, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations.
  • Used various transformations like Unconnected /Connected Lookup, Aggregator, Expression Joiner, Sequence Generator, Router etc.
  • Used transformations like Connected and Unconnected lookups, Aggregator, Expression, Update, Router and Sequence generator.
  • Filter, Expression, Joiner, Router, Aggregator, Sequence Generator and Lookup.

Show More

22. Source Systems

demand arrow
average Demand
Here's how Source Systems is used in Senior ETL Developer jobs:
  • Developed data Mappings between source systems and warehouse components using Mapping Designer.
  • Created landing tables, base tables, staging tables according to the data model and number of source systems.
  • Designed the Data Warehousing ETL procedures for extracting the data from all source systems to the target system.
  • Walked through the Logical and Physical Data Models of all source systems for data quality analysis.
  • Design and Develop Recon Process for static and transaction data from Upstream Source systems.
  • Worked with PowerCenter team to load data from external source systems to MDM hub.
  • Defined the Trust scores for the source systems as per understanding the business process.
  • Prepared new Control-M cycles according to each source systems in PAR.
  • Interacted with DBA and source systems and resolved issues.
  • Worked with different LOBs like Checking Account, Savings Account, ATM, Wire Transfer being extracted from different source systems.
  • Designed Informatica ETL jobs for extracting data from heterogeneous source systems, transforming and finally loading into the data marts.
  • Coordinated with team and Performed code review Interacted with Data modular and DBA and Source systems and resolved the Issues.
  • Designed and developed Informatica Mappings to load data from Source systems to Customer ODS and then to Enterprise Data Warehouse.
  • Design and d eveloped end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Developed mappings using Informatica Power Center Designer to transform and load the data from source systems to target database.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Profiled the data using Informatica Data Explorer (IDE) to standardize from different Source systems.
  • Developed data Mappings, Transformations between source systems and warehouse Performed Type1 and Type2 mappings.
  • Worked with Business Analyst to Identify the Source Systems for the cognos 10.1 reporting needs.
  • Used Informatica Power Center to load data from various source systems to Data warehouse.

Show More

5 Source Systems Jobs

No jobs at selected location

23. Different Transformations

demand arrow
average Demand
Here's how Different Transformations is used in Senior ETL Developer jobs:
  • Used different transformations for Extraction/Transformation, data cleansing and loading data into staging areas and Presentation Tables.
  • Developed New and Maintained existing ETL Process using different transformations.
  • Used SSIS to create complex mappings using different transformations to load data to server.
  • Created different transformations for loading the data into target database using e.g.
  • Created different transformations for loading the data into Oracle database.
  • Created mappings with different transformations, mapping parameters and variables.
  • Created Informatica source-to-target mapping using different transformations to implement business rules to fulfill the data integration requirements.
  • Develop Informatica mappings using different transformations as per requirements for loading data into Data warehouse.
  • Based on business requirements, worked on different transformations and implemented complex business logics.
  • Used Informatica Designer to create, Load, Update and delete mappings using different transformations to move data to application database.
  • Created different transformations such as Joiner Transformations, Look-up Transformations, Rank Transformations, Expressions, Aggregators and Sequence Generator.
  • Created the SSIS packages using different transformations Data conversion derived Column, Lookup union All, Slowly Changing Dimension.
  • Worked on different transformations like aggregator, expression, look up, filter, router, joiner etc.
  • Created different transformations for applying the key business rules and functionalities on the source data.
  • Developed Informatica mappings using different transformations to load SCD1 and SCD2 target tables.
  • Used Different transformations like Aggregator, Stored Procedure and Sorter.
  • Implemented lookups and different transformations in the mappings.
  • Used Informatica Designer to create mappings using different transformations to move data to a Data Warehouse from SQL server and Sybase.
  • Used different transformations to develop Mappings such as Joiner, Aggregator, Router, and Normalizer & Expression.
  • Designed various mappings and mapplets using different transformations such as key generator, labeller, parser, standardizer and lookup.

Show More

24. Autosys

demand arrow
average Demand
Here's how Autosys is used in Senior ETL Developer jobs:
  • Used AutoSys to Automated job control system for Scheduling, Monitoring and Reporting.
  • Defined and scheduled the load jobs using AutoSys and WLA scheduler.
  • Created, Scheduled and Monitored Batches and Sessions using Power Center Server Manager/AutoSys/Crontab.
  • Used Autosys scheduler for scheduling and running the jobs as per the business.
  • Developed shell scripts for running batch jobs and scheduling them using Autosys.
  • Scheduled various daily and monthly ETL loads using Autosys.
  • Implemented autosys job for weekly/monthly load of data.
  • Developed Autosys jobs for scheduling the Jobs.
  • Used Autosys for job scheduling.
  • Scheduled the tasks using Autosys.
  • Involved in migrating INFA code from one repository to another and automate the ETL jobs using Autosys jill scripts.
  • Scheduled the workflows using Autosys for daily loads and monthly loads and extensively worked on UNIX.
  • Modified/created Autosys Jil scripts for scheduling the Data Stage jobs with new parameters.
  • Create graphs/plans to parse Ab-initio log files and autosys log files.
  • Used Autosys as Job Scheduling tool to schedule Informatica jobs.
  • Used Autosys for scheduling the workflows.
  • Used Autosys scheduler to schedule and run the Informatica workflows on a daily/weekly/monthly basis.
  • Used Autosys to schedule Informatica, SQL script and shell script jobs.
  • Created the Teradata Set/ Multiset tables as per the data model/ requirements specifications Worked on Autosysscheduling tool for scheduling ETL jobs.
  • Used Autosys Scheduling tool to schedule the Informatica workflows and written autosys box and command jobs.

Show More

25. Complex Mappings

demand arrow
average Demand
Here's how Complex Mappings is used in Senior ETL Developer jobs:
  • Developed Complex Mappings for Data Integration based on Business Requirement and Logic.
  • Developed complex mappings involved extensive use of transformations.
  • Applied slowly changing dimensions (SCD) in various complex Mappings to load data from source to target.
  • Developed Transformation logic and designed various complex Mappings in the Designer for data load and data cleansing.
  • Involved in review and turning the complex mappings to increase the overall performance of the mapping.
  • Worked with all kinds of transformations and created complex mappings and worked with slowly changing dimensions.
  • Debugged to optimize the complex mappings and figure out the bottlenecks in the mapping.
  • Created the Complex mappings, done the performance tuning for the developed mappings.
  • Created complex mappings to load the data into Dimension tables and Fact tables.
  • Created complex mappings to load the data and provide support.
  • Created complex mappings and also tuned for better performance.
  • Developed complex mappings and involved in Performance Tuning both in informatica and database level.
  • Designed and developed many complex mappings and re-usable mapplets using various transformations i.e.
  • Designed and developed several complex mappings using various transformations transformations.
  • Developed complex mappings/sessions using Informatica Power Center for data loading.
  • Used Informatica Designer to create complex mappings, transformations, and source and target tables.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations.
  • Developed complex Mappings&Mapplets in Informatica to load the data using different transformations.
  • Optimized complex mappings, lookups and procedures.
  • Developed complex mappings in Informatica using various transformations, Mapping Parameters, Mapping Variables and Mapplets.

Show More

2 Complex Mappings Jobs

No jobs at selected location

26. Schema

demand arrow
average Demand
Here's how Schema is used in Senior ETL Developer jobs:
  • Received legacy data from TCS (stored in Intermediate Schema), which we verified and validated in an iterative process.
  • Created data modeling for staging area as well as Data warehouse (Star Schema and Snow Flake Schema) database.
  • Involved in analyzing scope of application, defining relationship within & between groups of data, star schema, etc.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema's using Erwin/TOAD.
  • Generated Submission, Account and Policy Change XML files out of the data loaded in the Intermediate Schema.
  • Involved in creating ETL model (snowflake schema), normalizing and documenting.
  • Used Erwin to design the target data model for the target Dimensional Schema.
  • Involved in creating Logical and Physical Data Models and creating Star Schema models.
  • Involved in developing star schema model for target database using ERWIN Data modeling.
  • Implemented logical and physical data modeling with STAR schema techniques in Data Mart.
  • Implemented Star schema, Slowly changing dimensions Type 2 with delta incremental loads.
  • Modeled the Data Warehousing Data marts using Star Join Schema.
  • Created Schema objects like Indexes, Views and Sequences.
  • Followed Star Schema to design dimension and fact tables.
  • Developed simple & complex mappings using Informatica to load Dimension&Fact tables as per STAR Schema techniques.
  • Enhanced and Fixed data lineages by hunting for missing relations among various lineage tables in schemas.
  • Designed multi dimensional Star schema, Generated the database scripts, E-R diagrams using ERWIN.
  • Participated in the detailed requirement analysis for the design of data marts and star schemas.
  • Design and build ETL processes (Star Schemas) and ETL for relational reporting.
  • Involved in working with different schemas and Dimension and Fact Tables.

Show More

8 Schema Jobs

No jobs at selected location

27. Debugger

demand arrow
average Demand
Here's how Debugger is used in Senior ETL Developer jobs:
  • Involved in Unit testing of mappings, scripts and used Debugger regularly to track and find errors.
  • Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
  • Used mapping debugger for data and error conditions to get trouble shooting information.
  • Used designer debugger to test the data flow and fix the mappings.
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Created the break points in the designer and worked with debugger.
  • Used debugger to test the mapping and fixed the bugs.
  • Used debugger extensively to identify the bottlenecks in the mappings.
  • Used Debugger wizard to troubleshoot data and error conditions.
  • Used the debugger to debug the valid mappings.
  • Used Debugger to troubleshoot the mappings.
  • Debugged mappings to gain troubleshooting information about data and error conditions using Informatica Debugger.
  • Fixed invalid mappings using Informatica Debugger.
  • Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
  • Checked Sessions and error logs to troubleshoot problems and also used debugger for complex Problem troubleshooting.
  • Used Informatica Debugger to debug the data in the transformations used in the ETL process.
  • Used the Debugger to validate the transformations by creating break-points and analyzing the debug monitor.
  • Experience in Informatica debugger to test mappings and fix bugs.
  • Worked on session logs, Informatica Debugger, and Performance logs for error handling when we had workflows and session fails.
  • Created Stored Procedure, Package and functions Used Informatica debugger to test the data flow and fix the mappings.

Show More

28. QA

demand arrow
average Demand
Here's how QA is used in Senior ETL Developer jobs:
  • Worked closely with the Business Analysts and the QA team for validation and verification of the development.
  • Supported migration of ETL code from development to QA and then from QA to production.
  • Supported QA Testing in fixing the bugs and also helped to resolve various data issues.
  • Ensured testing data is available for unit and acceptance testing within development and QA environments.
  • Frequent communication with the client, business, QA team on technical and functional queries.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Enhance and bugs fix the issue raised by QA/BA team using agile methodology.
  • Tested the target data against source system tables by writing some QA Procedures.
  • Developed Perl script to automate Data discrepancies between Production and QA environments.
  • Supported and worked with QA to test the stories in QA environment.
  • Support QA testing fixing defects and executing the ETL code developed.
  • Worked on Critical Bugs assigned by QA team through TFS.
  • Worked with QA team for better understanding of code development.
  • Performed Unit testing and moved the data into QA.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Perform Informatica code migrations from DEV to QA, UAT and PROD repositories and monitoring all the environments.
  • Designed Unit test document after the Informatica development and verified results before moving it to QA.
  • Involved in migrating the Data mapping documents and workflows from Development environment to QA environment.
  • Worked with Informatica Admin team to migrate codes to QA and PROD environment.
  • Configured Dev , QA, Prod repositories.

Show More

1 QA Jobs

No jobs at selected location

29. Design Documents

demand arrow
average Demand
Here's how Design Documents is used in Senior ETL Developer jobs:
  • Created Technical design specification documents based on the functional design documents and the physical data model.
  • Developed detailed design documents and Business Requirements and Operational requirements documentations for Non-Retail Data Mart.
  • Prepared the required application design documents based on functionality required.
  • Created various design documents and understanding documents for business partners.
  • Prepared high level mapping design documents from requirements.
  • Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Developed the code based on Technical Design documents and confirmed it covers the Goals from Functional related Document.
  • Involved in design review meetings to finalize the Data Model and the design documents necessary for the development.
  • Assisted Team Lead to prepare Mapping Design Documents and ETL Designs, Mapping Templates, etc.
  • Developed design documents and unit tested for the process developed for Data Vault project.
  • Involved in Requirements gathering and preparing High level and low level technical design documents.
  • Prepared detail design documents which contain job designs and functionality of each job.
  • Created Technical mapping design documents and field-field level mapping spread sheet documents.
  • Developed system test scripts, unit test scripts, detail design documents.
  • Designed data mapping documents and detail design documents for ETL Flow.
  • Prepared and review the Technical design documents.
  • Prepared ETL Design documents and developed ETL mappings in Informatica to build several dimension tables with slowly changing dimensions.
  • Reviewed Informatica ETL design documents and working closely with development team to ensure correct standards are followed.
  • Prepare High Level and Low Level Design Documents Worked with Teradata sources/targets.

Show More

30. UAT

demand arrow
low Demand
Here's how UAT is used in Senior ETL Developer jobs:
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Involved in migration of ETL Objects and Database changes from DEV to SIT and UAT environments and preparing Release Notes.
  • Perform the post deployment (QA/UAT/prod) health checks and monitor for any issues during the initial job execution.
  • Involved in System Integration testing(SIT) and User Acceptance testing (UAT).
  • Create UAT test document to deploy packages on ITG (QA) and production.
  • Worked closely with QA team to discuss the UAT defects and responsible for maintaining.
  • Address QA team and UAT user issues and facilitate towards QA/UAT sign off.
  • Involved in On Call Support for UAT Issues during office and non-office hours.
  • Prepared UTC document and support the team during QA and UAT phase.
  • Provided support during QA/UAT testing by working with multiple groups.
  • Supported the Production, UAT, QA deployments.
  • Participate in deployment, system testing, UAT.
  • Used migration, redesign and Evaluation approaches.
  • Involved testing and UAT support activities.
  • Worked closely with UAT team.
  • Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
  • Performed Unit Testing and helped SIT/UAT team to setup the Data for Testing.
  • Evaluate Ideal Compression Techniques available in Hadoop to optimally store the data.
  • Developed validation and balancing SQL scripts, system and uat testing.
  • performed the performance evaluation of the ETLs for full load cycle.

Show More

31. Fact Tables

demand arrow
low Demand
Here's how Fact Tables is used in Senior ETL Developer jobs:
  • Developed various staging tables in the database to load the intermediate tables before loading to fact tables as part of WH.
  • Utilized FACT tables and granular dimensions in designing and data modeling of data warehouse and data marts in star schema methodology.
  • Create multiple fact tables and dimension tables and create calculated columns using named query and named calculation property.
  • Used SCD's (Slowly Changing Dimension) strategies for Incremental loading of Dimension and Fact tables.
  • Analyzed and Created Fact tables, Aggregate tables and Dimension Tables for storing the historical data.
  • Modeled the Data Warehousing Data marts using Star Schema, Built Dimension Tables and Fact Tables.
  • Developed BTEQ's for loading the data from staging area to the final Dim/Fact tables.
  • Experience developing complex transformations, surrogate keys, routines, dimension tables and fact tables.
  • Created Type 1, Type 2, Junk and Hybrid Dimensions along with Fact tables.
  • Involved in the design of a STAR schema to load Dimension and Fact tables.
  • Developed plan to load the dimensional and fact tables into the Data Mart.
  • Developed mappings to load dimension and fact tables into SALES data mart.
  • Created slowly changing dimensions & Fact tables to meet the business requirements.
  • Worked with Huge data sets to load Fact Tables.
  • Created DataStage jobs for loading dimension and fact tables.
  • Created multi-way aggregate fact tables when queries on granular data produce slow response time and large unwanted result set.
  • Designed Optimized Load Processes to load FACT tables which will cater to Base load and subsequent intraday runs.
  • Developed and Tested Informatica mappings to load data into various dimensions and fact tables from various source systems.
  • Configured several Informatica Mappings to populate the data from Oracle into dimensions and fact tables.
  • Used the STARSchema Approach in the Design of the Dimension tables and fact tables.

Show More

32. Source Data

demand arrow
low Demand
Here's how Source Data is used in Senior ETL Developer jobs:
  • Designed and Implemented mapping to maintain history in warehouse and to capture the daily changes in the source data.
  • Worked closely with the BA and Data Warehouse Architect to understand the source data and need of the Warehouse.
  • Used Debugger to track the path of the source data and also to check the errors in mapping.
  • Analyzed the source data, made decisions on appropriate extraction, transformation, and loading strategies.
  • Develop, Enhance, Tune and Maintain ETL processes to load external source data into Star Schema
  • Validate the data in warehouse and data marts after loading process balancing with source data.
  • Analyzed source data and formulated the transformations to achieve the customer requested reports.
  • Created and modified COBOL copybook to connect source data using Power Exchange Navigator.
  • Extracted data from SalesForce as one of the source databases.
  • Source data was extracted from Oracle, SQL Server, flat files, COBOL sources and XML sources.
  • Extracted source data from Oracle, Flat files, XML files using Informatica, and loaded into target Database.
  • Configured the sessions using workflow manager to have multiple partitions on source data and to improve performance.
  • Designed the change data capture logic in the Informatica mappings by incremental extraction of source data.
  • Used Address Validator transformation to validate source data with the reference data for standardized addresses.
  • Used Informatica Power Exchange tool to read source data and generate feeds to external systems.
  • Analyzed the source data coming from different sources and worked on developing ETL mappings/mapplets.
  • Replicated data from source database to staging by using Informatica power exchange.
  • Experience in using Normalizer transformation for normalizing the XML source data.
  • Worked on Informatica Data Quality to profile source data as per the business requirements using data quality rules in Informatica.
  • Used Salesforces Local Database Query LanguageSOQL (Salesforce Object Query Language) to query their source data for testing purposes.

Show More

33. Worklets

demand arrow
low Demand
Here's how Worklets is used in Senior ETL Developer jobs:
  • Created workflows, worklets, tasks and reusable tasks to run the mappings to load the data Into target systems.
  • Developed reusable transformations, mapplets, sessions, and worklets to make code very modular and reused it as required.
  • Modified the existing worklets and workflows to accommodate for these new sessions and mappings.
  • Created sessions and workflows, worklets, and carried out test loads.
  • Created workflows, worklets, sessions and partitions for better performance.
  • Created session tasks, worklets and workflows to execute the mappings.
  • Created Workflows / Worklets/ Sessions, shell scripts and Performance tuning.
  • Developed various worklets as part of workflow design.
  • Involved in Creating tasks, worklets, workflows.
  • Worked on Parameters, Worklets and Maplets.
  • Developed and documented mappings/transformations, Informatica sessions and Informatica Worklets.
  • Developed Mapplets, Worklets and Reusable Transformations for reusability.
  • Created Workflows, Worklets and Tasks to schedule the loads at required frequency using Informatica scheduling tool.
  • Used mapping parameters and variables Creating sessions, worklets and workflows for carrying out test loads.
  • Involved in fixing invalid Mappings, Testing of Informatica Sessions, Worklets and Workflows.
  • Used the Workflow manager to create workflows and tasks, and created Worklets.
  • Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
  • Created reusable sessions and worklets using Informatica Powercenter workflow.
  • Created sessions, reusable worklets and workflows in workflow manager and Scheduled workflows and sessions at specified time interval.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.

Show More

34. User Acceptance

demand arrow
low Demand
Here's how User Acceptance is used in Senior ETL Developer jobs:
  • Moved mappings from development environment to test and quality environment to perform integrated testing and User acceptance testing.
  • Performed Unit Testing, Integration Testing and User Acceptance Testing to pro-actively identify data discrepancies and inaccuracies.
  • Involved in analysis, design, development, integration, performance and user acceptance testing and Production implementation of the projects.
  • Involved in Unit testing, User Acceptance Testing and System Testing to verify accuracy and completeness of ETL process.
  • Prepared test cases and involved in unit testing of mappings, system testing and user acceptance testing.
  • Worked closely on the major defects in QA and UAT Environment for Integration and User acceptance testing.
  • Created test cases and test plans for user acceptance testing and system testing based on functional specifications.
  • Involved Working on Functional testing, Integration/ System testing, Regression testing, And User Acceptance Testing.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Provided reliable, timely support of integration, performance and user acceptance testing processes.
  • Provided extensive support in UAT (user acceptance testing) and deployment of mappings.
  • Performed unit testing, system integration testing, and supported user acceptance testing.
  • Managed quality control, scope verification, change control and user acceptance.
  • Prepared unit test cases and user acceptance test(UAT) cases.
  • Assisted and resolved issues in User Acceptance testing.
  • Involved in User Acceptance Testing with end users.
  • Conducted unit, performance and user acceptance testing.
  • Performed Unit Testing, System Integration Testing and User acceptance testing Involved in ongoing production support and process improvements.
  • Involved in Unit, User Acceptance testing, System Integration testing of Informatica Sessions, and the Target Data.
  • Conducted customer onsite database User Acceptance Testing (UAT) and product delivery.

Show More

35. Normalizer

demand arrow
low Demand
Here's how Normalizer is used in Senior ETL Developer jobs:
  • Used Normalizer Transformation to split a single source row into multiple target rows by eliminating redundancy and inconsistent dependencies.
  • Worked extensively on Dynamic Look-up caching, Normalizer Transformation, SQL Transformation among others to deliver complex mappings.
  • Used Normalizer transformation extensively to normalize the data from the Oracle environment.
  • Created oracle packages to overcome normalizer limitation by dynamic transposition of rows.
  • Created and used the Normalizer Transformation to normalize the flat files in the source data.
  • Created mapping's with Normalizer transformation to log the error table in Oracle db.
  • Used complex transformations like normalizer, XML parser and Stored procedure.
  • Worked on normalizer transformation for normalizing the XML source data.
  • Worked with normalizer transformation for COBOL (VSAM) files.
  • Handled Error logging in the ETL by using Normalizer transformation.
  • Worked with Power Exchange to upload the data from Cobol Files into SQL Server by extensive use of Normalizer transformation.
  • Use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, and Sequence generator transformations.
  • Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures transformations.
  • Used Normalizer Transformation for Cobol (VSAM) sources.
  • Extracted source data from COBOL files using Informatica tools and created mapping using transformation mainly Normalizer, Lookup, Expression.
  • Used Router, Normalizer, Aggregator, Expression, lookup, update strategy, router, and rank transformation.
  • Created complex mappings using Joiner, Normalizer, Aggregator and Dynamic & Persistent Caches in Lookup transformations.
  • Used various Informatica Transformations like Expression, Lookups, and Normalizer transformations.
  • Used various transformations like unconnected lookup, connected lookup, aggregator, rank, joiner and normalizer.
  • Designed and Implemented Informatica jobs using various tranformations which includes Joiner, Sorter, Lookup, Aggregator, Normalizer etc.

Show More

36. Business Logic

demand arrow
low Demand
Here's how Business Logic is used in Senior ETL Developer jobs:
  • Involved in developing packages for implementing business logic through procedures and functions.
  • Extracted data from flat files and the Oracle databases and applied business logic to load them in the central Oracle database.
  • Source data from relational tables and flat files is transformed in the stage layer by applying the business logic.
  • Created sequential/concurrent Sessions/ Batches for data loading process and used Pre&PostSessionSQL Script to meet business logic.
  • Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic.
  • Developed PL/SQL procedures for processing business logic in the database and used them as a Stored Procedure Transformation.
  • Analyzed the requirements with BA's and framed the business logic for the ETL process.
  • Worked closely with business owners to translate existing business logic and needs to valuable requirements.
  • Used stored procedures to fulfill the iterative process and to solve complex business logic.
  • Used shared containers for multiple jobs, which have the same business logic.
  • Review the Master Data Workbooks for better understanding of business logic.
  • Implemented complex business logic using complex SQL queries.
  • Created the BTEQ Scripts to process the business logic from the Landing Zone to the Common Staging Area (CSA).
  • Developed transformation logic based on the business logic and designed various complex mappings and Mapplets using the PowerCenter Designer.
  • Designed and developed solutions for very complex business logics.
  • Developed Mappings & Workflows as per Business logic, quality and coding standards prescribed for the module.
  • Created shared container to incorporate business logic and used in multi job.
  • Created reusable mapplets/transformations embedding business logic Involved in Design Review, code review, test review, and gave valuable suggestions.
  • Analyzed the requirements and framed the business logic for the ETL Process using Talend 5.3 and Pentaho 4.5.
  • Interact with the requirements team and designers to get a brief knowledge of business logics.

Show More

37. SCD

demand arrow
low Demand
Here's how SCD is used in Senior ETL Developer jobs:
  • Implemented Slowly Changing Dimension (SCD) Type 2 logic through mappings to populate Enterprise Warehouse (EW) Dimensional Tables.
  • Implemented slowly changing dimensions on customers table using SCD stage in Data stage 8.7 on IBM Information server 8.7.
  • Used the Slowly Changing Dimensions (SCD type 2) to update the data in the target dimension tables.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly changing Dimension tables.
  • Develop common slowly changing dimension ETL code SCD type 1 and SCD type 2.
  • Developed Slowly Changing Dimension for Type 1 SCD and worked on ETL ODI applications.
  • Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.
  • Worked on SCD type -2, truncate and incremental load.
  • Worked with SCD stage for implementing slowly changing dimensions.
  • Implemented SCD techniques like Type1, Type2 and Type3.
  • Developed complex strategies to implement type 2 SCD.
  • Implemented SCD type 1 for all interfaces.
  • Provided technical leadership for SCDW group.
  • Implemented both SCD's (Slowly Changing Dimensions) TYPE-1 and TYPE-2.
  • Worked on SCD Type 1, Type 2 & Type 3 conversions.
  • Developed mappings for Type 1, Type 2 & Type 3 Slowly Changing Dimension (SCD) using Informatica Power Center.
  • Implemented Slowly Changing Dimensions (SCD) for some of the tables as per user requirement.
  • Implemented Slowly Changing Dimension (SCD) - Type 2 logic using Informatica mapping.
  • Used SCD2 to populate the data in a generic way.
  • Implemented Slowly Changing Dimensions (SCDs, Both Type 1 & 2).

Show More

38. Bteq

demand arrow
low Demand
Here's how Bteq is used in Senior ETL Developer jobs:
  • Coded complex BTEQ scripts to populate data mart tables from EDW to cater specific reporting needs.
  • Created BTEQ scripts to extract data from EDW to the Business reporting layer.
  • Developed various complex BTEQ Scripts to handle data mechanisms.
  • Created BTEQ scripts to incorporate the transformation rules.
  • Created the batch jobs using BTEQ scripts.
  • Performed Development using Teradata utilities like BTEQ, Fast Load, MultiLoad and TPT to populate the data into BI DW.
  • Worked on loading of data from several flat files sources to Staging using Teradata TPUMP, MLOAD, FLOAD and BTEQ.
  • Used BTEQ, Unix Perl scripting to trigger the cron jobs having high impact against the production database during non-prime time.
  • Developed the Teradata BTEQ's to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Created a BTEQ script for pre population of the work tables prior to the main load process.
  • Worked on Teradata utilities BTEQ, MLOAD, FLAOD and TPUMP to load staging area.
  • Developed MLOAD & BTEQ scripts to handle the population of data into Teradata tables.
  • Designed UNIX shell scripts to automate the BTEQ scripts for loading into Teradata database.
  • Loaded data into Teradata by using BTEQ batch scripts.
  • Worked on MLoad, BTeq, Fast Export and Fast Load to load the feed data in data warehouse.
  • Loaded data into Teradata using Data Stage, Fast Load, BTEQ, Fast Export, Multi Load.
  • Worked as an Informatica Developer as well as a Teradata Developer to build BTEQ scripts and FLOAD Scripts.
  • Created Teradata Bteq scripts to load the foundation tables.
  • Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fastload, FastExport.
  • Worked with different Teradata utilities like bteq, multi load, fast load and fast export.

Show More

39. EDW

demand arrow
low Demand
Here's how EDW is used in Senior ETL Developer jobs:
  • Designed and developed validation scripts based on business rules to check the Quality of data loaded into EDW.
  • Developed strategies for Incremental data extractions as well data migration to load into the Oracle (EDW).
  • Involved in the design, development and implementation of the Data Warehousing (EDW) process.
  • Worked on the daily requirement to insert and update the data into the EDW.
  • Involved in building a process to load new LOB of claims into EDW.
  • Used Redwood to design the job chains and scheduling as per requirements.
  • Developed Error/Exception handling mechanism to enhance the data quality loaded to EDW.
  • Sourced DB2 data to staging table before loading to EDW.
  • Authored complex data testing SQ queries to validate EDW data.
  • Developed EDW mappings based on the mapping document.
  • Applied Transformation logic in various complexities in transforming and transferring the data into downstream Teradata EDW.
  • Transform the Oracle PL/SQL packages logic into the ETL design for the EDW Teradata.
  • Designed ETL Architecture using Informatica for implementing various phases of EDW build.
  • Develop ELT scripts to integrate data in a Netezza based EDW.
  • Implemented the conventional load from EDWSTG -> EDW -> EDM schemas.
  • Gathered requirement to develop mappings to load the formulary data coming from external vendor into client's Teradata EDW.
  • Project Details: AEDW handles all the Teradata development involved with all the process revolving around the data warehouse.
  • Involved in System Documentation of Dataflow and methodology for EDW and weblogs.
  • Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
  • Load data files coming from external vendors onto Teradata EDW using mload and fload utilities.

Show More

2 EDW Jobs

No jobs at selected location

40. Email

demand arrow
low Demand
Here's how Email is used in Senior ETL Developer jobs:
  • Used Web Service Provider Writer to send Flat file target as attachments and also for sending email from within a mapping.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Created reusable tasks like decision task, command and email tasks and sent success, failure notifications.
  • Configured the session so that Power Center Server sends an Email when the session completes or fails.
  • Developed rules using IDQ to standardize the SSN, email phone number validations.
  • Worked with re-usable sessions, decision task, control task and Email tasks.
  • Implemented sending of Post-Session Email once data is loaded.
  • Used various tasks like session, email, command, Event wait, Event raise, control etc.
  • Created shell script to check whether the Informatica service up/down and generate email to test group if it is down.
  • Used tasks like event wait, event raise, email, command and pre/post SQL to load data.
  • Used different types of tasks such as Email, Command and Event Wait tasks in the Workflow Manager.
  • Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
  • Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Scheduled workflows and sent automatic email messages that post the statistics of the data load.
  • Developed Workflows, and developed tasks involving Event-Waits, Conditional flows, Email and Command.
  • Worked with mapping parameters and variables, session parameters, pmcmd commands, email tasks.
  • Created shell scripts to run workflows, back up files and email event etc.
  • Prepared Sybase/Teradata stored procedure to execute DDOEE email system based on condition.
  • Used workflow manager to create workflows, sessions, and also used various tasks like command, email.
  • Used Workflow Manager for creating workflows, work lets, email and command tasks.

Show More

4 Email Jobs

No jobs at selected location

41. CDC

demand arrow
low Demand
Here's how CDC is used in Senior ETL Developer jobs:
  • Worked on different level of the project to ensure the data quality and a better throughput through CDC process.
  • Worked with Power Exchange for CDC techniques for relational database sources on UNIX, and Windows operating systems.
  • Designed and Optimized Power Center CDC and Load Mappings to load the data in slowly changing dimension.
  • Used Power Exchange Data Change option (CDC) to capture modified records.
  • Implemented CDC for critical Sales Force fields as required by the end user.
  • Worked on CDC using Power Exchange 9.5.1 to implement SCD type 1.
  • Created a Technical Design Document for the CDC alternative.
  • Implemented update strategies, incremental loads, CDC maintenance.
  • Used CDC for moving data from Source to Target.
  • Retrieved data from the Oracle CDC Staging tables.
  • Resolved issues related to CDC logic within workflows which feeds data into warehouses, enabling gathering of more accurate/efficient data.
  • Worked with OWB and ODI ETL and Data Integration tools to Pre-built modules for CDC, bulk loading etc.
  • Developed a CDC alternative to extract batches from source side and push batch data into cif files.
  • Worked with CDC option is captures database changes, forwarding them to powercenter for further processing.
  • Implemented Full Pushdown optimization for CDC Sessions.
  • Implemented CDC using Informatica Power Exchange.
  • Implemented CDC using Informatica variables.
  • Phase 1: CDC for GECDW (Pipeline Analytics).
  • Created Registration, DataMap, configured Real-Time mapping and workflows for real-time data processing using CDC option of Informatica PowerExchange.
  • Used Datastage CDC stage to capture incremental data Written Hive UDFS.

Show More

42. Warehouse Designer

demand arrow
low Demand
Here's how Warehouse Designer is used in Senior ETL Developer jobs:
  • Worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Repository Manager.
  • Imported and Created Source Definitions using Source Analyzer and Target Definitions using Warehouse Designer.
  • Worked Efficiently with Designer tools including Source Analyzer, Warehouse designer, Mapping.
  • Assisted warehouse designer while designing the models.
  • Worked extensively on Informatica Power Center using Source Analyzer, Mapping designer, Warehouse Designer.
  • Created Source and Target Definitions in the repository using Informatica Source Analyzer and Warehouse Designer.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Worked on InformaticaPower Center 8.6.1 tool - Source Analyzer, warehouse designer, Mapping Designer & Mapplets, and Transformations.
  • Worked on Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and workflow manager to develop new mappings.
  • Used Informatica Source Analyzer, Mapping Designer, Transformation Developer and Warehouse Designer for Extraction, Transformation and Loading.
  • Used the Power center tools - Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas.
  • Worked on Informatica tool Source Analyzer, Warehouse Designer, Mapping Designer.
  • Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
  • Worked on Informatica Tool Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Show More

43. Technical Specifications

demand arrow
low Demand
Here's how Technical Specifications is used in Senior ETL Developer jobs:
  • Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications.
  • Documented business requirements, functional and technical specifications and test requirements.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Translated the requirements into functional and technical specifications.
  • Involved in creating Technical specifications and mapping documents.
  • Design and Technical specifications Documentation.
  • Create documents for Design reviews, ETL Technical specifications, Unit test plans, Migration checklists and Schedule plans.
  • Helped & Led Offshore development team in realizing designed ETL technical specifications.
  • Developed Technical Specifications of the ETL process flow.
  • Reviewed the code and technical specifications.
  • Worked on understanding and creating ETL Designs, Technical specifications, Mappings/Transformations and Informatica sessions documents confining to the business rules.
  • Developed technical specifications of the ETL process flow Designed the Source-Target mappings and involved in designing the Selection Criteria document.
  • Develop technical specifications for constructing SQL based reports Develop Data load modules using Camel/Fuse & Java/J2EE technologies.
  • Analysed business requirements, technical specifications, source repositories and physical data models for ETL mappings.
  • Created technical specifications using Informatica Analyst tool and excel spread sheets.
  • Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.
  • Prepared the technical specifications using use cases and maintained versioning & deployment groups' functionality.

Show More

2 Technical Specifications Jobs

No jobs at selected location

44. Data Analysis

demand arrow
low Demand
Here's how Data Analysis is used in Senior ETL Developer jobs:
  • Developed error handling process to collect rejects for data analysis.
  • Performed Data Analysis & Source-To-Target Mapping.
  • Have worked on data analysis to find the data duplication and existed data pattern using a data profiling tool, IDE.
  • Experienced in database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
  • Involved extensively in Data Analysis, Data Mapping, Data modeling, Data Transformation, Data Loading and Testing.
  • Worked on Data Extraction, Data Transformations, Data Profiling, Data Loading, Data Conversions and Data Analysis.
  • Performed data analysis, data matching, data validations and performance tuning of mappings to ensure bug free code.
  • Collaborate and work with business analysts and data analysts to support their data warehousing and data analysis needs.
  • Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation.
  • Experience on data analysis (field validation) for source, staging & DW.
  • Involved in extensive data mining and data analysis prior to initiating the design.
  • Performed data analysis and data profiling using SQL on various sources systems.
  • Performed data analysis on the source data coming from legacy systems.
  • Performed source data analysis & profiling to assist data architects.
  • Performed Source System Data analysis as per the Business Requirement.
  • Performed Data Analysis tasks like Profiling, Validating and Cleansing data using Informatica Analyst and developed data quality mappings using developer.
  • Used the DVO tool of Informatica to validate data analysis.
  • Involved in data analysis and created reports using SQLs.
  • Work on Informatica Power Center, Informatica PowerExhange for Metadata Analysis.
  • Analyzed the key functionalities of the institute and performed Data Analysis and abstracted the transactional nature of data.

Show More

2 Data Analysis Jobs

No jobs at selected location

45. Data Capture

demand arrow
low Demand
Here's how Data Capture is used in Senior ETL Developer jobs:
  • Created Change Data Capture mappings to extract the data from tables and later used for HUB, SATTELITE AND LINK mappings.
  • Used changed data capture type mappings to load slowly changing Type 1, 2 and 3 dimensions.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
  • Implemented the Change Data Capture extensively to send only the delta records to the target systems.
  • Performed change data capture (CDC) process using SCD-I and SCD-II implementations.
  • Implemented update strategies, incremental loads, Data capture and Incremental Aggregation.
  • Created different SCD-II mappings using changed data capture (CDC) processes.
  • Implemented Change Data Capture (CDC) for handling delta loads.
  • Worked on Power Exchange for change data capture (CDC).
  • Implemented Change Data Capture (CDC) using sales force.com.
  • Designed and implemented change data capture and web services.
  • Change Data Capture can do using the Power Exchange.
  • Worked on all the SCD like Type 1, 2 and type 2 variant and Change Data Capture.
  • Availed Informatica's Change Data Capture(CDC) feature to identify, capture, and deliver changes made data sources.
  • Implemented change data capture (CDC) using Informatica PowerExchange to update tables in the oracle database.
  • Worked onCDC(Change Data Capture)to implement SCD (Slowly Changing Dimensions).
  • Implemented Change Data Capture technology in Talend in order to load deltas to a DataWarehouse.
  • Implemented Real time Chang Data Capture (CDC) using informatica Power Exchange.
  • Well versed with Informatica Partitioning, Change data capture (CDC).
  • Implemented slowly changing dimension Type 1 andType 2 for Change data capture.

Show More

46. DEV

demand arrow
low Demand
Here's how DEV is used in Senior ETL Developer jobs:
  • Used metadata manager to import, export and validate data from development environment to testing environment.
  • Provided guidance to developers in professional and technical issues with successful results.
  • Developed DOS script programs to facilitate automated file movements.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
  • Prepared High level estimates, system integration support and worked in code migration activities moving the code from DEV/QA/PROD.
  • Developed PL/SQL procedures/Packages for loading the data from stage table to Facts/Dimensions with complex transform logic.
  • Designed and developed the web interfaces and services for an enterprise system.
  • Developed advanced Oracle stored procedures and handled SQL performance tuning.
  • Lead the team both development team & support team.
  • Developed Informatica mappings, enabling the extract, transform and loading large volumes of data into target tables.
  • Developed graphs to import metadata from external source like informatica and scripted jobs to run extractors on MHUB.
  • Developed and executed Talend jobs one module using DI for Rebate systems.
  • Developed joblets that are reused in different processes in the flow.
  • Design and Develop Informatica mappings based on Source to target mapping.
  • Developed many ETL packages in SSIS and Talend.
  • Developed mappings using Informatica Cloud to extract data using salesforce connector at real time and load it into SQL Server database.
  • Developed jobs to perform Address Standardization where customer addresses are standardized and loaded into HBase table which involves Talend jobs.
  • Designed and Developed Informatica mappings serving as interfaces between ISchwab and IDW.
  • Design and Develop Informatica sessions and workflows.

Show More

47. Production Environment

demand arrow
low Demand
Here's how Production Environment is used in Senior ETL Developer jobs:
  • Deployed packages, folders and Reports from a development environment to Testing environment and Production environment.
  • Deployed complex packages from development to production environment using proper package configuration.
  • Created a Migration strategy to migrate from development to QA/UAT/Production environment.
  • Involved in code migration from development to QA/UAT/Production environment.
  • Extended Assistance in Production environment during Project deployment phase.
  • Migrated development mappings to QA and Production environment.
  • Prepare Deployment, Roll-back and Communication Plan and ensure smooth deployment on Production Environment.
  • Used CA workload automation scheduling tool to configure/schedule the jobs in production environment.
  • Attended to Production support issues and Involved in supporting jobs in production environment.
  • Assisted in migrating jobs across Development, QA, Production environments.
  • Migrated development mappings into SIT, UAT and production environments.
  • Migrated jobs from development to QA to Production environments.
  • Worked with Shortcuts across Shared and Non Shared Folders Migrated the code in to UTE and production environment.
  • Created mappings, sessions and workflows in development environment and later moved them to testing and production environments.
  • Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
  • Worked with Informatica Administrator to setup project folders in development, test and production environments.
  • Performed Configuration Management to Migrate Informatica mappings/sessions/workflows from Development to Test to production environment.
  • Migrated Informatica mappings/sessions /workflows from Development to Test and to production environment.
  • Provide support for 680 Informatica workflows/mappings which are running into Production environment at client location.
  • Develop, test and promote all of the code changes toQA and Production environment.

Show More

48. ODS

demand arrow
low Demand
Here's how ODS is used in Senior ETL Developer jobs:
  • Developed control balance check both at Mapping level and tables level by various methods to improve the quality of the data.
  • Worked on US Address & Global Address cleaning, Match transformations to clean the data by suing Quality transformations in BODI/BODS.
  • Involved in tuning the database for better performance by analyzing the table, adding Hints and by Query Tuning methods.
  • Redesigned the existing data mart ODS, in order to increase the performance and reduce the maintenance cost.
  • Involved in Data Extraction, Staging, Transformation, Pre-Loading, and Loading into ODS, Data mart.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Enhanced the Job Performance by using proper Partitioning methods and analyzing the resources utilized using Job Monitor.
  • Design Flexible ETL framework with atomic jobs for each of Staging, ODS and Fact loads.
  • Worked on ODS, Data mart and also involved in designing the ETL Uses Cases.
  • Experience in PRODUCTION SUPPORT & Supporting Application during Warranty Periods of various ETL Systems.
  • Designed and developed efficient Error Handling methods and implemented throughout the mappings.
  • Involved in debugging the failed mappings and developing error-handling methods.
  • Project: ODS INTAKE /CCDR (CLINICAL CLAIMS DATA Repository) and prod support.
  • Implemented pushdown optimization on BODS Jobs to provide maximum efficiency and performance.
  • Worked on SPD, PD Data Marts on ODS (Operational Data Store) and DDS (Definitional Data Store).
  • Designed mappings to extract data from AS/400 data image maps and developed ETL mappings using Informatica 9.5.1 to load ODS tables.
  • Identified the database tables for defining the queries and flow through ODS -ETL and defined datasets for report generation.
  • Involved in PROFILING the source system through BODS Profiler.
  • Involved in Pushdown Optimization of BODS Jobs.
  • Used basic Insert else update, Upsert (Salesforce local) strategies for stage to ODS mappings.

Show More

49. BI

demand arrow
low Demand
Here's how BI is used in Senior ETL Developer jobs:
  • Conducted thorough technical interviews of candidates for BI developers resulting in successful hires.
  • Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
  • Involved in implementing Oracle BI/Database Performance Tuning Techniques.
  • Coordinated with Infrastructure Production Control Teams and BI/ETL Team to support activities that resolved data quality issues and deployed quick fixes.
  • Created and modified repositories Physical Layer, Business Model and Mapping Layer and Presentation Layer using Oracle BI Administration Tool.
  • Project: Working on various BI implementations DWH Like General Ledger, Letter of Credit Implementation of New ETL Design.
  • Performed problem assessment, resolution and documentation, and, performance and quality validation on new and existing BI environments.
  • Developed detailed ETL implementation design based on technical specification for BI effort within the ETL design standards and guidelines.
  • Project will deliver the capability to analyze and report on this data using Business Objects.
  • Designed, Developed and Tested Data security and Dashboard security in OBIEE.
  • Created Oracle BI Answers Requests, Oracle BI Interactive Dashboard Pages.
  • Involved in the OBIEE RPD creation for all the business markets.
  • Have built performance-tuned systems with maximum optimization, reusability and ease of operation.
  • Developed/modified the PL/SQL Procedures and Functions to enhance the reusability of the code to be used later in various applications.
  • Implemented parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Worked on handling performance issues, Troubleshooting of Informatica Mappings, evaluating current logic for tuning possibilities.
  • Developed re usable mapplets to convert confidential HR Data Hash value in to data with visibility.
  • Experience in ETL systems to enable the reusability of similar logic across the board.
  • Worked with the Cognos developers to create customized BI reports to meet the user requirements using Cognos Query Studio.
  • Model the Metadata schemas in Abinitio MHUB to establish links between data elements on the MHUB.

Show More

12 BI Jobs

No jobs at selected location

50. Ssis

demand arrow
low Demand
Here's how Ssis is used in Senior ETL Developer jobs:
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed ETL process using SSIS with Various Control Flow, Data Flow tasks and Store Procedures for Work Order Validation process.
  • Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
  • Prepared Implementation plans that assist Database Administrators to deploy the code from Testing to Production environments.
  • Created SSIS Packages for data migration and used Visual C# scripts to validate file structures.
  • Worked on DATA MIGRATION using power center and SSIS (SQL Server Integration System).
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Assisted Architects to create Process Flow Diagrams, Data Flow Diagrams and High Level Documents.
  • Involved in writing the Test Cases and also assisted the users in performing UAT.
  • Assisted in performance testing, data quality assessment, support & product deployments.
  • Assisted Mapping team to transform the business requirements into ETL specific mapping rules.
  • Used SSIS Import/Export wizard to move packages from oracle data source to target.
  • Created and executed comprehensive unit test plans and assist with integrated system test.
  • Mentor the team members by providing assistance with the functional environment/Technical aspects.
  • Assisted change management reviews and impact analysis with the Business team.
  • Assisted Testing team in creating test plan and test cases.
  • Assisted data modeler in designing Conceptual, Logical and Physical data models making use of ERwin for relational OLTP systems.
  • Reworked our current dynamic SSIS package to better source from iSeries.
  • Create packages using multithreading enabling the parallel processing property of SSIS.
  • Assisted the Project lead in identifying key deliverables and defining timelines from an Enterprise perspective.

Show More

8 Ssis Jobs

No jobs at selected location

Senior ETL Developer Jobs

NO RESULTS

Aw snap, no jobs found.

20 Most Common Skills For A Senior ETL Developer

Informatica

20.6%

Data Warehouse

8.9%

Business Requirements

7.0%

Pl/Sql

6.8%

Unix

6.0%

Target Database

5.9%

Lookup

4.9%

SQL

3.9%

Source Qualifier

3.7%

Toad

3.7%

Aggregator

3.5%

Update Strategy

3.4%

Windows XP

3.0%

Parameter Files

2.9%

Test Cases

2.8%

Mapplet Designer

2.7%

Repository

2.6%

XML

2.6%

Teradata

2.6%

DB2

2.6%
Show More

Typical Skill-Sets Required For A Senior ETL Developer

Rank Skill
1 Informatica 15.1%
2 Data Warehouse 6.5%
3 Business Requirements 5.2%
4 Pl/Sql 5.0%
5 Unix 4.4%
6 Target Database 4.3%
7 Lookup 3.6%
8 SQL 2.9%
9 Source Qualifier 2.7%
10 Toad 2.7%
11 Aggregator 2.5%
12 Update Strategy 2.5%
13 Windows XP 2.2%
14 Parameter Files 2.1%
15 Test Cases 2.0%
16 Mapplet Designer 2.0%
17 Repository 1.9%
18 XML 1.9%
19 Teradata 1.9%
20 DB2 1.9%
21 Sequence Generator 1.7%
22 Source Systems 1.7%
23 Different Transformations 1.4%
24 Autosys 1.4%
25 Complex Mappings 1.2%
26 Schema 1.1%
27 Debugger 1.1%
28 QA 1.1%
29 Design Documents 1.0%
30 UAT 0.9%
31 Fact Tables 0.9%
32 Source Data 0.9%
33 Worklets 0.9%
34 User Acceptance 0.8%
35 Normalizer 0.8%
36 Business Logic 0.8%
37 SCD 0.8%
38 Bteq 0.7%
39 EDW 0.7%
40 Email 0.7%
41 CDC 0.6%
42 Warehouse Designer 0.6%
43 Technical Specifications 0.6%
44 Data Analysis 0.6%
45 Data Capture 0.6%
46 DEV 0.6%
47 Production Environment 0.6%
48 ODS 0.5%
49 BI 0.5%
50 Ssis 0.5%
{[{skill.rank}]} {[{skill.name}]} {[{skill.percentageDisplay}]}%
Show More

21,736 Senior ETL Developer Jobs

Where do you want to work?

To get started, tell us where you'd like to work.
Sorry, we can't find that. Please try a different city or state.