Resume

Sign in

Data Sql

Location:
Atlanta, Georgia, United States
Salary:
120000
Posted:
February 13, 2018

Contact this candidate

Resume:

Qualifications Profile

Around * years experience in Ab Initio. Quality focused, solution-driven professional demonstrating extensive knowledge of Project Life Cycle in Design Development, Implementation and testing of various sized applications using ETL & BI Tools and performance tuning.

Senior ETL/Ab Initio Developer with 6 years experience in ETL, Data Mapping, Transformation and Loading from Source to Target Databases in a complex, high-volume environment, well-versed in conceptualizing cutover development life cycle.

Extensive experience with EME for version control, impact analysis and dependency analysis.

Hands onexperience on Ab Initio BRE. Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.

Experience working on ATLAS DATA LAKE program.

Experience working on INGESTION process, HDFS partitions. Good knowledge on Reconciliation concept, test harness, clean sandbox testing, fixing dependency analysis of various sandboxes.

Experience working on Metadata hub and expertise in fixing the dataset lineage.

Worked on continuous flows and Conduct>It. Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), and Continuous flows, Queues, publisher and subscriber components.

Experience with creating and deploying UNIX shell scripts in a production environment.

Experience with configuration of ODBC drivers and file transfer protocols NDM, Secure Shell, and SFTP.

Knowledge on TWS (Tivoli Workload scheduler) System for Scheduling.

Proficiency in Computer software: MS Project, Power Point, Excel and Visio.

Deploying and maintaining production job schedules with AutoSys.

Extensively created and used various Teradata Set Tables, Multi-Set table, global tables, volatile tables and temp tables.

Experience in Express>IT Template design and development, Configuration, Deployment.

Experience with Teradata SQL, Teradata ANSI SQL, Teradata Tools & Utilities (Fast Export, Multiload, Fast Load, TPUMP, BTEQ and Query Man).

Experience in design, develop and test ETL solutions on Advanced Abinitio.

Excellent knowledge on Abinitio PDL, META PROGRAMMING, VECTOR PROGRAMMING, and Web services, Stored Procedures.

Experience in working with Ab Initio, DB2 /Netezza/Oracle Database, UNIX / Windows OS.

Experience working with various Heterogeneous Source Systems like Oracle 10g, DB2, MS SQL Server, Flat files and Legacy Systems. Visionary planner, with adeptness in creating strategic initiatives and proven ability in diagnosing, troubleshooting, and resolving application- and data-related problems using solutions that consistently meet corporate objectives related to business and technology performance.

Experience on Agile methodology.

Architect, designs, modifies, develops, implements ETL solutions with Abinitio, UNIX, DB2.

Effective leader and team player, with superior decision-making and problem-solving techniques. Able to interface with management, clients, and people of diverse socio-cultural backgrounds. Known for a strong sense of pride in handling duties, along with unparalleled commitment to getting the job done and achieving high results.

Areas of Expertise

Performance Tuning

Data Transfer and Data Migration

Data Processing & Development

Query Optimization

Technical and User Documentation

Data Warehousing and ETL Tools

Technical Acumen

Primary Tools:

Ab Initio (Co>Op 3.0.3.9/2.15/2.14/2.13/2.10, GDE 3.0.4/1.15/1.14/1.13/1.10), Application Configuration Environment (ACE0.12.3), BRE, Teradata SQL, Teradata Tools and Utilities, Oracle 10g/9i, MS SQL Server 6.5/7.0/2000, DB2,Netezza

Languages:

Teradata SQL

Teradata Utilities:

BTEQ, Fast Load, MultiLoad, Tpump, SQL Assistant, Teradata Manager

Databases:

Teradata V2R/13/12/6, Oracle 10g/9i

Operating Systems:

Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX

Reporting tools:

Control M, Clear Quest

Education

Bachelor of Engineering in Electrical Engineering

Andhra university Visakhapatnam, India

Training

Big Data/Hadoop --Hadoop, HDFS, HIVE, PIG, HBASE, Flume, SQOOP

Teradata SQL Basic and Advanced

Teradata Physical Design and Implementation

ETL – Ab Initio, Informatica 8.6.1, Netezza, oracle

Mainframe DB2

Professional Experience

SUNTRUST BANK, ATLANTA, GEORGIA

Ab initio developer MAY2017 JAN2018

Environment:Ab Initio (GDE 3.0.6, Co-Operating System 3.1.5), Teradata 15/13.10, UNIX shell Scripting, Oracle 10g/9i, Teradata, UNIX, Windows XP/2000

Notable Contributions:

Extensively used Ab Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.

Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low-level design documents.

Experience working on ATLAS DATA LAKE program.

Experience working on INGESTION process, HDFS partitions. Good knowledge on Reconciliation concept, test harness, clean sandbox testing, fixing dependency analysis of various sandboxes.

Experience working on Metadata hub and expertise in fixing the dataset lineage.

Used winsql to analyze production issues for db2 database tables. Also did further analysis of data and resolved the issues and automate them as needed from the enhancements of the current requirements.

Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.

Knowledge on TWS (Tivoli Workload scheduler) System for Scheduling.

Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.

Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, Fast Load, MultiLoad, and Tpump.

Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.

Created several Sandboxes from scratch for the new project implementations along with sandbox and project parameters.

Created several packages to set up and share global variables, types and transforms which were extensively used for many Abinitio graphs

Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Abinitio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.

Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.

Implemented application configuration using Abinitio BRE (Business Rule Engine). And also created trigger files and setup thresholds.

Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.

Worked as a data analyst to configure the mappings part of the iFramework (ACE)based on the new Ab Initio framework UI product which is responsible for filtering, validating and transforming data (including the use of the cross reference translations services).

Responsible for deployingAb Initio graphs and running them through the Co-operatingsystems MP shell command language and responsible for automating the ETL process through scheduling

Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup’s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs

Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.

Working with Business Users and Business Analyst for requirements gathering and Business Analysis.

Developed ETL code based on Business requirements using various AB Initio components,

Making use of statements/variables in the components for creating complex data transformations.

Extracted data from source tables and transformed the data based on user requirements and loaded data to target server.

Used Ab Initio functions such as is _valid, is _error, is_defined, string_* functions for performing the data cleansing. Developed Ab Initio graphs for Data validation using validate components

Performed evaluations and made recommendations in improving the performance of graphs by minimizing the number of components in a graph, tuning the Max Core value, using Lookup components instead of joins for small tables and flat files, filter the data at the beginning of the graph etc.

Writing SQL Scripts to extract the data from Database and for Testing Purposes.

Interacting with the Source Team and Business to get the Validation of the data.

Involved in Transferring the Processed files from mainframe to target system.

Supported the code after postproduction deployment.

Familiar with Agile software methodologies (scrum).

WELLS FARGO, SIOUX FALLS, SOUTH DAKOTA

Ab initio developer OCT2015 April2017

Environment:Ab Initio (GDE 3.2.18, Co-Operating System 3.1.5), Teradata 15/13.10, UNIX shell Scripting, Oracle 10g/9i, Teradata, UNIX, Windows XP/2000

Notable Contributions:

Extensively used Ab Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.

Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low-level design documents.

Used winsql to analyze production issues for db2 database tables. Also did further analysis of data and resolved the issues and automate them as needed from the enhancements of the current requirements.

Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.

Knowledge on TWS (Tivoli Workload scheduler) System for Scheduling.

Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.

Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, Fast Load, MultiLoad, and Tpump.

Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.

Created several Sandboxes from scratch for the new project implementations along with sandbox and project parameters.

Created several packages to set up and share global variables, types and transforms which were extensively used for many Ab Initio graphs

Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.

Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.

Implemented application configuration using Ab Initio BRE (Business Rule Engine). And also created trigger files and setup thresholds.

Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.

Worked as a data analyst to configure the mappings part of the iFramework (ACE)based on the new Ab Initio framework UI product which is responsible for filtering, validating and transforming data (including the use of the cross reference translations services).

Responsible for deployingAb Initio graphs and running them through the Co-operatingsystems MP shell command language and responsible for automating the ETL process through scheduling

Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup’s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs

Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.

Working with Business Users and Business Analyst for requirements gathering and Business Analysis.

Developed ETL code based on Business requirements using various AB Initio components,

Making use of statements/variables in the components for creating complex data transformations.

Extracted data from source tables and transformed the data based on user requirements and loaded data to target server.

Used Ab Initio functions such as is _valid, is _error, is_defined, string_* functions for performing the data cleansing. Developed Ab Initio graphs for Data validation using validate components

Performed evaluations and made recommendations in improving the performance of graphs by minimizing the number of components in a graph, tuning the Max Core value, using Lookup components instead of joins for small tables and flat files, filter the data at the beginning of the graph etc.

Writing SQL Scripts to extract the data from Database and for Testing Purposes.

Interacting with the Source Team and Business to get the Validation of the data.

Involved in Transferring the Processed files from mainframe to target system.

Supported the code after postproduction deployment.

Familiar with Agile software methodologies (scrum).

United Health Care, Schaumburg, IL

Ab Initio Developer JAN 2014 – DEC 2014

Environment: Abinitio (GDE 3.1.16, Co-Operating System 3.16.10), Teradata 14.10, UNIX shell Scripting, Windows NT/2000, COBOL, JCL language, DB2, File-Aid, Teradata V2R6, UNIX IBM AIX 5.1, QMF.

Notable Contributions:

Developed several partition based Ab Initio Graphs for high volume data warehouse.

Involved in all phases of the System Development Life Cycle Analysis, &Data Modeling.

Extensively used Enterprise Meta Environment (EME) for version control

Extensive exposure to Generic graphs for data cleansing, data validation and data transformation.

Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.

Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata Fast Load, MultiLoad, and T-Pump.

Architected and developed Fast Load and MultiLoad scripts developed Macros and Stored procedures to extract data, BTEQ scripts to take the date range from the database to extract data.

Created JCL scripts for calling and executing BTEQ, Fast Export, FLOAD, and MLoad scripts.

Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata Fast Export.

Wrote highly complex SQL to pull data from the Teradata EDW and create Adhoc reports for key business personnel within the organization.

Created data models for information systems by applying formal data modeling techniques.

Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.

Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.

Performed reverse engineering of physical data models from databases and SQL scripts.

Provided database implementation and database administrative support for custom application development efforts.

Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.

Used AIR commands to do dependency analysis for all ABI objects

Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.

Extensively used Partition components and developed graphs using write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Re format.

Followed the best design principles, efficiency guidelines and naming standards in designing the graphs.

Knowledge on TWS (Tivoli Workload scheduler) System for Scheduling.

Developed shell scripts for Archiving, Data Loading procedures and Validation

Involved in writing Unit Test scripts, support documents and implementation plan.

Tuned the graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and enhanced performance.

Implemented a 6 way multi file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multi directories).

Database Query Optimization and I/O tuning techniques have been used for performance enhancements.

Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.

Worked with an offshore team to build the project and undertook responsibilities of working closely with users for requirements gathering and providing the offshore team with detailed requirements documents

Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.

Capacity of designing solutions around Ab initio, with advanced skills in high performance and parallelism in Ab Initio.

Capital One Financial, Richmond, Virginia

Ab Initio/Teradata Consultant JAN 2012 – AUG 2013

Environment: Abinitio (GDE 3.0.6, Co-Operating System 3.1.5), Teradata 13.10, UNIX shell Scripting, Windows NT/2000, COBOL, JCL language, DB2, File-Aid, Teradata V2R6, UNIX IBM AIX 5.1, QMF.

Notable Contributions:

Developed several partition based Ab Initio Graphs for high volume data warehouse.

Involved in all phases of the System Development Life Cycle Analysis, &Data Modeling.

Extensively used Enterprise Meta Environment (EME) for version control

Extensive exposure to Generic graphs for data cleansing, data validation and data transformation.

Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.

Extracted data from DB2 database on Mainframes andloaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata Fast Load, MultiLoad, and T-Pump.

Architected and developed Fast Load and MultiLoad scripts developed Macros and Stored procedures to extract data, BTEQ scripts to take the date range from the database to extract data.

Created JCL scripts for calling and executing BTEQ, Fast Export, FLOAD, and MLoad scripts.

Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata Fast Export.

Wrote highly complex SQL to pull data from the Teradata EDW and create Adhoc reports for key business personnel within the organization.

Created data models for information systems by applying formal data modeling techniques.

Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.

Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.

Performed reverse engineering of physical data models from databases and SQL scripts.

Provided database implementation and database administrative support for custom application development efforts.

Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.

Used AIR commands to do dependency analysis for all ABI objects

Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.

Extensively used Partition components and developed graphs using write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Re format.

Followed the best design principles, efficiency guidelines and naming standards in designing the graphs.

Knowledge on TWS (Tivoli Workload scheduler) System for Scheduling.

Developed shell scripts for Archiving, Data Loading procedures and Validation

Involved in writing Unit Test scripts, support documents and implementation plan.

Tuned the graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and enhanced performance.

Implemented a 6 way multi file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multi directories).

Database Query Optimization and I/O tuning techniques have been used for performance enhancements.

Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.

Worked with an offshore team to build the project and undertook responsibilities of working closely with users for requirements gathering and providing the offshore team with detailed requirements documents

Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.

Capacity of designing solutions around Ab initio, with advanced skills in high performance and parallelism in Ab Initio.

Bank Of America Dallas, TX

Ab Initio Developer MAY 2011 – DEC 2011

Environment: Abinitio (GDE 1.15, Co-Operating System 2.15), UNIX shell scripting, Oracle 10g,Windows NT/2000, COBOL, JCL language, DB2, File-Aid, BMC, QMF, TSO SIM tool, Web star, Mercury Test Directory

Notable Contributions:

Worked on Ab Initio graphs that transfer data from various sources like DB2, legacy systems, flat files and CSV files to the Oracle and flat files.

Worked on multiple projects with data transfer from different sources and targets.

Used Tivoli as a scheduling tool and closely worked with the scheduling team during the automation of quarterly jobs and also for the daily jobs.

Automated a series of quarterly jobs, also improved the efficiency by reducing the total time taken for the jobs by running the jobs in parallel after checking the dependencies (like the source and targets, completion of prior phases).

Widely used the Transform Components such as Aggregate, Dedup Sorted, and Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.

Components like Broad Cast, Partition by Key, Partition by Range, Partition by Round Robin and De-partition and components like Concatenate, Gather and Merge were extensively used.

Worked with the DBA’s for long running jobs (sql), checked active sessions and took successive actions like purge the inactive sessions adding hints for the queries for the completion of the jobs.

In the event of a failed job, explored the particular graph, went through the different phases, components and their properties to resolve the issue.

Coordinated with the off-shore team during the daily job and took over the running and pending jobs.

Manually executed the jobs in case of some breakdown in the scheduling tool (Tivoli).

In order to improve the performance for the long running queries added Hints – increased the degree of parallelism, and for the optimization of the query.

Created new config (.cfg) files and modified existing .cfg files for the master scripts depending on the phases executed of the current job.

Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.

Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.

Used Remedy as Defect tracking tool.



Contact this candidate