Post Job Free
Sign in

Data Manager

Location:
Jersey City, NJ
Salary:
110K
Posted:
December 16, 2020

Contact this candidate

Resume:

Lakshmi Pallavi Achanta adiqr5@r.postjobfree.com

+1-276-***-****

Professional Overview

9+ years of experience in Information Technology with a strong background in Analyzing, Designing, Developing, Testing and Implementation of Data warehousing applications in various verticals such as Health care, Retail and AWS platform. Hands on experience on building cloud DWH like AWS S3-Redshift.

Experience in defining project scope, requirements gathering from business users and system design (functional and technical specifications).

Experience in Data Modeling, Star/Snowflake schema modeling, Fact and Dimension Tables, Physical and Logical data modeling using Erwin Data modeling tool.

Strong understanding of Data Comprehension of Data warehousing Methodologies like Bill Inmon & Ralph Kimball approaches.

Hands on experience on reporting tools like OBIEE, Cognos and Business Objects.

Extensive experience in using Informatica Power Center 9.6/9.1//8.6.1/8.5/8.1/7.x/6.x, Power Exchange, Power Mart 5.x/4.x, IDQ (Informatica data quality), Informatica MDM (Master data management).

Extensively used Informatica Client tools like Designer, Workflow Manager, Workflow Monitor, Repository Manager and Server tools – Informatica Server, Repository Server

Extensive experience in debugging mappings. Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.

Experience in designing/developing complex mapping using transformations Connected and Unconnected Lookup, Source Qualifier, Joiner, Expression, Filter, Sorter, Aggregator, Router, Update Strategy, Stored Procedure, Sequence generator, Re-usable transformations.

Proficient in the integration of various data sources involving multiple relational databases like Oracle11g/10g/9i, MS SQL Server, DB2, COBOL files and XML Files, Flat Files (fixed width, delimited), into the staging area of ODS, Data Warehouse or Data Mart.

Extensively worked on PL/SQL Packages, Procedures, cursors, Functions and Triggers and Creation of Schema Objects- Tables, Indexes, Constraints, Sequence, Synonyms, Triggers, Views, Inline views etc.

Implemented Slowly Changing Dimensions – Type I & II in different mappings as per the requirements.

Experience in implementing update strategies, incremental loads, incremental aggregation and change Data capture.

Experience in identifying performance bottlenecks and tuning of Informatica sources, targets, mappings, Transformations, and sessions for better performance.

Excellent skills in using versioning control in Informatica.

Excellent skills in working with UNIX shell scripts to load the files into Informatica source file directory and to securely encrypt and transfer the files using SFTP and worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs using AutoSys, Appworx and CONTROL-M schedulers and Involved in migration/conversion of ETL processes from development to QA and QA to Production environment.

Experience in providing 24/7 Production Support.

Extensive knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing.

Experience in unit testing in Informatica Power Center.

Organized, flexible and a quick learner with the ability to multi-task and work independently or in a team environment.

Technical Skills

Data Warehousing

Informatica Power Center 10.1/9.1/8.6.1/8.5/8.1, Power exchange, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, OLTP, OLAP, AWS, S3.

Dimensional Data Modeling

Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWIN 4.5.

Databases

Oracle 11g/10g/9i/8i, RedShift, Microsoft SQL Server, IBM-DB2.

Programming GUI

C, C++, SQL, PL/SQL, Java, PYTHON, SQL Plus, XML.

Shell Scripting

Unix Shell Scripting, Korn shell scripting.

Other Tools

SQL * PLUS, TOAD, PL/SQL Developer and Putty.

Operating Systems

WINDOWS 95/98, WINDOWS- NT/2000/XP Professional, MS-DOS, UNIX, Sun Solaris and HPAIX.

Education

Master’s in Software Engineering, Stratford University VA, USA.

Professional Experience

NYC Mayors Office of Contract Services, NY Dec’18 – Present

AWS ETL Developer/Sr Informatica Developer

Responsibilities

Work closely with MOCS application services and operations team on ETL development efforts including analysis and design of integration solutions, data and reporting needs of internal and external stakeholders, and enhancement related to applications and services.

Work with external tables in Redshift to load the file from S3.

Hands on experience using AWS Services Subnets EC2, S3, CloudFront, SNS, SQS, RDS, IAM, CloudWatch and CloudFormation focusing on high availability.

Work with Matillion ETL tool in AWS to create SCD I and SCD II mapping to load data into Redshift.

Hands on experience on Amazon Event bridge, AWS step functions, Elastic search.

Hands on experience working on MPP databases like Redshift

Performance tuning and support of ETL, Database jobs.

Work on Code commit for source control and versioning.

Work with Unix scripts to encrypt and FTP the files to S3 bucket.

Work on daily jobs to move the files from remote host to S3 environment.

Design Stage jobs to pick the data files from S3 to redshift staging layer.

Design transformation jobs from stage table to Redshift target tables as per the requirement for TYPE I and Type II

Work on create Python Script to start and stop the EC2 instances in AWS.

API calls from Matillion to download files and parse from Docusign

Work with Informatica ETL tool to develop SCD I and SCD II mapping.

Work on Informatica on Cloud in AWS for POC to build source to Target mapping.

Develop Informatica ETL code to load data in DWH tables.

Testing of Informatica ETL jobs developed.

Tuning of Informatica ETL jobs.

Environment

Informatica Power Center 10.1, Matillion, Oracle 12, TOAD for Oracle 12, Erwin 10, UNIX Shell Scripting, AWS, Informatics Cloud, Redshift, S3, SQL Workbench, PYTHON, PUTTY, Cognos, Cron tab for scheduling.

Prudential Financial, NJ Jun’18 – Nov’18

Sr. Informatica Developer

Prudential Financial provides insurance, investment management, and other financial products and services to both retail and institutional customers throughout the united states and in over 30 other countries.

Responsibilities

Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them into technical specifications.

Extracted data from various source’s and load into tables.

Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.

Implemented incremental loads, Change Data capture and Incremental Aggregation

Extensively written complex SQL’s to extract data from source’s and load into the target tables.

Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.

Developed SCD Type I mappings.

Developed UNIX shell scripts to transfer files, archive files.

Developed UNIX shell scripts to validate header, trailer, and validate source files before extracting.

Scheduled Informatica WF’s using Autosys.

Practiced Agile methodology while implementing projects.

Environment

Informatica Power Center 10.1, Oracle 12, TOAD for oracle 12, Erwin 10, UNIX Shell Scripting, PUTTY, PL/SQL, Cognos, Autosys scheduling tool.

U.S. Cellular, Chicago Aug’17 – May’18

Sr. Informatica Developer

U.S Cellular is the fifth-largest wireless telecommunications network in the United States.

Responsibilities

Extracted data from various sources like flat files, XML files, oracle and loaded into tables.

Worked on Informatica 10.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.

Created complex mappings using the Mapping designer, respective workflows and worklets using the Workflow manager.

Troubleshooted the mappings using the Debugger and improved the data loading efficiency using Sql-overrides and Look-up Sql overrides.

Developed SCD I and SCD type II mappings using MD5 logic.

Implemented incremental loads, Change Data capture and Incremental Aggregation

Created UNIX shell scripts to encrypt and decrypt the files, move the files, archive the files and split the files.

Developed UNIX Shell Scripts and SQLs to get data from Oracle tables

Created stored procedures, functions and triggers to load data into summary tables.

Extensively written complex SQL’s to extract data from source’s and load into the target tables.

Implemented parallelism in loads by partitioning workflows using Key Range partitioning.

Worked on MDM manual maintenance to add, delete, update or merge data.

Experience in MDM implementation including data profiling, data migration and pre landing processing.

Created Informatica Data Quality Services like (Data Integration service, Analyst service & Content Management Service) and experience in using Informatica Developer and AnalystTool.

Experience in Informatica Administration creating repository services, Integration services and hands on Admin console, command line utilities in Informatica.

Practiced agile methodology while strategizing and implementing solutions

Scheduled Informatica Workflow’s using TWS scheduling tool.

Environment

Informatica Power Center 10.1, Oracle 12, TOAD for oracle 12, Erwin 10, UNIX Shell Scripting, IDQ, PUTTY, PL/SQL, MDM 10.1, Cognos, TWS scheduling tool.

Hyatt Hotels Corporation, Chicago Sep’16 – July’17

Sr. Informatica Developer

Hyatt Hotels Corporation is an American multinational owner, operator, and franchiser of hotels, resorts, and vacation properties.

Responsibilities

Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them into technical specifications.

Extracted data from various sources like flat files, Informix, XML files and loaded into Enterprise data warehouse.

Worked on Informatica 9.6.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.

Created complex mappings using the Mapping designer, respective workflows and worklets using the Workflow manager.

Troubleshooted the mappings using the Debugger and improved the data loading efficiency using SQL-overrides and Look-up SQL overrides.

Developed SCD I and SCD type II mappings using MD5 logic.

Implemented incremental loads, Change Data capture and Incremental Aggregation

Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.

Created UNIX Shell scripts to load the files into Informatica source file directory, Encryption and decryption of files through SFTP from remote locations.

Developed UNIX Shell Scripts and SQLs to get data from Oracle tables

Parameterized Informatica connections

Used SQL tools like DB Visualizer, SQL Developer to run SQL queries and validate the data in warehouse.

Implemented parallelism in loads by partitioning workflows using Key Range partitioning.

Worked on MDM manual maintenance to add, delete, update or merge data.

Experience in MDM implementation including data profiling, data migration and pre landing processing.

Created Informatica Data Quality Services like (Data Integration service, Analyst service & Content Management Service) and experience in using Informatica Developer and AnalystTool.

Hands on experience on Informatica cloud services.

Experience in Informatica Administration creating repository services, Integration services and hands on Admin console, command line utilities in Informatica.

Practiced agile methodology while strategizing and implementing solutions and Involved in Code reviews.

Scheduled Informatica Workflow’s using Autosys scheduling tool.

Environment

Informatica Power Center 9.6.1, DB2, SQL Developer for DB2, DB Visualizer, Informix, Flat Files, Erwin 9, MS Visio, LINUX, UNIX Shell Scripting, IDQ, PUTTY, PL/SQL, MDM 9.7.1, Cognos, Autosys scheduling tool.

DaVita Village Health, IL Jun’12 – July’16

Sr. Informatica Developer

Village Health is comprised of a dedicated team of specially trained nurses and professionals providing integrated care management to patients with kidney disease throughout the US. They provide Services to help improve the lives of the patients by working with them to prevent complications, reduce the Number of avoidable hospitalizations and improve overall health.

Responsibilities

Extracted data from various sources like flat files, Netezza, XML files, Oracle and loaded into Enterprise data warehouse.

Worked on Informatica 9.x client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.

Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.

Created complex mappings using the Mapping designer, respective workflows and worklets using the Workflow manager.

Troubleshooted the mappings using the Debugger and improved the data loading efficiency using Sql-overrides and Look-up SQL overrides.

Developed SCD I and SCD type II mappings using MD5 logic.

Implemented incremental loads, Change Data capture and Incremental Aggregation

Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.

Created UNIX Shell scripts to load the files into Informatica source file directory.

Build Informatica code to load Patient Data into text file and Created UNIX Shell Scripts to securely encrypt and transfer the files into remote location through SFTP.

Parameterized Informatica connections building Informatica framework using PL/SQL procedures and incorporated them in the mappings.

Created Email tasks, Even Wait and touch files to make sure FACT Tables are run after Dimension tables are loaded etc.

Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.

Responsible for migration of the mappings and sessions from development repository to production repository and provided 24/7 production support.

Practiced agile methodology while strategizing and implementing solutions

Involved in Code reviews.

Involved in upgrade of Informatica.

Environment

Informatica Power Center 9.6/9.1/8.6.1, Oracle 11g, Toad 10.6 for Oracle, Flat Files, Netezza, Erwin 9, MS Visio, Windows 7, UNIX Shell Scripting, IDQ, PUTTY, PL/SQL, SQL, IDQ, MDM, Cognos, Active Batch scheduling tool.

Dunkin Brands Inc., MA Mar’11 – May’12

Sr. Informatica Developer

Dunkin donuts is the largest coffee and baked goods restaurant in the world with loyal customers in 31 countries and Baskin Robbins is the largest and one of the largest ice cream specialty stores. Data is extracted from various sources cleansed and stored in Data warehouse using ETL tool and using BI tool reporting is done in the data warehouse.

Environment

Informatica Power Center 9.1.0/8.6.1, Oracle 10g, Toad 10.6 for oracle, Flat Files, XML Files, Erwin 7.3, MS Visio, Windows 2000, UNIX AIX, Shell Scripting, IDQ, PL/SQL, SQL, OBIEE, Appworx Scheduling Tool.

References: References can be provided on request.



Contact this candidate