Teja Anne
Email – *********@*****.***
Ph – 716-***-****
(Safe Agile Certified)
PROFESSIONAL SUMMARY:
6+ years of IT experience in Software Development Life Cycle (SDLC) which includes requirement gathering, designing, implementing and testing.
3+ years of Technical and Functional experience in Production Support – Data warehousing implementing ETL (Extract, Transform and Load) using Informatica Power Center 10.2/9.6/9.5/8.6/7.1.3/6.2
Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes through Informatica.
Extensively worked on Power Center Mapping Designer, Mapplet Designer, Transformation developer Warehouse Designer, Workflow Manager, Repository Manager and Workflow Monitor
Well acquainted with Performance Tuning of sources, targets, mapping and sessions to overcome the bottlenecks in mappings.
Sound Understanding of Data warehousing concepts and Dimensional modeling (Star schema and Snowflake schema)
Strong analytical and conceptual skills in database design and implementation of RDBMS concepts.
Experienced in Oracle database development using PL/SQL, Stored Procedures, Triggers, Functions and Packages.
Good knowledge in interacting with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ).
Experience in UNIX shell scripting (file validations, file downloads, workflow executions).
Developmental experience on Windows NT/95/98/2000/XP, UNIX platforms.
Excellent communication and interpersonal skills.
Good understanding of Star and Snowflake Schema, Dimensional Modeling, Relational Data Modeling and Slowly Changing Dimensions.
Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schemas.
Resourceful, creative problem-solver with proven aptitude to analyze and translate complex customer requirements and business problems and design/implement innovative custom solutions.
Exceptional problem solving and sound decision making capabilities, recognized by associates for quality of data, alternative solutions, and confident, accurate, decision making.
TECHNICAL SKILLS:
Informatica Tools
Informatica Power Center 10.2/9.6/9.5/8.6/7.1.3/6.2
Designing Tools
Adobe Dreamweaver, Sublime text, Adobe Photoshop and Adobe Illustrator.
Languages
SQL, PL/SQL
Operating Environment
Unix(solaris) 9.0, AIX 5.2,MS Windows 95/98/NT/2000/XP
Database
Oracle 11g, SQl, Hive, Hbase, PL/SQL and MS Access, HDFS
Operating Systems
Windows Environment, Linux and Ubuntu
Graphics
MS-visio 2007.
Scripting
Unix shell scripting
EDUCATION:
Bachelor of Engineering JNTUK, India- Computer Science Apr 2014
Master of Science Northwestern Polytechnic University - Computer Science Dec 2016
Master of Science, University of the Cumberlands, Kentucky - Present
PROFESSIONAL EXPERIENCE:
CVS Health, Scottsdale, AZ Jan 2018 - Present
Informatica Developer/ Production Support
Responsibilities:
Worked as Production Support and working on Tier -1 Applications like Secure chat using Informatica process by Extracting and loading data into all databases.
Worked exclusively on performing the ETL to Level 1 as is staging, Level 2 staging, DW load based on the business rules and transformation specifications.
Extensively worked on Very recent Version of informatica 10.2.0 and created mappings using all complex transformations and extensively used shell scripting to automate the scripts and used more into Designer, Monitor and Repository.
Extensively worked on all databases with Huge amount of Data and creating backup of all data in all databases.
Identifying the root causes for the Failure of Informatica Jobs and solving the complex problems and reran the processing’s in order to avoid customer related issues.
Extensive experience of databases like Oracle, Hbase, Cassandra, Hive, HDFS and etc
Extensively used Transformation Language functions in the mappings to produce the desired results.
Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
Extensively worked on Informatica 10.2 version and much usage of Monitor, Designer, Repository and Manager.
Analyzed requirements from the users and created, reviewed the specifications for the ETL.
Designed Incremental strategy, Created Reusable Transformations, Mapplets, Mappings/Sessions/Workflows etc.
Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy
Extensively worked on writing complex SQL Queries and creating Complex Mappings and designing the workflows as per the requirement.
Experience working on all backend database technologies like Hive, Cassandra, SQL and writing complex queries based on requirement.
Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
Worked with complex mapping using transformations such as Expression, Router, Lookup, Filter, Joiner, SQ, Stored Procedures and Aggregator.
Worked on existing mapping for the performance tuning to reduce the total ETL process time.
Analyzed, inspected and laid the framework for Claims process Engine, a process designed and customized by external vendors for the client.
Strong working experience with Oracle and working all kinds of Transformation in 10.2 version
Extensively Worked in Processing Structured and Unstructured data.
Informatica B2B Data Transformation supports transformations and mappings, via XML, of
Resolved the anomalies arising in the file processing across individual stages of data flow.
Designed, Developed and unit tested the ETL process for loading the Medicaid/Medicare records across STAGE, LOAD and ACCESS schemas.
Accomplished the Provider directory load (a five step sequential process) using informatica for the states of AZ and RI.
Extended the functionalities of existing ETL process of Medicaid for Medicare.
Extensively worked on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
Maintained documents for Design reviews, Engineering Reviews, ETL Technical specifications, Unit test plans, Migration checklists and Schedule plans.
Environment: Informatica Power Center 10.2.0, Oracle 12g, Windows NT, Flat files (fixed width/delimited), MS-Excel, UNIX shell scripting, Hdoop, Hive, Hbase, HDFS
Farm Journal Media, Chicago, IL Mar 2016 – Jan 2018
Informatica developer/ Production Support
Responsibilities:
Analyzed and gathered the Product/Performance attributes available in EDW for FSP application team to build their interface for the source.
Interacted comprehensively with the business analysts to understand the nature of data to be transformed and loaded to EDM to generate 3 reports namely FSP Analytics Report, POG & Section Productivity and Space Productivity Reports.
Designed, Developed and Tested the ETL for data transition from FSP source to EDWSTG, EDW and finally to EDM which was used for generating the reports.
Implemented Generic load process which was a metadata driven procedure designed by CCS to accomplish the time phased data load from EDWSTG to EDW.
Implemented the conventional load from EDWSTG -> EDW -> EDM schemas.
Performed the calculations for sales, assessed space and other related metrics to the lowest level of granularity to be loaded to the Fact table in EDM based on which the weekly/monthly reports were generated.
Extensively worked on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
Used session and workflow logs to debug the session and analyze the problem associated with the mappings and generic scripts.
Implemented AutoSys job for weekly/monthly load of data.
Analyzed and gathered the Product/Performance attributes available in EDW for FSP application team to build their interface for the source.
Worked Extensively on Informatica tools -Designer Workflow monitor and Workflow Manager.
Worked extensively on various Teradata load utilities like FastLoad, MLoad & TPump to populate data into Teradata tables.
Worked on all the Transformations like Lookup, Aggregator, Expression, Router, Filter, Update Strategy, Stored procedure, Sequence Generator.
Debugging and validating the mappings and Involved in Code Review.
Created and Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
Worked on BTEQ Scripts to populate data from Stage to Target.
T he load ready files are loaded into Teradata Tables using Teradata ODBC/Fastload, Multiload connection. The Load Date and Time are also captured in the teradata table.
Extensively used various Performance tuning Techniques to improve the session performance (Partitioning etc).
Extensively worked on correcting and reloading rejected files.
Used the command line program Pmcmd to communicate with the Informatica Server.
Completed documentation in relation to detailed work plans, mapping documents and high-level data models.
Configured and used FTP by the Informatica Server to access source files.
Environment: Informatica Power Center 7.1.3, DB2, Oracle 9i, Windows NT, UNIX (AIX 5.2), Flat files (delimited), Windows XP, Teradata V2R5, V2R6, Informatica (Power Center 7.1/6.1), HP- UX 10.20,
Fastload, Multiload
Mindtree, Hyderabad, India July 2014 – June 2015
Informatica Developer
Responsibilities:
• •Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
•Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.
•Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
•Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
•Reviewed source systems and proposed data acquisition strategy.
•Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
•Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
•Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
•Created Mapplet and used them in different Mappings.
•Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
•Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
•Performance tuning using round robin, hash auto key, Key range partitioning.
•Used shell scripts for automating the execution of maps.
Environment: Oracle 9i, SQL Plus, PL/SQL, Windows 2000