Resume

Sign in

Data Developer

Location:
New Delhi, Delhi, India
Posted:
September 19, 2018

Contact this candidate

Resume:

Geetha Navuluri

Phone: 972-***-****

Email: ac63xb@r.postjobfree.com

Summary:

Having 8 Years of IT experience in Business Intelligence solutions and Data Warehouse complete life cycle using all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications including ETL processing and Reporting.

Professional Summary:

Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Oracle, SQL SERVER, and DB2), EXCEL, Flat Files and people soft application.

Experience in SQL server, Oracle, SSIS, SSRS, SSAS and Informatica.

Experience in Dimensional Data Modeling Star and Snow Flake Schemas.

Experience in, T- SQL for Developing Stored Procedures, Functions, sub Queries, Joins, Views, Indexes and Creating & Updating table scripts.

Strong Experience in Developing, Configuring and Deploying SSIS Packages.

Created packages to extract, transform the data from various source systems and load into different Data Marts.

Experience in using Control Flow Tasks like For Loop Container, For Each Loop Container, Sequential Container, Execute SQL Task, Script task, Execute Process and Data Flow Task in designing packages.

Hands on experience performing various SSIS data transformations like Derived Column, Data Conversion, Look-ups, Aggregate, Conditional Split etc.

Creating Event Handlers, Error Handlers for error management.

Experience in Interacting with Business users to analyze the business process, gathering requirements and transforming requirements into screens, performing ETL, documenting and rolling out the deliverables.

Expertise in developing SSRS Reports SQL.

Creating Partitions, collecting stats on tables and Optimizing the Query performance.

Experience in developing drill down, drill through, Ad-hoc, Parameterized Reports and dashboards.

Experience in Configure and maintain Report Manager and Report Server to deploy and Schedule the Reports.

Involved in analyzing, designing, building &testing of OLAP cubes with SSAS and in adding calculations using MDX.

Experience in designing and configuring Data Source View, defining relations in DSV, Named Calculations, Dimensions, Measure Groups, Relations ships in Cubes and Deployment of Analysis Services Databases.

Extensive Knowledge in creating Partitions, Aggregations, Calculated Measures and developing reports using MDX.

Knowledge of designing & Developing SSAS Tabular Model.

Experience in deploying and browsing cubes with multiple fact measures and multiple dimension hierarchies in SSAS

Trained in Power BI and Tableau reporting tools.

Hands on experience in creating mappings using informatica 8.X

Experience in taking Repository Backups, Table backups etc.

Performed all Informatica administration functions such as user access accounts, roles, setting standards and associated privileges and folders.

Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL process.

Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator, Stored Procedure transformations and developing mapping and mapplets.

Experience in performance tuning mappings, identifying and resolving bottlenecks in various levels like sources, targets, mappings, sessions at Informatica and Data Base level.

Created Deployment Group to migrate the code.

Effective communication, interpersonal skills with strong analytical and problem-solving skills

TECHNICAL SKILLS

ETL Tools: SSIS 2012/08, Informatica Power Centre 8.X,9.0.1

Reporting Tools: SSRS 12/08,OBIEE,Power BI, Tableau

Analytical Tools: SSAS 2012/08

Languages: SQL, PL/SQL, Oracle,.Net

Scripting Languages: Unix

Databases: SQLServer2012/08,MS-Access,Oracle 11g

Packages: Micro soft office 2010/07/03

Operating Systems: Windows 08/07/03/XP

Work Experience:

Project #1: Mr. Cooper, TX June 2017 – Till Date

Role: BI Developer

Environment: SQL Server 2008, SSIS 2008, SSRS 2008, Windows server 2010, C#.net

Description:

NSM founded in 1994 and is headquartered in the Dallas, Texas area. Nationstar Holdings consists of Nationstar Mortgage, which provides servicing and originations for homeowners throughout the United States, and Xome, which provides technology and data enhanced solutions to the real estate market and companies engaged in the origination and/or servicing of mortgage loans. NSM is one of the largest mortgage servicers in the United States with a servicing portfolio of approximately $500 billion and more than 3 million customers.

Responsibilities:

Interacting with clients for understanding business process and requirements and implementing data reports.

Working closely with subject matter experts/users to ensure monthly resolution of data issues.

Research on data issues in various departments like Bankruptcy, Foreclosure, Escrow and Mortgage insurance data etc. and providing solution to fix issues accordingly.

Manage multiple projects involving acquisitions, improving vendor status and financial reporting.

Translating business concepts into firm, explicit requirements via sessions with Business front-line personnel as well as front-line & 1 up management.

Manually extract, transform, and load from disparate sources to perform data analysis using SQL, V-LOOKUP and Excel Pivot Tables.

Writing the Stored procedures and Ad-hoc queries as per the requirement to provide data reports.

Automation of data loads using integration services.

Creating exceptional reports to analyze the data gaps.

Extracting data from various sources, transforming as per the business requirement and then load into target dynamically using SSIS packages.

Sending job status notifications automatically to the users.

Converting excel files into binary format automatically once data has been loaded using C# code.

Performing Unit testing, regression testing to make sure existing functionality to not break.

Created Jobs and implemented it for calling the SSIS packages on demand.

Implementing the best practices for increasing the performance of the SSIS Packages while loading data to Database.

Created Complex Summary reports such as Drill Through and delivered them in Excel formats.

Created daily, weekly and monthly Loan Modification reports.

Works with large amount of data with emphasis on extensive data mining, data aggregation and reconciliation skills.

Established best practices and best procedures for database administration.

Analyzing the data, Writing Complex SQL and T-SQL Programming.

Creating Indexes on the tables for better query performance.

Using SharePoint to track newly created Ad-Hoc team reporting requests

Project #2: CalWIN, CA Mar 2016 – May 2017

Role: BI Developer

Environment: SQL Server 2012, SSIS 2012, SSRS 2012, SSAS 2012 and Windows server 2008

Description:

The CalWIN system is a large-scale client/server based, fully integrated, on-line, interactive, automated system that will determine client eligibility and employability, calculates and issues benefits, exchange information with several other external systems, and provide management support. CalWIN system is been developed for Welfare Client Data System (WCDS) Consortium. The WCDS Consortium constitutes of eighteen (18) California counties. These counties together administer services for over forty percent (40%) of the California State's public assistance caseload. CalWIN is powered with functionality to support CalWORKs, General Assistance/General Relief, Food Stamps, and Refugee on Cash Aid, Foster Care and Medi-Cal assistance programs. Over 25,000 end users and more than a million clients will eventually benefit from CalWIN's automated Eligibility Determination, benefit calculation, and case management. Users of the system will be presented with more than 2,000 Graphical User Interface web pages, to access a wide range of functionality. The clients will have access to on-line web based policy manuals.

Responsibilities:

Interacting with client for understanding business process, taking business requirements and implementing packages and reports.

Used different transformations available in SSIS to load data into Data Warehouse and for different computations.

Used Control Flow Tasks like Sequence Container, For Each Loop Container, Execute SQL Task, Send Mail Task and Data Flow Task.

Created Jobs and implemented it for calling the SSIS packages on demand.

Implemented unit testing.

Developed and maintained technical documentation from source to target load for ETL development process.

Conducting the Weekly/Daily status calls within the Team and offshore Team to meet the project deadlines.

Implementing the best practices for increasing the performance of the SSIS Packages while loading data to Database.

Implementing the Incremental Loading logic for the ETL Packages.

Created Complex Summary reports such as Drill Down, Drill Through and Matrix reports using SSRS and delivered them in Excel and PDF formats.

Designed and developed Static and dynamic Charts and Sub Reports using SSRS

Developed SSAS multidimensional Cube by designing and configuring Dimensions, Fact tables, and Measures Groups.

Experience in creating dynamic partitions on cube for better performance.

Testing reports for data gapes.

Project #3: CBRE, Dallas June 2013-Feb 2016

Role: ETL, Report Developer

Environment: SQL Server 2012, SSIS 2012, SSRS 2012, Windows server 2008

Description:

CBRE is the world's premier, full-service real estate Services Company. Operating globally, the firm holds a leadership position in virtually all the worlds’ key business centers. After successful BI implementation for NAM region, CBRE recently expanded the BI application to APAC region and currently in the phase of expanding it to EMEA. The scope of the project is to implement BI application enhancements phase 3 items and provide support.

Responsibilities:

Created SSIS packages to extract the data from OLTP databases and load it into the target

Extracted data from various sources like SQL Server,.CSV and Excel files.

Created packages using various transformations such as Slowly Changing Dimensions,

Look up, Aggregate, Derived Column, Conditional Split, Fuzzy Lookup, Multicast, and Data Conversion etc.

Automated the process of running packages by creating Jobs and Schedules using SQL

Server Agent.

Created Complex Summary reports such as Drill Down, Drill Through and Matrix reports sing SSRS and delivered them in Excel and PDF formats.

Crated various calculation, visualizations and deploying the reports to the UAT and Production environment.

Automated Delivering reports by using Subscriptions in SSRS.

Generated multi parameterized reports using SSRS 2008 allowing users to make selections before running reports to make user friendly.

Creating Dashboards and KPIs.

Developed custom Ad Hoc analysis reports.

Involved in Creation/Review of functional requirement specifications and supporting

Documents for business systems.

Reviewed, tested packages, fixing bugs (if any) using SQL 2012 Business Intelligence Development Studio.

Project #4: Cardinal Health Care Jan 2012 – May2013

Role: ETL Developer

Environment: SQL Server 2008, SSIS 2008, Windows server 2008

Description:

Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a $91 billion, global company serving the health care industry with products and services that help hospitals, physician offices and pharmacies reduce costs, improve safety, productivity and profitability, and deliver better care to patients. With a focus on making supply chains more efficient, Cardinal Health manufactures medical and surgical products and is one of the largest distributors of pharmaceuticals and medical supplies worldwide. Ranked No. 18 on the Fortune 500, Cardinal Health employs more than 30,000 people across the world. Cardinal Health currently uses Informatica 8 to load its Enterprise Data Warehouse.

Responsibilities:

Implemented the SSIS packages as per requirement using technical specification and data mapping analysis documents for performing all ETL operations.

Implemented the Lookup Framework in all packages using lookup transform table.

Implemented the Package deployment and Package scheduling through jobs in SQL Server as well as File System deployment.

Implemented the connection strings dynamically using variables through expressions in all packages.

Implemented the Package Configurations (Package Deployment Model, Expressions, Variables, Connection Strings)

Implemented the logging using SSIS log Provider for SQL server.

Created various documents (Technical Specification, Mapping Analysis, and Deployment).

Developed Unit Test Cases for Testing the SSIS Packages, thus ensuring minimum bugs slipping to next level.

Implemented the Error Handling using Event handlers.

Query Optimization and testing data.

Project #5: AmeriQuest Mortgage Sept 2010 – Dec 2011

Role: ETL Developer

Environment: Informatica 9.1, Oracle 10g,Unix, OBIEE and Toad

Description:

The Enterprise Data Warehouse of Ameriquest Mortagage was built by integrating financial information across various OLTP systems. EDW marts were designed to extract data from EDW staging for end Users and managerial adhoc analysis. Enterprise Data Warehouse is mainly used for Customer Status, Index Rates, and Loan allocation status, involved parties in Loan Origination and Total Loans funded for each Branch.

Responsibilities:

Extensively used Informatica Power Center 9.1 to extract data from various sources and load in to staging and target databases.

Extracting the data from People soft, Oracle, flat files and load into target database Oracle

Involved in complete SDLC phase in collecting and documenting requirements. Prepared technical design/specifications for data Extraction, Transformation and Loading.

Extensively designed, developed, and tested Informatica mappings to extract data from external flat files and oracle tables using Informatica.

Extensively used Mapping Variables, Mapping Parameters, Workflow variables and Session Parameters.

Extensively used the Expression, Router, and Filter, Lookups (Connected/Unconnected), Update strategy and Aggregator transformations.

Used Type 1 SCD and Type 2 SCD mappings to update slowly changing Dimension tables.

Implemented Informatica Pass through partitions extensively to improve the performance of the mappings.

Performed Informatica code migration from development to testing and testing to production systems.

Wrote Shell Scripts for event automation and scheduling.

Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.

Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.

Closely worked with the reporting team to ensure to correct the data is presented in the reports.

Migrated workflows, mappings, & repository objects from development to QA to production

Used various Informatica Error handling techniques to debug failed session.

Implemented standards for naming Conventions, Mapping Documents, Technical Documents, and Migration form.

Academic Background:

Master of Computer Applications from Osmania University in 2010

WORK STATUS

H-1B visa



Contact this candidate