Post Job Free

Resume

Sign in

Data Developer

Location:
Leesburg, VA
Posted:
March 03, 2021

Contact this candidate

Resume:

Pragnya Guntakandla

913-***-****

Email:adkmtr@r.postjobfree.com

Data Warehouse / ETL Specialist

Summary: Around 7+ years of Strong experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using IBM DataStage 11.5/9.1/8.5/8.1/7.5x2/7.x and Informatica 10.1/9.x/8.x and BODS 4.2 (Business Object Data Services)

Extensively used DataStage client components like Data stage Designer, Director, and Administrator in Data Warehouse development. Worked with Data Migration, Data Conversion in diverged environment like DB2, Oracle, SQL Server, and complex files.

Extensive experience with Informatica Power Center 10.1,9.x, 8.x and 7.x hosted on UNIX, Linux, and Windows platforms and Informatica cloud hosted on UNIX.

Expertise in OLTP/OLAP database design and development of database schemas like star schema and snowflake schema that are used in relational, dimensional, and multidimensional modeling. Implemented SCD’s (Slowly Changing Dimensions) and CDC (CDC Change Data Capture).

Good understanding of Ralph Kimball and Bill Inman methodologies/approach.

Involved in Full Life Cycle Development (Waterfall & Agile) of building a Data warehouse on Unix Platforms for Retail, Financial, insurance Industries.

Proficient in designing and developing complex standard containers using various transformation logics by utilizing the transformer, filter, joiner, lookup, merge, sort, remove duplicates, aggregator.

Expertise in creating reusable components such as Shared containers, Local containers and in creating custom transformations, functions, and routines.

Maintained outstanding relationship with Business Analysts and Business Users to identify information as per the Business Requirement.

Worked on various transformations like Expression, join, Lookup, Sort, Filter, Aggregator and Mapplets and Reusable transformations using Informatica Powercenter Designer.

Having hands on experience in Informatica Power Exchange and Informatica IDQ.

Expertise in creating databases, users, tables, triggers, views, functions, packages, joins and hash indexes in oracle and Teradata database.

Worked extensively on Informatica Cloud tool to pull the data from cloud applications.

Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

Extensively worked on different connectors AWS s3, workday to retrieve the data using ics(Informatica cloud services).

Having good knowledge in CI/CD deployment process.

Created buckets folders users and assigning policies to users in AWS s3.

Team player, Motivated, able to grasp things quickly with analytical and problem-solving skills.

Comprehensive technical, oral, written, and communicational skills

TECHNICAL SKILLS:

ETL Tools

Data Stage 7.x/8.x/9.x/11.x, Informatica 10.1,9.x/8.x/7.x, SSIS, Azure Data Factory BODS 4.2, Talend

Operating Systems

Windows 98/2003/2000/XP/7/NT, Unix, MSDOS

Connectors

AWS, workday

Version Control Tools

SVN, TFS

RDBMS

Sql server, Teradata, DB2, Oracle 11g/10g/9i/8i/8.0, Sybase

Data Modeling

Erwin 4.1.4/3.5.2

Data Base Tools

SQL Developer, Toad 4.0.3, MS Visio, MS Access

Reporting Tools

Tableau, OBIEE 10.x/9.x

Scripting

UNIX Shell Scripting

Web Technology

Web Technology HTML, Java script

Languages

SQL, PL/SQL, C, C++, XML, Python

Education:

Bachelor of Technology in Electronics & Communications from JNTU Hyderabad, India.

Master’s in computer science from Sacred Heart University Fairfield, CT.

Certificate of Appreciation:

TRIRIGA China Property Management Implementation

TRIRIGA China Facilities Management Implementation

Certificates:

Certified SQL Developer (W3 Schools)

PROFESSIONAL EXPERIENCE:

CITI (Creative information technology Inc.)

Healthcare Sep 2019- Till Date

Falls church, VA

ETL Developer

Responsibilities:

Working on Healthcare Projects with the multiple customers and multiple systems.

DRIS (DATA RETENTION AND INTEROPERABILITY SOLUTION) archives patient data from legacy system.

Involved in writing sql scripts to extract the data from Legacy systems and loading the data into DRIS archive application.

Worked with Business users and Business Analysts for requirements gathering, business analysis, and testing and project coordination.

Migrated source data from different databases and csv’s to sql server.

Analyzing the customers data based on healthcare application finding the components such as patient demographics, encounters, medications, procedures, flowsheets, immunizations, vitals, results, Guarantor, and insurance details.

Creating the extraction scripts for the components and load the data into staging.

Customers will validate the data in DRIS testing site, and BA’s will do the validations in staging site.

Worked in clinical (eg: demographics, encounters, insurance etc...) and accounting (accounts, insurance, claims etc...) and providers.

Created sql server packages to extract the data from different source systems and load them into different target systems.

Created packages using ssis to migrate the data from flat files to sql server, and oracle to sql server.

Extensively used stage variables for source validations, to capture rejects and used job parameters for automation of jobs.

Will perform the load in staging, test once we get the sign-off from customers for largescale we will load the data into Production based on the date.

Involved in Production deployment and created or reorganized indexes for the fast performance.

Worked closely with devops team to do the weekly deployments.

By comparing the customer’s application, we will analyze the data and extract the data from legacy systems and load the data into DRIS.

Migrate SQL SERVER to Microsoft Azure Cloud.

Migrate the data using Azure Database Migration (AMS).

Environment: sql server, Oracle 11g, SSIS, Azure Data Factory, DRIS, TFS, Healthcare applications: (Wellsoft, Horizon clinical, OTTR, And Cerner), FileZilla, Azure.

Stanford University, Palo Alto, CA. July 2018 – Aug 2019

Informatica Developer

Project Description: ADAPT (Alumni and Development Applications Platform Transition)

ADAPT is a multi-year project that will migrate Stanford’s 20-year-old postgrads database and all its associated systems and processes to salesforce platform. As part of this we will use Informatica tool for Data Integration.

Responsibilities:

Designing Technical and functional specs based on requirements provided by business and finalizing the data model.

Involved in writing mapping specification document (STTM).

Created ETL documents (ETL Standards, Migration Manual, and Unit Testing) for the development.

Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy, Stored Procedure and Router transformations for populating target table in efficient manner.

Used Parameter files to specify DB Connection parameters for sources.

Designing and developing complex Informatica mappings including Type-I, Type-II

Involved in writing Range Partitioning with dates on fact table.

Involved in writing Stored Procedure to break a string and for truncating the partitioned tables.

Extensively worked in the performance tuning for mappings and ETL procedures both at mapping and session level. Developing, Testing, and debugging the mappings in Informatica.

To Maintain PII and SSA information as part of Data Privacy/Data security used BigID tool.

Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.

Support and batch monitoring for weekly refresh.

Environment: Informatica Power Center 10, UNIX, BigID, sql Developer, Sybase, salesforce, OBIEE, Oracle 11g

Wal-Mart, Inc., Bentonville, Arkansas

Sr. Data Stage Developer Dec’15- June 2018

Project: TRIRIGA

Global rollout of Tririga system to manage Facility Management, Construction Management and Property Management. Tririga system is integrated with other Legacy and SAP systems which include SRM, ECC and MDG systems. As part of this we will use the DataStage tool for Data Integration.

Responsibilities:

Worked closely with business analysts in requirements gathering, reviewing business rules and identifying data sources. Worked on analyzing the systems and gathering of requirements.

Work with the project and business teams to understand the business processes involved in the solution.

Created technical design documents by analyzing functional specification documents that are required for ETL development.

Extracted the data from DB2 database into flat files, performed required transformations and loaded them into DB2 Staging Tables.

Developed parallel jobs using various stages like sequential file, data set, lookup, filter, funnel, copy, surrogate key generator, column generator, sort, transformer, change data capture.

Extensively used stage variables for source validations, to capture rejects and used job parameters for automation of jobs.

Developed job sequencer with proper job dependencies, job control stages, triggers.

Used the DataStage director and its run-time engine to monitor the running jobs.

Involved in all steps of development and deployment processes.

As Part of TRIRIGA provided SAP ECC, JDE Extracts to all the teams using BODS designer and scheduling the jobs weekly/bi-weekly using data service management console.

Environment: IBM Data Stage 11.5/9.1, BODS (Data Services Designer) 4.2, Netezza, Python, My SQL Server, File Zilla, UNIX Shell Scripts, Flat Files, BODS 4.2.

Marsh & McLennan Companies, Hoobken NJ

Informatica Developer Jan 2014 to Dec 2015

Project: Marsh & McLennan Companies, Inc. is a global professional services firm, headquartered in New York City with businesses in insurance brokerage, risk management, reinsurance services, talent management, investment advisory, and management consulting. This project involves in building a New HR EDW using legacy system Peoplesoft in oracle 11g system using Informatica Powercenter and pulling the data from Peoplesoft production data to stage layer using Informatica cloud services. This New EDW data is used to build the report like headcount, Hire, Terms etc.

Responsibilities:

Designing Technical and functional specs based on requirements provided by business and finalizing the data model.

Involved in writing mapping specification document (STTM).

Created ETL documents (ETL Standards, Migration Manual, and Unit Testing) for the development.

Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy, Stored Procedure and Router transformations for populating target table in efficient manner.

Used Parameter files to specify DB Connection parameters for sources.

Worked on the installation and setup ETL (Informatica Cloud) applications on Linux servers.

Designing and developing complex Informatica mappings including Type-I, Type-II

Involved in writing Range Partitioning with dates on fact table.

Involved in writing Stored Procedure to break a string and for truncating the partitioned tables.

Extensively worked in the performance tuning for mappings and ETL procedures both at mapping and session level. Developing, Testing, and debugging the mappings in Informatica.

Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.

Extensively worked in Informatica cloud using DRT (data Replication Tasks) and DST (Data Synchronization Task)

Support and batch monitoring for weekly refresh.

Environment: Informatica Powercenter 9.1, Informatica cloud, Unix, TOAD, sql Developer, Peoplesoft, Taleo, Citrix, OBIEE, Oracle 11g

HSBC Bank, India June 2012 – Nov 2013

Data stage Developer

Project: Digital Insight

DI involves with profile information and events performed by users & customers of personal internet banking as well as Corporate banking. It provides the users and enables bank to deploy new features to improve the services and bank with information related to applications being accessed by the customer satisfaction. Main aim is to capture this data in data warehouse and make data available in reports, for access by business users.

Responsibilities:

Worked closely with Business analysts and Business users to understand the requirements and to build the technical specifications.

Involved in all business meetings to understand the existing logic, different member information and agency data to come up with the best IT solution.

Responsible to create Source to Target (STT) mappings.

Involved in day-to-day production support activities.

Worked on various defects raised by the concerned business teams from various entities.

Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer and to load the target tables using the DataStage Designer.

Worked on various stages like Transformer, Join, Lookup, Sort, Filter, Change Capture and Apply stages, Quality Stage.

Developed parallel jobs using various Development/debug stages (Peek stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)

Designed job sequencer to run multiple jobs with dependency and email notifications.

Involved in Unit Testing, SIT and UAT Worked with the users in data validations.

Extensively worked in improving performance of the jobs by avoiding as many transforms as we can.

Prepared documentation for unit, integration, and end - to - end testing.

Optimized/Tuned DS jobs for better performance and efficiency

Responded to customer needs, self-starter, and customer service oriented.

Worked within the team to make appropriate, effective decisions related to project responsibilities and to initiate and follow-up on assigned tasks without supervision.

Provided support and guidance by creating Release, Deployment & Operation guide documents.

Involved in Performance tuning of complex queries.

Developing SQL scripts to facilitate the functionality for various modules.

Created control M jobs for scheduling.

Environment: IBM Info sphere DataStage 8.x (Designer, Director) Oracle 10g, Teradata 13, Sequential files, UNIX Shell Scripting



Contact this candidate