Post Job Free

Resume

Sign in

Data Manager

Location:
Cumming, GA
Posted:
January 20, 2021

Contact this candidate

Resume:

Sushma Kondapalli

adjkru@r.postjobfree.com

937-***-****

Professional Summary:

·Above 10 years of experience in the field of ETL and business intelligence with high proficiency in expertise in requirement gathering and data modeling including design and support of various applications in OLTP,Data Warehousing,OLAP.

·Extensive involvement in phases of SDLC including functional requirements gathering, data model/integration architecture, development, and testing.

·Redesign current ETL / DW processes for best performance and proper dependency.

·Strong experience with different project methodologies including Agile Scrum Methodology and Waterfall methodology.

·Understand the project scope, identify activities/ tasks, task level estimates, schedule, dependencies, risks and provide inputs to Module Lead for review.

·Expert in architecture and design of ETL technology using Informatica

·Experience in Architecting and building Data Warehouse systems and Business Intelligence systems including ETL using Pentaho BI Suite (Pentaho Data Integration Designer / Kettle, Pentaho Report Designer, Pentaho Schema Workbench, Pentaho Design Studio, Pentaho Enterprise Console, Pentaho BI Server, Pentaho Meta Data, Pentaho Analysis View, Pentaho Analyzer & Mondrian OLAP).

·Good experience of designing/developing slowly changing dimensions to maintain the transactional data and historical data

·Extensive database experience and highly skilled in SQL in Oracle, MS SQL Server, Flat Files, MS Access.

·Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.

·Experience in working with Business Intelligence and Enterprise Data Warehouse(EDW) including Pentaho, OBIEE, Amazon Redshift and Azure Data Warehouse.

·Proficient in implementing Business Rules, Profiles, Scorecards, Mappings, mapplets using Analyst and Developer Tool for data cleansing, standardization, enhancement, and Reporting

·Experience in Designing Databases (Oracle, SQL Server), SQL, Database Programming (PL/SQL) and Performance Tuning

·Expertise in managing Repository Metadata with experience in standardizing practices on metadata databases for good Repository performance with OBIEE

·Experience building data marts, data structures, data storages, data warehouses, data archives, and data analysis

·Experience in sourcing data from DB2, SQL Server and Oracle databases,SalesForce

·Documented business rules, functional and technical designs, test cases, and user guides.

·Demonstrate excellence in communication skills, data gathering including medical claims, analysis, reporting, and process improvement

·Excellent technical and communication skills and ability to work effectively and efficiently in teams

·Managing the off shore project development team, document and communicate analytic requirements, perform quality assurance checks of offshore deliverables.

·Have worked within the framework of common Project Management structures and practices including project planning, scope control and customer relationship management.

·Worked in software development methodologies like Waterfall and Agile.

·Actively engage in building robust ETL solution using Informatica Cloud (IICS) to harness the best practices.

·Worked on customer segmentation using an unsupervised learning technique clustering Performed Exploratory Data Analysis and Data Visualizations using Python.

·Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.

·Strong analytical & problem solving skills and a quick learner of new technologies and concepts.

·Knowledge of Hadoop, HBase, Hive n Pig, Impala, Spark and Kafka.

·Knowledge on monitoring, metrics and logging systems on AWS.

·Exposure to PeopleSoft’s GL, AP, AR modules and also in the EPM product line which includes Financial Service Industry (FSI).

·Ability to learn new tools, concepts and environments.

·Ability to explain technical concepts and communicate with users and system administrators at all levels.

Technical Skills:

·Performance Tuning of SQL Queries and made changes to database by implementing the partitions.

·Reporting Tool : Oracle BI EE 11g/10g,BI Apps 7.9

·ETL : Informatica 10.1/9.5.1/9.1/8.6, DAC 11g,Pentaho 8.1

·Operating Systems : Win NT/2K/XP, Windows 2008/2003 Server, UNIX/LINUX, MAC

·Languages : SQL, PL/SQL, Python, C++, HTML, Visual C#

·RDBMS : Oracle 11g/10g, Oracle EBS, SQL Server 2005,SQL Server 2008

·Office Applications : MS Office Suite, MS Visio

Professional Experience:

Randstad, Atlanta, GA Jul 2019-Present

Data Engineer

Responsibilities:

·Worked with Analytics team, data architects to identify the business requirements and developed designed document for ETL flow, analysis of database architecture, created various complex Pentaho Transformations and Jobs using PDI Spoon.

·Involved in building Data warehouse whose primary source of data comes from external vendors and corporate sources which includes the data from various source system like Oracle 10g, Flat files, and Microsoft Excel file, XML files.

·Designed Data warehouse including star schema design, DW capacity planning, MySQL performance and tuning. Implemented Orders using star schema, Orders Business domain using Pentaho Meta Data.

·Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Pentaho PDI.

·Created Stage based DW supported by that’s completely implemented in Pentaho Kettle.

·Participated in design of Staging Databases and Data Warehouse/Data marts using Star Schema/Snowflakes schema in data modeling.

·Developed complex database objects like Stored Procedures, Functions, Packages using SQL and PL/SQL.

·Effectively made use of Table Functions, Indexes, Table Partitioning, Materialized Views and table spaces.

·Extensively used bulk collection in PL/SQL objects for improving the performance.

·Published cubes and reports onto Pentaho repository and refreshed Pentaho BI repository after uploading each object to be available for central use.

·Load data to Big Query(AVRO format data) through ETL.

·Monitor dataflows in Google Cloud Platform using BigQuery.

·Used MDX queries to generate custom adhoc reports with addition of calculated measures in the reports.

·Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.

·Configured versions of SQL queries using GitHub.

·Responsible for maintaining all Development, QA, Staging and Production PostgreSQL databases within the organization.

·Used Python to develop a variety of models for analytic purpose.

·Managing the security of the servers, creating new logins and users, changing the roles of the users.

·Involved in creating logical and physical models of the database using Erwin and Visio.

·Developing stored procedures, triggers, views, and adding/changing tables for data load and transformation and data extraction.

·Involved in optimizing code and improving efficiency in databases by re-indexing.

·Worked on different data formats such as JSON, XML and performed machine learning algorithms in Python.

·Worked on Infocenter Tickets and Ivanti Tickets.

·POC on Data Extraction, aggregations and consolidation of Adobe data within AWS Glue

·Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.

·Created pipelines to move data from on-premise servers to Azure Data Lake.

·Involved in training Looker Tool, Hadoop, HDFS, MapReduce and Spark.

·Assisted in migrating from On-Premises Hadoop Services to cloud based Data Analytics using AWS.

·Defined and deployed monitoring, metrics, and logging systems on AWS.

·Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system.

·Performed data analytics on Datalake platform using python.

·Extracted the raw data from flat file to staging tables using Informatica Cloud9i(IICS).

·Experience in building pipelines using Azure Data Factory and moving the data into Azure Data Lake Store.

·Provided 24/7 production support for the production database and also to the code deployed into the production environment.

Environment: Pentaho BI Server, Pentaho Data Integration (PDI/Kettle), Oracle 10g, PL/SQL, Toad, SQL Loader, Oracle SQL Developer, OBIEE, Oracle Enterprise Manager, Informatica Power Centre 10.1, Informatica Cloud (IICS), 12c

Deloitte, Atlanta, GA July 2016- June 2019

Data Engineer

Responsibilities:

·Developed reports for Integrated Eligibility System project for the state of Georgia.

·Interacted with business representatives for gathering the Reports/ Dashboards requirements and to define business and functional specifications.

·Coordinated with different teams during development/design.

·Create and review monthly/quarterly claims information and summarize for client reports.

·Review and analyze summary from carrier to confirm renewal is in line with claims history.

·Assist with performing various analyses relating to Medicaid engagements, including the analysis of medical claims data.

·Strong Knowledge of all phases of Software Development Life Cycle (SDLC) such as Requirement Analysis, Design, Development, Testing, UAT, Implementation and Post Production support.

·Created batch architecture document, design document and data map for ETL rules that is scalable for application high volume data

·Developed Informatica PowerCenter batch architecture to extract, transform and load data from different sources like oracle, SQL Server, flat files

·Designed ETL Informatica Transformations including – Source Qualifier, connected -unconnected Lookup, Filter, Expression, Router, Joiner, Rank Aggregator, Sorter and Sequence Generator, Normalizer, SQL Transformation, stored procedure, and create complex mappings

·Manage ETL specifications and supporting documentation

·Coordinates the work of others to develop, test, install, and modify programs and establish physical database parameters

·Developed code using Informatica best practices and considering performance of the jobs

·Maintained and enhanced ETL code for loading transactional data into the system

·Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings

·Resolved debug issues in Mapping and Workflow Development, Monitoring and Administrating jobs, Deployment concepts, Repository and Globalization concepts, Product Interfacing with Databases like Oracle,SQL Server, Web services and other applications

·Maintained warehouse metadata, naming standards and warehouse standards for future application development

·Developed Informatica mappings to load into flat files and stored as a report in portal.

·Developed highly complex RTF, Excel templates to be used in BI publisher reports and modified them according to the desired layout.

·Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data warehouse tables.

·Participated in all Business Intelligence activities related to data warehouse ETL and report development methodology.

·Created Materialized Views for extracting the data from remote database.

·Possess knowledge about migration of OBIEE (Catalog and RPD) between dev/test/prod environments

·Participated session meetings for understanding of data points, identifying the source systems and target system paths.

·Developed RPD using Fact, Dimensions, and Physical, Alias and Logical tables.

·Migrated RPD and maintained upgraded versions.

·Involved in QA, UNIT testing.

·Worked on Prod Sanity Checks and worked on remedy tickets resolving user issues.

·Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion

·Worked with Batch Team to schedule Monthly, Quarterly, Weekly, Daily Reports

·Worked on Federal Reports and reconciling the Reports monthly and weekly.

·Responsible for Report and query performance tuning.

·Knowledge of Informatica MDM.

·Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift.

·Responsible for Development, QA, PostgreSQL databases within the organization.

·Used JSON schema to define table and column mapping from S3 data to Redshift

·Worked on Autosys scheduling tools depending on client’s preference to schedule tasks.

·Used Oracle SQL queries that create/alter/delete tables and to extract the necessary data.

·Extracted the raw data from flat file to staging tables using Informatica Cloud9i(IICS).

·Migrating the ETL workflows, mappings and sessions to different environments.

Environment: OBIEE 11g, Oracle 11g, SQL Developer, Toad 12, Informatica 9.5.1, Informatica Cloud (IICS),

Informatica 10.1

Georgia Department of Human Services, Atlanta, GA July 2013-June 2016

DataWarehouse Consultant

Responsibilities:

·Developed reports for Grant Management, Department of Family and children services and Aging services project for the state of Georgia.

·Designed Star Schema, Fact and Dimension Tables, Conceptual, Logical and Physical Data Modeling using embarcadero tool.

·Interacted with business representatives for gathering the Reports / Dashboards requirements and to define business and functional specifications.

·Prepared Detail Design Document of the complete implementation of Business modules.

·Design OLAP, analysis, architecture, development, implementation, reporting, administration, and migration of various applications in client/server environment using OBIEE in Windows and UNIX environments

·Created MUD (Multiple user Development) environment by utilizing projects in OBIEE RPD.

·Designed Schema/Diagrams using Fact, Dimensions, and Physical, Alias and Logical tables in RPD.

·Worked using Catalog manager to migrate web catalog between different instances.

·Worked on different RTF, excel Templates in BI Publisher.

·Performed unit testing of reports in Development, UAT and Production

·Involved in Performance tuning of queries.

·Manipulated Stored Procedures, Triggers, Views, Functions and Packages using TOAD

·Assisted in up-gradation of database and applications by providing necessary patch set information.

·Created an Alert using Agent in OBIEE and Scheduled the Report to deliver it to clients directly through emails without logging in to the URL.

·Attended JAD and White Boarding Sessions with Users and Business Analysts to keep business synchronized with the progress.

·Developed a number of Informatica Mappings and Transformations to load data from relational and flat file sources into the data warehouse.

·Extensively used Informatica Client tools – Source Analyzer, Warehouse Designer, Mapping Designer, Informatica Repository Manager and Informatica Workflow Manager

·Developed and modified triggers, packages, functions and stored procedures for data conversions and PL/SQL procedures to create database objects dynamically based on user inputs

·Loaded the data in incremental fashion by using batches & sessions on Informatica Server Manager.

·Experience with setting up and configuring Informatica and CDC on client/server platforms.

·Designing mapping templates to specify high-level approach.

·Work with Workflow manager to check with session and validate the workflow.

·Developed the Common Processes (Mapplets) for the data validation and Exception Handling.

·Worked on scheduling mappings through DAC (Data warehouse Administrator Console).

·Design, Setup and Executing the Execution plans for customized and pre-built mappings through DAC.

·Trained Users and created Adhoc reports.

·Created POC’s (proof of concepts) dashboard for leadership group with the team and business users

·Worked on fixing issues on Pre-built mappings for financial and project Analytics.

·Worked closely with internal audit teams to translate financial and operational audit objectives into analytic requirements.

·Developed executive KPI reports and Dashboard using OBIEE.

·Involved in continuous enhancements and fixing of Production Problems. Generated server side PL/SQL scripts for data manipulation and validation.

·Wrapped the sessions with pre-session and post-session UNIX shell scripts. Created and scheduled batches for the daily load of the data.

·Knowledge in PeopleSoft Financials (Accounts Receivable, Billing, General Ledger, Accounts Payable), Grant Analytics and HR Analytics.

Environment: OBIEE 11g, Informatica 9.1.4, CDC Power Exchange 9.1.5,Oracle 11g, SQL Developer, embarcadero, Toad 12,DAC 11g,Informatica 9.5.

Emory University,Atlanta,GA March 2013 – July 2013

Developer

Responsibilities:

·Developed reports for Labor and Budget Transaction project.

·Worked closely with Business Analysts in understanding the Analytics requirements and created the business model for generating reports as per the user requirements.

·Conducted Functional Requirements review and walk through with BI Architect, Designers

·Analysis, Design, Developed Informatica Mappings to load data from relational and flat file sources into the data warehouse.

·Exposure to PeopleSoft’s GL, AP, AR modules and also in the EPM product line which includes Financial Service Industry (FSI).

·Designed and developed custom Dashboards on Labor Transaction and Budget Transaction report.

·Created Wireframes and Web-Mockups to give look and feel of UI to end users.

·Prepared Detail Design Document of the complete implementation of Business modules

·Trained users and creating Adhoc reports.

·Preparing training document for end user to do adhoc reports/analysis.

·Gained Experience on ODI Tool by creating a solution and load scenario, procedure and execution

·Ran the data load daily through ODI.

·Involved in Code Migration to Production and restart of BI Server.

·SQL and PL/SQL programming for ODI and oracle.

·Identified the required Facts, Dimensions, Levels / Hierarchies of Dimension.

·Optimized the SQL queries for improved performance.

·Developed reports and analysis on Asset Management Transaction, Work Orders and Meters.

·Prepared technical document for the customizations done.

·Involved in Unit Testing of Reports in UAT and Production.

Environment: OBIEE 11.1.1.6, PeopleSoft EPM 9.1, Data Stage 8.5, Informatica 10.1, SQL Developer

Toyota,Erlanger,KY August 2012 –February 2013

Datawarehouse Consultant

Responsibilities:

·Developed reports for Supply chain, Spend Analysis,Procurement and Contacts of Vendors project.

·Used simple to complex analytical SQL queries against the source databases to identify and analyze the authenticity of the data before importing to build the OBI repository

·Involved in laying out a strategy with senior management and technical team for building the metadata that presents the business model to support the business requirements

·Created POC’s (proof of concepts) for the repositories and business functionalities with the architect team and business users.

·Designed and implemented database indexes and partitions to improve application and dashboard performance.

·Developed Informatica Mappings, Transformations to load data from relational and flat file sources into the data warehouse.

·Developed various mappings using Mapping Designer and worked with Workflow manager to check with session and validate the workflow

·Created Dashboard Pages, Created a New Interactive Dashboard Page by using answers from presentation catalog.

·Performed unit testing and create test plans for reports built in Oracle Answers

·Identify causes of performance issues and helped to resolve the same.

·Developed Ad hoc reports in BI publisher and implemented report parameters and List of Values (SQL Query Based).

·Developed highly complex RTF templates to be used in BI publisher reports and modified them according to the desired layout.

·Wrote many XML scripts for runtime calculations in RTF

·Prepared Detail Design Document of the complete implementation of Business modules.

·Extensively used ETL to load data from multiple sources to Staging area (Oracle) using Informatica Power Center

·Loaded the data in incremental fashion by using batches & sessions on Informatica Server Manager.

·Resolved open issues with the production system

Environment: OBIEE 11g, Informatica 9.0.1, Oracle 11g, Oracle EBS R12, SQL Developer, SQL Server 2008

Syscom Technologies,Chantilly,VA February 2012 – July 2012

Developer

Responsibilities:

·Gathered user requirements and created the business requirements documents and tested the generated reports.

·Experience in working with different data sources like Oracle, SQL Server, XML, Flat files

·Experience in creating Metadata Objects (Subject Area, Table, Column) and Web Catalog Objects (Dashboards, Pages, and Reports) using OBIEE.

·Imported sources and targets created Mappings and developed Transformations using Power center designer.

·Experienced in designing Star Schema, Dimensional Data Modeling Logical methodologies.

·Experienced in Data warehouse Administration Console (DAC) to Configure, Monitor and Schedule ETL routines of Full and Incremental ETL loads of the Oracle Business Analytics Warehouse.

·Managed security privileges for each subject area and dashboards according to the requirements.

·Developed simple & complex mappings using Informatica to load Dimension & Fact tables as per STAR schema techniques.

·Developed user friendly Dashboards by appropriately including guided navigational links, prompts, and drilldowns.

·Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, and Sorter and Sequence Generator transformations.

·Knowledge with Qlikview sheet objects, scripting, and optimizing applications.

Environment: OBIEE 11g/10g, Informatica 8.6.1, Oracle 11g/10g, PeopleSoft Financials, Oracle EBS R12, BI Apps 7.9.5, DAC 10.1.3.4.1,SQL Developer

Hilmot Corporation,Waukesha,WI July 2011 – January 2012

Design Engineer

Responsibilities:

·Design, simulate and build energy efficient conveyor systems

·Assists in the design of user interface and business application prototypes.

·Experience in all phases of the System life Cycle including project definition, analysis, design, coding, testing, implementation and Production support.

·Worked with QA team to design test plan for user Acceptance Testing (UAT)

·Involved in the regression testing and solved problems to meet the requirement.

·Gathered all the requirement for the design of PLC ladder programming using 3G card (PLC) to access different hardware components

·Design and implementation in Visual Studios C# language to operate automated assembly hardware.

·Participated session meetings for understanding of data points, identifying the source systems and target system paths.

·Actively involved in deployment and post release support.

·Experience in working with different data sources like Oracle, SQL Server, XML, Flat files.

·Involved in QA, UNIT testing

Environment: Microsoft Visio, Microsoft Word, Microsoft Excel, Win XP/2008, Visual Studio C#,SQL Server, Oracle.

University of Dayton,Dayton,OH May 2009 – June 2011

Business Research Group Specialist

Responsibilities:

·Experience as Research and Marketing Analyst for NCR products

·Gathered information on the survey and generated questionnaire through communication and interaction with clients.

·Regularly coordinated with support team for resolution of the issues.

·Developed understanding of customer markets, segments and suggested research solutions following strategic plans of action.

·Performed market research and data analysis duties and supported in strategic planning of the company by providing research information.

·Delivered insights regarding the reason that impacts the success of business.

·Defined and documented customer business functions.

·Developed weekly forecasting report.

·Performed customer segment research

·Devise focus group discussions and conduct surveys.

·Collaborated with marketing departments for business improvement.

·Presented ‘Feedback loop and decision making reports’ to BRG director & group.

Environment: Microsoft Visio, Microsoft Word, Microsoft Excel, Win XP/2008.

Education:

·Bachelors of Technology in Electronics and Communication Engineering, India - 2008

·Masters in Electrical and Computer Engineering, Dayton, OH - 2011



Contact this candidate