Post Job Free
Sign in

ETL, BI, Data warehousing

Location:
Naperville, IL
Posted:
April 12, 2018

Contact this candidate

Resume:

Swapna Chinthalacheruvu

*********@*****.***

309-***-****

Location: IL

SUMMARY:

Highly motivated and innovative software engineer with 12 years of experience in the Information Technology. Expert in Data Warehousing, ETL, Database Administration, Big Data and Business Intelligence technologies.

OBJECTIVE:

Seeking a challenging Architect or Sr. Developer position in ETL, Data warehousing, Business Intelligence and Big Data.

TECHNICAL SKILLS:

ETL, Data Warehousing: Informatica Power Center 10.1, 10.1.1, 9.6, etc. Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository Manager, Microsoft SQL Server Integration Service (SSIS), Amazon Redshift/Hadoop HDFS/Salesforce Connectors (SFDC), Apex Data loader.

Business Intelligence: Cognos 10.x/8.x Framework Manager, Query Studio, Report Studio, Analysis studio, Metric Studio, Metric Designer, Event Studio, Transformer 8.4 Cognos Connection, Microsoft SQL Server Reporting Service (SSRS), Microsoft SQL Server Analysis Service (SSAS), Qlikview, SAP Business Objects 4.X, Universe.

Big Data: Hadoop (Horton Works), MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Flume, Splunk

Databases: DB2, Oracle 11g, Sql Server, Mysql, Postgres, Teradata.

Database Tools: TOAD, Query Analyzer, Data Studio, Winsql, pgAdmin, T-SQL.

Data Modeling: Erwin, Logical and Physical Data Modeling, Star Schema & Snoke-Flake Schema Modeling.

Programming: Linux Shell Scripting, PL/SQL, Very good knowledge on Perl Scripting.

Tools: BMC Control-M Scheduler, IBM Tivoli Scheduler, HP Service Manager, C#, UltraEdit, Putty, Winscp, Filezilla.

EDUCATION:

Four year engineering B.S. degree from Osmania University, Hyderabad, India.

Professional experience

Adtalem Global Education Inc., formerly DeVry Education Group Inc. – Sriven Systems 10/2017 to Current

Naperville, IL

ETL & BI Lead Developer

Working on Technical Independence project to integrate Chamberlain University data from new source systems into new DW for Chamberlain.

Gathered the requirements by attending meetings with the users, Data analyst and Business analyst to understand the new source and target systems and documented, then split the work among the developers.

Queried Repository metadata tables in repository database tables to identify the existing components that populates Chamberlain data into the tables and move them into a corresponding folders to make the necessary changes.

Documented the Data Warehouse & Reporting objects and prepared ETL mapping specification document that covers the requirements and business logics for the transformations.

Extracted student information data from Banner source system, Salesforce is the other source system that generates the files thru a set of data loader jobs for the ETL to pull, HR data, and financial data etc. and landed into a staging tables.

Worked with Informatica cloud to integrate Salesforce and load the data from Saleforce to Oracle db.

Extensively worked on complex mappings, used several transformations like stored procedure, Sequence generator, Update strategy, Lookup, aggregator etc. and designed the mappings with Type I, II, and III changes in slowly changing dimension and loaded the data into Chamberlain Data Warehouse.

Mapplets and Reusable Transformations are also used to prevent redundancy of transformation usage and modularity.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Created database layer and business layer, defining appropriate relationship between query subjects in Cognos Framework Manager.

Involved in Designing DMR models with Relational Data using Framework Manager and deployed packages to Cognos Connection.

Created custom prompted reports in Report Studio, Bursting reports to different user groups.

Developed List, cross tab, drill through, master-detail, chart and complex reports which involved Multiple Prompts, in Report Studio, Created Drill- through reports

Involved in working with Cascading Prompts, Conditional Formatting, and Conditional Blocking.

Scheduled Reports using Cognos Connection.

Environment: Informatics 10.1.1 Power Center Client (Repository Manager, Designer, Workflow Manager, Workflow Monitor), SQL Server 2012, SSMS, T-SQL, Oracle11g (SQL Developer), Sales Force(SFDC), Banner application, Appworx scheduling tool, Unix scripting, ServiceNow, Sharepoint, HDFS, Cognos 10.2 (Framework Manager, Cognos Connection, Query Studio, Report Studio, Analysis Studio, Access Manager) Cognos Transformer.

Centene Corp/Health Net™ HealthCare Insurance – Sriven Systems 05/2016 to 09/2017

Naperville, IL

ETL & BI Lead Developer

●Analyzed the existing data with Business Analysts for Requirement gathering at initial stage of project for the Data warehouse to be developed.

●Involved on various HIPPA claims validation and verification process to understand the source data.

●Assigned and prioritized tasks to team members and assist with analysis, design and trouble shooting.

●Used Erwin for forward engineering approach for designing and developing normalized Logical and Physical dimensional model for the reporting system, Reverse Engineering for the Existing model and developed Star schema and Snow flake schema.

●Extensively worked on Informatica IDQ to apply the business rules for data validation and for standardizing the data using IDQ transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

●Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Informatica Designer to load into a data warehouse repository, designed complex mappings using transformations (Rank, SQL transformation, XML Parser, XML Generator, Union, Joiner, Aggregator, Lookup, Router, Filter, Update Strategy, Normalizer, etc…) create repositories and establish users, groups and their privileges.

●Created workflows using various tasks like sessions, event raise, event wait, decision, e-mail, command, worklets and Assignment.

●Created Slowly Changing Dimensions (SCD) Type 2 mappings for loading dimensional data to maintain historical data.

●Optimize mapping by identifying and creating Reusable components, using indexes, Partitioning at DB level and Informatica level, Query optimization and reducing unnecessary DB resources utilization.

●Experience in Health care domain like Medicare, Medicaid and Insurances compliance within HIPPA regulation and requirement.

●Worked extensively on the HIPPA 4010 transactions and HIPPA 5010 transactions as the source data like 837,834,835, 270, 277,276 and more.

●Involved in analyzing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level.

●Worked a lot on Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.

●Excellent T-SQL development skills to write complex queries involving multiple tables, maintain stored procedures, triggers.

●Write UNIX shell scripts extensively for scheduling and pre/post session management.

●Scheduled and created dependency to execute ETL workflow using BMC control m scheduling tool.

●Designed Hive tables to load data to and from external files.

●Importing data from MySQL database to HiveQL using Scoop.

●Develop, validate and maintain HiveQL queries.

●Running reports in Pig and Hive Queries analyzed data with Hive, Pig.

●Designed and Implemented metadata models in Framework Manager and Powerplay cubes in Transformer and published the packages according to the business requirements.

●Designed and Implemented new dashboards using cognos workspace.

●Developed List reports, cross tab reports, charts, Drill through Reports, Master Detail Reports by using Report studio.

●Creating value prompts, Date & time prompts, Text box prompts, cascading prompts, Hide and Show the objects by using conditional blocks and Implemented report bursting to distribute reports effectively.

●Created various types of reports in Report Studio using dimensional (Cube & DMR) and Relational data sources

Environment: Informatics 9.6 and 10.1.1 Power Center Client (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Information Data Quality (IDQ) 9.5.1/9.6.1,SQL Server 2005 & 2008, SSMS, T-SQL, DB2, Lotus Notes, Oracle11g (PLSQL Developer), Control+M, Unix scripting, HP Service Manager, Hadoop Connector, HDFS, Hive, Cognos 10.2 (Framework Manager, Cognos Connection, Query Studio, Report Studio, Analysis Studio, Metric Studio, Event Studio, Access Manager) Transformer

Professional experience

StateFarm Insurance – Sriven Systems 04/2014 to 05/2016

Bloomington, IL

ETL and BI Lead Developer.

StateFarm Insurance – Teksystems/Populus Group 5/2010 to 03/2014

Bloomington, IL

ETL and BI Lead Developer.

Informatica Responsibilities:

●Worked on multiple projects with different business units like Auto, Claims, Bank, Mutual Funds etc..

●Worked with business analysts for requirement gathering and business analysis.

●Experience in critical production support ETL projects.

●Coordination with data modeler to design the various schemas.

●Created business rules in Informatica Developer (IDQ) and imported them to Informatica power center to load the standardized and good format of data to staging tables.

●Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the source data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

●Used IDQ’s standardized plans for addresses and names clean ups.

●Extensively worked on data extraction, transformation and loading data from various source and target RDBMS Data stores SQL Server, DB2, Oracle, Postgres, flat files etc.

●Created mappings and Mapplets to load data from source systems into data warehouse.

●Implemented Type I, II slowly changing dimension (SCD) tables.

●Created and Run the Linux/Unix Shell Scripts for all Pre-Post session ETL jobs.

●Extensively worked on debugger to identifying the bottlenecks and fine-tuned the mappings, Workflows and Sessions.

●Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.

●Worked with Session Logs and Workflow Logs for Error handling and troubleshooting in DEV/QA/PRD environment.

●Providing support to the developed jobs and providing solutions to the open tickets like incident tickets and problem tickets solving business partner problems on the data which we have loaded using ETL process and closing the tickets with proper solution.

●Strong T-SQL experience in creating complex sql scripts to validate the data flow from source to target.

●Part of the team work is to researching on data integration on Amazon Redshift

●Part of the project involved in bringing the Salesforce data to in house target data ware housing using informatica salesforce connector.

●Used Apex data loader for bulk import, export, insert, update delete, or export Salesforce records, having a good knowledge on SFDC.

●Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.

●Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.

Environment: Informatics 9.6 Power Center Client (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Information Data Quality (IDQ) 9.5.1/9.6.1, SQL Server 2005 & 2008, T-SQL, DB2, Postgresql(pgadmin), Lotus Notes, Oracle10g&11g, Control+M, Unix scripting, HP Service Manager, Splunk, Hadoop Connector, Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Amazon Redshift, Salesforce Connector.

Cognos Reporting Resposibilities:

●Extensively used Cognos Framework Manager to build models with query subjects, query items and namespaces.

●Created OLAP Cubes using Transformer and published for faster report response time.

●Created monthly/weekly/daily Cognos Cubes depending on the requirement and using Control-M jobs to refresh the cubes.

●Published packages to the Cognos connection.

●Migrated cognos 7.4 version cubes (used analytically and used in reports), reports and Framework Manager Models to cognos 8.4 version, Migrated 8.4 cognos reports, cubes and FRWM Models to 10.1.1, Migrated From 10.1.1 to 10.2.1 tested the reports, Cubes and FRWM Models on test servers, published to production servers where users started using. Also providing support to the converted cubes.

●Created many management Dashboard out of a cube for better performance which brings many individual reports information and data elements. This Dashboard gives information on how externals and internals have been utilized, allocated and time reported under particular projects.

Environment: Cognos 10.2 (Framework Manager, Cognos Connection, Query Studio, Report Studio, Analysis Studio, Metric Studio, Event Studio, Access Manager)Transformer, SQL Server 2008(SQL Server Management Studio 10.0), HP Service Manager.

QlikView Reporting Resposibilities:

●Extract information from several data sources to create data views that can be filtered by multiple data points. Developed executive dashboards in QlikView delivered through the web, which enable to measure the performance of the business with analytical capabilities

●Dashboards were served to track the summary of the performance and current health of the ICP Program

●Created sub apps, Dashboards, Reprots for this Application providing metrics on risks and issues of all the projects in the ICP Program based on the business user requirements.

●Multiple tier of QVD architecture is implemented depending upon the complexity of the application and the final application reload duration was 4 Hours.

●Worked on Qlikview Server and Publisher, to manage user access, schedule refreshing of documents, mount completed applications.

●Created PDF push reports and setup email distribution on QlikView Publisher 9.

●Reviewed QlikView error logs and provided production support.

●Task set up are done using QEMC and created architecture for the interdependent applications and execute in proper order.

●Creating scheduled jobs for QVD extracts and report reloads.

●Created and Scheduled Weekly, Monthly QlikView reports to distribute on email

Environment: QlikView11, Web Services, Lotus notes, Excel, Primavera, Share Point, DB2, MS Sql server 2012.

Johnson & Johnson - Medcomps Inc. 7/2009 - 4/2010

Piscataway, NJ

Role: Informatica and Cognos Lead Developer

●Prepared all the required documents for ETL work and Cognos objects like technical design documents, unit test plan documents.

●Developed complex mappings with big sql override statements, used transformers according to data load requirements and created corresponding workflows.

●Developed PL/SQL procedures, functions, packages, triggers to facilitate

specific requirement

●Coded Linux shell scripts for the control m to to kick of the workflows and also created many parameter files on the informatica servers.

●Tested the informatica components and linux components in the development environment and then promoted to other test environments for further testing and then deployed to production successfully.

●Created Models and Packages in Framework Manager from Oracle 9i.

●Creating Reports using Report Studio and created adhoc queries using Query Studio.

●Expertise in developing Standard Reports, List Reports, Cross-tab Reports, Charts, Multiple-levels of Drill through Reports and Master Detail Reports Using Report Studio.

●Testing and improving Report Performance upon report completion for validation of Calculations & data.

Environment: Informatica 8.6.1, Power Center Client (Repository Manager, Designer, Workflow Manager, Workflow Monitor), SQL Server 2008, DB2, TeraData, Lotus Notes, Oracle10g&11g, Control+M, Unix scripting, HP Service Manager..Cognos V8.4 (Report Studio, Query studio, Analysis Studio, Cognos Connection, Framework Manager), Analysis Studio, Testing tools (IBM Rational Manual testing tools)

State Farm Insurance Company - Medcomps Inc 6/ 2008 - 6/2009

Bloomington, IL

Role: ETL and Cognos Developer

●Member of warehouse design team assisted in creating fact and dimension tables based on specifications provided by managers.

●Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.

●Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data in the data warehouse.

●Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.

●Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.

●Created various tasks like Session, Command, Timer and Event wait.

●Built Models and published Packages to the cognos connection utilizing Framework Manager.

●Expertise in developing Standard Reports, List Reports, Cross-tab Reports, Charts, Multiple-levels of Drill through Reports and Master Detail Reports Using Report Studio.

Environment: Informatica 8.1, Power Center Client (Repository Manager, Designer, Workflow Manager, Workflow Monitor), DB2, Lotus Notes, Oracle10g&11g, SQL Server, Control+M, Unix scripting, HP Service Manager..

IBM Cognos 8.4, Framework Manager, Report Studio, Query Studio, Analysis Studio, Metric Studio, Cognos Connection, and Access Mananager.

National Westminster Bank, UK 9/2005 - 11/2006

Westminster Bridge, UK

Role: Cognos and ETL Developer

●Developed reports like drill through reports, sub reports, list reports using Cognos

●Impromptu.

●Created catalogs from multiple databases with the help of Hot Files (.ims)

●Enhanced the existing reports and created new reports as per the user requirements.

●Customized the reports by adding calculations, conditions and functions.

●Created List Reports, Cross Tab Reports and Drill through reports.

●Created Filters, Conditions and Prompts in Catalog.

●Worked extensively on Oracle database. Troubleshoot for SQL issues/performance issues.

●Created Dimensional Maps, Transformer Models, Power-Cubes and Drill through.

●Automating tasks with Cognos Scheduler using Macros, Cognos Script Editor and Cognos Script Dialog Editor.

Environment: Cognos Impromptu, Cognos Power Play, Cognos Transformer, Cognos Impromptu Web Reports, Cognos Power Play Enterprise Server, Access Manager, Cognos Upfront, SQL *Loader, Oracle8i/9i, Windows NT/2000

Best Medical Center, Malaysia - KCR eVision Pvt Ltd, India. 7/2004 - 8/2005

Banglore, India

Role: Senior Developer.

●Involved in a software development project MTT (Medical Tools Testing) for the Best Medical Center. The purpose of this project is to develop a software system which will be used to test the accuracy of various surgical tools and detect the defects occur due to the prolonged usage.

Environment: CAD / CAM / CAE, RDBMS & GUI: C, C++, Oracle 8i

Cagney Builders, Malaysia - KCR eVision Pvt Ltd, India. 8/2003 - 6/2004

Banglore, India

Role: Software Developer

Involved in a tower design project to create structural assembly, process planning and detail drawing using CAD/CAM software. Also responsible for coordination with client team who is in Malaysia and quality assurance of the design drafts.

Environment: CAD/CAM, Oracle 7.3

The Parisutham Pvt. Ltd, India - KCR eVision Pvt Ltd, India 2/2003 to 6/2003

Banglore, India

Role: Programmer Analyst.

HOTRES project is complete front office management software for the hotel. It has reservation module with automatic rent posting that facilities for twenty four hours check out. Tour manager module is a customized travel agent and money invoicing and accounting module.

Environment: Visual Basic 6.0, Oracle 7.0



Contact this candidate