Resume

Sign in

Data /Business Analyst

Location:
Milwaukee, Wisconsin, United States
Salary:
110,00
Posted:
January 09, 2017

Contact this candidate

SAILENDRA KORITALA

acx6tr@r.postjobfree.com

Cell: 410-***-****

Summary:

12+ plus years of experience in Design, Development and Testing of software applications.

As a Big Data analyst, Data Architect, Sr. Data Analyst, Data Modeler having Hands on experience with ETL tools like Informatica, Data Services, SSIS

And worked on Business Intelligence Reporting tools like OBIEE, Tableau, discoverer and Erwin and Oracle applications like Siebel CRM, Oracle EBS PeopleSoft etc.

Extensively Worked on Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.

Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.

Have good experience in understanding & creating source and target definitions, Mappings, Transformations, Sessions and Scheduling the Workflows.

Expertise knowledge on all Transformations of Informatica, Data Quality, Cleansing and Enrichment etc..

Have experience to perform install/upgrade Informatica and creating multiple repositories, users and privileges using Informatica repository manager.

Involved in Performance tuning of Informatica mappings and sessions.

Good experience in test case design, test planning, test execution, defect management and risk analysis.

Knowledge on Data Warehouse Architecture and Designing Star Schema, Snow Flake Schema.

Expert in using design and management tools like Erwin and Toad.

Good Experience in Writing UNIX Shell Scripts and Autosys.

Worked in different environments like Data Migration and Data Integration.

Experience in Integration of various data sources like Oracle, SQL Server 2000, DB2 UDB, and Ms Access into a staging area. Advanced skills in Administration, Configuration, and Generation of Reports using Analytics.

Technical skills:

ETL : Talend 6.0, Informatica 9.1 / 8.6 / 7.1.3

Big Data : Hadoop, Hive, Squirrel, Hbase, Spark, Pig, Python, Perl, Scala, Java

CRM : Siebel 8.1.1.10 / 8.1.1 (EIM), Scripting, Assignment Manager, Workflows, EAI

Analytics : OBIEE 11.1.1.7 / 10.1.3, Actuate Reports, Tableau

Scripting : Pig, Hive, eScript, VB Script, Shell Scripting, Batch scripting

Middleware : MQSeries, eLink, TIBCO.

GUI : Visual Basic 6.0, and Developer 2000.

Databases : Oracle7.x/ 8i/9i, DB2, SQL Server 7.0 and MS-Access 97/2000.

Languages : Pig, Python, SQL, PL/SQL,Perl, C, C++, JAVA.

Certifications:

Hadoop Foundations – IBM Open Badge

Big Data Foundations – IBM Open Badge

Education:

Master of Computer Applications (MCA)

Professional Experience:

Johnson Controls, WI Feb’16 – Till Date

Principal Consultant (Role: Big Data Analyst and Lead Developer)

GRDM is to design, develop and implement a Building Efficiency (BE) Global Data Reference Model.

The BE Global Data Reference Model will be designed based on high-level BE business requirements collected during the Scoping effort which interviewed representative stakeholders from across BE. Additionally, this engagement will address the data required for the initial implementation of CBRE & SFDC Reporting projects.

Conducting interviews with business users and gathering reporting requirements like Secured Revenue/Cost, Executed Revenue/Cost, Cash collections, Tax Payments etc..

Assess existing BE Global Data Reference Model architectures

Conducting interviews with Source Applications owners, Understanding source applications, its functionality

Identifying different entities and relationships in source application

Transforming currently entities into our GRDM data model and creating ERD between source entities.

Firing simple and complex SQLs on source database to find relations between tables and to find cardinality and current grain in source data

Identifying source tables and assist ingestion team to bring data into Hadoop(Data lake) and creating Hive tables on source data in Hadoop

After Data Ingestion Profiling source data in Hadoop using Hive Sqls and assist data Modeler to make changes to fit in Global data warehouse etc..

Creating Source to Target Mapping documents to Stage and data warehouse (GRDM) on target load table basis and modeling History and snapshot data loads.

Assisting data modeler to build Global Data Warehouse model with multiple sources in required lowest grain, cardinality, history, snapshots and aggregates etc..

Coding and Assist ETL developers in coding Business logic, SQLs, Lookups and reference tables in Talend tool

Getting unique ids for Customer, Legal entities and Addresses from MDM and validating data in stage tables and final target tables.

Creating sample data for reference tables like currency conversions, line of Business, vertical market, Global Confirmity etc...

Validating data in Data warehouse against correct grains, business scenarios, transaction dates, surrogate keys, History, snapshots and aggregations etc..

End to End responsibility to bring source data into hadoop till loading data into Warehouse, Validation verification and UAT etc..

Environment: Talend 6.0, Oracle 12c, Hadoop, Hive, Putty, Windows XP, SQL Developer, FTP, JIRA

Regal Beloit, WI Aug’15 – Jan’16

BI Consultant (Data Architect)

Responsible for DAC Monitoring and Daily activities

Worked in QAD North America Go Live and Support Activities

ETL design and development for Build and Hold logic on Orders. Global Employee Head count

Worked on write back changes in RPD and Answers for Parent account name for NAICS customers, Sold to customer strategy.

Extensive analysis and reporting and comparison in BI reports. End – End data validation between Oracle EBS and BI Reports.

As I came from ETL technical background. So After source data Analysis i can do End-End (Source to Target Mappings) ETL design and coding in Tools like Informatica, Scheduling etc...

Experience in Source Data Analysis, Profiling depends on Customer Business Prospective and in Building ETL mappings for business Scenarios, debug and troubleshooting for existing ETL mappings and Providing solutions for current Data Gaps etc.

Analysis and Identifying a gap between NAICS and Combined sales, fixing ETL code and comparing field level reports.

Analysis of different source reports and trying to match sales numbers in OBIEE reports.

Performance improvement in ETL Code and SQL scripts.

Working on Client requests (JIRAs) for new report fields and functionality.

Installation of OBIEE 12C in development environments and testing reports and functionality

Following with Oracle support team on OBI 12C hurdles/issues.

Environment: Informatica 9.0.1, DAC 11.1.1.6., OBIEE 11.1.1.6, Oracle EBS, Windows XP, TOAD, FTP, JIRA

Johnson Controls, WI Nov’14 – Aug’15

BI Consultant (Data Architect / Migration for SFDC)

As this is my previous client, from day one I need to jump on DAC Monitoring, BI trouble shooting, working of BI enhancement tickets and Taxware development.

Responsible for getting data feed from Taxware cloud and incorporating tax information into BI application

Comparing tax number between Taxware and BI and BI and Nxgen application.

Trouble shooting and providing solutions for Missing data feed for Taxware outages etc.

As part of SFDC data migration, working on multiple entities’ data cleansing, loading using Informatica, SQL and stored procedures

Environment: Informatica 9.1.0, DAC 11.1.1.6., OBIEE 11.1.1.7, Siebel 8.1, Oracle 11g, Windows XP, EIM, TOAD, FTP, Netdrive

GE Healthcare, WI Jan’14 – Nov’14

Data Conversion Lead/Specialist (Informatica and EIM)

My current project includes extensive ETL scripts with Informatica, SQL and PLSQL then Siebel EIM data load as part of data conversion and load runner as part of Performance testing.

Acting a Lead role in current Health Care data conversion project and working as onsite coordinator.

Receiving legacy data, Analyze data based on business requirement and educate legacy owners how new system works with legacy data to deliver required data sets for conversion.

Data Cleansing, Profiling techniques to get accurate data from legacy.

Multi domain master data management (MDM) to create single source of master data for customers, streamline reimbursements and maximize return on data.

Data Profiling using IDQ and Bulk Loads using IDD.

Appending existing data to improve its quality and Adding new data for richer context as part of enrichment data.

Preparing data conversion documents for code build, reviewing with product owners and pass it to developers.

Reviewing code build with respect to current business logic, performance and Validating data loads.

Setting up timelines for code build, testing and documentation etc..

Providing data extracts to product owners to review and analyze and to give new inputs or data sets to load.

Creating virtual users to test entitle process in load runner.

Running back end data fixes as part of existing production support and performance issues.

Preparing SQL and PLSQL scripts as part of Analysis, Daily reports, data conversion and Performance fixes.

Validating enterprise ids for legal entities Accounts and addresses against MDM system.

Working on Design documents, Change controls, Test scripts, Validations etc..

Extensive Production support for go live and post go live activities and Field support engineers etc..

Johnson Controls, WI Dec’10 – Dec’13

Sr. Data Analyst (ETL and OBIEE)

11g Agents configuration to emails, shared location and conditional agents

Direct OLTP reports

Upgrade responsibilities including Informatica 8.1.1 to 9.1.0 and DAC 7.9.5 to 11.1.1.6 and OBIEE 10g to 11g, Scheduler migration

Using SQL. PL/SQL and Informatica to populate custom buckets in OBIEE for AR aging.

Weekly RPD release activities.

Prerelease and post production responsibilities for Informatica, DAC and OBIEE

Developing Role based Dashboard for Trend analysis in Answers

RPD changes for OBIEE reports.

Sr. Conversion Lead (ETL and EIM)

Performance tuning in MTSA, Informatica jobs.

Preparing Functional design and Technical design documents for Employee and Taxware data loads

Informatica Code changes for Business requirements like NS loads.

Data Loads for National Service including Accounts, Assets, Agreements, Line Items, Activity Templates, SDP and Sub ledger etc. using SQL, PL/SQL, Informatica and Siebel EIM.

Frequent Data Load for Activity Template, Vendor Merge, Sub ledger Setup, Account Priority, Discount Matrix etc.

Nightly ETL (Analytics Refresh) monitoring

Troubleshooting Data errors, Proposing solutions, Data updates etc..

Developing and Modifying Batch scripts for Daily Employee Load.

Developing reports for Business Users and SOX reports etc.

Environment: Informatica 9.1.0, DAC 11.1.1.6., OBIEE 11.1.1.7, Siebel 8.1.1.10, Oracle 11g, Windows XP, EIM, TOAD, FTP, Netdrive

EDS (HP Company), VA Feb’10 – Nov’10

Data Analyst (ETL and EIM)

Performance tuning in MTSA, Informatica jobs.

Translate Business Requirements into Informatica mappings.

Created mappings to load Slowly Changing Dimension tables based on the amount of historical dimension data wanted to keep.

Tested the Informatica Mappings by creating test plans, test data, testing the maps, configuring both sessions and monitoring session streams.

Developed Incremental and updateable loading through mappings.

Designed reusable transformations and shortcuts to share different mappings.

Created and edited multiple sessions and added them into batch component. Loading Products, Accounts, Notes, Billing and services using Informatica.

Developed Stored Procedures with nested cursors, bulk loads and using DB Hints to increase performance.

Deal with large amount of data like 54 million records.

Working on Production issues, Troubleshooting and data recovery.

Loading Products and services using Informatica.

As part of MTSA Release v4.84

Development in Cost of Attendance On demand Job from MTSA to using Informatica and EIM.

Developed end to end Cost of Attendance to MTSA Daily Job, using Informatica.

Deployment activities, Post deployment tasks etc.

Environment: Informatica 8.6, Siebel 8.1.1, Oracle 10g, Windows XP, OBIEE, Load Runner, EIM, TOAD, Linux.

Abbott Labs, IL Sept'09 – Jan’10

Sr. Data Analyst

Running Data loads for Assignment Rules, IRL, Contact load, Accounts, Activities, Asset/Account Segmentation, TAB cancel etc using EIM

Working on DAC to Schedule and monitor the SDE and SIL mappings in Informatica to move data from OLTP to OLAP.

Configured Physical Layer, Imported tables and added/changed physical joins/keys.

Configured Logical Layer. Created Logical tables, dimensions and columns. Built Logical Joins/keys, Set levels to support Drill-down, Aggregation. Created Dimensional hierarchies.

Process Sales and Part Orders manually by running informatica workflow and EIM on daily basis.

Wrote a Business service to update Primary Warranty for Assets and schedule BS from RCR component.

Applied the Troubleshooting techniques to identify data which is not migrated into Navigator.

Tested the Informatica Mappings by creating test plans, test data, testing the maps, configuring both sessions and monitoring session streams.

Developed Incremental and updateable loading through mappings.

Designed reusable transformations and shortcuts to share different mappings.

Created and edited multiple sessions and added them into batch component.

Debug, and resolve the loading failures by verifying the log files.

Fine tune SQL and Generate reports for downstream systems.

Developed Batch scripts for FTP and attached as Pre-session task to Informatica workflow, Starting EIM job and to run Statistics.

Monitoring the Autosys schedule to check daily loads for all entities for different downstream systems.

Involved in Data Mapping and Data Modeling.

Database Design and Installation. Made changes to data model based on client requirements

Extending the Data Model to accommodate the Client’s Requirements.

Order management, Handle ADMS, OMAR, FedEX-SO and PO order processing.

Prepare the design document and participate in release activities.

Environment: Informatica 8.5.1, 7.1.4, OBIEE, Siebel eMedical, Oracle 9i, Windows XP .Net, Assignment Manager, Load Runner, EIM, TOAD, HP-UX.

Merck, PA Sept'08 – July’09

Sr. Data Analyst

ETL & EIM

Involved in Functional specific and Technical design and documentation for user requirements.

Involved in the discussion with Client Services team to gather and analyze the Business requirements.

Understand Business requirements, analysis and translate into Application and operational requirements.

Understanding Architecture and Involving in Data Conversion from FACTS to Account Management for Integration Project by preparing mapping between Facts to AM using Data Model. And Paying End-End responsibility for ETL and EIM tasks.

Coordinate offshore team members with Technical design and checking code on daily basis.

Duties include performance tuning, Data Quality, Troubleshooting as per Best practices.

Coordinating offshore team on daily basis, providing technical solutions for ETL, EIM and Scripting components necessary to implement the solution and meet deadlines.

Optimizing SQL queries to increase Performance.

Responsible for End-End Automation workflow for source files transfer from FACTS Infa, ETL loads and EIM jobs.

Applied the Troubleshooting techniques to identify data which is not migrated into Account Management.

Created extension Columns as per the client’s requirement.

Developed Batch scripts for FTP and attached as Pre-session task to Informatica workflow, Starting EIM job and to run Statistics.

Loading data for Accounts and Contact Affiliations, Activities, Files etc.

Tuned and Optimized SQL Queries. Created test scripts for all entities in Data Migration.

Environment: Informatica 8.0 Power Center, Oracle 10g, Windows XP, .Net, Assignment Manager, EIM, HP-UX 10x, MS Visio, TOAD

PerotSystems, VA: DOE Oct’07- Aug'08

ETL

Migrating data from different subsystems to IPM (Integrated Partner Management) staging and EIM tables using Informatica.

Applied complex transformation logic to filter source data based on binary codes, address logic, history data, maintain Parent Child relation etc.

Developed Profile reports to analyze source data.

Optimizing SQL queries to increase Performance.

Writing Parameter files, creating single Workflow per subsystem.

Promoting code from DEV to TEST, load and verify data in TEST environment.

Created shell scripts which will substitute all the user and schema information in the sql queries.

Involved in designing to different stages using complex mappings

Automated the conditional execution of workflows in a specific sequence using shell scripts and working as a production support

Modified the pre session and post session scripts.

Involved in moving from development to production environment.

Responsible for setting up the test environment.

Involved Unit test and System integration test.

Completely tested all the workflows and Involved in configure Power Exchange with Mainframe Source.

Debug, and resolve the loading failures by verifying the log files.

Used debugger for error handling and to test the data flow.

Tuning Mappings for optimum performance. Used various tuning principles to reduce the load time for sessions Debug, and resolve the loading failures by verifying the log files.

Used the Workflow Manager for creating, validating and running the workflows, sessions and scheduling them to run at specified times

Environment: Siebel 7.8, Informatica 8.0, EIM, .Net Framework, Assignment Manager, Business Services, Workflow Process Batch Manager and Mercury Tool.

Johnson & Johnson, NJ Aug’06 – Oct’07

Tier 3 Data Team

Responsibilities:

As a Data Team member involved in development, support and maintenance activities.

Loading Contacts, Accounts, and Products using EIM. Automation of OBI compensation flag update.

Performing Professionals (Contacts) and Account merges based on the business request.

Loading Professional – Account Affiliations on regular basis. Loading Account – Account Affiliations Loading new Opportunities and bulk updates on the Opty.

Creating and editing mappings in Informatica and running the ETL batches. Scheduling the batches to run at the required intervals.

Loading the data from legacy Customer Master Tables to base tables using Informatica ETL and EIM. Support for all data related issues and maintaining the data consistency.

Worked on Territory renumbering.

Created SQL/PLSQL blocks for cleansing and updating the required data.

Worked on change request based on business requirement.

Responsible for setting up the test environment.

Developed various complex Informatica mappings

Created mappings to load Slowly Changing Dimension tables based on the amount of historical dimension data wanted to keep.

Tested the Informatica Mappings by creating test plans, test data, testing the maps, configuring both sessions and monitoring session streams.

Extensively used transformations like router, aggregator, lookup, update strategy, joiner, expression, and sequence generator transformations in extracting data in compliance with the business logic developed.

Debug, and resolve the loading failures by verifying the log files.

Used debugger for error handling and to test the data flow.

Used the Workflow Manager for creating, validating and running the workflows, sessions and scheduling them to run at specified times.

Tuning Mappings for optimum performance. Used various tuning principles to reduce the load time for sessions and workflows.

Worked as Production Support.

Environment: Informatica 7.1, Oracle 9i, Windows XP, SQL Loader, EIM, Business Services, JMS/HTTP, Smart Scripts, Workflow Process Batch Manager and Testing.

Mahindra – British Telecom Ltd., MUMBAI, INDIA May’04 – June’06

Electronic Customer Ordering, BT, UK

Responsibilities:

Modification and Configuration of Vanilla Siebel Application using Siebel Tools.

Customization and modification of various Object definitions using Siebel Tools.

Developing the User Interface by mapping and designing the Applets into Views, Views into screens by using Applet and View Designer.

Designing the Business Object Layer and User Interface Layer.

Used Siebel Tools for the Configuration of Business Objects, Business Components, Pick lists, MVG’s, Applets, Screens, Views, Links and Joins as per the requirements.

Environment: Siebel Sales 6.0/7.0.3, Siebel Tools, Workflows, EIM, Assignment manager, eScript, Windows 2000, Oracle



Contact this candidate