Post Job Free
Sign in

Developer Data

Location:
United States
Posted:
January 28, 2019

Contact this candidate

Resume:

Mahesh Chandra

ETL Developer

Email: ****************@*****.*** Phone: +1-302-***-****

Professional Summary

6+ years of programming experience as an Oracle PL/SQL/ETL Developer in Analysis, Design and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS).

Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.

Proficient in using TOAD Application for creating Oracle Packages, Stored Procedures, Functions, Triggers and retrieving Explain Plain to enhance the performance of sql queries and comparing the Tables data with development and Production schema’s.

Experience in Power builder with Power Center ETL tool.

Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.

Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.

Strong experience in Data warehouse concepts, and ETL.

Tuned Informatica mappings and sessions for optimum performance.

Developed the Mapping Specifications by using various power center transformations like Filter, Router, Aggregate, Complex Join, and Look up, Source Qualifier.

Performance tuning is performed at the Mapping level as well as the Database level to improve the data flow from source into target.

Performed Unit testing and maintained test logs and test cases for all the mappings.

Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Parsing high-level design specification to simple ETL coding along with mapping standards.

Created Tables, Views, Constraints, Index, Sequences.

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

Developed materialized views for data replication in distributed environments.

Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.

Loaded Data into Oracle Tables using SQL Loader.

Partitioned large Tables using range partition technique.

Experience with Oracle Supplied Packages such as DBMS_SQL, DBMS_JOB and UTL_FILE.

Created Packages and Procedures to automatically drop table indexes and create indexes for the tables.

Worked extensively on Ref Cursor, External Tables and Collections.

Expertise in Dynamic SQL, Collections and Exception handling.

Experience in SQL performance tuning using Cost-Based Optimization (CBO).

Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.

Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.

Experience in ETL techniques and Analysis and Reporting including hands on experience with the Reporting tools such as Cognos.

Experience in agile methodology.

Good knowledge of Finance domain which include Alternative investment, Account receivable/Payable, accounting.

Excellent communications skills, fast and enthusiastic learner, excellent work ethics and a good team player. Strong management, administrative, analytical and leadership skills

Technical Skills:

Database Technology : Oracle Database 12c, 11g, 10g, 9i, MS Access, Netezza.

Database Management System : Database design, Development and maintenance.

Database Tools : Oracle Forms & Reports, Toad, SQL*Loader, PL/SQL Developer, Oracle

SQL Developer, SQL *PLUS, VI Editor, Putty, ERWIN, Oracle 9i/10g/11g,

FoxPro, Teradata SQL Assistant, TKPROF, EXPLAIN PLAN, Export, Import,

Oracle Warehouse builder (OWB), Oracle Data Integrator (ODI), Oracle

Enterprise Manager (OEM).

Scripting : Shell (Bash Shell), XML, SQL

Operating Systems : LINUX (Red Hat Linux), UNIX, AIX, MS DOS and Windows

9x/NT/2000/2003/XP/VISTA.

Programming Language : Python (Version 3 and 2)

ETL Technologies : Informatica Power center Repository Manager, Power center Designer,

Power center work flow manger, Power center work flow monitor.

Education : Master’s in Computer Science from Wilmington University, Delaware

PROFESSIONAL EXPERIENCE

Client: American Express, Phoenix, AZ Sep-2018 to Current

Role: ETL Developer/ Engineer I

Project – Business Intelligence and Accounts Receivable Data ware House (BI&DW)

Responsibilities:

Involved with the Business teams (upstream and downstream) to gather the future project related requirements.

Designing the data flow in ETL and database between various Informatica transformations and different databases.

Developing the ETL Specs document by managing the data modeler tools like ERWIN Data modeler for Oracle Database.

Created Dimension tables and Fact tables across the Schemas in Development, QA and Prod Environments.

Developing the ETL Informatica Mapping, Mapplets, Sessions and Workflows as per the ETL Specification document.

Reviewing and validating the code developed by other ETL resources and publishing in Stack Repository.

Providing and delivering the data as required by BI teams like Oracle OBIEE, IBM Cognos, and Micro Strategy teams.

Scheduling the jobs in scheduler tool like Atomic, Active Batch and Control –M as per the business requirement.

Creating the SFTP profile setup’s to open a SFTP site for various upstream and downstream teams to deliver/send the files securely.

Accessing the Power exchange tool to verify/validate the content in Mainframe files to initiate the batch process.

Maintaining or Creating the Disaster Recovery (DR) Plans to secure and regenerate the data warehouse data.

Organizing and Maintaining the database for lower and upper environments and corresponding UNIX Servers.

Responding/ Resolving the Incidents raised by other teams to BIDW/ARDW team in timely manner.

Implemented the File validation techniques, Session Control, Batch Load and Audit control mechanisms by creating the Informatica mappings and developed well organized architecture in lower and upper environments.

Responding to Third Party Audit Company raised questions/Ad hoc requests by providing the sample data with the complex SQL/PLSQL queries and stored Procedures.

Creating the UNIX shell scripts to generate the parameter files dynamically and invoke the workflows.

Environment: IBM Netezza database, Informatica Power center 10, Micro strategy Reporting tool, UNIX, Power Exchange, Citrix Receiver, Aginity IDE, Winscp, FileZilla, Oracle 10g/11g, SQL developer, PL/SQL developer, TOAD, IBM Cognos, Oracle APEX, Active Batch Job Scheduler, UNIX shell scripting (Bourne and Korn), GITHUB, SVN, CVS Migration tool, Informatica Power center Repository Manager, Power center Designer, Power center work flow manger, Power center work flow monitor

Client: Mediacom, Chester, NY Nov-2016 to Sep-2018

Role: Oracle PL-SQL Developer / ETL Developer

Project – Data ware House Operations

Responsibilities:

Responsible developing the Active Batch Scheduler tool where Mediacom Data warehouse operational jobs will be scheduled at timely manner like monthly Jobs and weekly/ bi weekly jobs are scheduled for both inbound and outbound jobs.

Managing the smart sheet to log the issues from end users, vendors, internal department to track the status of an issue.

Developing the different types of Job process and scheduling the process into Active Batch.

Monitoring the Jobs in Active Batch and Identifying the Issues and logging them into Issue Tracker (Smart Sheet).

Troubleshooting the Issues and debugging the issues identified by other developers.

Fixing the Issues by developing the code in PLSQL.

Writing SQL query’s to send the data to cognos to generate the reports for other departments of Mediacom.

Scheduling the DBMS Jobs in DBMS Schedule.

Handling the daily errors found in Active Batch and Issue logging in Issue Tracker to share the error status to different managers.

Troubleshooting and debugging the identified errors and solving the errors by manipulating the respective PL/SQL code in stored procedures and packages.

Validating the generated reports from Cognos and Oracle APEX before triggered out to other vendors of Mediacom

Differentiating the Residential Mediacom and Commercial Mediacom plans and their codes to respective reports by writing complex PL/SQL code.

Analyzing the Performance of Active Batch Jobs by monitoring the Average Job Complete time and implementing the special category tags for jobs to identify them in Active Batch.

Deploying the code to Production database by maintaining the Release notes for each deployment.

Handling the SVN and GIT repository’s for preserving the code and maintaining the history of code.

Managing the inbound jobs from various Mediacom Vendors into Data warehouse to load the Mediacom tables with daily source files data.

Exporting the data from Mediacom tables to flat files or different type’s reports developed by Informatica tool.

Extracting the data by using the Mapping transformations and scheduling the workflow in Active Batch to run at specified time as per business requirements.

Converted the sql loader scripts, stored procedures into Informatica mappings by using various transformations.

Experience in usage of expression, lookup, filter, SQL, aggregate, router, union,rank transformations in mapping development for both inbound and outbound jobs in Informatica

Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.

Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

Involved in extracting the data from the Flat Files and Relational databases into staging area.

Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.

Environment: Oracle 10g/11g, SQL developer, PL/SQL developer, TOAD, IBM Cognos, Oracle APEX, Active Batch Job Scheduler, UNIX shell scripting(Bourne and Korn), GITHUB, SVN, CVS Migration tool, Informatica Power center Repository Manager, Power center Designer, Power center work flow manger, Power center work flow monitor

Client: The Sherwin Williams, Cleveland, OH Sep-2014 to Oct-2016

Role: Software Developer/IT Analyst

Responsibilities:

Project – Cost Center Nucleus (CCN, Store Drafts, Banking, Processors, Field payroll)

Responsible for migrating data from Legacy Mainframe systems to Oracle for CCN Application.

Worked on troubleshooting existing CCN Application issues.

Responsible for developing UNIX shell scripts to support conversion of data and scheduled the shell scripts in CRON job to run on daily and monthly basis

Responsible for Project implementation or single dimensional business functions

Involved in Application development, Business Analysis and system administration

Analyzing and implementing functionality into the existing systems per business requirements with minimal supervision.

Created batch files for automation of SQL Script files and scheduled those using CRON

Worked on developing SAP Business intelligence tool crystal reports through eclipse to generate banking reports to send mainframe

Used UTL_FILE and import/export toolbar to load the data from .txt and .xls files to oracle.

Worked on REF cursors and bulk inserts in retrieving data for crystal reports

Worked on Error handling using system defined exceptions and user defined exceptions by using PRAGMA

Developed PL/SQL stored procedures, functions, packages, views and materialized views, implicit and explicit cursors, triggers, enforced integrity constraints.

Store drafts data for cost centers located in USA, Canada, and Mexico is handled efficiently by running initload files and batch load jobs.

Stored Procedures were written to send the .xls and .csv files to mainframe by using UTL_FILE inbuilt package by oracle.

Automated jobs and the status of jobs at every half an hour will be notified to the users by sending emails to the groups using UTL_SMTP protocols.

Field payroll domain was handled by running the payrolls on every Thursday for cost centers belongs to USA and on every Friday for cost centers belongs to Canada.

Member maintenance window for banking domain in developed with front end as Java Application and Back end as Oracle.

Created External tables using Oracle Loader to load data into Oracle DB from IBM Mainframe.

Developing the XML data based up on the provided XSD document from one application to another application.

Building the XML data from the provided character data to pass from database to UI to further processed by Java application

Stored packages and procedures were written to build the business rules for front end application.

Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.

Developed several reusable transformations and mapplets that were used in other mappings.

Responsible for designing and developing of mappings, mapplets, sessions and work flows for loading the data from source to target database using Informatica Power Center and tuned mappings for improving performance.

Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.

Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.

Tuned mappings and SQL queries for better performance and efficiency.

Performed Unit testing and validated the data.

Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.

Environment: Oracle 10g/11g, SQL developer, PL/SQL developer, TOAD, Eclipse using crystal reports, UNIX shell scripting(Bourne and Korn), GITHUB, SVN, CVS Migration tool

Client: United Health Group (UHG) OCT-12 – Jun-14

Title: IT Analyst

Project – Optum Operations

Responsibilities:

Gathered Business Requirements from the client and translated the business detail into technical design and specification.

Analyzed the data, extracted from the different source systems based on the business rules.

Creation of Database, table spaces, schema objects and stored procedures.

Developed various backend application programs such as Table, Views, Functions, Triggers, Procedures and Packages using SQL and PL/SQL.

Responsible for creating functions to be used for calculations. Written database triggers for monitoring the data migration.

Developed algorithm and PL/SQL code for efficient retrieval and manipulation of complex data sets using PL/SQL packages.

Written ANSI SQL, PL/SQL procedures and functions while migrating data from the source legacy systems to the Target system (Oracle database).

Extracted and transformed source data from different database like Oracle, SQL Server and DB2 and flat file into oracle.

Responsible for troubleshooting, debugging, problem solving and tuning for improving performance of the backend application programs.

Loaded the flat file data into database table by creating multiple SQL*Loader control scripts.

Created and modified database objects like tables, views, Indexes, Synonyms, Sequences and Constraints.

Implemented business logic using stored procedures to increase performance.

Used Plan table, Explain plan and TKPROF for Tuning the SQL statements by creating indexes.

Generated session level trace and used the TKPROF utility to get the report for the trace file.

Monitor and publish database response times, uptime of different applications in supporting different database applications.

Created many executable programs running on UNIX operating system and wrote UNIX Shell Scripts for automating the process.

Implemented business logic using ANSI SQL functions, stored procedures within Composite Data type.

Used exception handling methods along with RAISE_APPLICATION_ERROR in order to associate our own created exception names.

Enhanced the existing UNIX scripts to run batch processes, oracle background processes using JOBS & achieve better performance and through output.

Manage work sheets and workbooks. Using discover to complexity of underlying database structures like OLAP, cubes, tables, column, joins etc.

Responsible for Data Migration using Oracle tools Expdp, Impdp and RMAN.

Involved in creating target database users and modules using Oracle Warehouse Builder (OWB)

Created stored procedure and function in PL/SQL to increase the performance of Forms on web server.

Used TOAD, Erwin tools for database designing and business process flow.

Environment: Oracle 9i/10g/11g, PL/SQL, SQL*Plus, SQL developer, SQL*Loader, TOAD, OWB, HTML, ANSI SQL, XML, Windows 2000/2003/XP, FTP, Developer 2000, Export, Import, ETL, Erwin, RMAN, Unix sun Solaris (5.10), Unix shell scripts.



Contact this candidate