Post Job Free

Resume

Sign in

Data Sql Server

Location:
Charlotte, NC
Posted:
February 12, 2018

Contact this candidate

Resume:

484-***-****

ac4f8t@r.postjobfree.com

MANIKANTH

PROFESSIONAL SUMMARY

• Experienced in Analysis, Design, Development, and Implementation of Data Warehousing, Data Marts and Data Integration Solutions using ETL & OLAP tools like Informatica Power Center and SSIS Packages in the Financial, Health-Care, Telecom Domains.

• Engaged myself in different phases of SDLC/Data Warehouse Life Cycle including Business Reporting Requirements gathering, Source System Analysis, Logical/Physical Data Modeling, ETL Design Development, Code Deployment, Production Support.

• Designed and developed Business Intelligence Solutions using Informatica Power Center and SQL Server Management Studio using SSIS (SQL Server Integration Services).

• Expertise in working with ETL tools like Informatica Power Center Client tools - Designer, Repository Manager, Workflow Manager, Workflow Monitor, and SQL Server SSIS Packages.

• Deployed various mappings using different transformations like Expression, Filter, Router, Joiner, Aggregator, Sorter, Lookup, Update Strategy, XML Generator, XML Parser.

• Proficiency in Oracle, Teradata, Teradata Utilities, BTEQ, TPT, SQL, PL/SQL, and use of Teradata SQL Assistant, MS Access, MS Excel on UNIX and Windows platforms.

• Enabled population of the data warehouse tables using SQL*Loader, TPUMP, MULTILOAD, FASTLOAD, Oracle Packages, Stored procedures, Stored functions & Cursors.

• Organized experience in Informatica mapping specification documentation, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys.

• Implemented Performance Tuning, identifying, and resolving performance bottlenecks at various levels in Business Intelligence applications.

• Optimized database querying, data manipulation using SQL and PL/SQL in Oracle, Flat Files, and other databases.

• Integrated data from various data sources like Oracle, SQL Server, Teradata, XML files and Flat Files into staging area.

• Fabricated Slowly Changing Dimensions to maintain the historical data using Type I, Type II.

• Systemized in writing of UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

• Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing.

WORK EXPERIENCE

Application/ETL Developer Charlotte NC

TEK Systems (Regulatory Reporting Team at Bank of America) August 2017 – Till Present

• Interactive collaboration with the Business team to gather requirements for Centralized Regulatory Reporting Schedules in the form of stories, as a part of project planning and put these data together to frame the documentations necessary for the ETL Developers.

• Coordinated with core development team, business analysts, DBA’s to architect data model, design and implement the schemas according to the business needs.

• Proposed designs for various Reporting Schedules of Bank of America and implemented necessary ETL mappings.

• Reconciled the excel mappings for implementation purpose after quickly analyzing the business specifications documents.

• Negotiations with the business team and the business analysts to make sure the specification received is appropriate work item for the scheduled delivery.

• Built various extracts, staging to lookup, staging to end target workflows for different reporting schedules using Informatica Power Center 10.1 using the business specifications.

• Computed and maintenance of the aggregated data in the target schema’s necessary for the AXIOMSL reporting tool.

• Proposed the use of partitions and sub-partitions on the schema tables for quick data retrieval and access.

• Guided the use of PRE-& POST SQL’s in the workflow sessions for the well maintenance of the reporting data.

• Encouraged the use of Version One to track individual work load and their responsibilities of the scheduled AGILE Sprints.

• Customized Autosys JIL Scripts to schedule data load activities and other Informatica workflows according to the Service Level Agreement (SLA’s).

• Prepared Unit Testing, Integration Testing of the developed workflows before they are pushed to the higher environments.

• Packaging of Informatica Workflows for the migration to higher environments QA, UAT & PROD based on schedule.

• Investigating and fixing issues, defects of previously developed jobs using sessions logs and workflow logs.

ETL/Informatica Developer Charlotte NC

Modis Inc. (Financial Stability Board Team at Bank of America) August 2016 – August 2017

• Gathered business requirements from the FSB Business Team, analyzed and transformed them to the mapping documents.

• Design proposal and development of Reconciliation module of Bank of America’s Balance Sheets, General Ledger and Regular Reporting data.

• Developed ETL (Extract, Transformation, and Loading) mappings to extract the data from multiple data level source systems representing Bank of America’s balance sheet positions like Oracle, SQL Server, Flat files, Cobol Files, Teradata and loaded into Oracle.

• Developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

• Performed debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches, and Target Data.

• Migrated the codes from Dev to QA, Dev to UAT, UAT to Prod environment and wrote up the Team Based Development technical documents to the smooth transfer of the project.

• Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

• Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

• Simplified data extraction from various source sources like Oracle 12C/11g/10g, DB2, and Complex Flat Files.

• Reduced Informatica sessions & workflows by combining them to single one’s based on the derivation dependency.

• Performed Tuning of mappings in Informatica to reduce the load on the server.

• Built several new table structures and modified existing tables to fit to the existing Data Model.

• Guided use of Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 9.6.1.

• Scheduled workflows using Autosys job scheduler & PMCMD, PMREP and UNIX shell scripts

Functional Analyst/ETL Developer Baltimore MD

Central Collection Unit, Department of Budget, and Management June 2014 – August 2016

• Assisted developers with the business analysis and requirements gathering from high-end users and stake holders (MVA, Department of Education and other 400 state agencies).

• Monitored the setting up of server, database connections in different environments.

• Analyzed and created Facts, Dimension tables, Look-up, and Referential Tables in DEV environment.

• Developed Mapping Documents and Test Scripts from the requirements needed for the validating the workflows.

• Worked with DBA’s on creation of tables, indexes, triggers, stored procedures, partitions for the data model identified.

• Created various ETL mappings using different transformations and business logics in the Actian Data Connect.

• Extracted and loaded large amounts of test data from XML files, Flat files, Oracle, Netezza and DB2 using Actian.

• Introduced reusable transformations, sessions, mapplets and worklets to simplify ETL processes among different applications.

• Explained the use of Variables and Parameters in the mappings to pass the values between mappings and sessions.

• Performed impact analysis using Metadata Manager before making the changes to the existing mappings.

• Involved in debugging mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Sessions, Batches, and Target Data.

• Worked on data integrity check, validation, exceptions and exception reprocessing for a full refresh and incremental load.

• Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

• Supported production environment to resolve the issues and complete the typical processes.

• Coordinated with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.

• Engaged with reporting team using the BI interface Business object on improving the business.

ETL/SQL Developer Baltimore MD

Ciena February 2013 – May 2014

• Actively analyzed Mapping documents and Design process for various Sources and Targets in coordination with Business Analysts and Data Modelers.

• Designed mappings/workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Sessions in the workflow manager.

• Developed mappings to load data to the staging tables and then to Dimensions and Facts.

• Enlisted use of transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, and update strategy to meet business logic in the mappings.

• Persuaded use of Metadata Manager to manage the Metadata associated with the ETL processes.

• Dealt with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter, and Aggregator transformations.

• Generated FTP scripts and Extraction of Unstructured Data from various Data formats using Perl/Shell Scripts. XML/flat file formats and extracted successfully.

• Implemented Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

• Used Teradata utilities Fast Load, Multiload, TPT to load data.

• Scheduled loading of data into Teradata from legacy systems and flat files using complex scripts.

• Responsible for monitoring all the sessions that are running, scheduled, completed, and failed.

• Developed, executed Test Plans and Test Cases to validate and check Referential integrity of data extracts before loading it to Data Warehouse.

• Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing the queries with explain plan, creating new indexes, partitions, and Materialized views.

• Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

• Monitored repository & assigned grants to users to access the repository.

System Engineer-ACE GVPM TATA TELECOM Hyderabad India

Tata Consultancy Services June 2010 – December 2012

• Involved in analyzing the client requirements and developing rules sheet for the application.

• Interpreted the Business requirements, Technical Specification, Test plans and Test Case scenarios.

• Integrated rule configurations to Java code in IBM BPM V 8.0 application.

• Worked on SFDC in creating orders that can be invoked using the BPM application and process the application using Metasolv.

• Active involvement on different scenarios of Commercial & Technical Vetting implementation of each order through the BPM Application.

• Fixed and tracked implementation issues using Bugzilla (issue tracker).

• Developed SQL Server SSIS ETL Packages to Merge and transform Oracle 11g data to DataMart.

• Automated numerous accounting processes utilizing SQL Server SSRS reports, SQL Server Stored Procedures, SQL Server SSIS Packages.

• Experience in Extracting, Transforming, and Loading (ETL) data from Excel, Flat file, Oracle to MSSQL Server DTS and SSIS services.

• Create views and stored procedures in T-SQL to facilitate reports.

• Experience on MDM, Soap UI (XML), Salesforce, Life-Ray applications.

• Addressed junior subordinates with various knowledge transmission sessions.

TECHNICAL SKILLS

Programming: C, C#, JAVA, PL/SQL, .NET, UNIX, PHP.

Web Technologies: HTML, CSS, JavaScript, XML, WordPress.

IDE’s: SQL Developer, Visual Studio, SQL Management Studio, Toad for Oracle, SQL Assistant.

Tools: Informatica 8.x/9.x/10.x, SSIS, SSRS, Tableau, SAS Enterprise, Autosys.

Databases: Oracle, MySQL, Microsoft SQL Server, Teradata.

Certifications: Oracle SQL Fundamentals I, Java SE 6 Programmer Certified.

EDUCATION

Bachelor of Technology in Computer Science at GITAM University, India.



Contact this candidate