Post Job Free

Resume

Sign in

Lead Informatica Developer, Unix, Teradata, Oracle, Autosys

Location:
Redmond, WA
Salary:
$110K per anum
Posted:
April 01, 2020

Contact this candidate

Resume:

Swaroop Kumar Nookala

adckx2@r.postjobfree.com

206-***-****

Seattle, WA

Senior ETL Developer with 9+ years of experience in the field of Data Warehousing and Analytics. Looking for a role in a challenging and fast-paced environment to best utilize my technical/analytical skills, excellent inter-personal skills and strong work ethics.

Professional Summary

Over 9 years of experience in Business Intelligence, Data Modeling and Data Warehousing across multiple domains in various roles as a Project Lead, Lead ETL(Informatica) developer, Teradata/ SQL developer and Informatica Admin

4 years of leadership experience in architectural design, technical decision-making and solution estimation. Architected and developed solutions for highly scalable, highly available systems

Proficient in understanding business processes/requirements and translating them into technical requirements.

Expertise in communicating with cross-functional teams and across organizational levels to drive a shared vision and to foster a culture of excellence.

Strong presentation skills with the ability to take very technical concepts and present them to non-technical teams

Participate in technology leadership meetings to provide feedback and recommendations to other leaders

Expertise in scoping and creating agile release for projects

Spear-headed different initiatives like organizing knowledge sessions, creating knowledge repositories, reference checklists and templates for sharing knowledge and mentoring junior developers

Professional experience using advanced SQL coding in various platforms, such as Oracle, Teradata and SQL server for data mining and report generation.

Solid background in using Teradata utilities like MULTILOAD, BTEQ and FASTEXPORT to load/import/export the data.

Exposure to the entire process of Software Development Life Cycle implementation.

Extensive expertise in creation and automation of workflows using Autosys.

Managed onsite-offshore model and coordinated with SMEs and project managers.

Provided technical oversight and guidance during client’s engagement execution

Worked with the Production Support team during month-end cycles to troubleshoot failures, provide workarounds and perform RCA for the bugs identified during the batch processing.

Worked with the Production Support Team during the Production Release cycles and provided support for the release.

Proficient in troubleshooting and fixing the bottlenecks in performance

Supported informatica admin activties in all environments and migrated the code from one evironment to another.

Education

Masters in Product Design Engineering from Anna University, India — 2008-2010

Bachelors of Technology in Industrial and Production Engineering from Sri Venkateswara University, India — 2003-2007.

Technical Proficiencies and Skills

ETL: Informatica Developer & Admin

Programming: Unix shell scripting, SQL, PL/SQL

Reporting: OBIEE

Cloud Technologies: Amazon AWS

Databases: Oracle, Teradata, SQL server

Job scheduling: AutoSys, CA Workload Automation

Modeling Tools: MS Visio 2010/2013, Erwin

Version control: SVN

Project Tracking: Jira

Ticketing tools: ServiceNow

Certifications

AWS Certified Solution Architect - Associate

BNFS Capital Markets certification

Trainings

Attended trainings on Banking & Capital Market Fundamentals

Attended trainings in Power BI and Amazon AWS

Awards

Have been awarded the ‘Brillian Star Award’ from Brillio for successful implementation of the TFM module

Have been awarded the ‘Super Team Award’ from Brillio and this is in recognition of the contributions to Ameren Aspire Team.

Have been awarded ‘Client winner Award’ from Nomura for exceptional client engagement and recognition

Project Experience

The details of the various assignments that I have handled are listed below:

Company : Brillio, Bangalore, India.

Project : Ameren – EIM (Enterprise Information Management)

Customer : Ameren Corporation.

Role : Data Specialist

Period : Jul 2016 - till date

Overview:

Ameren Corporation is a Fortune 500 company that trades on the New York Stock Exchange under the symbol AEE. It is the parent company of Ameren Illinois, based in Collinsville, Ill., and Ameren Missouri in St. Louis. Ameren Transmission Company, also based in St. Louis, designs and builds regional transmission projects. View our businesses.

Under EIM, I dealt with different project modules listed below:

1.Aspire Program: The goal of this project was to replace the work management systems for the MO & IL substation and transmission groups. This replacement included elimination of the work management functionality in EMPRV which was replaced by Maximo.

The replacement of EMPRVs work modules was technically very complex because those modules integrate with many other systems within Ameren (e.g. TRIS, Oracle, Power Plan). However, this project was aided by technology developed in the Click Mobile Touch project. This included integrations already built into PeopleSoft and Epoch.

2.CDW: Goal of the project was to build the Semantic layer accommodating data from multiple sources, which will be used to develop both pre-defined and adhoc reports. The project also implemented role-based security to the data which provided the ability to limit access to segment specific data by user. The project enabled retirement of Microsoft SQL Server-based data marts for Customer Data Warehouse (CDW) and Weather

3.Trade Floor & Marketing: Trade Floor & Marketing (TFM) is part of Enterprise Information Management (EIM). TFM consisted of migrating SSIS packages into Informatica and replacing the SQL server database with Teradata. Informatica was used to integrate the data from various systems and load the data into Teradata. OBIEE was used for reporting purposes.

Roles & Responsibilities:

Periodically met with the client team, gathered, documented and analyzed Business requirements. Developed detailed design documents and mapping documents.

Scoped the project and created an agile release plan

Created data models using Erwin

Developed Data Standardization guidelines to ensure uniform enterprise naming convention

Developed Code review checklists to ensure consistency across the codebase and to reduce technical debt

Designed & developed various metrics to accommodate adhoc reports

Implemented different techniques to improve the performance in Informatica level.

Used Push down Optimization for better performance. Even used TPT for better performance.

Wrote multiple Bteqs to load data from Stage tables to Fact tables.

Effectively worked on Onsite and Offshore work model. Delivered two modules by leading a team of more than 6 developers in offshore and coordinating with on-shore members.

Designed dataflow layout for the project through autosys to run all the jobs smoothly.

Involved in and hosted numerous review meetings with project teams and developers to report, demonstrate, prioritize and suggest resolution of issues discovered during testing.

Designed and developed reusable mapplets for each rule.

Supported informatica admin activities in all environments and migrated the code from one environment to another.

Used Informatica PowerCenter tools such as Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Worked on Slowly Changing Dimensions.

Extensive experience in debugging, troubleshooting, monitoring and performance tuning in SQL queries.

Production support experience on Informatica PowerCenter and UNIX/TERADATA.

Prepared test plan documents, test cases and test scripts for Development & Migration projects and tested accordingly.

Automated the Informatica Administrative tasks like Purge older version objects, Purge older version Session/Workflow logs.

Developed and enhanced custom unix scripts for source file validations

Solution Environment:

Teradata, SQL Server, Sybase; Informatica: 9.5.1, 10.1.1; AutoSys; UNIX

Company : Syntel, Mumbai, India.

Project : Nomura - Global Finance Architecture

Customer : Nomura Securities Co. Ltd.

Role : Project Lead

Period : Dec 2010 – Jul 2016

Overview of GFA Project:

Global Finance Architecture (GFA) is a global program to provide Nomura’s Finance department with a scalable software infrastructure to support global finance business processes. Finance Rules Engine (FRE) is a centralized accounting rules engine that converts trading, settlement and other business events into accounting entries defined by global controllers for GFA’s ledger environment. Informatica was used as integration platform for performing validations and eventually sending journal line level data to PeopleSoft and then to OBIEE for reporting purposes.

Under GFA, I dealt with different project modules:

1.NDM: This goal of the project was to build a reporting solution based on the Trading Sub Ledger (TSL) instance of PeopleSoft to address the key risks and disruptors identified in the legacy solution. One of the key focuses was to deliver a data model for Tag key value pair attributes, as well as other reference data extracted from upstream systems.

Nomura uses Qubix for its ERP and data management needs and the reporting solution delivered by this initiative should integrate with Qubix and act as a best practice reference model for other analytical & reporting applications at Nomura following Kimball’s methodology of data warehousing.

2. JT Back: The Journal Translator work stream is part of GFA program and is used to map journals generated by FRE to be sent to TSL. The purpose of journal translator is to update journal’s nominal with respective Segknow nominal mappings to enable sending TSL journals for the products in scope.

TSL nominal mappings are not in 1 to 1 relationship with Segknow. To achieve 1 to 1 mapping, combination of Account, Sub-account, Product, Book, CCY etc on TSL are required to map to a single account in Segknow. For performing reconciliation between TSL & Segknow, we needed to capture both the mapping information from JT and logic used in the Ledger reconciliation tool

Overview of MIS Project:

The project objective was to provide a robust scalable reporting platform for GL data coming from AEJ / Japan / US / EMEA regions. The Essbase cube, a management reporting tool was used by Finance to report the Firm’s Management, Segment and Financial results. The project aimed at consolidating Profit & Loss data and Headcount from the five regional general ledgers to enable global reporting by business using the consolidated book and nominal structures a.k.a. GBS Book and GBS Nominal structure respectively.

Under MIS, I dealt with different project modules:

1.MIS_BAEJ Migration to MISGlobal Cube: The goal of this project was to reverse engineer the existing flow of financial data from the AEJ offices to AEJ Dodge (and associated systems) and eventually migrate it to EMEA Dodge (and associated systems). The decision to move Execution Services out to Instinet was taken by the local AEJ Corporate. Finance & Technology teams reviewed options around Operations & Finance technology platforms for hosting the residual business on legal entities within Hong Kong, Singapore and the regional offices grouped under AEJ that were more cost effective to run given the reduced volume of transactions.

2.Automated Monthly Instinet Feed: In the legacy process, MIS Finance user received the Instinet P&L TB data once a month through mail. The user manually transformed the Instinet data into a valid Essbase fact data. This was entered using Excel Smartview’s submit data function to MGC_Inp cube. MGC_Inp data was loaded to MISGlobal and rebuilt through an Autosys batch.

In the new process developed in this module, we automated the Instinet data load from US Instinet System to Nomura Monthly Essbase MISGlobal cube.

Roles & Responsibilities:

Analysis of the specifications provided by the clients and mapping them to the technical requirements.

Working with the Architecture team and Informatica Admin team to get the required databases and Informatica environments set up.

Developed and enhanced multiple custom UNIX scripts.

Created Autosys Scripts for job scheduling. Batch design for one project.

Created Functional specifications and testing documentations

Coordinated and worked on Monthly releases to migrate code from multiple environments into Production.

Performed unit testing and identified the gaps between the functional requirements and mappings and fix bugs if any. Worked closely with the Business to provide demos and provide the required support for the POC mappings for UAT.

Documented the known issues and roadblocks for the development of the actual mappings after the POC mappings have been approved.

Supported batch in SIT, UAT, Regression testing, also in production parallel and post go live warranty period support.

Implemented the Global Parameter file to simplify the usage of Parameters across the environments.

Created Folders/Connection Strings using Informatica PowerCenter tools

Performed the code reviews

Solution Environment:

DBArtisan, Oracle (10g, 11g), SQL Server; Informatica: 8.6, 9.1.6; AutoSys; UNIX



Contact this candidate