Post Job Free

Resume

Sign in

Data Warehousing Sql Server

Location:
McKinney, TX
Posted:
February 14, 2024

Contact this candidate

Resume:

Satya R Kakumanu Email: ad3l6l@r.postjobfree.com

Ph : 703-***-****

Senior Technology Architect

Career objective and Summary:

Seeking a position to utilize my skills and abilities in an organization that offers professional growth while being Resourceful, innovative, flexible and to take part in the growth of the organization.

Professional Summary:

●Around 19+ years of experience in Data Warehousing with expertise in Informatica Power Center, IICS, Teradata, Oracle 11g and UNIX shell Scripting.

●Experienced in designing and developing efficient ETL solutions to load huge volumes of data to/from various Flat files, XML files and Relational Databases i.e., Teradata, Oracle, SQL Server.

●Proficient in gathering the Business requirements and translating them to corresponding technical requirements and strategies.

●Extensive working experience in the Analysis, Design, Development, Testing and Implementation phases of various Data Warehousing applications. Worked in both Waterfall & Agile methodologies including SCRUM.

●Sound knowledge in Data warehousing concepts like OLTP vs OLAP, Dimensional Data Modelling, E-R Modelling, Slowly Changing Dimensions, Database Normalization/De-normalization etc.

●Developed many complex mappings using various transformations i.e., Expression, Filter, Look-up, Joiner, Router, Aggregator, Stored Procedure, Normalizer, Transaction Control, Custom, HTTP, XML transformations etc.

●Possessing good knowledge in ETL Informatica Architecture about Nodes, Domain & Services. Also, working experience with Admin Console.

●Well-versed with ETL Informatica performance tuning process involving bottleneck identifications, analyzing thread statistics, optimizing components, and using parallel partitions.

●Good knowledge of Teradata Architecture and proficient in creating various database objects Tables, Indexes, Views, Triggers, Macros, and Stored Procedures etc. Excellent experience with Informatica 9/10 Administration, ETL Cloud, ETL Performance/Tuning, Oracle Real Application Cluster on 18C/12C/11gR2 and 10gR2 with ASM, Oracle 9i on heterogeneous operating systems including Windows and Red Hat Linux.

●Extensive hands-on experience in Data Extraction through SSIS and used MDX queries in reporting used SSAS.

●Extensive hands-on Experience in reporting using SSRS, OBIEE, Cognos and Tableau.

●Experience in Query optimization In Teradata, Oracle, and SQL Server.

●Experience in Front end applications using MS.Net and Mobile applications using XCODE and Objective –C.

●Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

●Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.

●Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.

●Expertise in Client-Server application administration using Oracle 18C/12C/11g/10g/9i/8i SQL *PLUS, TOAD etc.

●Experience in Informatica 10.4.1 Administration activities and migration of Informatica from 10.1.1 to 10.4.1 /10.5.x

●Experience in Data migration Projects with key emphasis on performance using techniques like Query Optimization, SQL Advisory, etc.

●Experience using ER diagram, Dimensional data modeling, Logical / Physical Design, Star-Schema modeling, Snowflake modeling using tools like Erwin.

●Experience in writing efficient Shell Scripts for invoking SQL scripts and scheduled them using crontab.

●Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of a team.

Technical Skills:

ETL Tools

Informatica Power Center 10.X/9.x/8.x

DQ Tools

Informatica Data Quality 9.x

Databases

Teradata 15, Oracle12c, SQL Server 2012

Database Tools

Teradata SQL Assistant, SQL Developer, TOAD, Snowflake

Reporting Tools

SSRS, OBIEEE, Cognos and Tableau

Cloud Technologies

OCI, GCP

Data Modelling Tools

Erwin 7.x

Languages

C#, HTML, XML, SQL, T-SQL, PL/SQL, Shell Scripting

Operating Systems

Windows UNIX

EDUCATION:

Master of Computer applications (2002)

Work Experience:

CLIENT: Verizon, Dallas, USA OCT 2009 – Till Date

Project: Global Pipeline:

Build enterprise pipeline tool to include pending network and non-network orders (install, changes, disconnects) leveraging data from SAS, PMRA, Cloud and Professional and security service. New past due calculation incorporates the international pending information from SAS reporting group.

Pipeline reporting to include Legacy US Domestic staging orders and international/US domestic VRD Staging Orders (PMRA Staging) in Pipeline Throughput reporting and add GVP Cat Field.

Global ops 20185 target to reduce the past due rate from 10 to 5% and achieve revenue realization and support VES global ops objective of installing 13 months of revenue in 12 months by 2018.

Allow sales, operations, Service to manage delivery of ~$30M at any given time prioritizes work activity based on order value.

Enterprise Dashboard: VCES data related small and medium business, created Dashboard application using Xcode, Objective C in Ipad.

These Dashboard applications had user-friendly creating ad hoc reports for Presales to post sales data and included provisioning delays and expecting to identify the customers who are going to migrate to VRD

Project: Intelligence Dash Board:

Created OBIEE reports for users and used VCES, VES and Wholesale data, one of the application is called for this type of data related is PMRA (Performance metric reporting analysis). Used KPI to identify the valued customers for VRD migration

Project: Medallia and SPA:

Created ETL process and reports for Customer survey Sales and marketing data of all VCES, VES, VBIS, VPS, Whole Sale and all segments including Federal government data.

Project: DSS

DSS/VESBI is part wholesale and national market of the parent organization, Verizon Wireless. The ordering basically is done for the bandwidth of the undersea cables by various telecommunication organizations all around the world. DSS provides the significant support in the decision-making process as its name implies by itself. DSS showcased its importance since it plays vital role in transforming the data into useful information and storing the same, which in turn eases the decision-making process. Reporting on summary level and drill down level is produced at regular intervals to the higher executives.

Responsibilities:

●Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.

●Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.

●Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and load it into data warehouse using Informatica Power center.

●Informatica/Teradata Administration maintains around 36 nodes and 17 servers.

●Provide Roles and user access to Informatica and Teradata applications

●Created the repository manager, users, user groups and their access profiles.

●Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations.

●Created connected and unconnected Lookup transformations to look up the data from the source and target tables.

●Modeling and populating the business rules using mappings into the Repository for Meta Data management.

●Involved in Data Quality, Profiling, Validations, reference check and exception handling using Informatica Data Quality.

●Worked as Informatica POC for any system issues and involved in Migrations from older versions to newer versions.

●Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.

●Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Oracle scripts and shell scripts migration process on the UNIX box.

●Worked on Informatics Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

●Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

●Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow, creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.

●Interacting with the Source Team and Business to get the Validation of the data.

●Involved in Transferring the Processed files from mainframe to target system.

●Writing SQL Scripts to extract the data from Database and for Testing Purposes.

●24/7 production support with end users.

●Wrote, tested and implemented Teradata Fast load, Multi load and BTEQ scripts, DML and DDL.

●Used Informatica power center 10.0.1 to Extract, Transform and Load data into Teradata Data Warehouse from various sources like Oracle and flat files.

●Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.

●Developed and deployed ETL job workflow with reliable error/exception handling and rollback within the Mule Soft framework.

●Created tables, views, secure views, user defined functions in Snowflake Cloud Data Warehouse.

●Hands-on experience with Snowflake utilities, SnowSQL, Snow Pipe

●Strong knowledge in migrating other databases to Snowflake.

●Optimize and fine tune queries, worked on Performance tuning of Big Data workloads.

●Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.

●Provided on call support during the release of the product to low level to high level Production environment.

●Used agile methodology for repeated testing.

●Worked with Control-m scheduling tool for jobs scheduling.

●Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

●Managed all development and support efforts for the Data Integration/Data Warehouse team.

●Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and load it into data warehouse using Informatica Power center.

●Identify the bugs/issues/problem and provide solutions for potential issues based on the knowledge acquired on the application

Environment: Informatica Power Center 10.x/9.x/8.x, Oracle 11g/19c, Teradata 13/14/15, SQL Server 2012, MS.NET, Shell Scripting, Xcode, Objective –C, OBIEE, Cognos, SSIS,SSRS, SSAS

CLIENT: LG & E, Louisville, KY, USA Jan 2008 – Oct 2009

Roles and Responsibilities:

●Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.

●Used C#.net for window application creating re used Libraries.

●Created SSRS reports for user friendly dashboard applications

●Used SSIS for ETL transformations

●USED MDX queries through SSAS

●Investigating failed jobs and writing SQL to debug data load issues in Production.

●Writing SQL Scripts to extract the data from Database and for Testing Purposes.

●Interacting with the Source Team and Business to get the Validation of the data.

●Involved in Transferring the Processed files from mainframe to target system.

●Execute test scripts including pre-requisites, detailed instructions and anticipated results

●Execute back end data-driven test cases

●Identify the bugs/issues/problem and provide solutions for potential issues based on the knowledge acquired on the application

●24/7 Production Supports

Environment: Oracle 11g, SQL Server 2008,MS.NET,Shell Scripting, UNIX, SSIS,SSRS, SSAS

CLIENT: Innovant VISA, Portland, ME, USA Nov 2006 – Jan 2008

Roles and Responsibilities:

●Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.

●Used C#.net for window application creating re used Libraries.

●Created SSRS reports for user friendly dashboard applications

●Used Lexus Nexus libraries for Credit card points and approval process

●Window applications for customer verification and validation through Lexus Nexus.

●Credit card points calculated based on spending and end user approval to customers.

●Used SSIS for ETL transformations

●Investigating failed jobs and writing SQL to debug data load issues in Production.

●Writing SQL Scripts to extract the data from Database and for Testing Purposes.

●Interacting with the Source Team and Business to get the Validation of the data.

●Involved in Transferring the Processed files from mainframe to target system.

●Execute test scripts including pre-requisites, detailed instructions, and anticipated results

●Execute back-end data-driven test cases

●Identify the bugs/issues/problem and provide solutions for potential issues based on the knowledge acquired on the application

●24/7 Production Supports

Environment: Oracle 11g, SQL Server 2008, MS.NET, Shell Scripting, UNIX, SSIS, SSRS, Lexus Nexus

IGATE India

CLIENT: Statefarm, USA Nov 2002 – Oct 2006

Roles and Responsibilities:

●Used C#.net for web application creating classes.

●Created SSRS reports for user friendly dashboard applications

●Used SSIS for ETL transformations

●Investigating failed jobs and writing SQL to debug data load issues in Production.

●Writing SQL Scripts to extract the data from Database and for Testing Purposes.

●Interacting with the Source Team and Business to get the Validation of the data.

●Execute test scripts including pre-requisites, detailed instructions, and anticipated results

●Execute back-end data-driven test cases

●Identify the bugs/issues/problem and provide solutions for potential issues based on the knowledge acquired on the application

●Supports after code migrate to Postproduction.

Environment: SQL Server 2008, MS.NET, Shell Scripting, UNIX, SSIS, SSRS.



Contact this candidate