Post Job Free

Resume

Sign in

Senior Developer Ms Sql

Location:
Trenton, NJ, 08629
Posted:
October 12, 2023

Contact this candidate

Resume:

Gopikrishna Potineni Email: ad0b3h@r.postjobfree.com

Contact: +1-661*******

PROFESSIONAL SUMMARY

12+ years of IT experience in the Analysis, Design, Development of DWH and Business intelligence Applications using Informatica Power Center, IICS, Tableau, Teradata, Snowflake, Snowsql, Oracle, MS SQL Server, Azure, aws and Pyspark.

Worked at client location Munich, Germany and worked as an onsite coordinator.

Have worked in various roles like Support Engineer, Developer, Senior Developer, Team Lead, Technical Lead and Consultant.

Have experience working in Development, Support, Migration, and Product development Projects.

Have experience in gathering customer requirements.

Have good experience in analyzing the existing system and giving estimation for the tasks.

Have experience in defining and reviewing the templates and artifacts developed by the team.

Have experience in identifying and bridging the gaps between business and technical design documents.

Have experience in designing and developing complete BI and ETL strategy.

Experience on agile methodologies and having experience on Jira, Azure Devops, Confluence, HP-ALM, X-Ray etc.

Expertise in doing the code review and preparing the unit test cases.

Experience on preparing solution design documents, Technical design document, Data mapping document, Unit test document, Deployment documents, Release documents etc..

Developing DWH applications using Informatica Power Center and IICS.

4.5 Years of experience in IICS Informatics cloud.

9 years of experience in Informatica Power Center.

Designed, developed, and implemented ETL process using Informatica power center and IICS.

Created the Mappings, Maplets, Mapping tasks/Sessions and Task flows/Workflows using different transformations and tasks using IICS.

Extracted data from different data sources like Salesforce, Oracle, SQL Server, Flat files, and XML files and loaded into Oracle, Snowflake, Teradata SQL Server and generated the files.

Worked on different transformations Expression, lookup, Filter, Aggregator, Source qualifier, Joiner, Store procedure, Router, sql transformation, Sequence, Normalizer, Union, xml, Sorter, Rank etc.. as per requirement.

Experience on performance tuning and implemented the Partitioning and Pushdown optimization.

Experience in implementing complex mappings and task flows and worked scd type1 and type 2 mappings.

Having experience with pmcmd command in Power center and run a job utility in IICS.

Implemented the full and incremented data flows.

Experience in data profiling and data validations.

Migrated the workflows from one environment to another environment and Migrated the IICS objects using Azure Devops.

Having good experience on Application integration, Integrated the IDQ, MDM and DI objects using application integration.

Extracted the files from AWS and Azure and copied the files into AWS and Azure.

Experience in UNIX and Shell scripting, Created and modified the shell scripts.

Having Hands on experience on Azure ADF, Pyspark, Databricks and having on knowledge on AWS Glue and ready to work on those tolls.

Developed Data Visualization Work sheets, Dashboards using features such as Data Blending, Calculations, Filters, Actions, Parameters, Maps, Extracts, Context Filters, Groups, Sets, Aggregate measures, Table calculations and LOD expressions.

Created Tableau dashboards using Text tables, Highlight tables, stacked bars, Horizontal bars, Pie Charts, geographical maps, Side by side bars, dual axis (combination charts), Continuous and discrete lines etc..

Good experience in dashboard designing with large data volumes from various data sources Such as Snowflake, SQL Server, Oracle, Files etc...

Good experience in creating interactive dashboards using different types of actions Hover, Select, Menu.

Good experience in Tableau Administration (Tableau Server).

Experience on checking the source query of SAP BO reports and modifying BO reports.

Excellent experience in Oracle, SQL server, Snowflake, Teradata etc...

Created the stored procedures and functions with exception handling, Temp tables and audit logging and implemented the DWH applications using Oracle and Teradata and snowflake.

Experience on creating views and materialized views on top of Fact and Dimension tables using different joins and functions.

Experience on data profiling and creating database objects such as Tables, Views, Materialized views, Indexes.

Having experience of analytical functions.

Good experience on performance tuning, checking query execution plan, Applying the indexes and Hints.

Experience on partitioning or clustering the tables and gathering the stats for the tables.

Experience on DML operations such as Insert, Update, Delete, Merge.

Worked on Teradata migration project.

Experience on Data migration and comparing the data between different databases using DVO tool and queries.

Achievements/Contributions:

Consistently achieved a high level of customer satisfaction by exhibiting a positive attitude and building trust, by ensuring commitments are met and expectations are exceeded.

TECHNICAL SKILLS:

TECHNICAL SKILLS

ETL Tools

Informatica PowerCenter, IICS, ADF, Data Bricks, AWS Glue

Reporting Tools

Tableau, SAP BO, Cognos, Ramco Decision works

Databases

Oracle, Snowflake, MS SQL Server, Teradata

Script

Shell Scripting, Python, Pyspark,Sql,Snowsql

Modeling Tool

Erwin, ER- Studio

Cloud

S3, EC2, Azure

EDUCATION DETAILS:

Bachelor of Technology from Acharya Nagarjuna University- May 2009

PROFESSIONAL EXPERIENCE:

Kelly Services, Troy, Michigan

ROLE: Data Analyst

DURATION: Jul 2023- Till now

Working with Business analysts, DB Architects, Project managers and stake holders to understand the business requirements.

Doing the Impact analysis, Data analysis, providing story points, Development, Unit testing, Test data Preparation, Code review, QA support, Proving KT to other team members.

Preparing the technical design documents and mapping documents.

Preparing the dashboards using tableau to analyze the business.

Applying joins, doing the table level, fact, and dimension level calculations.

Creating complex queries and views for reporting purposes.

Doing the performance tuning for queries and dashboards.

Create Mappings using Source, Target, Expression, Aggregator, Filter, Lookup, joiner, union, Normalizer, router, SQL, union, web service transformation etc... as per the business requirements.

Created the App connections, Service Connections, Process Objects, and processes to integrate the applications.

Creating scd type1 and type 2 mappings

Created the task flows using Data task, assignment, notification, Parallel paths, sub task flow, Decision task etc.

Worked on the parameter files and in and in out parameters.

Created complex SQL queries to implement the ETL and for Testing using multiple using joins, sub queries, Analytical functions, aggregate functions etc.

Working on creating database objects such as tables, Views using multiple tables etc...

Working in Data integration, Administration, and monitoring services.

Deploying database objects from one environment to another environment using Azure Devops.

Attending Scrum calls, sprint planning and retrospect meetings.

Technologies: Tableau, IICS, Snowflake,MS SQL Server, Snowsql, Python, Azure, Sales Force.

Wolters Kluwer, Coppell, TX

ROLE: Technical Lead/Lead Data Engineer

DURATION: Jun 2021- Jul 2023

RESPONSIBILITIES:

Working with Business analysts, Project managers and stake holders to understand the business requirement.

Created Work sheets, Dashboards using tableau.

Doing the reverse engineering in tableau and in informatica to identify the issues raised by end users.

Migrated the python code into Informatica.

Created the user stories and service requests as per the requirements and execution of the service requests.

Checking the code and data after deploying the code into higher environments and providing confirmation.

Doing the Impact analysis, Data analysis, providing story points, ETL pipelines Development, Unit testing, Test data Preparation, QA support, Proving KT to other team members.

Prepared the technical design document, mapping document and unit test case documents.

Worked as a technical lead and provided the solution for complex tasks inputs to team members.

Creating mapping tasks, Maplets, Task flows etc using different transformations and different tasks.

Created the APIs using REST and SOAP Urls.

Created App connections, Service Connections, Process Objects and processes to integrate the applications.

Integrated the MDM, DQ and DI applications.

Worked on web service transformation.

Working in Data integration and monitoring services.

Extracting data from Flat files, Salesforce, Oracle database SQL Server database and loading the data into snowflake database.

Deploying database objects from one environment to another environment.

Created Complex SQLs, working on analytical functions, Views, Creating the stored procedure and functions.

Attending Scrum calls, sprint planning and retrospect meetings.

Taking care of standup meetings with Development team and dev deliverables.

Technologies: IICS, Snowflake, Snowsql,Tableau, Informatica Power center, Oracle, Salesforce, Workbench, Jira, Python, Shell scripting, AWS.

BMW, Munich, Germany & BFIL (Bharat Financial Inclusion Limited) India

ROLE: ETL Developer/Report Developer/Team Lead

DURATION: Sep 2014- Jun 2021

RESPONSIBILITIES:

Requirement gathering, ETL objects backtrack, Data analysis, Providing Estimations, ETL Development, Unit testing, Test data Preparation, INT deployment, UAT support, Proving KT to other team members, Production deployment support, Documentation.

Worked closely with clients to solve business issues.

Worked as Team lead and onside coordinator.

Conducting daily team meetings with team to know about open issues and open points. Resource management, Work allocation etc

If the team struck anywhere used to provide the guidance to them.

Having experience on SAP transformations and Web services and pods.

Created and modified the mappings, maples, Sessions, Workflows and Worklets using different transformations and tasks.

Created mappings, mapping tasks, and task flows using different transformation and task flows.

Doing the reverse engineering to understanding the code and to identify the data issues and failures.

Modified and created the Unix scripts and scheduling the workflows using Crontab.

Created stored procedures and complex queries using analytical functions, joins etc.

Created and modified the shell scripts.

Created/Modified tableau worksheets and dashboards.

Created bar charts, Interactive dashboards, maps, dual axis charts, scatter plot, Gantt Chart, Tree map, line Graph, Bar Chart, Bubble chart, pie charts, filters, calculations, and data blending.

Publishing the dashboards and supporting end users.

Worked on Informatica migration projects and Database migration projects. Tested the informatica applications after migration.

Creating and executing the testcases in HP-ALM Preparing the test result documentation.

Technologies: Informatica Power center, IICS, Oracle, Tableau, Shell scripting, Teradata, HP-ALM, SVN, X-ray, Jira, Confluence.

DBS Bank, India.

ROLE: Analyst Programmer

DURATION: May 2012- Sep 2014

RESPONSIBILITIES:

Worked as ETL and Report Developer.

Created and modified mappings, maplets, sessions, workflows using informatica power center based on the requirement and monitoring the workflows.

Creating test cases for unit testing and deploying the workflows from one environment to another environment.

Doing the reverse engineering to understanding the code and to identify the data issues.

Creating the Complex queries and plsql procedures as per requirement for both development and testing.

Created the reports using Ramco Decision works tool for RBI.

Technologies: Informatica Power Center, Oracle, MS SQL Server, Ramco decision works, SVN

BCBS, Chicago, IL

ROLE: ETL Developer

DURATION: Aug 2011- Apr 2012

RESPONSIBILITIES:

Extracted data from Oracle, Flat files and loaded it into Oracle using Informatica Power Center.

Developed mappings using different transformations and created workflows.

Having good experience on parameter files and mapping parameters, Mapping variables, workflow and worklet variables.

Modified the existing mappings as per change request.

Analyzing the Stored procedures and tables and views for enhancements and modifying the existing Queries and Stored procedures.

Unit testing of Workflows and data validations and creating the unit test case documents.

Technologies: Informatica Power Center, Oracle, SVN.



Contact this candidate