Post Job Free

Resume

Sign in

Software Engineer Data

Location:
Atlanta, GA
Salary:
100 k per annum
Posted:
March 27, 2020

Contact this candidate

Resume:

RISHI SHRIVASTAVA

Atlanta GA ***** adcgzb@r.postjobfree.com 715-***-**** https://www.linkedin.com/in/rishi-shrivastava-7aa8aa121

SUMMARY

Experienced Team Lead/ Sr. Software Engineer/ Asst. Manager in the field of Software Engineering, Business Intelligence, Data Warehouse development lifecycle, Data Modelling, UI & Full Stack development. Well versed in SDLC, Agile methodologies. 9.5 years of rich IT industry experience. Currently working as a Tech Lead /Sr. Software Engineer in a renowned Medical Insurance company in Atlanta, U.S.A.

EDUCATION

B.E. (Honors) GPA 3.3/4.0 RGTU India : May 2006 - June 2010

TECHNICAL EXPERTISE

Tools

Informatica, MS SQL Server, Teradata SQL Assistant, Oracle SQL Developer, Toad, OBIEE, ControlM

Other Areas of Expertise

Python,Data Modelling, Star – snowflake Schema, Data Mapping, Datawarehouse Development, Data Analysis, OBIEE Reporting, IDQ, Data Standardization, Error and Audit Controls, Matching Logic Concepts, DB Level Performance tuning concepts.

Databases/RDBMS

MSSQL, Oracle, Teradata, DB2, T-SQL

Scripting

UNIX shells/scripts, PowerShell

Operating Systems, Databases

Windows, Android, Linux, Mac, SQL Server, Oracle SQL Developer, MySQL

Tools-Bug fixing

JIRA (Scrum Master), Remedy

PROFESSIONAL EXPERIENCE

Accenture Nov 2016 - Present

Anthem Insurance - Sr. Software Engineer/Tech. Lead Atlanta, Georgia, USA

Team Lead/Sr. Software Engineer: Gained Confidence from the Business Stakeholders in very less time. Wearing different hats: Sr. Software Engineer, Tech Lead and Asst. Managerial Role, Scrum Master

To split the % of time invested on a daily basis: Sr. ETL Developer (80%), Tech Lead and Asst. Managerial Role (15%), Scrum Master(5%)

Extensive experience in SDLC using the Agile methodology.

Expertise in Data Analytics

Awarded by the Account for Distinctive Achievement during the final evaluations completed by Talent Lead

Effective Mentorship, Teamwork and co-ordination with multiple teams in the strict deadlines with Risk Management

Requirement gathering from Data Architects, Client Teams, Business Users and developing the mapping documents.

Technical work hands on which Includes Code development, testing, performance improvement for the end to end data flow.

Single handed deployed 300+ jobs to Informatica Scheduler - ControlM. Highly appreciated by Client as the same project got failed when another Vendor was given this task.

Good exposure to Python.

Working on multiple Databases as Source: Teradata, SQL, Oracle and also using Flat Files

Expertise on Datawarehouse development: Developing new ETL Application, maintaining and Improving the performance of Existing applications in Production.

Data Analysis as per the current Business Logic end to end and suggesting code level changes to make the Data / Reports as expected by Business. Involved in Bi-weekly meetings with Architect & Business Team for the same.

IDQ Implementations in the code for Data Standardization, which extends to Address Doctor / Address Reference Files

Implement Error Handling dynamic code logic for the daily feeds and put in place Audit Steps to backtrack and to analyze the trends which is very useful for code level changes/ code improvements. Implementation of Triggers on critical feed.

Worked in much detail on De duplication and on the Matching Logic for the incoming feed to find the True Updates before introducing the data into the system to lower down unnecessary pressure on DB and on the loads. Only 10% were the true updates which were sent by the Source team, thus helped a lot in getting the data processing done in least time possible. Was highly appreciated by the Client team.

Working on Performance Improvement methodologies which include: Table partitions at DB level, Informatica Partitioning at session level parallel partitioning and using different partitioning approaches like Hash Partitioning, Round Robin Partitioning, Pass Through Partitioning. Applying Indexes at Table level, using Parallel Hints to optimize the query run time

Data retention timelines to make sure that the DB is not impacted with stale or unused data.

Monthly audits on existing code for performance bottlenecks and implementation of Code fixes in minimum time possible (next Sprint)

Good exposure to SDLC methodology. Design and development of Informatica complex mappings/code, stored procedures, functions, tables, views, and functions in SQL and Teradata

Data Modelling for logical and Physical data models. Worked with Data Architects for the Table definitions and granularity level for the Reporting requirements by the Business,

Optimization for Informatica code, stored procedure and scripts to remove any bottlenecks.

Working on OBIEE Reporting Tool for Business and Sales team to target the potential prospects and customers.

Code level changes at OBIEE RPD level as per Business requirements.

Working on Performance improvement for the OBIEE reports by applying indexes on Dimensions in the Star schema, for the Business to have the reports in least time possible.

Deriving and Understanding of the Data model and scope for the scalability and Performance tuning.

Working with Business on Estimations and Prioritizing the current CR’s as per the criticality and Bandwidth within the team.

Guidance and sessions within the team to overcome any roadblocks: Functional or Technical.

Working on Agile Process (Scrum Master). Allocation of Tasks among Team members based on the Sprint planning and as per the bandwidth of the team which also depends on the dynamic change in the priorities of the Tasks as per the Business. For which I keep in constant touch with Client team

Perform multiple Agile ceremonies: Daily Stand Ups, Backlog Grooming with the Stakeholders which gives clarity on the order of Deliverables they are expecting. PI Planning sessions with the Client team to decide on the Future work as per the Business expectation.

In-depth understanding of Data warehousing concepts, OLAP, SCDs (type1, type2 and type3). Good exposure to OBIEE & SIEBEL as well.

Good Exposure to scripting using Powershell, unix scripting.

Applied SCD 2 approach for the current Project to keep track of the Historical changes which is used in the OBIEE Reporting.

Peer design and code reviews and extensive documentation of standards, best practices, and ETL procedures.

Involved in Managerial activities: Resource allocations, budget and forecasting for the Dollar values . Team grooming, Team cross trainings for the individual team members to be self-reliant in case of new issue which is outside their comfort zone. Brown bad sessions per week.

Performing Managerial activities helps in understanding the vision of the Client and Stakeholders which gives me an edge to guide the work in the best path, so that the deliverables are in sync with the expectation in a longer run and there are no last-minute surprises. Also it helps in gaining more confidence from the Client team, because from the start the steps are aligned with the Clients expectations.

Accenture Oct 2014 – Nov 2016

SENTRY Insurance (Stevens Point) - Sr. Software Engineer / Tech. Lead Stevens Point, Wisconsin, USA

To split the % of time invested on a daily basis: Sr. ETL Developer (90%),Tech Lead (10%)

Developed Technical Specifications of the ETL process flow.

Brainstorming sessions with Data Architects and Hortica IT Team (Sentry acquired Hortica ), on the data mappings between Sentry and Hortica platform.

Worked on Data Flow diagrams and technical diagrams for the Data Integration flow.

Deeply involved in Data Conversion, for the Hortica data to be made in sync with Sentry data, to not affect the Reporting for Business and Stake Holders.

Data Integration for Hortica policy data into Sentry framework

Implementation of Data standardization using IDQ and Data Audits using multiple checks at the Mapping level to send it back to Source team and re-process the corrected data with Correct Flag values for the Reporting and Analysis by the Business team.

Design/Logic of the code was built is such a way that, if Sentry acquired another company, then there is a minimum rework/modification required.is as per the Business Process and Standards to minimize the rework. Introduced Business Lookup tables in the code to bifurcate the data feed from multiple companies

Application of Parameters and Variables and Decision tasks at the Informatica level to branch out the policy and casualty data into different / respective dimensions.

Development work using Informatica as a primary ETL tool, pulling data from multiple DB: Teradata, Oracle. Flat files were also other medium for which dynamic handling and concepts for Flat file were applied.

Data Modelling for logical and Physical data models. Worked with Data Architects for the Table definitions and granularity level for the Reporting requirements by the Business,

Worked on Performance tuning for the code by evaluating the Bottlenecks.

Applies Infomatica based parallel partitioning at session level to introduce 3x throughput, thus helping the Business to get the Reports well ahead in time.

Worked on Unix scripting.

Implemented Tivoli as the Informatica scheduler.

Implemented Triggers for De Duplication audit purpose.

DB Level partitioning and application of Indexes and Parallel hints at the Query level to decrease the cost of the query.

Data Validation and testing using debugger to make sure that the Data flows as per the Business logic.

Single handedly worked on a complex Data Conversion- Integration Project

Highly Appreciated by the Client to deliver the complex and difficult Data Integration Project within the deadlines

Sprint Planning, understanding the new features and requirements, coordinating with the Business and BA for requirements/ clarification.

Ad-hoc / weekly Design Reviews with Data Architect, Client Teams, Business Users and make sure that the Development, testing automation of the new and complex requirements as per the Release Calendar

Conducted peer design and code reviews and extensive documentation of standards, best practices, and ETL procedures

Built efficient code for processing fact and dimension tables with complex transformations and SCD type 1 and type 2 changes

Effective team co-ordination and Proactive in taking the responsibilities and in taking the initiatives

Tata Consultancy Services(TCS) May 2010– Sep 2014

British Petroleum/ Finance Business Warehouse – Software Engineer Mumbai, India

Handled multiple critical Modules involving understanding the new features and requirements, coordinating with the Client and BA for requirements clarification, Development, Unit test cases and creation and testing of the new requirements based on functional specifications and taking part in peer reviews.

Developed mapping documents, Designed and developed complex Informatica mappings/code, store procedures, functions, tables, views, and functions.

Developed understanding of business requirements based on functional criticalities and identify development and test scenarios and based on Software Requirement Specifications.

Developed code and Write, manage and execute scripts. Interaction with stakeholders - Client, BAs, Dev team, Business Users

Developed Informatica complex mappings

Implemented audit checks and controls for the data processing.

Worked on unix scripting to process the huge input files, which drastically lowered down the total processing time.

Worked on Change Controls to modify the existing code as per the Business requirement. Applied multiple performance improvement methodologies in the Informatica code and at the Database level.

Worked on Stored procedure to be called by Informatica mappings

Created Triggers for daily feeds which required analysis of the data for the code improvement and health check

Implemented error handling at the informatica level, to capture the erroneous data at the real time.

Parallel processing of the data feed by using Pass though Partitioning.



Contact this candidate