SURYA ALEKHYA GOTETI
************.******@*****.*** ETL Full Stack Engineer
Professional Summary:
Astute and output-driven IT professional with 13+ years of progressive experience including ETL DevOps Engineering, SQL, Teradata and handling SQL Server Databases. ETL Full Stack activities includes (Development, Testing, Business Analysis, Deployment)
Adept at working closely with multiple developments and test teams to provide process design, management, and support for source code control, code compilation, change management and production release management
Deft at driving continuous improvement by focusing on increased automation, continuous integration and continuous test principles
Proficient in all phases of STLC (Software Testing Life Cycle), requirement gathering & designing as per the ETL requirements for data warehousing and business intelligence.
Sound cognizance in SQL, DWBI report testing and supply chain & health care domains
Efficient in streamlining & coordinating.
Configuration/Build/Release/Deployment/Process/Environment management across all the products of the clients’ applications.
Skilled in DevOps processes to build and deploy systems and working with batch teams to schedule and monitor batch jobs on a weekly basis
Demonstrated excellence in working as Admin on JIRA tool and customizing the dashboard based on the teams’ requirements.
Expert in writing SQL query using multiple databases like ORACLE, SQL Server etc.
Extensive knowledge of Informatica tuning and SQL tuning.
Experience in integration of various data sources into staging area.
Performed Unit testing, System testing, Integration testing and users for UAT and prepared test report in different phases of project.
Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.
Designed and developed complex mappings, Mapplets, tasks and workflows and tuned it.
Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.
Good understanding of the Data modeling concepts like Star-Schema Modeling, Snowflake Schema Modeling.
Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations like Mapplets.
Extensively used Slowly Changing Dimension (SCD) technique in business application.
Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced the execution time for huge volumes of data for a company merger project.
Conducted Unit tests, Integration tests and Customer Acceptance tests.
Working experience in Agile and Waterfall methodologies.
Education:
Bachelor of Technology (Information Technology), Andhra University, India (2009)
Technical Skills:
Scheduling Tools:
CAWA, TWS, Airflow
ETL Methodologies:
Data Warehousing, OLAP, Agile, Software Development Life Cycle (SDLC)
ETL Tools:
DataStage 11.5, Informatica Power Center 9.x/8.x/7.x/6.x
Scripting Languages:
Perl, Python
Networking Tools:
WinSCP, Putty
Reporting Tools:
IBM Cognos Reporting Tool
Databases:
Teradata, MS SQL Server 2005/2008, Oracle 8i/9i /10g/11g, Oracle Data Integrator,
Operating Systems:
Windows 2000/2003/XP/Vista/Windows 7, UNIX
Work Experience:
Client: Optum UHG (United Health Group) Apr 2018-Jan 2023
Role: ETL Full Stack Engineer
Project: Unified Data Warehouse
Team Size: 20+
Responsibilities:
Utilizing collaboration tools like Jira & Scrum Agile methodologies and hands-on experience in using Jenkins as continuous integration and continuous deployment tool
Collating all the Service Now Tickets and delegating them to the team on priority
Addressing and resolving the critical defects by working on the development tasks
Drafting ETL test scripts based on the requirements and maintaining all the Business Scenario scripts to provide added value to the business
Executing Data Integration Process and identifying Business Test Scenarios to test the functionalities
Creating Test Data in SIT and UAT environments and involving in end-to-end deployment from last 3 years
Providing production support on a bi-monthly basis and making sure that all the batches are executing as per the schedule with no flaws.
Experienced in Waterfall, Agile/Scrum Development.
Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing
Strong knowledge in Data Warehousing concepts, hands on experience using Teradata
Well versed in creating dashboard queries which will be easy for the users to understand the data availability
Created a controls framework tool using python and SQL, which gives complete gist of what all validations are been performed on the data and also gives complete information on the data quality
Was always pro-active to schedule the jobs to see the right dependency with out any overlaps on the schedules
Thorough knowledge on the framework and was well versed to handle schedule the batches during full load versus incremental loads
My vision on the data volume based on the requirement helped data model and data base team in terms of table space
Was also part of query performance tuning
Business show and tell presentation is an additional skill that I acquire to give the status on the data loads
Mapping sheet updates and mapping review with Product Owner and Technical System Analysts
Involved in upstream team collaboration
Involved in downstream team collaboration
Also was involved in code suggestions to the cross functional teams as and when it is required
Solution walkthrough to Product Owners
Requirement collaboration with Data Model Team
Mapping and Requirement handover to Development team
Functional and Technical Test Results review
Environment: Jenkins, Git, GitHub, Jira, Teradata SQL, Putty, Data Stage 11.3, Rally, Service Now, Airflow, TWS
Client: EPAM Systems HAVI Global Solutions Jan 2014-Aug 2017
Onsite Loc: Downers Grove, Illinois
Role: Software Engineer
Team Size: 8
Responsibilities:
Participated in Batch Monitoring, Daily Data load Test, fixing the production issues, back-end testing and functional & regression testing
Implemented deployment at the client location as per the client requirement
Drafted functional test scenarios & test cases and prepared test data for test cases
Communicated with clients regarding the status of deployment & other issues regularly and performed defect logging in share point
Batch monitoring at Client place as per the client requirement during the deployment
Daily status emails on the batches delay if any
Co-ordination on the source files from the external teams
Data acquisition from source
Data Integration testing at the intermediate layers
Involved in UAT and SIT and PROD check outs
Was involved in production fixes during stringent timelines
Introduced knowledge acquisition documentation for the new team mates which helps the team for smooth on-boarding
Created process flow diagrams for easy understanding of the framework
Was part of deployment prep-activities and prioritization
Was involved in end-to-end scheduling jobs
Introduced batch monitoring play book and gave clear instructions on the work arounds handover
Environment: JIRA, CAWA, Informatica, Service Now, Oracle 11g
Client: Outworks IBM May 2013-Dec 2013
Location: Hyderabad, India
Role: Software Engineer
Project: ANV Insurance
Team Size: 10
Responsibilities:
Took part in development activities, mapping document updates as per release and maintaining requirements traceability matrix
Implemented extensive data integration testing in SQL environment using SQL scripts and validated the results with SQL
Performed extensive Report testing and validated the results with SQL and end to end report validation
Resolved logging issues in the share point and tracked all the defects till closure
Co-ordination with source teams on daily files
Involved in files transfer activity to the external stack holder via ECG connect
Involved in file validation
Involved in File to Stage loads
Involved in transformation testing
Involved in incremental testing
Involved in full load data validation
Environment: JIRA, CAWA, Informatica, Service Now, Oracle 11g
Client: Cognizant Apr 2012-Mar 2013
Location: Hyderabad, India
Role: Programmer / Analyst
Project: Wellpoint
Team Size: 40
Responsibilities:
Developed system test scenarios based on requirements & design documents
Performed extensive data integration testing in Oracle environment using SQL scripts, validated the results with SQL and administered extensive Report testing and validating the results with SQL
Developed re-usable scripts
Involved in data mart validation
Co-ordination with source teams on daily files
Addressed & resolved logging issues in the share point and tracked all the defects till closure
Environment: Teradata, Informatica, Putty
Client: Cognizant Schneider Apr 2010-Mar 2012
Location: Hyderabad, India
Role: Programmer / Analyst
Project: Schneider
Team Size: 25
Responsibilities:
Assessed the requirements and met the timelines for the requirement
Communicated with the client on behalf of the business team
Developed transformations as per the mapping requirements
Involved in defects triaging
Involved in bi-weekly clients calls on the development status triage
Status handover to the internal teams
Environment: Teradata, Oracle Data Integrator, Putty
Client: SVL Infotech Private Limited Jun 2009-Mar 2010
Location: Chennai, India
Role: Junior Programmer
Project: Human Resource Management System
Team Size: 5
Responsibilities:
Participated in the HRMS Web Application testing
Involved in the team meetings of development requirement
Involved Traceability matrix creation and maintenance
Involved in Unit testing defects tracker
Involved in Unit testing
Involved in migration testing
Executed the test scripts and resolved logging defects in share point
Environment: SQL Server, Service Now