Sita Lakshmi M
adwi15@r.postjobfree.com
PROFESSIONAL SUMMARY
8+ years of experience in Information Technology ranging from Data Migration, Data Integration, Data Analytics, ETL Solution Design, Development, Testing, Implementation and Job Scheduling
Extensively worked on data extraction, cleansing, data integration and loading from different data sources (like CSV, Oracle and DB2 databases) using ETL Datastage (Designer, Director, Manager and Administrator) versions 8.x to 11.x
Developed various Reusable components in ETL Datastage such as Automated Data Comparison framework, Automating Messages of MQs into XMLs etc.,
Work experience in Snowflake and Cloud ETL tools DBT and SnapLogic
Have very good hands on knowledge in Unix and SQL commands
Have good querying knowledge in SnowSQL
Good knowledge and understanding of various databases like DB2 and Oracle 10g
Experience in working on scheduling tools like Control-M to run, monitor, debug and test the application on development and to obtain the performance statistics
Experience in writing UNIX Shell scripts for various purposes like file watchers, file validation, automation ETL process, sending notifications to the external applications once the staging load and the table loads in regional data marts are complete
Good knowledge in Python Scripting as well
Designed and developed jobs, from varied transformation logic using different stages like Transformer, Lookup, Filter, Aggregator, Join, Merge, Funnel, Sort, Change Capture etc.,
Designed job sequences working on stages like Job Activity, Execute Command, End Loop Activity, Exception Handler, Nested Condition, Notification Activity, Routine Activity, Sequencer, Start Loop Activity, Terminator Activity, User Variables Activity
Have comprehensive understanding and implementation experience of RDBMS concepts like Data Warehouse, Data Marts, OLAP OLTP, Star and Snowflake Schemas, Fact and Dimension tables, 2NF, 3NF and SCD Type 1 and Type-2 changes
Hands-on experience on handling AWS S3 clusters (to extract data from and load into Snowflake), for the tables that are supposed to be created and loaded on Snowflake
Involved in different phases of SDLC life cycle process like Analysis, Coding testing and de-bugging and Deployment
Performed ELT operations using Cloud DBT tool by creating multiple Incremental and Type-2 models as well as Hooks and Macros
Experience in creating, executing and debugging Data extract pipelines, Load pipelines, perform Aggregations, Joins, Pivot, CDC, SCD Type-2, Unique, Remove Duplicates, Sorting, NULL Handling, Date Conversions, Trim functions for JSON, CSV format data using SnapLogic Designer, Studio and Dashboard
Ability to deal with people, willing to Learn, Team player, Self-Motivated and Very flexible
Experience in Agile Methodologies
Experience in Test Case Design, Test Tool Usage, Test Execution, and Defect Management
Excellent problem-solving skills to solve all the business and process related problems of the organization
Good knowledge of Object-Oriented Programming, Collections, C# concepts
Experience in application development using .NET MVC and web Technology Like Ajax, jQuery and HTML
Worked on IDEs such as Microsoft Visual Studio and SQL Server Management and jQuery as well
Knowledge of giving project code builds at Testing Side Environments
Planning and implementing service improvement & transformation plans for better utilization of the project resources
Have good domain knowledge on Banking and Financial services which results in better understanding of business requirements and aligning the process for a seamless and smooth delivery
TECHNICAL SKILLS
ETL Tools
DataStage 11.7,11.5, 8.5, 9.1 Snaplogic, DBT (started learning)
Databases
Oracle 10g/11g, DB2, SQL Server, Snowflake
Languages
SQL, Unix Shell Scripting, Python,
MVC, .NET, HTML, CSS, C#
Tools
Control-M, Jira, qTest,
SQL Developer, Visual Studio
Operating Systems
Windows, Linux
Deployment Tools
GIT and UCD
CERTIFICATIONS
SnapLogic Certified Enterprise Automation Professional
PROFESSIONAL EXPERIENCE
Client: Conoco Phillips
Location: Houston, TX
Role: Sr.ETL Developer
Jan 2023 – Current Date
Roles and Responsibilities:
Understanding the business rules and sourcing the data from multiple source systems into Snowflake using SnapLogic
Interacted with users and supported multiple projects for Analytics team
Extracted the raw data from Flat files *.csv and JSON files coming from HR application to Snowflake staging tables using SnapLogic ETL
Refreshing the mappings for any changes/additions to CRM source attributes
Created and developed mappings to load the data from Snowflake staging tables to EDW DataMart tables based on Source to Staging mapping design document
Implemented SCD Type2 Pipelines using Scheduled Tasks to update and load new records of Snowflake target tables
Followed ETL standards - Audit activity, Job control tables and session validations
Followed practice to send error logs into error files using Reusable Error Pipelines which are used
Designed and developed jobs, from varied transformation logic using different stages like Mapper, Joins, Filter, Aggregator, Unique, Sort, Change Data Capture
Used Shared folders to create reusable components (Source, target, email tasks)
Developed triggered and scheduled tasks for the re-usable purpose to run daily
Developed reusable pipelines in SnapLogic Designer using Pipeline-Execute snap also
Used Sql queries to validate the data after loading
Development, Unit Testing, System Testing, and deploying them in Production environment
Prepared Test Cases for testing interfaces which were used by QA team.
Developed Workflows with various Tasks using parameters
Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings and sessions. This led to better session performance
Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes
Worked on resolving issues on priority basis and report it to management
Environment: Snaplogic 3.7,/4.x, Flat Files, Oracle 11g, Snowflake, SQL, SQL Developer, Windows 10, Unix, GIT, UCD, Python, Putty, WinScp
Client: IBM
Location: Indianapolis, IA
Role: Sr. ETL Developer
Sep 2022 – Jan 2023
Roles and Responsibilities:
Participated in discussions with Project Manager, Business Analysts and Team members on different Business Requirement issues
Worked on the existing documents and developed the required technical specification as per the business needs
Worked on extracting data from Flat files, JSON & CSV files using SnapLogic Designer pipelines and load into Snowflake Staging tables
Designed ETL jobs to extract data from Staging and applying transformations and ETL logic to them
Performed Aggregations, Joins, Pivot, CDC, SCD Type-2, Unique, Remove Duplicates, Sorting, NULL Handling, Date Conversions, Trim functions for JSON, CSV format data using SnapLogic Designer, Studio and Dashboard
Mapped Data Items from Source Systems to the Target System
Validation of all the metrics is done well in advance, before migrating it into the production environment
Used the SnapLogic Studio and Dashboard to view the pipelines run-time execution, in order to debug its components, and monitoring the resulting executable versions
Managed SnapLogic servers, pipelines and scheduled/triggered tasks
Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements
Created critical transformations and mappers wherever it is necessary
Created Snaplex transformations to convert data from Flat file to table
Built data quality Rules and generated dashboard using services
Involved in query optimization and performance tuning
Responsible to migrate the ETL code to higher environments through export and import pipelines
Facilitate testing and data validation
Interacting with end users to facilitate requirement gathering and testing final output
Environment: Snaplogic 3.7,/4.x, Flat Files, JSON, Oracle 11g, Snowflake, SQL, SQL Developer, Windows 10, Python, Putty, WinScp
Client: Exelon
Location: India
Role: ETL Developer
Apr 2021– July 2022
Roles and Responsibilities:
Worked on Designing and integrating on Premise Oracle, Sql Server Data warehouse data into Snowflake
Build the Logical and Physical data model for Oracle as per the required
Worked on Oracle Databases, a little on Snowflake database
Define virtual warehouse sizing for Snowflake for different type of workloads
Implemented source data analysis and business rules for Designing/Developing the DataStage Jobs using IBM InfoSphere 11.5/11.3
Designed Parallel jobs using various stages like Join, Remove Duplicates, FTP stage, Filter, Dataset, and Lookup file set, Modify, Transformer and Funnel stages
Expertise in InfoSphere Change Data Capture 11.3 CDC, ETL DataStage Designer and to Audit the data on Live
Performed Data Cleansing before loading it on data warehouse
Define roles, privileges required to access different database objects
Gathered information from different data warehouse systems and loaded into Oracle 12g Data warehouse System using IBM DataStage Designer Tools
Used various Source Databases like DB2, MS SQL Server, Sequential Files to Read and Transform data to Target Snowflake tables using various Transformations of Business Rules
Analysed data with discrepancies through Error files and Log files for further data processing, cleansing
Knowledgeable in the areas of relational database logical design, physical design, and performance tuning of the RDBMS
Good usage of Unix commands to format and restructuring the intermediate files before loading into Target
Environment: Datastage 11.5/11.7, Control-M, Flat Files, Oracle 11g, Snowflake, SQL, SQL Developer, Windows 10, Unix, GIT, UCD, Python, Putty, WinScp
Client: Microsoft
Location: India
Role: ETL Developer
Aug 2019 – Apr 2021
Roles and Responsibilities:
Third party risk management is the process of identifying, managing, monitoring, and mitigating the risks presented to Microsoft Audit in conjunction with products and services provided by third party
Documenting the process flow and storing records of the trends of daily due diligence status reports
Create message requests on daily basis for posting messages using message queue dashboard
Incident tracking of issues received from customer
Analyse business problems & provide solutions
Participate with peers and represent business on project teams
Work closely with team on delivery application and ensure application is more securely accessible
Identifying application issues by debugging logs and resolving them based on priority
Responsible to keep application secure following corporate standards
Generate xml files out of messages coming from Message Queues using ETL Datastage and Unix
Participating in design review/ code review meetings- local/global Unit testing, Integration testing, UAT/SIT support Code check in, check out, merge, build management as needed Reporting to the Program manager on project/task progress as needed
Participated in knowledge transfer/acquisition sessions with client for various applications and maintain the documented records
Environment: Datastage 9.1/11.3, Control-M, Flat Files, Oracle 11g, Netezza, SQL Developer, Aginity Workbench, Windows 8.x, Unix, Putty, WinScp
Client: GC-Dental
Location: India
Role: ETL Developer
Apr 2017 – Aug 2019
Roles and Responsibilities:
Design and develop DataStage jobs for transformations by combining data with join stage, lookup stages etc. as per interface requirement
Develop Job streams (sequences) for regular processing, reprocessing for corrected data while creating target data for downstream business users
Create Shell Scripts to automate the schedule of DataStage jobs
Develop jobs to load data into SQL Server staging tables, DB2 Data Marts from ODS tables
Modifying and enhancing existing Datastage Jobs and UNIX shell scripts to make them scalable, robust and better transportability from one environment to the other and automating them
Development, Unit Testing, System Testing, and deploying them in Production environment
Understanding of business requirement shared by the customer and industry standard best practices followed in the project
Extensively involved in different Team review meetings and conferences with remote teams
Participated in requirements gathering and created Source to Target mappings for development
Created DataStage jobs to replace existing PL/SQL code where ever required to reduce maintenance
Performed the Unit testing for jobs developed to ensure that it meets the requirements
Used Control-M to schedule the jobs based on prerequisites in and dependencies out of the jobs
Environment: Datastage 9.x, Control-M, Flat Files, SQL Server, DB2, SQL Developer, Aginity Workbench, Windows 8.x, Unix, Putty, WinScp
Client: Venkateswara Coir
Location: India
Role: ETL Developer
Feb 2016 – Apr 2017
Roles and Responsibilities:
Used Control-M to schedule the jobs based on prerequisites in and dependencies out of the jobs
Development, Unit Testing, System Testing, and deploying them in Production environment
Understanding of business requirement shared by the customer and industry standard best practices followed in the project
Involved in doing the unit testing of the created dashboard and resolve the data load issues
Designing SQL queries along with config and schema files through which the tables will get loaded
Providing timely and crisp status updates on Client calls and highlighting issues during impediments call
Trouble shooting and fixing various data related issues, schema related issues, Query performance issues and fine-tuning queries on Oracle Database
Managed teams and performing team member evaluations, guiding & mentoring team members
Environment: Datastage 9.1/11.3, Control-M, Flat Files, Oracle 11g, Netezza, SQL Developer, SQL Squirrel, Windows 8.x, Unix, Putty, WinScp
Client: Dairy Day Ice Cream
Location: India
Role: Software Developer
Jan 2015 – Feb 2016
Roles and Responsibilities:
Involved developing views and controllers using MVC
Implemented Service classes and DAO classes
Developed presentation (view Level) using Telerik Control and C#
Providing timely and crisp status updates on Client calls and highlighting issues during impediments call
Involved in doing the unit testing to resolve issues
Involved in developing Cascading Dropdown fields & MVC Validation on views
Handling Maximum Operation by using Ajax Call
Environment: ASP.NET, C#, MVC, Visual Studio, SVN, Telerik, Windows 7.x, Unix, Putty, WinScp, Eclipse
EDUCATION
Bachelor of Technology in Electronics and Communication Engineering, Ongole
Intermediate from M.S.R Junior College, Kanigiri