Harini M
Sr. Oracle Database Developer
****.********@*****.***
SUMMARY:
Around 8+ years of working experience in MS SQL Server and Oracle PL/SQL with expertise in MS SQL2016/2014/2012/2008, SSIS, SSRS, SSAS and Oracle 12c/11g/10g.
Experience in SQL development, database design, and performance tuning across Oracle, SQL Server, and DB2 platforms.
Proficient in implementing all the phases of the Software Development Life Cycle (SDLC) from understanding requirement specifications to writing technical specifications, and performing software design, coding, implementation, testing, database design, RDBMS, and data warehousing.
Extensive experience working in Test Driven Development (TDD) and Agile development environments.
Thorough experience building workflow solutions and Extract Transform and Load (ETL) solutions for data warehousing using SQL Server Integration Services (SSIS).
Parsed Organizations, Orders, Objectives, Adjustments, and data in various formats like text-based files, Excel spreadsheets, JSON file formats, XML files, Log files to SQL Server databases using SQL Server Integration Services (SSIS) to overcome the transformation constraints.
Skilled with High-Level Design of SSIS Packages for integrating data using OLE DB connection from heterogeneous sources (Excel, Oracle, Teradata, flat files) by using multiple transformations provided by SSIS like Data Conversion, Conditional Split, Bulk Insert, merge, and union all.
Involved in ETL development using Oracle tools, performing ETL testing for data validation.
Managed user roles, privileges, and schema for secure and efficient database access.
Extensive experience in developing wide range of reports/visualizations using SSRS, modelling and optimizing SSAS cubes with different storage modes, aggregation designs and processing techniques.
Developed Advance PL/SQL packages, procedures, triggers functions, Indexes and Collections to implement business logic using SQL Navigator.
Generated server-side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
Experience in the field of Web Application Development including Client/Server with proficiency in HTML, CSS, JavaScript, jQuery, XML, C#, SQL Server, ASP.NET, Web Services and MVC.
Involved creating operational reports in SSRS 2008/2008r2/2012 i.e. Drill-Down, Drill-through, Dashboard, Sub-Report, and Matrix reports for various domains.
Expertise in using writing T-SQL Queries, Dynamic queries, sub-queries, and complex joins for generating complex Stored Procedures, Triggers, User-defined Functions, Views, and Cursors.
Have experience of designing & implementing Logical & Physical Data Model on Oracle/SQL Server & DB2 platform for Data Warehouse & Data Marts.
Extracted, transformed, and loaded millions of data rows using ODI 11g, including installation, configuration, and administration. Experience & good hold on Analysis and Designing in Oracle, SQL Server, DB2, and UNIX.
Proficient with Database optimization and developing stored procedures, Triggers, Cursors, Joins, Views, Cursors, and SQL on databases like SQL Server, MySQL, and Oracle10g.
Implemented scalable and resilient APIs using ASP.NET Core Web API and integrated with Salesforce REST API for data synchronization.
Knowledge of data access and delivery methods, analytics applications and tool portfolio, data sourcing and integration methods. Understanding of data warehouse, ETL, data cleansing, and architecture.
Strong knowledge of ETL processes using SQL, T-SQL, and SQL Loader.
Experienced supporting Very Large Databases (VLDB) and very large tables with millions of rows and troubleshooting the problems.
Expertise in Azure infrastructure management (Azure Web Roles, Worker Roles, SQL Azure, Azure Storage).
Implemented Normalization, Transactional, Snapshot replication and performed log shipping as Disaster recovery solutions.
Installed, configured, and administered Microsoft SQL Server on Windows Server environments.
Good Working knowledge on the Integration tools like SnapLogic, Oracle Warehouse Builder
Provided technical leadership by mentoring junior developers, conducting code reviews, and enforcing best practices in SQL development and ETL design.
Integration of Exadata monitoring into our existing IT monitoring framework. Provide and implement performance and tuning recommendations of all components in the Exadata machines.
Software and firmware maintenance of the Exadata storage servers in the Database machine including routine health check, patching and upgrades according to the suggestions from Oracle support department.
Experienced in performance tuning, query optimization, client/server connectivity and Database Consistency Checks using DBCC utilities.
Expertise in using Configuration management tools (Version Controllers) like SVN, GIT, BIT Bucket and Team Foundation Server (TFS).
Experienced in interacting with clients/users to gather user requirements, and excellent written and verbal communication skills.
Good communication, analytical, interpersonal skills, and ability to perform as part of a team as well as an individual under pressure.
TECHNICAL SKILLS:
Databases & PL/SQL Development
Relational Databases: Oracle (8i–19c), MySQL, SQL Server (2005/2008), DB2/UDB, PostgreSQL, SAP Tables, MS Access
PL/SQL Expertise: Advanced SQL, Stored Procedures, Functions, Packages, Triggers, Exception Handling, Bulk Processing, Dynamic SQL, Collections, Ref Cursors, Pipelined Functions
Query Performance & Optimization: Query Optimization, Indexing, Partitioning, Explain Plan, Optimizer Hints, Materialized Views, AWR, ADDM, TKPROF
ETL & Data Integration
ETL Tools & Processes: Informatica PowerCenter, Oracle Data Integrator (ODI), IBM Infosphere DataStage, SSIS, SQL*Loader
Data Pipelines & Warehousing: Star & Snowflake Schema Modeling, Erwin, Sybase Power Designer, OWB, Pandas, NumPy, Pyspark, DataFrame manipulation,
Automation & Scheduling: Unix Shell Scripting, Control-M, AutoSys
Reporting & Business Intelligence
BI & Reporting Tools: Power BI (DAX, M Query, Data Modeling), BI Publisher, SSRS, Crystal Reports, Tableau, Oracle APEX, Oracle Reports 6i/9i/10g
Enterprise Reporting & Visualization: Workflow Builder, XML Publisher, SAP Business Objects
Cloud & DevOps
Cloud Databases & Services: AWS RDS (Oracle), AWS Lambda, Azure SQL Database, Azure Synapse, Snowflake, GCP BigQuery, GCP Dataflow, Cloud-Based ETL Pipelines
DevOps & CI/CD: Jenkins, Git, Subversion (SVN), ClearCase, TFS
Programming & Scripting
Languages & Technologies: PL/SQL, SQL, Python, Unix Shell Scripting, Java, C/C++/C#, VB.NET, XML, JSON, Angular
Middleware & Integration: Oracle Forms/Reports, Oracle E-Business Suite (EBS), Oracle Fusion Middleware, Apache Airflow
Tools & Platforms
Development & Administration Tools: TOAD, SQLPlus, SQL Developer, SQLLoader, Oracle Enterprise Manager (OEM), PyCharm, Oracle Recovery Manager (RMAN), HP PPM, HP ALM/QC
Security & Performance Monitoring: AWR, ADDM, Oracle Data Guard, Database Security, Encryption..
PROFESSIONAL EXPERIENCE:
Client: CenterLight Healthcare, Ridgewood, NY Dec 2022 - Till Date
Role: Sr SQL Server\Oracle Database Developer
Designing and developing efficient ETL (Extract, Transform and Load) solutions using SQL Server Integration Services.
Building SSIS (SQL Server Integration Services) packages involving ETL process, extracting data from legacy systems using various formats such as flat files, Excel files, XML files, JSON files and loading it into SQL server.
Transformed complex business logic into Database design and maintaining it by using SQL objects like Stored Procedures, User Defined Functions, and Views.
Parsing simple and complex XML, JSON source files to SQL server.
Worked in two-week Sprint development and provided user points/estimated to scrum master and delivered on time.
Code optimization and improving efficiency in databases including Re-indexing, Updating Statistics, Recompiling Stored Procedures and performing other maintenance tasks.
Created Web API/ REST API for accessing business as JSON/XML object.
Developed SSIS packages for migrating SQL Server and Oracle databases to Star Schema Data warehouse. Before migrating to Data warehouse each relational database has a staging database.
Optimized SQL and PL/SQL performance using tools like EXPLAIN PLAN, SQL*TRACE, TKPROF, AUTOTRACE, and Oracle Hints. Improved query performance through indexing, partitioning, and parallel processing.
Troubleshot and resolved Oracle performance issues, including data integrity concerns and query bottlenecks
Perform design/code reviews to make sure that the design/code adheres to customer’s standards and meets industry’s best practice.
Responsible to tune ETL procedures and schemas to optimize load and query Performance.
Involved with the Architecture group to develop ETL metadata strategies and Informatica objects reuse policies. Developed reusable Informatica Mapplet’s and Transformations.
Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc. Used Informatica Workflow Monitor to monitor and control jobs.
Migrated data from legacy systems (e.g., SQL Server) to Oracle Data Warehouse using custom procedures and functions.
Involved in production support activities with Installation and Configuration of Informatica Power Center 8.6.
Analyzed, Designed and Implemented the ETL architecture and generated OLAP reports.
Configured SSIS packages using package configuration wizard to allow packages run on different environments.
Have experience of designing & implementing Logical & Physical Data Model on Oracle/SQL Server & DB2 platform for Data Warehouse & Data Marts.
Implementing the Physical models on SQL Server & DB2 platform & preparing DDL and DML scripts and streamlining the process for any changes.
Configured SQL Server performance monitoring using AWR, SQL Profiler, and DMVs, identifying bottlenecks and optimizing database workloads.
Designed star and snowflake schemas for optimized data warehouse and OLAP cube performance.
Leveraged Python libraries such as pandas, NumPy, and scikit-learn within SSIS environments to perform advanced data manipulation, statistical analysis, and machine learning tasks.
Used T-SQL to develop DDL, DML and dynamic SQL daily.
Used Azure DevOps (ADO) for CI/CD pipelines, automating database deployments and ensuring version control.
Performed maintenance duties like performance tuning and optimization of queries, functions, and stored procedures. Used Teradata database with OLEDB source in SSIS and Informatica.
Developed ETL processes using Oracle tools (SQL*LOADER, PL/SQL, Informatica), and performed ETL testing for data validation.
Coded UNIX shell scripts to execute Oracle Custom Packages, Procedures, and OWB ETL mappings on a scheduled basis.
Configured high availability for Azure SQL databases using Always On Availability Groups and Azure SQL Elastic Pools.
Created various Crystal reports, involving cross tabs, sub-reports, charts, and graphs.
Coded OWB ETL mappings for data movement and resolved QA Trackers.
Collaborated with stakeholders to maintain a metadata repository and performed metadata validation, reconciliation, and error handling in ETL processes. Unit tested and tuned SQLs and ETL Code for better performance.
Migrated data from Oracle to Snowflake, reducing data processing time by 30% and ensuring seamless integration.
Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
Created various Documents such as Source-To-Target Data mapping Document, Unit Test Cases and Data Migration Document.
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter
Created Tables, Indexes, Constraints Views, Materialized Views, DML, Stored procedures on SQL server in Dev Environments and tested and promoted the scripts to higher environments as a part of PSR work (User Stories).
Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task, Send Mail task, etc.
Used Pull Request to Create/ Review/Merge scripts in Git repository on Azure Repos.
Developed and optimized T-SQL queries and stored procedures in Azure Synapse SQL Pools to extract insights from large volumes of structured and semi-structured data, leveraging advanced analytics capabilities such as window functions and in-memory processing.
Utilized Azure Data Factory's monitoring and logging features to track pipeline execution, monitor data quality, and troubleshoot performance issues, ensuring reliable and robust data integration solutions.
Used Execution Plan, SQL Profiler and Database Engine Tuning Advisor to optimize queries and enhance the performance of databases.
Environment: SQL Server 2018, Sybase ASE, Visual Studio 2008,10,12,15,17, C#, MVC, XML, SSIS, T-SQL, DAX, Informatica, MDX, REST API, PowerShell, Python, JSON, XML, TFS, IIS, Azure, Azure Data Lake, Synapse, Business Intelligence Development Studio, OLE DB, GIT, GIT Bash, SVN, Bit bucket.
Client: Fiserv, Parsippany, NJ July 2021 - Nov 2022
Role: Sr SQL Server\Oracle Database Developer
Created Mappings using Designer, Extracted the data from the flat files and other RDBMS databases into staging area and loaded onto Data Warehouse.
Developed Mappings by using Aggregator, SQL Overrides by using Lookups, source filter by using Source Qualifiers and Data Flow Management into multiple targets using Router Transformation.
Generated tabular, matrix, drilldown, Drill through, summary Reports, sub reports, detailed reports, Parameterized and linked reports using Tableau.
Performance Tuning of Stored Procedures, SQL queries using SQL Profiler Index Tuning Wizard in SSIS. Data migration from text files, csv, excel files to SQL server. Understanding existing functionality and database design.
Created indexes on the tables for faster retrieval of the data to enhance database performance.
Created (or edited existing) Informatica mappings and sessions to accomplish data migrations from various sources such as Oracle, SQL Server 2000, DB2 and flat files and loaded into Data Warehouse.
Optimized various Oracle/Sybase/DB2 maintenance and application related, shell/SQL jobs and reports.
Automated job scheduling using Control-M, Autosys, and Informatica Scheduler for efficient task execution.
Built efficient SSIS packages for processing fact and dimension tables with complex transformations implemented the error threshold using precedence constraints and variables in SSIS packages using business rules as reference.
Worked on Data Marts, Data warehousing, Operational Data Store (ODS), OLAP, Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables using MS Analysis Services
Developed and optimized ETL processes using Oracle tools (SQL*Loader, PL/SQL) and Informatica PowerCenter for data extraction, transformation, and loading.
Integrated AutoSys with Oracle databases, ensuring efficient execution of stored procedures, SQL scripts, and ETL workflows.
Worked on Data migration from DB2 legacy system to oracle 11g system and worked on automation of reconciliation between legacy system and new data warehouse.
Involved in automating SSIS Packages using SQL Server Agent Jobs. Created an ETL summary report which consists of summary information for all the loading activities done each day and month.
Designed and developed various SSIS packages (ETL) to extract and transform data and involved in Scheduling SSIS Packages. Created packages using the various control flow tasks such as data flow task, execute SQL task. Created SSIS Packages/projects to move data from source to destination programmatic by using Visual Studio
Worked on various technologies such as Oracle, Access DB, Excel/Flat files, using them as sources and targets in ODI.
Collaborated with development teams to integrate Power BI capabilities into existing applications, expanding the reach of data analytics across the organization.
Developed custom applications with embedded Power BI reports, providing tailored solutions for specific business needs and enhancing user experience
Identified opportunities for optimization in data storage and query performance, leading to a 20% reduction in resource consumption and associated costs
Develop and record certified processes/procedures for all potential IT downstream support groups to maintain and upgrade the Exadata database machine.
Integration of Exadata monitoring and escalation into existing IT monitoring and actionable events solutions.
Worked on Data migration from DB2 legacy system to oracle 11g system and worked on automation of reconciliation between legacy system and new data warehouse.
Administered DB2 Database on Linux, UNIX, and Windows, including audit configuration, SQL tuning, and security reviews. Created source, target, and process models to load data with DB2 Data warehouse Center.
Extracted and transformed source data from different database like Oracle, SQL Server and DB2 and flat file into oracle.
Extracted and transformed source data from different database like Oracle, SQL Server and DB2 and flat file into oracle.
I was responsible for guiding borrowers through the loan application process, evaluating their creditworthiness, and making decisions on loan approvals. Underwriters, on the other hand, review loan applications and assess the risk involved in lending money to borrowers.
Designed and optimized data models for a complex dataset, resulting in a more streamlined and efficient reporting process.
Configured Siebel tables, Business Components, Integration Objects, web services and workflows to fetch data through SQL queries and transfer the information from Siebel to PeopleSoft application through SOA middleware.
Design, development and implementation of backend Redshift tables and ETL jobs in Snaplogic and informatica to transform data from varied sources systems.
Unit testing and integration testing of Snaplogic pipelines and performance improvements. created task for scheduling the ETL pipelines in Snaplogic.
Integrated Azure SQL Database with Azure Blob Storage and Azure Redis Cache for efficient data storage and retrieval.
Configured Siebel Links, joins, Multi Value Groups, Static and Dynamic Drilldowns, Toggle Applets, Views, Integration Objects, Web Service, Data Mapping, Workflows, tables, and Jobs to meet client’s requirement.
Creating Stored Procedure, User Define Functions, Views, TSQL scripting for complex business logic
Involved in Development of test framework using Python.
ASP.NET web applications and .NET components hosted in windows 2000.
Excellent knowledge of dealing with Delegates, Assemblies, User Controls and Custom Controls in C# .Net.
Worked on .NET Security features such as Authentication & Authorization, Forms-based Authentication, Authorizing Users, Roles, and User Account Impersonation
Environment: Tableau Desktop 8/9/10/2018/2019/2020, Tableau Server 8/9/10/2018/2019/2020), MS SQL Server 2014/2012, Microsoft Visual Studio 2012, XHTML, XML, MS SQL Query Analyzer, MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), Power BI and ASP.NET, C#, T-SQL.
Client: CNA Insurance, Chicago-IL May 2020 – June 2021
Role: SQL Server\Oracle Database Developer
Used transformations like data formatting, check points, error handling, email notifications in SSIS packages.
Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming, and loading data into data warehouse.
Created SSIS Packages using SSIS Designer for export heterogeneous data from OLE DB Source, Excel Spreadsheet to SQL Server 2008/2012.
Developed business intelligence solutions using SQL server data tools 2015 versions and load data to SQL & Azure Cloud databases.
Provided guidance for the team in developing optimized, high quality code deliverables, performing code reviews and unit test plan reviews, and conducting independent testing.
Followed the Software Development Life Cycle (SDLC) including Requirement Gathering, Design, Coding and Unit Testing.
Integrated SQL Server Integration Services (SSIS) packages with Azure Data Factory pipelines using the SSIS Integration Runtime, enabling seamless migration and execution of existing ETL workflows in the cloud.
Troubleshooted and provided RCAs to the production bugs.
Involved modifying existing Informatica Packages and execute it through batch file.
Troubleshooted and fixed performance issues on SQL code.
Monitored and troubleshooted the failed SSIS packages and optimized SSIS package performance to reach the expected timeframe for data loading.
Designed and created data extracts, Power BI, Tableau, or other visualization tools reporting applications.
Integrated SQL Server Integration Services (SSIS) packages with Azure Synapse pipelines using Synapse Spark to orchestrate ETL workflows, enabling seamless data movement and transformation across on-premises and cloud data sources.
Designed Snaplogic ETL/ELT template for different source to snowflake and SQL Server data loading.
Migrated SSIS package into Snaplogic pipeline for data cleaning, lookup, and ETL solution.
Conducted performance tuning and query optimization. Provided Onsite and Offshore coordination.
Wrote and managed T-SQL, Stored Procedures, Functions, Triggers and SSIS Packages to meet the business requirements.
Involved in creating tags and database related release activities and fixing the deployment failures.
Worked on incident tickets related to database issues in production.
Provided grants for application team and ETL team to read and write the database tables in dev environment and tagged them to higher environment.
Performed T-SQL and Workflows performance tuning and optimization of queries for reports that take longer execution time.
Responsible for strategic planning and execution, performance, deliverables quality and schedule of data migration work stream.
Led the Migration project, converted package deployment model to project deployment model.
Responsible for testing, fixing the bugs and troubleshooting the technical problems.
Environment: Visual Studio 2015, C#, MVC, SQL Server 2014/2016, XML, SSIS, SSRS, Power BI, Python, Tableau, Informatica, T-SQL, DAX, MDX, PowerShell, Azure, TFS, IIS, SVN, GIT.
Client: Intuit, Bangalore-India Oct 2017 – July 2019
Role: SQL Server Developer
Created various Tabular, cross tab, Matrix and Ad-hoc reports using SSRS.
Designed and implemented complex SSIS package to migrate data from multiple data sources for data analysis.
Included Report Design and Coding for Standard Tabular type reports, including Drill down and Drill through functionality and Graphical presentations such as Charts and Dashboard type metrics.
Generated Drill down and Drill through reports with Drop down menu option, sorting the data, defining subtotals in SSRS.
Performed ETL process by pulling large volumes of data using SSIS, BCP into the staging database from various data sources like Access, Excel.
Scheduled Jobs for executing the stored SSIS packages which were developed to update the database.
Created Error and Performance reports on SSIS Packages, Jobs, Stored procedures, and Triggers.
Created alerts, notifications and emails for system errors, insufficient resources, fatal database errors and hardware errors. Used BCP command and TSQL command to bulk copy the data from Text files to SQL Server and vice versa.
Development of Informatica jobs t import data from Legacy Mainframes int CRM Siebel Staging tables.
Development of Siebel EIM jobs t load data int Siebel CRM.
Working on setting up agents on Exadata x6 machines for monitoring.
Primary technical resources in SQL tuning in Exadata Production and Performance Testing environments by using statistics, hints, SQL tuning set etc.
Designed and optimized data warehouse structures, including fact and dimension tables, using star and snowflake schemas.
Post production support of bug fixes and enhancements related t the Informatica and Siebel EIM data loads.
Worked on Visual Studio 2017 / .Net 4.6 Framework to implement Business Logic.
Worked on .NET Security features such as Authentication & Authorization, Forms-based Authentication, Authorizing Users, Roles, and User Account Impersonation
Created Stored Procedures, T-SQL, Triggers, Views, and Functions for the database Application.
Defined the report layout and identified Datasets for the report generation. Built drill down reports, and parameterized reports using SSRS. Extensively used Report Builder and Report Manager in SSRS.
Written Complex SQLs, PL/SQL Functions and Procedures, Packages, and creation of Oracle Objects – Tables, Materialized views, Triggers, Synonyms, User Defined Data Types, Nested Tables and Collections, Dynamic SQL.
Environment: MS SQL Server 2008R2, SQL Server Integration Services (SSIS), SSAS, SSRS, OLTP, Power BI Business Intelligence Development Studio, SQL Server Profiler, SQL Server Management Studio, Excel, TFS
Client: Intergraph, India July 2014 – Sep 2017
Role: SQL Developer
Developed and maintained complex ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS) for seamless integration with Snowflake.
Worked on Visual Studio / .Net CORE Framework to implement Business Logic.
Created Web API/ REST API for accessing business as JSON object for admin portal.
Worked mainly with Python scripting to code predictive models and data wrangling. Monitored and analysed daily/monthly/quarterly/yearly sales reports to understand product and market trends.
Created and optimized complex T-SQL queries and stored procedures to encapsulate reporting logic.
Manage the end-to-end Middleware functionality and data load in salesforce data loader for Veeva CRM application.
Expertise in writing complex DAX functions in Power BI and Power Pivot.
Designed and documented Architecture of Power BI POC.
Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into report.
Design and Development of data validation, load processes, test cases using PL/SQL, Stored Procedures, Functions, Triggers using Oracle.
Used OAuth authentication in the Asp.Net MVC web pages for authentication against user’s credentials.
Developed SQL/SSIS-based data integration solutions to synchronize Salesforce CRM data with on-premises databases, ensuring seamless data exchange and consistency between Salesforce and internal systems.
Generated SQL and PL/SQL scripts to install create and drop database objects, including tables, views, primary keys, indexes, constraints, packages, sequences, grants, and synonyms.
Used different transformations like looping, data formatting, check points, error handling, email notification, etc. in SSIS packages.
Worked on query optimization, index tuning, caching, and buffer tuning to improve performance of databases.
Experience working on Data modeling, creating star and snowflake schemas.
Performed T-SQL, SSIS (SSDT) Package and Workflows performance tuning and optimization of queries for reports that take longer execution time.
Expertise in performance tuning of Data Flow Task in SSIS Packages.
Created and maintained database objects like complex Stored Procedures, Triggers, Cursors, and Tables, Views and SQL Joins using SQL Server.
Requirements gathering, design and development of Informatica ETL loads for OLAP Data warehouse and Siebel EIM.
Siebel EIM and Data Mapping.
Participated in deploying the application on the IIS server.
Conducted regular system audits and updates to maintain optimal performance and compliance with industry standards.
Troubleshooted the SSIS packages and fixed in the development environment.
Used Team Foundation Server for Source Code Control, project related document sharing and team collaboration.
Environment: Visual Studio 2015, .NET CORE, MS SQL Server, Oracle 11g, SSRS, PL/SQL, T-SQL, TFS, REST API, SQL Server Data tools (SSDT), Salesforce, Python, C#, JSON, ASP.NET, MVC, SSIS.