Post Job Free
Sign in

Engineer Ii Sql Server

Location:
Cumming, GA
Posted:
January 08, 2025

Contact this candidate

Resume:

Sai Dutt Madisetty

801-***-****

Backend Engineer II – Data Architect

PROFESSIONAL SUMMARY

●Over twenty-one years of IT experience in Design, Development, Customization, Integration and Implementation of client/server and E-Business applications using Oracle 7.1/8.0/8i/9i/10g/11g (PL/SQL Packages, Procedures, Functions and Triggers), Oracle 8i Replication, Pro*C, Developer/2000 (Forms 4.5/5/6i/9i/10g, Reports 2.5/5/6i/9i/10g, Graphics 2.5/6i), SQL Server 2008, and PostgreSQL 9.5.

●Good functional knowledge on domains like Oracle Financial Applications 11i, r12 (AP, AR, GL), Oracle Adverse Event Reporting System 4.5, HealthCare Information System, Finance Management System, Material Management System, and Oracle ATG.

●Good exposure to GAP Analysis, Identifying the Design & Functional inconsistencies, Functional Integration, and Data Integration.

●Designing and implementing data pipelines using AWS services like AWS Glue, AWS Data Pipeline, or AWS Step Functions for ETL (Extract, Transform, Load) processes.

●Experience in managing appropriate data storage solutions such as Amazon S3, Amazon RDS, Amazon Redshift, or Amazon DynamoDB based on data requirements and access patterns.

●Experience in using EKS, S3, Glacier, FSx, RDS, DynamoDB, Aurora, DocumentDB, Redshift, EMR, QuickSight, Kinesis, Athena

●Experience in integrating data from various sources, including databases, APIs, and third-party services, using services like AWS Lambda or AWS Glue.

●Setting up monitoring and logging for data workflows using AWS CloudWatch and AWS CloudTrail to track performance and troubleshoot issues.

●Experience in using GCP Compute Engine, Cloud Functions, GKE, Cloud Storage, Cloud SQL, Firestore, BigQuery, BigQuery, Dataflow, Dataproc.

●Experience in Data ingestion from source systems like Oracle, SQL Server using Azure data factory, and building data pipelines.

●Experience creating Notebook using Azure Databricks, Delta Lake, Spark SQL, Spark Structured Streaming, and Direct Streaming using PySpark.

●Experience in creating Kafka producers, consumers, and topics to stream data from OLTP system to Analytics server to make applications real time instead of batch processing.

●Experience working with the Parquet, Avro, Delta, and ORC formats, and good understanding of Spark internals and distributed computing.

●Experience in developing data transformation and other analytical applications in Spark and Spark-SQL using Python programming language (PySpark).

●Implemented data migration and data cleansing using Python and Shell scripting, database design, data analysis, development, and data conversions.

●Created and managed database objects including tables, indexes, tablespaces, views, users, roles, privileges, and materialized views to meet Module requirements.

●Expertise in performance tuning and SQL optimization using Explain Plan and TKPROF.

●Good Exposure to Informatica, ETL scripts, Data warehousing concepts, Star schema, Data modeling, and Cognos Reporting Tool.

●Analyzed system requirements and created Replicated objects using Materialized views.

●Experience in production support, troubleshooting of database applications, and user training.

●Experience in developing Interface for data transfer/data loading using Pl/SQL programs, SQL*Loader, and Unix Shell Scripting.

●Expertise in developing and customizing SQL server procedures, functions, and SSIS Packages.

●Expertise in developing and customizing PL/SQL procedures, functions, packages, triggers, partition tables, object tables, XML Tables, nested tables, forms, and reports using Forms4.5/6i/9i/10g, Reports2.5/6i/9i/10g, and Pro*C.

●Extensively used Oracle Supplied Packages DBMS_SQL, DMBS_LOB, UTIL_FILE, DBMS_JOB, and DBMS_HTTP.

●Expertise in developing and deploying complex matrix and graphic reports on web front presentation layer using Report Builder 10g.

●Expertise in tuning, debugging, problem analysis, and giving the best solutions.

●Experience in production support, troubleshooting of database applications, and user training.

●Strong interpersonal skills, self-motivated with the ability to lead and drive teams.

●Experience in leading teams of four to six people, worked in multiple projects by providing support and enhancing per module requirements.

TECHNICAL SKILLS

Cloud Technology:

Azure, AWS and GCP.

Message Streaming:

Kafka, GCP Pub/Sub, AWS Kinesis

Databases:

Oracle 7.1, Oracle 8i, Oracle 9i, 10g, 11g, SQL Server 7.0/2008 PostgreSQL 9.5(ora2pg, pgadmin, pgadmin debugger), Hive, Delta Lake.

Big Data:

Hadoop, Map Reduce, Spark, HDFS, HBase, Pig, Hive, Sqoop, Kafka, Hue, Cloudera, Horton works, oozie and airflow.

Tools:

Developer Suite (Forms 4.5/5.0/6i/10g, Reports 2.5/3.0/6i/10g, Graphics

2.5/6i), Visual Basic 6.0, FrontPage, Crystal Reports, TOAD, Visual

Interdev, Oracle 10g Warehouse Builder, Oracle Discoverer 10g, SQL SSIS, Azure Data Factory, Azure Databricks.

Languages:

SQL, PL/SQL, C++, Java, Java Script, JSP, Unix Shell Script, Python, Scala.

ERP:

MEDICOM 2.1, IBA eHIS formerly known as MEDICOM 4.1., Oracle Financials 11i, Oracle Commerce, Oracle Adverse Event Reporting System 4.5.

WORK EXPERIENCE

Chicos FAS Inc.

September 2013 – Present

Backend Engineer II – Data Architect

●Involved in Software Analysis, design, development, implementations, and technical support

●Involved in gathering requirements and business rules and documented in Jira Confluence..

●Involved in coding and testing of migration of SQL server to Oracle.

●As a data architect, provided solutions to the current system moving from on-premises to Cloud platform.

●Collaborated with cross-functional teams, stakeholders, vendors and management for the smooth functioning of the enterprise data system.

●Developed data modeling and is responsible for data acquisition, access analysis, and design.

●Involved in data integration with existing warehouse structures and continuous improvements to system functionality.

●Involved in reviewing business requests for data and data usage.

●Developed and maintained all standards in accordance with data structures and models.

●Developed and modified SSIS Packages to integrate CRM application with ATG eCommerce application.

●Designing and implementing data pipelines using AWS services like AWS Glue, AWS Data Pipeline, or AWS Step Functions for ETL (Extract, Transform, Load) processes.

●Experience in managing appropriate data storage solutions such as Amazon S3, Amazon RDS, Amazon Redshift, or Amazon DynamoDB based on data requirements and access patterns.

●Experience in using EKS, S3, Glacier, FSx, RDS, DynamoDB, Aurora, DocumentD,Redshift, EMR, QuickSight, Kinesis, Athena

●Experience in integrating data from various sources, including databases, APIs, and third-party services, using services like AWS Lambda or AWS Glue.

●Setting up monitoring and logging for data workflows using AWS CloudWatch and AWS CloudTrail to track performance and troubleshoot issues.

●Experience in using GCP Compute Engine, Cloud Functions, GKE, Cloud Storage, Cloud SQL, Firestore, BigQuery, BigQuery, Dataflow, Dataproc.

●Created data pipelines using Azure data factory and Notebook using Azure Databricks.

●Experience in Delta Lake, Spark SQL, Spark Structured Streaming, and Direct Streaming using PySpark.

●Created Kafka producers, consumers, and topics to stream data from OLTP system to analytics server to make the application real time instead of batch processing.

●Involved in interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of new projects.

●Developed logical and physical design and created database objects including tables, indexes, views, and materialized views.

●Manually converted SQL server procedures, functions and SSIS packages to PL/SQL procedures, functions, and packages.

●Using EXPLAIN PLAN and TKPROF utilities, assessed efficiency of the modules.

●Developed Unix shell scripts and SQL *Loader scripts for loading data.

●Analyzed system design and created Replicated objects.

●Extensively used advanced features of PL/SQL like collections, varray, and Dynamic SQL.

●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package and bulk operation features.

●Developed and customized SQL Server and PL/SQL packages, procedures, functions, and triggers.

●Automated application using Scheduler Package in Oracle Database 11g.

●Involved in integration between Oracle e-Commerce and Manhattan.

●Used Bulk Collections for better performance and easy retrieval of data by reducing context switching between SQL and PL/SQL engines.

●Extensively used the version control tools like git, subversion, bit source.

●Provided support in troubleshooting production database application and reporting problems.

●Environment: Oracle 11g, SQL server 2008, SSIS, Oracle Commerce (ATG), Spark Structured Streaming, Spark Direct Streaming, Spark SQL, Spark Core, Scala, PySpark, Kafka, Delta Lake, Hive, Azure data factory, Azure data bricks, Shell Script, Python, GCP, BigQuery, Cloud Function, Cloud storage, PubSub.

RapidIT

August 2011 – August 2013

Sr. Analyst Application Developer

●Involved in Software Analysis, design, development, implementations, and technical support.

●Involved in coding and testing of Asset Data Services.

●Involved in business user interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of new release for Asset Data Services.

●Analyzed requirements of Workflow for Asset Data Services.

●Developed logical and physical design and created database objects including tables, indexes, and views and materialized views.

●Involved in implementation, Acceptance testing, and customization per enhanced requirements.

●Using EXPLAIN PLAN and TKPROF utilities, assessed efficiency of the modules.

●Developed Unix shell scripts and SQL *Loader scripts for loading data.

●Analyzed system design and created Replicated objects.

●Extensively used advanced features of PL/SQL like collections, varray, and Dynamic SQL.

●Improved performance of Database using appropriate hints, rewrote poor performing queries and optimized packages, materialized views, partitioning, indexing techniques, and bulk operation features.

●Developed and customized PL/SQL packages, procedures, functions, and triggers.

●Automated application using Scheduler Package in Oracle Database 11g.

●Prepared documentation and user manuals, trained users, and involved in validation and UA testing.

●Provided support in troubleshooting production database application and reported problems.

●Environment: Oracle 11g on UNIX, Oracle 10g Forms and Reports, Informatica, Java, JSP, Java Script.

CVS Caremark (May 2009 – July 2011)

Mutex Systems Inc. (April 2007 – April 2009)

RxAmerica (August 2008 – April 2009)

Sr. Analyst Application Developer

●Involved in Software Analysis, design, development, implementations, and technical support.

●Involved in coding and testing of Explanation of Benefits module.

●Involved in business user interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of Explanation of Benefits.

●Designed module in such a way that EOB’s will be generated even if database was not supporting Spanish Language.

●Analyzed requirements of Workflow for Explanation of Benefits.

●Developed logical and physical design and created database objects including tables, XML Tables, indexes, and views and materialized views.

●Used ora2pg for migration from Oracle to PostgreSQL, troubleshot and fixed conversion issues.

●Manually converted Oracle packages to PL/pgSQL functions and procedures that could not migrate with ora2pg.

●Used pgadmin 4 for development.

●Involved in implementation, Acceptance testing, and customization per enhanced requirements.

●Used EXPLAIN PLAN and TKPROF utilities to assess efficiency of the modules.

●Developed Unix shell scripts and SQL *Loader scripts for loading data.

●Involved in enhancement of Replicated Database design.

●Analyzed system design and created Replicated objects.

●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package, materialized views, partitioning, indexing techniques, and bulk operation features.

●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Forms 10g and Reports 10g.

●Automated application using Scheduler Package in Oracle Database 10g.

●Developed and deployed ad-hoc reports, interactive reports, and graphics on the web front presentation layer using SQR to help senior management.

●Prepared documentation and user manuals, trained users; involved in validation and UA testing.

●Provided support in troubleshooting production database application; reported problems.

●Environment: Oracle 11g on UNIX, SQL server 2008, PostgreSQL 9.4(ora2pg, pgadmin, pgadmin debugger), SSIS, Informatica, Java, JSP, Java Script.

Schering

July 2007- July 2008

Sr. Oracle Application Developer

●Responsible for mentoring a team of three developers and two Business analysts.

●Documented user requirements, responsible for updating and marinating project documentation per SDLC.

●Involved in technical reviews of functional and design specifications.

●Good Exposure to Oracle Clintrial Database.

●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Reports10g for CARES Project.

●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package using materialized views, partitioning, indexing techniques, and bulk operation features.

●Monitored and analyzed performance of the database using Automatic Workload Repository.

●Designed and created materialized views with partitioning for better performance for queries and reports.

●Developed logical and physical design, created database objects including tables, XML Tables, indexes, and views, and materialized views.

●Created XML Queries to generate output in XML Format.

●Created Tables with CLOB Data Types to put generated XML output in CLOB column data type.

●Extensively Used Oracle XML API, DBMS_HTTP, and DBMS_LOB.

●Automated application using Scheduler Package in Oracle Database 10g.

●Developed SQL *Loader scripts for loading data.

●Developed and deployed interactive reports and graphics on the web front presentation layer using Oracle Report Builder 10g to help senior management.

●Generated plain text reports, graphics, Excel, and PDF output.

●Designed, developed, and deployed interactive multi-tab, multi-stack, master-detail Oracle Forms for application using Oracle Forms 10g.

●Provided support in troubleshooting production database applications and reporting problems.

●Prepared documentation and user manuals and trained users; involved in validation and UA testing.

●Environment: Oracle 10g on UNIX, Oracle Reports 10g, Oracle Forms 10g, Oracle Discoverer 10g, and TOAD.

IBA Health Limited (aka Medicom Solutions)

Decreaption: MEDICOM is a healthcare product implemented for different clients mention in clients section. Its an integrated solution product which has Billing, EMR, Inpatient, Outpatient, Medical Records, Finance modules.

October 2002 – February 2007

Product Development Lead

Clients: Lahad Datu Hospital (Malaysia), American Hospital (Dubai), Gauteng Province Hospital (South Africa), Siriraj Hospital (Thailand)

●Involved in development, testing of Blood Donor Management System module, and enhancements of Laboratory management system module for Malaysia Project using Oracle 6i Forms Developer.

●Involved in user interaction for collecting specifications, design, development, coding, testing, gap analysis, implementation of Clinician Access, and Risk Management module for American Hospital (Dubai).

●Analyzed requirements of Workflow for Clinician Access and Risk Management mechanism.

●Involved in implementation, Acceptance testing, and on site customization per the enhanced requirements.

●Used EXPLAIN PLAN and TKPROF utilities to assess efficiency of the modules.

●Involved in enhancement of Replicated Database design.

●Analyzed system design and created Replicated objects.

●Used PVCS version control management software.

●Extensively Used Oracle XML API, DBMS_HTTP, and DBMS_LOB.

●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Forms4.5/6i, Reports2.5/6i, and Pro*C for Laboratory, Nursing Management, Inventory, Risk Management, Patient Registration, Appointment Management, Inpatient Management, Outpatient Management, Accident & Emergency, Order entry, Medical Stores, Purchase Order, and MIS Modules.

●Environment: Oracle 8i on UNIX for Gauteng Province Hospital (South Africa). Oracle 9i on UNIX for American Hospital (Dubai), and Lahad Datu Hospital (Malaysia), Oracle Financial Applications 11i(AP, AR, GL), Developer 2000(Forms 4.5/6i, Reports 2.5/6i, Proc*C) on Windows 2000/95.

Electronics Corporation of India Limited

January 1999 – July 2002

System Analyst

●Involved in Software Analysis, design, development, and implementation.

●Analyzed requirements of the Administration, Finance, and Inventory mechanism.

●Designed and developed database structure to meet Module requirements.

●Interacted with Staff of various department personnel regarding analysis of Finance and Inventory Management requirements to be captured in Application.

●Involved in coding of Stored Procedures, functions, and packages.

●Developed and customized forms and reports using Forms 5 and Reports 3 for Administration, Finance, and Inventory modules.

●Environment: Oracle 8i on Windows NT, Developer 2000(Forms 5.0, Reports 3.0) on WINDOWS 95.

EDUCATION/CERTIFICATIONS

Bachelor of Science, Electronics, Osmania University June 1997

Certified as ORACLE PL/SQL Developer Certified Associate



Contact this candidate