Sai Dutt Madisetty
**********@*****.***
GCP Solution Architect
EDUCATION
Bachelor of Science, Electronics, Osmania University June 1997
CERTIFICATIONS
Certified as ORACLE PL/SQL Developer Certified Associate
PROFESSIONAL SUMMARY
●Over twenty-one years of IT experience in Design, Development, Customization, Integration and Implementation of client/server and E-Business applications using Oracle 7.1/8.0/8i/9i/10g/11g (PL/SQL Packages, Procedures, Functions and Triggers), Oracle 8i Replication, Pro*C, Developer/2000 (Forms 4.5/5/6i/9i/10g, Reports 2.5/5/6i/9i/10g, Graphics 2.5/6i), SQL Server 2008, and PostgreSQL 9.5.
●Good functional knowledge on domains like Oracle Financial Applications 11i, r12 (AP, AR, GL), Oracle Adverse Event Reporting System 4.5, HealthCare Information System, Finance Management System, Material Management System, and Oracle ATG.
●Good exposure to GAP Analysis, Identifying the Design & Functional inconsistencies, Functional Integration, and Data Integration.
●Setting up monitoring and logging for data workflows using AWS CloudWatch and AWS CloudTrail to track performance and troubleshoot issues.
●Experience in using GCP Compute Engine, Cloud Functions, GKE, Cloud Storage, Cloud SQL, Firestore, BigQuery, BigQuery, Dataflow, Dataproc.
●Experience in Data ingestion from source systems like Oracle, SQL Server using Azure data factory, and building data pipelines.
●Designing and implementing data pipelines using AWS services like AWS Glue, AWS Data Pipeline, or AWS Step Functions for ETL (Extract, Transform, Load) processes.
●Experience in managing appropriate data storage solutions such as Amazon S3, Amazon RDS, Amazon Redshift, or Amazon DynamoDB based on data requirements and access patterns.
●Experience in using EKS, S3, Glacier, FSx, RDS, DynamoDB, Aurora, DocumentDB, Redshift, EMR, QuickSight, Kinesis, Athena
●Experience in integrating data from various sources, including databases, APIs, and third-party services, using services like AWS Lambda or AWS Glue.
●Experience creating Notebook using Azure Databricks, Snowflake, Python, Power BI,Delta Lake, Spark SQL, Spark Structured Streaming, and Direct Streaming using PySpark.
●Successfully implemented Unity Catalog in Databricks to provide a unified governance solution across multiple data assets, enhancing data discoverability and compliance.
●Proficient in using Apache Spark, Spark Core, Spark SQL, and Spark MLlib for distributed data processing, SQL querying, and machine learning model development.
●Developed and maintained robust data transformation workflows using dbt, leveraging modular SQL for efficient data modeling.
●Implemented dbt tests to ensure data quality and integrity.
●Integrated dbt with cloud data warehouses (e.g., Snowflake) to streamline ETL processes.
●Created comprehensive documentation of dbt models and transformations using dbt docs, improving data lineage visibility and self-service capabilities for analysts.
●Designed and implemented ETL processes using Snowflake's data loading features, ensuring seamless integration with various data sources.
●Leveraged Snowflake to design and implement scalable data warehousing solutions, improving data retrieval speeds.
●Developed robust ETL pipelines using Snowpipe and AWS services to automate data ingestion from various sources.
●Conducted performance tuning and optimization of SQL queries, resulting in a 30% reduction in execution time.
●Collaborated with cross-functional teams to gather requirements and translate them into data models, enhancing data accessibility.
●Implemented security best practices within Snowflake, managing user roles and permissions to ensure compliance with data governance policies.
●Experience in creating Kafka producers, consumers, and topics to stream data from OLTP system to Analytics server to make applications real time instead of batch processing.
●Experience working with the Parquet, Avro, Delta, and ORC formats, and good understanding of Spark internals and distributed computing.
●Experience in developing data transformation and other analytical applications in Spark and Spark-SQL using Python programming language (PySpark).
●Implemented data migration and data cleansing using Python and Shell scripting, database design, data analysis, development, and data conversions.
●Created and managed database objects including tables, indexes, tablespaces, views, users, roles, privileges, and materialized views to meet Module requirements.
●Expertise in performance tuning and SQL optimization using Explain Plan and TKPROF.
●Good Exposure to Informatica, ETL scripts, Data warehousing concepts, Star schema, Data modeling, and Cognos Reporting Tool.
●Analyzed system requirements and created Replicated objects using Materialized views.
●Experience in production support, troubleshooting of database applications, and user training.
●Experience in developing Interface for data transfer/data loading using Pl/SQL programs, SQL*Loader, and Unix Shell Scripting.
●Expertise in developing and customizing SQL server procedures, functions, and SSIS Packages.
●Expertise in developing and customizing PL/SQL procedures, functions, packages, triggers, partition tables, object tables, XML Tables, nested tables, forms, and reports using Forms4.5/6i/9i/10g, Reports2.5/6i/9i/10g, and Pro*C.
●Extensively used Oracle Supplied Packages DBMS_SQL, DMBS_LOB, UTIL_FILE, DBMS_JOB, and DBMS_HTTP.
●Expertise in developing and deploying complex matrix and graphic reports on web front presentation layer using Report Builder 10g.
●Expertise in tuning, debugging, problem analysis, and giving the best solutions.
●Proficient in leading a team of 6 people and production support, troubleshooting of database applications, and user training.
TECHNICAL SKILLS
Cloud Technology:
Azure, AWS and GCP.
Message Streaming:
Kafka, GCP Pub/Sub, AWS Kinesis
Databases:
Oracle 7.1, Oracle 8i, Oracle 9i, 10g, 11g, SQL Server 7.0/2008 PostgreSQL 9.5(ora2pg, pgadmin, pgadmin debugger), Hive, Delta Lake.
Big Data:
Hadoop, Map Reduce, Spark, HDFS, HBase, Pig, Hive, Sqoop, Kafka, Hue, Cloudera, Horton works, oozie and airflow.
Tools:
Developer Suite (Forms 4.5/5.0/6i/10g, Reports 2.5/3.0/6i/10g, Graphics
2.5/6i), Visual Basic 6.0, FrontPage, Crystal Reports, TOAD, Visual
Interdev, Oracle 10g Warehouse Builder, Oracle Discoverer 10g, SQL SSIS, Azure Data Factory, Azure Databricks, Snowflake, Python, Power BI,.
Languages:
SQL, PL/SQL, C++, Java, Java Script, JSP, Unix Shell Script, Python, Scala.
ERP:
MEDICOM 2.1, IBA eHIS formerly known as MEDICOM 4.1., Oracle Financials 11i, Oracle Commerce, Oracle Adverse Event Reporting System 4.5.
WORK EXPERIENCE
Chicos FAS Inc. September 2013 – Present
Fort Myers, FL
GCP Solution Architect
●Involved in Software Analysis, design, development, implementations, and technical support
●Involved in gathering requirements and business rules and documented in Jira Confluence..
●Involved in coding and testing of migration of SQL server to Oracle.
●As a data architect, provided solutions to the current system moving from on-premises to Cloud platform.
●Collaborated with cross-functional teams, stakeholders, vendors and management for the smooth functioning of the enterprise data system.
●Developed data modeling and is responsible for data acquisition, access analysis, and design.
●Involved in data integration with existing warehouse structures and continuous improvements to system functionality.
●Involved in reviewing business requests for data and data usage.
●Developed and maintained all standards in accordance with data structures and models.
●Developed and modified SSIS Packages to integrate CRM application with ATG eCommerce application.
●Experience in using GCP Compute Engine, Cloud Functions, GKE, Cloud Storage, Cloud SQL, Firestore, BigQuery, BigQuery, Dataflow, Dataproc.
●Led the migration of legacy databases to MongoDB on GCP, utilizing tools like MongoDB Atlas and GCP's data transfer services to ensure seamless transitions with minimal downtime.
●Optimized MongoDB performance on GCP by implementing indexing strategies, query optimization, and proper schema design, resulting in improvement in query response times.
●Developed and implemented robust backup and disaster recovery strategies for MongoDB databases on GCP, ensuring data integrity and availability in case of failures.
●Configured role-based access control (RBAC) and integrated GCP Identity and Access Management (IAM) for secure access to MongoDB databases, enhancing overall data security.
●Designed and implemented effective data models in MongoDB to support application requirements, ensuring efficient data retrieval and storage optimization.
●Designed and implemented scalable data architectures on Google Cloud Platform (GCP) to support large-scale data processing and analytics, improving data accessibility and performance.
●Developed data pipelines using Google Cloud Dataflow and Apache Beam, enabling real-time data processing and transformation for analytics and reporting.
●Architected data storage solutions utilizing Google BigQuery for analytics, ensuring efficient data retrieval and minimizing query costs through optimal schema design.
●Implemented data governance frameworks on GCP, ensuring compliance with data privacy regulations and enhancing data quality through automated validation processes.
●Collaborated with cross-functional teams to gather data requirements, translating business needs into technical specifications for GCP data solutions.
●Utilized Google Cloud Pub/Sub for building event-driven architectures, enabling real-time data ingestion and processing for various applications.
●Conducted performance tuning and optimization of GCP data services, achieving significant improvements in query performance and reducing operational costs.
●Leveraged Oracle Fusion Financials APIs to facilitate seamless data synchronization between Oracle ERP Cloud and external systems.
●Led the successful migration of on-premises data warehouses to Google BigQuery, optimizing data storage and query performance while reducing operational costs.
●Designed and implemented scalable data pipelines using Google Cloud Dataflow and Apache Beam, enabling real-time data processing and analytics across multiple data sources.
●Established a robust data lake architecture on Google Cloud Storage, facilitating the storage and management of structured and unstructured data for improved accessibility and analysis.
●Integrated various GCP services (e.g., BigQuery, Pub/Sub, Cloud Functions) to create seamless data workflows, enhancing data availability and reducing latency for analytics.
●Developed efficient ETL/ELT processes using Google Cloud Composer and Cloud Data Fusion, streamlining data ingestion and transformation for analytical workloads.
●Implemented data governance frameworks and security protocols on GCP, ensuring compliance with industry standards and protecting sensitive information.
●Collaborated with business stakeholders to define data analytics strategies, leveraging GCP tools to drive insights and support data-driven decision-making.
●Integrated machine learning models into data architecture using Google AI Platform, enabling predictive analytics and enhancing business intelligence capabilities.
●Developed comprehensive documentation of data architecture designs and best practices, promoting knowledge sharing and standardization across teams.
●Provided training and mentorship to team members on GCP data architecture principles and tools, fostering a culture of continuous learning and innovation.
●Developed and maintained scalable LookML models, views, and dashboards to enable self-service analytics across the organization.
●Collaborated with stakeholders to gather requirements and translate business needs into efficient LookML solutions.
●Optimized LookML code for performance, ensuring fast and reliable data delivery.
●Implemented reusable LookML components such as measures, dimensions, and classes to promote consistency and reduce development time.
●Managed version control and documentation of LookML projects to facilitate collaboration and knowledge sharing.
●Established dashboard standards and best practices to ensure consistency, usability, and compliance across the organization.
●Led the creation and enforcement of data governance policies for dashboard development and maintenance.
●Managed user access controls and permissions to safeguard sensitive data and ensure appropriate data sharing.
●Conducted regular audits of dashboards for accuracy, relevance, and performance optimization.
●Collaborated with data owners and stakeholders to prioritize dashboard updates and enhancements.
●Experience in Snowflake, Python, Power BI, Delta Lake, Spark SQL, Spark Structured Streaming, and Direct Streaming using PySpark.
●Proficient in using HDFS, MapReduce, Hive, Apache Spark, Spark Core, Spark SQL, and Spark MLlib for distributed data processing, SQL querying, and machine learning model development.
●Developed and maintained robust data transformation workflows using dbt, leveraging modular SQL for efficient data modeling.
●Implemented dbt tests to ensure data quality and integrity.
●Integrated dbt with cloud data warehouses (e.g., Snowflake) to streamline ETL processes.
●Created comprehensive documentation of dbt models and transformations using dbt docs, improving data lineage visibility and self-service capabilities for analysts.
●Designed and implemented ETL processes using Snowflake's data loading features, ensuring seamless integration with various data sources.
●Designed and implemented scalable data warehousing solutions on Snowflake, optimizing for performance, cost, and security.
●Developed and maintained data models, schemas, and pipelines to support complex analytics and reporting requirements.
●Led the migration of existing on-premises or cloud data platforms to Snowflake, ensuring minimal downtime and data integrity.
●Architected data integration workflows using Snowflake’s Snowpipe, SQL, and ETL/ELT tools such as Informatica, Talend, or Apache NiFi.
●Implemented best practices for Snowflake security, including role-based access control, data masking, and encryption.
●Optimized query performance through clustering, partitioning, and materialized views within Snowflake.
●Managed Snowflake account and user provisioning, monitoring, and cost optimization strategies.
●Collaborated with data engineers and data scientists to design data pipelines, ensuring data quality and consistency.
●Implemented data governance, lineage, and auditing within Snowflake environment to ensure compliance and transparency.
●Provided technical leadership and best practices for Snowflake architecture, scalability, and performance tuning.
●Conducted training sessions and created documentation for team members on Snowflake features and architecture.
●Monitored and analyzed Snowflake usage metrics to optimize costs and resource allocation.
●Supported multi-cloud strategies by integrating Snowflake with GCP cloud services.
●Conducted performance tuning and optimization of SQL queries, resulting in a 30% reduction in execution time.
●Collaborated with cross-functional teams to gather requirements and translate them into data models, enhancing data accessibility.
●Created Kafka producers, consumers, and topics to stream data from OLTP system to analytics server to make the application real time instead of batch processing.
●Involved in interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of new projects.
●Developed logical and physical design and created database objects including tables, indexes, views, and materialized views.
●Manually converted SQL server procedures, functions and SSIS packages to PL/SQL procedures, functions, and packages.
●Using EXPLAIN PLAN and TKPROF utilities, assessed efficiency of the modules.
●Developed Unix shell scripts and SQL *Loader scripts for loading data.
●Analyzed system design and created Replicated objects.
●Extensively used advanced features of PL/SQL like collections, varray, and Dynamic SQL.
●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package and bulk operation features.
●Developed and customized SQL Server and PL/SQL packages, procedures, functions, and triggers.
●Automated application using Scheduler Package in Oracle Database 11g.
●Involved in integration between Oracle e-Commerce and Manhattan.
●Used Bulk Collections for better performance and easy retrieval of data by reducing context switching between SQL and PL/SQL engines.
●Extensively used the version control tools like git, subversion, bit source.
●Proficient in leading a team of 6 people and provided support in troubleshooting production database application and reporting problems.
●Environment: Oracle 11g, SQL server 2008, SSIS, Oracle Commerce (ATG), Spark Structured Streaming, Spark Direct Streaming, Spark SQL, Spark Core, Scala, PySpark, Kafka, Delta Lake, Hive, Azure data factory, Azure data bricks, Shell Script, Python, GCP, BigQuery, Cloud Function, Cloud storage, PubSub.
RapidIT August 2011 – August 2013
Alpharetta, GA
Sr. Analyst Application Developer
●Involved in Software Analysis, design, development, implementations, and technical support.
●Involved in coding and testing of Asset Data Services.
●Involved in business user interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of new release for Asset Data Services.
●Analyzed requirements of Workflow for Asset Data Services.
●Developed logical and physical design and created database objects including tables, indexes, and views and materialized views.
●Developed dynamic web applications using React and JavaScript, enhancing user experience and improving application performance through efficient state management and component architecture.
●Implemented responsive UI components with React, utilizing CSS frameworks (e.g., Bootstrap, Material-UI) to ensure cross-browser compatibility and seamless mobile experiences.
●Optimized application performance by employing React best practices, such as code splitting, lazy loading, and memoization, resulting in a 25% reduction in load times.
●Integrated RESTful APIs and GraphQL with React applications to fetch and manipulate data, enhancing the interactivity and responsiveness of the user interface.
●Involved in implementation, Acceptance testing, and customization per enhanced requirements.
●Using EXPLAIN PLAN and TKPROF utilities, assessed efficiency of the modules.
●Developed Unix shell scripts and SQL *Loader scripts for loading data.
●Analyzed system design and created Replicated objects.
●Extensively used advanced features of PL/SQL like collections, varray, and Dynamic SQL.
●Improved performance of Database using appropriate hints, rewrote poor performing queries and optimized packages, materialized views, partitioning, indexing techniques, and bulk operation features.
●Developed and customized PL/SQL packages, procedures, functions, and triggers.
●Automated application using Scheduler Package in Oracle Database 11g.
●Prepared documentation and user manuals, trained users, and involved in validation and UA testing.
●Provided support in troubleshooting production database application and reported problems.
Environment: Oracle 11g on UNIX, Oracle 10g Forms and Reports, Informatica, Java, JSP, Java Script.
CVS Caremark August 2008 – July 2011
Salt Lake City, UT
Sr. Analyst Application Developer
●Involved in Software Analysis, design, development, implementations, and technical support.
●Involved in coding and testing of Explanation of Benefits module.
●Involved in business user interaction for collecting specifications, design, development, technical reviews, coding, testing, gap analysis, and implementation of Explanation of Benefits.
●Designed module in such a way that EOB’s will be generated even if database was not supporting Spanish Language.
●Analyzed requirements of Workflow for Explanation of Benefits.
●Developed logical and physical design and created database objects including tables, XML Tables, indexes, and views and materialized views.
●Used ora2pg for migration from Oracle to PostgreSQL, troubleshot and fixed conversion issues.
●Manually converted Oracle packages to PL/pgSQL functions and procedures that could not migrate with ora2pg.
●Used pgadmin 4 for development.
●Involved in implementation, Acceptance testing, and customization per enhanced requirements.
●Used EXPLAIN PLAN and TKPROF utilities to assess efficiency of the modules.
●Developed Unix shell scripts and SQL *Loader scripts for loading data.
●Involved in enhancement of Replicated Database design.
●Analyzed system design and created Replicated objects.
●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package, materialized views, partitioning, indexing techniques, and bulk operation features.
●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Forms 10g and Reports 10g.
●Automated application using Scheduler Package in Oracle Database 10g.
●Developed and deployed ad-hoc reports, interactive reports, and graphics on the web front presentation layer using SQR to help senior management.
●Prepared documentation and user manuals, trained users; involved in validation and UA testing.
●Provided support in troubleshooting production database application; reported problems.
Environment: Oracle 11g on UNIX, SQL server 2008, PostgreSQL 9.4(ora2pg, pgadmin, pgadmin debugger), SSIS, Informatica, Java, JSP, Java Script.
Mutex Systems Inc
Client: Schering-Plough July 2007- July 2008
PSpring Field, NJ
Sr. Oracle Application Developer
●Responsible for mentoring a team of three developers and two Business analysts.
●Documented user requirements, responsible for updating and marinating project documentation per SDLC.
●Involved in technical reviews of functional and design specifications.
●Good Exposure to Oracle Clintrial Database.
●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Reports10g for CARES Project.
●Improved performance of Database using appropriate hints, rewrote poor performing queries, and optimized package using materialized views, partitioning, indexing techniques, and bulk operation features.
●Monitored and analyzed performance of the database using Automatic Workload Repository.
●Designed and created materialized views with partitioning for better performance for queries and reports.
●Developed logical and physical design, created database objects including tables, XML Tables, indexes, and views, and materialized views.
●Created XML Queries to generate output in XML Format.
●Created Tables with CLOB Data Types to put generated XML output in CLOB column data type.
●Extensively Used Oracle XML API, DBMS_HTTP, and DBMS_LOB.
●Automated application using Scheduler Package in Oracle Database 10g.
●Developed SQL *Loader scripts for loading data.
●Developed and deployed interactive reports and graphics on the web front presentation layer using Oracle Report Builder 10g to help senior management.
●Generated plain text reports, graphics, Excel, and PDF output.
●Designed, developed, and deployed interactive multi-tab, multi-stack, master-detail Oracle Forms for application using Oracle Forms 10g.
●Provided support in troubleshooting production database applications and reporting problems.
●Prepared documentation and user manuals and trained users; involved in validation and UA testing.
Environment: Oracle 10g on UNIX, Oracle Reports 10g, Oracle Forms 10g, Oracle Discoverer 10g, and TOAD.
IBA Health Limited (aka Medicom Solutions)
Decreaption: MEDICOM is a healthcare product implemented for different clients mention in clients section. Its an integrated solution product which has Billing, EMR, Inpatient, Outpatient, Medical Records, Finance modules.
Product Development Lead October 2002 – February 2007
Muscat, Oman
Clients: Lahad Datu Hospital (Malaysia), American Hospital (Dubai), Gauteng Province Hospital (South Africa), Siriraj Hospital (Thailand)
●Involved in development, testing of Blood Donor Management System module, and enhancements of Laboratory management system module for Malaysia Project using Oracle 6i Forms Developer.
●Involved in user interaction for collecting specifications, design, development, coding, testing, gap analysis, implementation of Clinician Access, and Risk Management module for American Hospital (Dubai).
●Analyzed requirements of Workflow for Clinician Access and Risk Management mechanism.
●Involved in implementation, Acceptance testing, and on site customization per the enhanced requirements.
●Used EXPLAIN PLAN and TKPROF utilities to assess efficiency of the modules.
●Involved in enhancement of Replicated Database design.
●Analyzed system design and created Replicated objects.
●Used PVCS version control management software.
●Extensively Used Oracle XML API, DBMS_HTTP, and DBMS_LOB.
●Developed and customized PL/SQL packages, procedures, functions, triggers, forms, and reports using Forms4.5/6i, Reports2.5/6i, and Pro*C for Laboratory, Nursing Management, Inventory, Risk Management, Patient Registration, Appointment Management, Inpatient Management, Outpatient Management, Accident & Emergency, Order entry, Medical Stores, Purchase Order, and MIS Modules.
Environment: Oracle 8i on UNIX for Gauteng Province Hospital (South Africa). Oracle 9i on UNIX for American Hospital (Dubai), and Lahad Datu Hospital (Malaysia), Oracle Financial Applications 11i(AP, AR, GL), Developer 2000(Forms 4.5/6i, Reports 2.5/6i, Proc*C) on Windows 2000/95.
Electronics Corporation of India Limited January 1999 – July 2002
Hyderabad, India
System Analyst
●Involved in Software Analysis, design, development, and implementation.
●Analyzed requirements of the Administration, Finance, and Inventory mechanism.
●Designed and developed database structure to meet Module requirements.
●Interacted with Staff of various department personnel regarding analysis of Finance and Inventory Management requirements to be captured in Application.
●Involved in coding of Stored Procedures, functions, and packages.
●Developed and customized forms and reports using Forms 5 and Reports 3 for Administration, Finance, and Inventory modules.
Environment: Oracle 8i on Windows NT, Developer 2000(Forms 5.0, Reports 3.0) on WINDOWS 95.