Post Job Free
Sign in

Data Architect Governance

Location:
Chicago, IL
Posted:
June 02, 2025

Contact this candidate

Resume:

NAGESWARA RAO PAMUJULA +1-817-***-****

********.***@*****.***

LinkedIn: nageswara-rao-pamujula-5b205a127

Professional Summary

Results-driven Data Architect with 17.6 years of extensive IT experience, expertise GCP data architecture and Engineering. Proven ability to design, build, and optimize scalable, high-performance data platforms and applications using GCP services GCS, Pub/Sub, Big Query, Dataflow, Composer (Airflow), and Dataproc

Strong background in ETL/ELT pipelines, data modelling, and data governance. Experienced in working with cross-functional teams to translate business needs into technical solutions, ensuring data availability, security, and compliance.

Extensive experience in building Data Lakes, Data Warehouses, and implementing analytics solutions on GCP.

Skilled in data modelling, building ELT/ETL data pipelines in GCP, and integrating multi-cloud and on-premise systems.

Expert leveraging messaging queue technologies such as Pub/Sub.

Cloud Data Migration & Modernization – Experienced in migrating legacy on-premises systems to Google cloud, ensuring minimal downtime and high data integrity(Hadoop to GCP migration).

Performance Tuning & Optimization – Expertise in fine-tuning data pipelines, SQL queries, and data models to enhance efficiency and reduce processing time.

Agile, Scrum & DevOps Practices – Proficient in agile methodologies and implementing CI/CD pipelines to ensure faster delivery and operational efficiency

Designs Teradata systems for scale, performance, and integration.

Strong background in SQL optimization, performance tuning, and advanced programming with PySpark, Spark SQL, Scala, and Python.

Experience in Hadoop ecosystem, including HDFS, Hive, Sqoop, Yar, Oozie, and Spark, with a focus on data processing, serialization (AVRO, Parquet), and performance tuning.

Deep knowledge of partitioning, bucketing, clustering and incremental ingestion techniques to optimize performance and storage.

Strong experience designing and delivering large-scale business applications, including data migration, integration, conversion, and testing.

Experience implementing and supporting IAM tools and processes.

Set up and manage access to cloud resources using accounts, users, and groups.

Adept at processing structured and semi-structured data (JSON, XML, AVRO), and working with various file formats across platforms.

Experience presenting technical architecture and design proposals to both internal stakeholders and clients.

Active mentor and technical lead, supporting junior engineers and fostering a collaborative, growth-oriented environment.

Disaster Recovery & High Availability – Designed and implemented backup and disaster recovery strategies to ensure high availability and business continuity.

Committed to ensuring standards and best practices are maintained across GCP solutions.

Technical Skills

Cloud Platforms: GCP, GCS Bucket, Pub Sub, Big Query, Dataflow, Airflow, Composer, Dataproc, AWS S3, Snowflake

Hadoop Platform: Hadoop, HDFS, Sqoop, Hive, Yarn, Hue, Oozie

Databases: DB2, SQL, SQL Server, Oracle, IBM-DB2, IMS-DB/DC

Programming & Scripting: Scala, Spark, Spark SQL PySpark, Python, SQL, Cobol, JCL, PL1, Easytrieve.

Data Tools: OPC, Endevor, File-Aid, File-Aid/Db2, Xpeditor, SPUFI, Smart Test, Service Now (SNOW), HPQC. Autosys, ETL (DataStage), JIRA, Confluence, VSAM, GDG, MQ series, Taradata.

Data Governance Tools: IAM, Data Lake Security

Version Control & CI/CD: Git

Methodologies: Agile, Scrum, Waterfall

Reporting Tools: SAP Business Objects, Tibco Spotfire.

Operating System & Servers: MVS, Windows, Linux, Cloudera 5.12, Hortonworks 2.6.5

Professional Experience

GCP Data Architect

Jan 2021 – Present

Renault-Nissan – Atos Global IT Solutions & Services Pvt Ltd

Designed and implemented end-to-end GCP-based data solutions to support data warehousing, advanced analytics, and real-time reporting

As a Lead GCP Data Architect /Engineer, Architected and developed scalable, high-performance data solutions on Google Cloud Platform (GCP), overseeing the complete data lifecycle from ingestion to analytics and reporting.

Designed and implemented strong ETL/ELT pipelines using GCP services such as GCS, Pub Sub, Dataflow, Airflow/Composer, Dataproc, and Big Query.

Built data governance frameworks using Google Cloud IAM service, implementing data classification, lineage, and cataloguing for risk, compliance, and customer data domains

Enforced data security standards and compliance by implementing encryption at rest and in transit, tokenization, and secure key management using IAM

Transformed and processed large volumes of semi-structured data (JSON, XML, Avro, Parquet), enabling structured analytics for business consumption.

Optimized performance of SQL queries, stored procedures, and ETL processes by applying best practices in indexing, partitioning, and query tuning.

Troubleshot data pipeline issues and optimized job performance by conducting root cause analysis and applying best practices for debugging and performance tuning.

Led cloud data integration and migration from remote systems to GCP Storage, deploying pipelines using CI/CD practices with Git for efficient and automated delivery.

Created and managed partitioned Big Query tables and child table architectures, optimizing complex SQL queries for performance and cost-efficiency.

Monitor and troubleshoot data pipeline issues and optimize performance and cost-efficiency.

Tuned Spark jobs for memory and execution performance, significantly reducing compute costs and improving runtime for large-scale batch and streaming jobs.

Worked closely with product owners and business users to gather requirements, participate in Agile ceremonies, and drive successful delivery across sprints.

Delivered 1,700+ user stories totaling 20,000+ story points, achieving a customer satisfaction score of 4.5/5 while maintaining consistent quality and timelines.

Designs Teradata systems for scale, performance, and integration

Led code reviews, implemented change management processes, and upheld best practices in data engineering for team-wide consistency.

Conducted team capacity forecasting and functional point analysis, ensuring optimal resource planning and sprint throughput.

Built Spotfire dashboards integrated with Big Query and other systems, delivering timely, actionable insights to stakeholders across business functions.

Disaster Recovery & High Availability – Designed and implemented backup and disaster recovery strategies to ensure high availability and business continuity.

Contributed to team growth through knowledge sharing, innovation, and maintaining comprehensive documentation for future scalability and onboarding.

GCP Data Engineer

Jan 2017 – Dec 2020

Renault-Nissan – Atos Global IT Solutions & Services Pvt Ltd

As a GCP Data Engineer, I designed and optimized high-performance data engineering solutions on Google Cloud Platform (GCP), focusing on scalability, cost-efficiency, and business value delivery.

Developed and optimized large-scale Spark applications using RDDs, DataFrames, and Spark SQL to enable efficient data processing and transformation.

Tuned Spark performance and memory usage, significantly reducing execution time and optimizing resource consumption for large datasets.

Engineered strong cloud data integration pipelines, loading transformed data into Big Query for analytics and reporting needs.

Created complex JSON outputs for structured storage and simplified downstream processing to meet evolving client requirements.

Achieved multi-million-euro savings through Free Trade Agreement (FTA) optimization, improving operational margins by 3.5% YoY over three consecutive years.

Conducted comprehensive data profiling, quality assurance, and production support, ensuring reliable, clean, and actionable data for business users.

Played a key role in Agile delivery by creating and estimating user stories, participating in sprint planning, conducting peer reviews, and delivering on-time, high-quality deliverables.

Contributed to team growth through knowledge sharing, innovation, and maintaining comprehensive documentation for future scalability and onboarding.

Data Engineer

Jan 2015 – Dec 2016

Renault-Nissan – Atos India Pvt Ltd

As a Data Engineer specializing in Hadoop technologies, I have led end-to-end design and development of enterprise-grade data solutions, translating complex business processes into scalable data warehouse (DWH) applications.

Led end-to-end development of a Hadoop-based Data Warehouse (DWH) application integrating SAP order and payment data, enabling enterprise-level data analytics and reporting.

Collaborated with business stakeholders to analyse requirements, estimate effort, and create Functional Specification Documents (FSD) and high-level design documentation.

Managed Agile delivery by creating user stories in JIRA, participating in sprint planning, and ensuring effective change management throughout development cycles.

Conducted impact analysis, designed data workflows, and led coding, UTP, UTR, and production implementation.

Defined and reviewed acceptance test plans, ensuring alignment with business requirements and end-user expectations.

Led code reviews, test reviews, and quality assurance efforts to uphold standards and reduce defects.

Provided technical mentorship to junior developers and new joiners, fostering knowledge sharing and onboarding support.

Delivered real-time demos to customers upon sprint completion, facilitating feedback loops and enhancing customer engagement.

Maintained production support for mission-critical systems, quickly resolving issues to ensure high availability and reliability.

Generated and shared project metrics, daily/weekly status reports, and key insights with project stakeholders and executive management.

Supported cross-functional collaboration, engaging closely with SAP, QA, and business teams to align deliverables and drive results.

System Analyst

Dec 2009 – Dec 2014

Renault-Nissan – Atos India Pvt Ltd

Developed and maintained complex applications using COBOL, JCL, IMS and DB2 to support critical business operations.

Translated business and functional requirements into scalable and maintainable technical specifications and high-level design documents.

Performed impact analysis, coding, unit testing, and integration testing for system enhancements and production defect fixes.

Used Changeman and Endever for source code management, version control, and production deployments across development, test, and live environments.

Created and executed unit, system, and integration test cases, ensuring high-quality and bug-free deliverables.

Tuned DB2 SQL queries and optimized batch jobs for performance and cost efficiency.

Provided L2/L3 production support, handling incident resolution, root cause analysis, and preventive measures to avoid recurrence.

Software Engineer

Nov 2007 – Dec 2009

Nissan North America Inc (NNA) – Tech Mahindra

As a Mainframe developer responsible for individual deliverable in mainframe technology.

Analyzing client requirement and enhancement.

Preparation of Analysis and Design documents.

Coding and reviews of DSR, Coding, UTP, UTR, Preparing the Acceptance test plans based on the requirement and doing acceptance test.

Coding, Review, Preparation of UTP and Test Data

Unit Testing, Preparing UTR

Supporting SIT and UAT testing

Post deployment validations.

Training and Certifications

GCP - Cloud Digital Leader

GCP - Data Engineer

DP900 – Microsoft Azure Data fundamentals

Professional Qualification

Master of Computer Science from S. V. University, Tirupati (A.P), India in 2003.

Achievements

Led the successful migration of the Renault Customs application from an on-premises Hortonworks Data Platform (HDP) to the Google Cloud Platform (GCP), delivering a complex cloud transformation project seamlessly.

Maintained zero SLA violations and ensured zero post-production defects across multiple high-impact projects, demonstrating a strong commitment to reliability and quality.

Consistently recognized for excellence with multiple awards, including “First Time Right,” “SPOT,” and “Best Performance” awards from Atos and Renault.

Honored with the prestigious “Performance of Excellence” award from Tech Mahindra and Nissan, acknowledging outstanding delivery and technical leadership.



Contact this candidate