NAGENDRA BABU PASAM
*****************@*****.***
PROFESSIONAL SUMMARY
• I have around 6 years of IT experience with a strong foundation in Data Modeling (Relational, Dimensional, Star, and Snowflake schemas), Data Analysis, and Data Warehousing. Proficient in tools such as Erwin, IBM Infosphere DataStage (11.7/11.5).
• Experience Enterprise Data Warehousing, Relational and Dimensional data modeling for creating logical and physical design of database and ER diagrams using data modeling tools like ERWIN r9.7, r9.6,2020 R1, ER-Studio & Power Designer 16.5 and ER studio.
• I have 2 years of experience with Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage.
• I have 4+ years of experience in Data Modeling, including work with Erwin and ER Studio on both OLAP and OLTP projects.
• Experience creating Hive data models for big data environments.
• Worked on Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import and Data Export.
• Experience with Data Lake implementations and data ingestion.
• Proven experience on projects involving data integration, data warehousing, data cleansing, data mapping, and data conversion activities.
• Created, documented, and maintained logical and physical database models, including comprehensive data model documentation.
• Good knowledge in Database programming SQL, SQL Server2012/2008 ORACLE. 12c/11g/10g/9i, DB2, TERADATA 15, NoSQL Database tuning and query optimization.
• Experience in Developing Spark applications using SPARK-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into customer usage patterns.
• Experience creating/modifying DDL based on the database type in various databases.
• Responsible for creating and testing DDL/DML in a testing environment before delivering over to the DBA team.
• Experienced in Data loading using SQL Scripts and SQL Server Integration Services packages
• Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
• Extensive experience (2+ years) in building and managing data warehouses using Snowflake, including hands-on experience with DBT to create models on Snowflake tables to support analytics.
• 2 years of experience with Python scripting for data manipulation and analysis.
• Developed and maintained mapping spreadsheets for ETL processes, detailing source-to-target mappings, physical naming standards, data types, volumetric and domain definitions, and corporate metadata.
• Experienced in implementing dash boarding, data visualizations, analytics, solving business problems and managing reports in organization using Tableau and Power BI
• Having very Good exposure to ETL tool like Informatica.
• Experienced in designing and developing data warehouses, marts, and ODS.
• Experience working in agile and waterfall methodologies.
Technical Skills::
ETL Tools
DBT, Azure Databricks, ETL, Informatica
Data Modeling Tools
Erwin, ER Studio, SSIS, DataStage
Databases
SQL Server, MySQL, Oracle (various versions), Snowflake, Teradata
OLAP Tools
Microsoft Analysis Services, Business Objects
NoSQL
MongoDB, Azure CosmosDB
Data visualization tools
Power BI, Tableau, SSRS
Cloud Technologies
Azure Databricks, Azure Datalake Storage, Snowflake
Web Technologies
HTML, XML, XSD
Programming Scripts
Python, PowerShell, Spark, R Language
Other Tools
MS-office Suite(Word, Excel, Project, outlook), JIRA
Certifications:
• Microsoft Azure Data Engineer Associate
• snowpro advanced data engineer
Educational Experience
Master’s in IT: Franklin University, USA Aug-2022 to Dec-2023
Bachelor’s: KITS, JNTUK, India July-2014 to May-2018
Worked closely with business analysts to gather requirements and produce Technical Design Documents. Skilled in understanding business applications, data flows, and client interactions. Committed to continuous learning and leveraging technology to solve complex challenges.
PROFESSIONAL EXPERIENCE
Client: GM Financial December 2023 - Present
Role: Data Modeler
Responsibilities:
• Working in Kanban Agile environment.
• Worked with Business owners to gather the data warehouse requirements.
• Analyzed database requirements in detail with the project stakeholders by conducting development sessions.
• Extract transforms and Load data from sources systems to Azure Data Storage services using combination of Azure Data factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks. Created pipelines in ADF for data extraction, transformation, and loading from various sources.
• Design, build and deploy high quality, performance, integrated, and documented ETL solutions while following established standards.
• Prepared and designed ETL load strategies needed to support required staging, warehouse, or data marts.
• Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1.
• Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
• Work closely with the DBA to deploy the DDL in Stash (Bit Bucket).
• Developed normalized Logical and Physical database models to design OLTP system for financial applications.
• Considered several things like maintain referential integrity by considering Denormalizing logical design, type of data that is getting processed, execution process rules, ensuring that model columns data types are supported by DBMS.
• Developed a conceptual, logical, physical data model using ERWIN. Create, maintain, publish logical and physical data models to ERWIN Model Mart.
• Developed a conceptual, logical, physical data model using ER-studio. Create, maintain, publish logical and physical data models to ER-studio Model Mart.
• Involved in documentation of Data Mapping and ETL specifications of Enterprise data. warehouse designing and developing conceptual models using Workbench and ER-studio. designing tool as per requirements and demonstrated to business.
• Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
• Responsible for naming conventions for entities and columns.
• Worked on creating Azure Databricks notebooks and Azure Datalake Storage for ETL process.
• Attends daily standup to provide updates on development progress.
• Worked on creating stories in JIRA board based on the features that are discussed during PI planning.
• Design and implement data models suited to the application's requirements. Write and optimize queries and database access code.
• Developed Data Pipelines to load data from Flat Files, CSV and JSON files into the data lake from a variety of sources in size about 5 TB on daily basis.
• Used Altova XML Spy 2021 designing, editing, and debugging enterprise-class applications involving JSON, XML, XML Schema, XSLT, XQuery, SOAP, WSDL, Web services.
• Started Learning how to work with Google Cloud Platform based on the new requirement.
• Has good knowledge of MDM Hub components such as Informatica MDM Hub Console and Master Data Modeling.
• Trained in all MDM areas: landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, queries, groups, packages and is currently supporting Integrity MDM platform to develop match/merge models and reference data management.
• Integrate NoSQL databases with front-end and back-end systems. Work with APIs or libraries provided by the NoSQL database.
Environment: SQL Server, Oracle, Erwin, ETL, Teradata Administrator, Snowflake, JIRA, Azure Databricks, Azure Data Warehouse, ERWIN data modeler, ER-studio, Bit Bucket, MDM Hub console, MS Visio.
Client: Promarg Consultants Pvt., Ltd, Feb 2018– August 2022
Role: Data Modeler
Responsibilities:
• Worked in Agile methodology.
• Practical understanding of Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
• Expertise in identifying and analyzing the business needs of the end-users and building the project plan to translate the functional requirements into technical tasks that guide the execution of the project.
• Prepared ETL standards, naming conventions and wrote ETL flow documentation for stage, Data Marts.
• Developed a Conceptual model using Erwin based on requirements analysis.
• Created dimensional models for the reporting system by identifying required dimensions and facts using Erwin r7.1.
• Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
• Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis.
• Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation.
• Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
• Experience with Snowflake Multi-Cluster Warehouse.
• Hands on with creating DBT models to create reports on snowflake to support the analysis team.
• Created several DDL’s and DML’s in DBT models and ran against snowflake.
• Achieved a 20 % enhancement in data transformation efficiency through the implementation of optimized DBT code for Snowflake.
• Created enterprise Hive models in publish layer based on reporting use cases.
• Assisted python developers to move data between the data layers by applying necessary DQ rules and business requirement rules in the big data environment.
• DBT lineage to identify the source to target mapping for all table.
• ETL/ELT transformations are designed and developed in DBT tool.
• Ad hoc reporting, data analysis and troubleshooting for various business groups, ad-hoc queries for data mining using different data mining tools.
• Defined best practices for Tableau report development. Prepared Dashboards using aggregations, parameters in Tableau.
• Used SNOWSQL, Data Pipelines to load the data into snowflake.
• Complete understanding of Agile process.
• Attended daily standup to provide updates on development.
• Created several stories in JIRA board by splitting features.
Environment: SQL Server, Teradata, Erwin, DBT, ETL/ELT, SnowSQL, Snowflake, JIRA, DDL, Azure Data Warehouse, ERWIN data modeler, ER-studio, Bit Bucket, Tableau, Power BI.