Ravi Sarvasiddi
**************@*****.***
US Citizen
Professional Summary
Results-oriented Lead Data Analyst/Business Analyst, Product Owner with over 20 years of success in aligning business needs with technology solutions to drive performance and efficiency. Skilled in stakeholder engagement, improving business processes, and turning complex information into clear, actionable insights that align with organizational goals and drive measurable impact. Experienced in managing large-scale datasets, performing data quality assessments, and supporting enterprise-wide initiatives, including cloud data migrations with deep proficiency in AWS. Proficient in Agile practices, including PI planning, sprint planning, capacity planning, backlog refinement, and retrospectives, with a focus on enhancing team collaboration and promoting continuous process improvement. Recognized for driving business value through clear communication, analytical thinking, and strategic problem-solving.
Core Strengths: Strong Business Acumen, In-depth Technical Expertise, Strategic Thinking, Adaptability, Leadership, Conflict Management, Prioritization, Multitasking Efficiency, and Mindful Decision-Making. Technical Skills
Programming Languages & Scripting: Python, Java, JavaScript, SQL, PL/SQL, Unix (Shell), XML, JSON, Angular
Cloud & Infrastructure (AWS Focused): AWS Lambda, AWS IAM, AWS Glue, ECS, Fargate, EC2, EMR, S3, AWS Cloud Watch, Redshift, PostgreSQL, SQS/SNS, AWS CloudFormation, MWAA, Sagemaker, bedrock, Textract, Lex, Kendra, Augmented AI, Comprehend, Transcribe, Rekognition, Jupyter Notebook, Snowflake, Azure Virtual Machine, Azure Functions, Azure Blob, Azure Files, Azure Data Lake, Azure Batch, Azure Vnet, Azure AD, Azure Monitor
Data Integration & ETL: Informatica, AWS Glue, ETSS Bridge, ER Studio, ERWIN, Toad, DBeaver, Oracle SQL Developer, Collibra
DevOps & CI/CD: GitHub, GitLab, Bitbucket, Jenkins, UrbanCode, SonarQube, Nexus, Flyway, Twistlock Monitoring & Incident Management: Dynatrace, Splunk, Sentinus, Moogsoft, xMatters BI, Analytics & Reporting: Tableau, Power BI, Visio Software Development Tools & Frameworks: Visual Studio, PyCharm, Toad, DBeaver, FLEX, NGINX Agile & Project Management: Scrum, Kanban, Jira, Confluence, SharePoint, Rally Virtualization & App Streaming: AppStream
Education & Certifications
• Master’s in Computer Applications
• AWS Certified Solutions Architect
• AWS Certified Cloud Practitioner
Employment History
Fannie Mae Business Analyst / Product Lead / Analyst Jan 2007 – Aug 2008; Jun 2009– Mar 2025
Domain: Single Family, Multi Family, FHFA Regulatory Reporting, Loan Acquisitions, Loan Modification, Loan Servicing, Investor Reporting, Property, Party, Securities (MBS, MEGA, REMIC), Home Affordable Modification
• Held various roles and responsibilities throughout my tenure with the company, demonstrating flexibility and adaptability across multiple functions.
• Worked on predictive analytics use case using classification and regression methods in sagemaker to determine loan approval which significantly reduced time to review loan application.
• Collaborated with cross-functional stakeholders to gather, document, and analyze business requirements related to data management, enhancing data accuracy and supporting downstream reporting and analytics.
• Translated business needs into clear, actionable functional and technical specifications to guide solution design and development.
• Served as a liaison between business and development teams, ensuring clear communication and alignment on project goals and requirements.
• Partnered with solution architects and developers to ensure technical specifications aligned with business objectives and system capabilities.
• Collaborated with the Enterprise Data Governance (EDG) team to define and standardize new data attributes, ensuring consistency and alignment with governance policies.
• Prioritized and managed the product backlog, including features and user stories, to align with business priorities.
• Strong programming skills in SQL, PL/SQL, Python, pandas, polars, openpyxl, AWS lambda, glue, Postgres, oracle, redshift.
• Participated in quarterly Program Increment (PI) planning and led sprint planning, capacity planning, and backlog refinement sessions.
• Defined feature-level acceptance criteria and reviewed story-level acceptance criteria to ensure alignment with business expectations.
• Collaborated closely with internal and external application teams to ensure alignment on the Change request(CR) commitments, timelines, and deliverables.
• Scrum of Scrums meetings to monitor progress, review impediments, and ensure alignment across multiple Agile teams and timely delivery.
• Created multiple Jira dashboards/reports for the teams to review, track deliveries, incidents and overall wellbeing of the project.
• Actively participated in sprint demos to capture business feedback and help prioritize upcoming requests based on value and urgency.
• Conducted retrospectives to gather teams feedback on what went well, and discuss process improvements.
• Streamline workflows using Scrum and Kanban, significantly boosting productivity and improving team efficiency.
• Defined and implemented business and technical controls, working closely with technology teams to ensure compliance and risk mitigation.
• Collaborated with data modelers and governed high-breadth data flow diagrams, business and technical processes, and documentation to support enterprise data initiatives.
• Conducted training sessions and provided ongoing support to business users on newly implemented applications.
• Ensured the proper handling and protection of NPI (sensitive information), aligning with privacy and security regulations.
• Designed and implemented application features ensuring compliance with SOX requirements, including access controls, audit logging, and data integrity.
• Enforced data governance policies, improving data integrity, security, and compliance within cloud-based environments.
• Defined and implemented business and technical controls, working closely with technology teams to ensure compliance and risk mitigation.
• Planned and performed disaster recovery(Failover and Failback) testing for (live-live and live-standby).
• Delivered timely and accurate regulatory reports to FHFA, ensuring full compliance with industry and government standards.
• Coordinated with development teams and stakeholders during critical situations to implement Emergency Bug Fixes (EBFs), mitigating potential business disruptions.
• Identified improvements and optimizations as needed and streamlined the business process, ensuring a high- quality standard, and removing bottlenecks for the team.
• Led the cloud migration (focused on Amazon Web Services) of on-premises applications enhancing scalability, performance and reliability.
• Provided technical support and database troubleshooting for complex issues related to SQL queries, database design, PL/SQL development and system performance.
• Extensively worked on PL/SQL Packages, Procedures, Functions, SQL Injection to handle business logic and data processing tasks.
• Developed and implemented complex SQL queries to retrieve, manipulate, and analyze large datasets, improving reporting accuracy and efficiency.
• Contributed to the Party Mastering (MDM) initiative, consolidating and standardizing party data including lenders, borrowers, and servicers.
• Led efforts in Property Mastering, focusing on address standardization and establishing a single source of truth across systems.
• Implemented and maintained Master Data Management (MDM) solutions to ensure the accuracy, consistency, and integrity of party and property-related data for regulatory compliance and operational reporting.
• Led software engineering teams to modernize mortgage processing solutions, increasing operational efficiency by 40% through automation and scalable design patterns.
• Groomed and mentored T-shaped teams, fostering deep expertise with cross-functional collaboration to accelerate delivery and innovation.
• Led AWS Cost optimization initiatives, collaborating with infrastructure and development teams to identify and eliminate redundant resources, reduce service idle time, and right-sizing strategies resulting in substantial savings.
• Designed and implemented EMR-based data processing solutions within Enterprise Data Lake to efficiently handle large-scale batch data loads, improving performance and scalability.
• Directed a cross-functional data engineering team, overseeing end-to-end solution design, development, deployment, and maintenance.
• Collaborated with cross-functional teams to implement cloud-native solutions in AWS, driving a 25% reduction in infrastructure costs.
• Designed and developed ETL mappings using Informatica PowerCenter to extract data from various sources like Oracle, Postgres, Flat Files, and load into Data Warehouse systems.
• Optimized existing ETL workflows and mappings to improve performance by 30%, reducing data load times.
• Implemented complex transformations including Joiner, Lookup, Aggregator, Update Strategy, and Router to meet business logic requirements.
• Created and scheduled workflows using Workflow Manager, handling dependencies and failure recovery.
• Designed reusable Mapplets and Worklets to ensure consistency and reduce development time.
• Managed production support for critical financial applications, ensuring high availability and uptime, and ensuring rapid incident resolution and root-cause analysis.
• Implemented data integration pipelines using AWS Glue and Lambda, reducing data processing time.
• Conducted performance tuning of SQL queries and Redshift clusters, ensuring fast data retrieval and query execution.
• Optimized Redshift performance, reducing query processing time through advanced data modeling techniques and indexing strategies.
• Implemented automated shutdown & startup of AWS ECS and RDS instances to reduce idle time.
• Managed AWS resources such as EC2 instances, RDS databases (Oracle/Postgres), S3 buckets, and Redshift clusters to support scalable cloud architectures.
• Well experienced in Data Modeling, Data Mapping, Data Profiling, Data Analytics, Data Mining, Data Integration, Data Cleansing, Data Loading, Data Transformation, Data Migration, Data Quality/Validation.
• Immense knowledge in SDLC (Requirements Analysis, Design, Develop, Maintain and production support).
• Designed scalable database schemas and optimized performance through indexing, query optimization, and data modeling in Postgres and Oracle databases.
Clients: Environmental Protection Agency/Chicago Board of Options/Verisign - Software Engineer Mar 2006 – Dec 2006; Sep 2008 – May 2009
Domain: Endangered Species, Domain Verification, Stock Options
• Designed and implemented business intelligence solutions using Oracle Reports.
• Design and maintain data base objects using ERWIN.
• Collaborated with global teams to ensure system availability and compliance with regulatory standards.
• Developed application using Java, Oracle PL/SQL stored procedures, functions, triggers, packages, and scripts.
• Provided production support, address development issues quickly.
• Create and optimize SQL queries for performance and scalability.
• Provided complex queries for reporting.
• Identify and fix issues in existing PL/SQL code.
• Analyze slow-performing queries and rewrite or index as needed.
• Analyzed execution plans (EXPLAIN_PLAN, TKPROF) and Oracle Optimizer techniques to tune code.
• Worked on Views and Materialized views to increase query execution performance.
• Retrieve and analyze large and complex data sets through finetuned queries. Time Warner Cable Software Developer
Feb 2005 – Mar 2006
Domain: Media
• Reviewed Business Requirements and Functional Specifications.
• Extensively involved in writing databases Stored Procedures, Packages and Triggers.
• Designed and developed Ad hoc reports using SQL*PLUS, and ORACLE.
• SQL Tuning and training developers on efficient coding of SQL and PL/SQL.
• Improving Query Performance by rewriting the problematic queries.
• Performance tuning of the SQL queries using Explain plan.
• Wrote various UNIX shell scripts to automate Jobs.