Srivani
Email: ******************@*****.***
Mobile: +1-774-***-****
Data Analyst
PROFESSIONAL SUMMARY:
Over 4 years of experience in data analysis, applying analytical thinking and attention to detail to solve complex problems, delivering innovative solutions in agile/scrum teams. Strong communication skills presenting to technical and non-technical audiences.
Proficient in connecting dots across applications, understanding end-to-end views, and communicating effectively across organizations, using both technical and business language to influence team success.
Expertise in writing and analyzing complex queries and stored procedures using PL/SQL, aiding data analysis with strong query tools and Oracle Exadata or 10g and above experience.
Experienced in Microsoft Office suite usage, working in Agile/Scrum teams for prioritization of work and resource assignments, and contributing as a team player with minimal supervision.
Good at identifying priorities and managing multiple projects simultaneously, demonstrating know-how in effort and financials estimation, and willingness to ask questions for assistance.
Applied analytical thinking and problem-solving skills to deliver data-driven insights, demonstrating attention to detail and innovative thinking in various projects across different business units.
Utilized strong communication and presentation skills to convey complex technical information to both technical and non-technical audiences, ensuring clear understanding and alignment across teams.
Demonstrated proficiency in connecting dots across various applications to understand end-to-end views, effectively communicating across the organization using both technical and business language.
Leveraged expertise in PL/SQL to write and analyze complex queries and stored procedures, utilizing strong query tools to aid data analysis and improve decision-making processes.
Proficiently used the Microsoft Office suite to support data analysis and reporting needs, actively participating in Agile/Scrum teams to prioritize work and manage resource assignments effectively.
Experienced with Oracle Exadata or 10g and above, contributing as a team player who can influence and guide the team for success, working well in a team environment with minimal supervision.
Demonstrated ability to identify priorities and manage multiple projects simultaneously, showcasing know-how in effort and financials estimation, and proactively seeking assistance when required.
Applied analytical thinking and attention to detail to solve complex problems, fostering innovative thinking within agile/scrum teams, and communicating effectively with diverse audiences.
Optimized SQL-based monthly reporting jobs, improving efficiency and accuracy, while also demonstrating strong PL/SQL skills and the ability to write and analyze complex queries.
TECHNICAL SKILLS:
Programming - Python, SQL, R, PL/SQL
Data Visualization - Power BI, Tableau, Looker, Excel
Cloud Platforms - Google Cloud Platform (BigQuery, Dataflow, Cloud Storage, Vertex AI), Azure
ETL Tools - Azure Data Factory, SSIS
Databases - SQL Server, PostgreSQL, MySQL, Oracle Exadata
Data Wrangling - Pandas, NumPy, OpenRefine
Version Control - Git
Others - Jira, Confluence, REST APIs, Airflow (Cloud Composer), Excel Power Query, Microsoft Office
PROFESSIONAL EXPERIENCE:
Northern Trust Corp Nov 2024 – Present
Data Analyst
Responsibilities:
Led the design of automated Power BI dashboards for enterprise KPIs, connecting SQL Server and BigQuery, demonstrating strong analytical thinking and attention to detail in data presentation. This improved efficiency and transparency across business functions, showcasing problem-solving skills.
Partnered with finance teams to build dynamic financial dashboards that monitored spend, variance, and projections, leading to improved budget accountability and a 20% rise in data-driven investment decisions. This required innovative thinking and strong communication skills.
Developed scalable ETL pipelines using Google Cloud Composer, Python, and BigQuery SQL to ingest and transform transactional data, enabling near real-time analytics and improved performance for operational teams. This demonstrated proficiency in query tools.
Built anomaly detection models using scikit-learn to flag irregularities in transaction streams, reducing fraud detection false positives by 30% and improving audit response and risk mitigation. This required expertise in analytical thinking.
Migrated financial datasets from SQL Server to BigQuery using Cloud Storage, resulting in faster query performance, centralized data access, and simplified schema management for cross-functional analytics. This showcased attention to detail.
Automated monthly reporting cycles using Python and Airflow, decreasing manual data handling by 15 hours per week and reducing data refresh failures through exception handling and scheduling logic. This demonstrated problem-solving skills.
Collaborated with risk stakeholders to embed Basel III compliance logic into BigQuery models, supporting accurate exposure calculation and ensuring consistency in regulatory reporting pipelines. This required strong communication skills.
Refined GCP warehouse schema with engineering teams by normalizing historical data, enforcing data quality checks, and standardizing naming conventions to improve downstream compatibility and model accuracy. This showcased innovative thinking.
Utilized strong PL/SQL skills to optimize data retrieval processes within Oracle Exadata, enhancing the performance of critical financial reporting systems and demonstrating expertise with query tools. This improved overall system efficiency.
Worked in an Agile/scrum team, prioritizing tasks and managing multiple projects simultaneously to ensure timely delivery of data-driven solutions, demonstrating effective team player skills and the ability to influence team success. This required minimal supervision.
Trustmark Mar 2023 – Oct 2024
Data Analyst
Responsibilities:
Designed Tableau dashboards for HR and payroll metrics by integrating ADP and SQL sources, enabling centralized workforce analytics and empowering department heads with real-time visibility into team structure and attrition. This required analytical thinking.
Wrote optimized SQL queries to analyze tenure distribution and resignation trends, generating actionable insights that improved HR’s workforce planning and strategic hiring decisions across business units. This demonstrated attention to detail.
Conducted A/B testing on internal policy communications using Python statistical libraries, guiding leadership decisions based on conversion lift, open rates, and feedback response metrics for targeted employee outreach. This showcased problem-solving skills.
Automated legacy Excel-based reports using pandas and openpyxl, eliminating manual data entry, enhancing report reliability, and saving over 10 hours of weekly analyst labor across HR and finance functions. This required innovative thinking.
Collaborated with IT to build secure GCP access layers for HR data; developed BigQuery views with row-level security to ensure data privacy for compensation, benefits, and personnel records. This demonstrated strong communication skills.
Created self-service dashboards via Google Sheets API and App Script, empowering managers with near real-time HR data access and reducing dependency on data teams for standard requests. This showcased proficiency in query tools.
Developed time-series models for overtime cost forecasting using Python and Tableau, enabling finance teams to plan labor budgets more effectively and forecast spending fluctuations with greater confidence. This required expertise in analytical thinking.
Deployed and maintained ETL pipelines in Dataform for BigQuery, centralizing transformation logic, versioning workflows, and supporting automated data refreshes with minimal intervention. This demonstrated attention to detail.
Utilized Microsoft Office suite extensively for data analysis and reporting, creating presentations and documentation to effectively communicate findings to both technical and non-technical audiences, showcasing strong communication skills.
Demonstrated the ability to connect dots across various applications and business processes to understand the end-to-end view, facilitating comprehensive data analysis and informed decision-making within the organization, requiring minimal supervision.
BNP Paribas May 2021 – Dec 2022
Associate Data Analyst
Responsibilities:
Created automated SQL queries for credit risk exposure tracking, reducing manual reporting time by 60% and supporting high-frequency reviews across operational and compliance teams. This demonstrated analytical thinking and attention to detail.
Built Python scripts to clean CSV exports, enrich data from external market sources, and merge datasets to streamline analysis and reduce manual data handling overhead for risk teams. This showcased problem-solving skills.
Designed Tableau dashboards for operational risk KPIs such as SLA adherence, incident categories, and response times, enabling clearer tracking of risk controls and compliance performance metrics. This required innovative thinking.
Conducted QA validation across daily pipeline runs; introduced anomaly detection flags to capture and alert on outlier values or discrepancies before dashboards were refreshed and distributed. This demonstrated strong communication skills.
Supported onboarding of legacy risk systems by transforming flat-file exports and staging them in PostgreSQL, validating schema integrity and minimizing data loss during system migrations. This showcased proficiency in query tools.
Collaborated with stakeholders to define business rules, sketch mockups, and iterate Tableau dashboard designs to improve usability and answer ad hoc analytical needs. This required expertise in analytical thinking.
Maintained a centralized KPI glossary and documented rules for calculations, data lineage, and caveats to ensure consistency in reporting and reduce onboarding friction for new team members. This demonstrated attention to detail.
Developed Power BI dashboards to monitor ETL job success rates and pipeline health, allowing engineering and risk teams to troubleshoot failures and improve operational resilience. This showcased problem-solving skills.
Automated Excel workbook consolidation tasks using Python (pandas), appending data to SQL Server tables and eliminating repetitive manual copy-paste work for monthly team metrics. This required innovative thinking.
Demonstrated willingness to ask questions and reach out for assistance as required, fostering a collaborative team environment and ensuring successful project outcomes through effective communication and problem-solving, requiring minimal supervision.
Certifications:
Microsoft Certified: Fabric Data Engineer Associate
Educational Details:
Master's in Computer Science - University of Massachusetts Lowell
Bachelor's in Electronics & Communication Engineering - Geethanjali College of Engineering and Technology