Deepika Gaddam
Data Analyst
+1-412-***-**** *************.****@*****.*** LinkedIn:
linkedin.com/in/deepikagaddam Location: United States. PROFESSIONAL SUMMARY
Dynamic and results-driven Data Analyst with over 7 years of extensive experience in data analysis, integration, ETL processes, and business intelligence solutions. Highly proficient in writing complex SQL queries and performing data integration tasks across multiple databases with a strong commitment to ensuring data accuracy and efficiency. Expertise in ETL tools such as SSIS, Oracle Data Integrator, and Informatica, with proven success in cleansing and transforming large datasets. Demonstrated ability to manage data flows using Apache NiFi and Kafka for efficient system and database integration. Advanced user of Microsoft Excel for data analysis, visualization, and reporting to drive strategic business decisions. Skilled in data transformation using XML and JSON for seamless data interchange across platforms. Proficient with version control using GIT to support collaborative development practices. Strong documentation skills, ensuring clear and precise system and process records. Experienced in Linux for system management, SharePoint for content management, and Confluence for team collaboration and project documentation. Proven ability to automate and optimize data processes using Alteryx and AWS tools. Developed and deployed comprehensive dashboards in Tableau, providing actionable insights for business stakeholders. Hands-on experience with AWS services including S3 for data storage, Redshift for data warehousing, and Recognition for advanced analytics. Applied Apache Spark ALS for predictive analytics and supported machine learning initiatives using Python with NumPy and pandas, TensorFlow, and PyTorch. Proficient in using Jupyter Notebook for prototyping and visualization, and MATLAB for complex numerical computing. Adept in Agile and Scrum methodologies, ensuring timely project delivery and effective stakeholder communication. Holds a master’s degree in information systems management from Robert Morris University and a bachelor’s degree in computer science from Acharya Nagarjuna University.
TECHNICAL SKILLS
• Data Analysis: SQL, Excel, Tableau, Power BI, R
• Data Integration & ETL: SSIS, Informatica, Oracle Data Integrator, Apache NiFi
• Machine Learning: TensorFlow, PyTorch, AWS SageMaker, Azure ML
• Programming: Python, R, MATLAB
• Data Visualization: Tableau, Power BI, Excel
• Cloud Technologies: AWS (S3, Redshift, SageMaker, Recognition), Microsoft Azure
• Version Control: GIT
• Project Management & Collaboration: SharePoint, Confluence, Agile, Scrum
• APIs: RESTful API integration
• Operating Systems: Linux
• Data Formats: XML, JSON
PROFESSIONAL EXPERIENCE
• Client: Total Bank Solutions, Hackensack, NJ
• Role: Data Analyst Jan 2024 – Present
• Developed complex SQL queries to analyze and interpret financial data, enhancing reporting accuracy and strategic decision-making.
• Utilized Tableau to create dynamic and interactive dashboards, enabling real-time financial monitoring and insights.
• Managed data storage and retrieval operations using AWS S3, ensuring efficient handling of large financial datasets.
• Employed AWS Redshift for robust data warehousing solutions, enabling scalable analytics and storage capabilities.
• Integrated Amazon SageMaker to develop and deploy machine learning models, predicting financial trends and behaviors.
• Configured Apache Spark ALS for advanced analytics, enhancing predictive modeling and data processing efficiency.
• Automated data processes using Alteryx, streamlining data preparation and integration to support analytics projects.
• Maintained comprehensive documentation for all data processes and systems using Confluence, enhancing project transparency.
• Developed and managed SharePoint sites for project collaboration and document management, ensuring team alignment.
• Implemented R language for advanced statistical analysis, supporting complex data studies and analytics.
• Optimized data queries using SQL to improve performance and response times in financial reporting tasks.
• Analyzed data trends and generated financial reports using Excel, providing actionable insights to management.
• Created and maintained AWS cloud environments, ensuring scalable and secure data operations.
• Utilized AWS Recognition to enhance data analysis capabilities, applying image and text recognition technologies.
• Configured and managed data integrations using RESTful APIs, enhancing system connectivity and data exchange.
• Documented all technical processes and systems comprehensively, ensuring accuracy and adherence to compliance standards.
• Led training sessions on data tools and analytical techniques, enhancing team capabilities and knowledge.
• Designed financial models and forecasts using Amazon SageMaker and Apache Spark, informing strategic planning.
• Performed data validation and quality assurance using SQL and Excel, ensuring data integrity across platforms.
• Collaborated with IT teams to resolve system issues and optimize data workflows, enhancing overall efficiency.
• Advanced the implementation of machine learning projects using R and AWS tools, contributing to predictive analytics initiatives.
• Client: Voya, New York, NY
• Role: Data Mining Specialist Jan 2021 – Jul 2023
• Analyzed large insurance datasets using SQL and Python (NumPy, pandas) to uncover trends and insights.
• Developed neural network models using TensorFlow and PyTorch within Microsoft Azure.
• Used Jupyter Notebook to manage and document end-to-end machine learning workflows.
• Implemented Azure Databricks to process and analyze large-scale datasets.
• Created dashboards and reports using Power BI and Excel to support data-driven decision making.
• Documented project steps using Confluence, ensuring transparency and replicability.
• Applied MATLAB for statistical modeling and algorithm development.
• Led Agile Scrum ceremonies and collaborated with cross-functional teams to deliver analytics solutions.
• Automated routine data processing tasks with Python, improving efficiency.
• Used Azure ML for customer behavior modeling to support targeted marketing and product offerings.
• Client: Juspay Technologies, India
• Role: Data Integration Analyst Feb 2018 – Dec 2020
• Integrated multiple data sources using SQL, SSIS, and Oracle Data Integrator.
• Transformed and cleaned data using Informatica to ensure high data quality.
• Managed real-time data flows using Apache NiFi and Kafka.
• Automated ETL processes and ensured data security via FTP/SFTP and REST APIs.
• Documented and standardized data integration procedures across platforms.
• Used GIT for version control and team collaboration.
• Created advanced Excel reports to visualize business metrics.
• Optimized performance of Oracle databases for analytics queries.
• Trained team members on best practices in ETL and data validation.
• Ensured compliance with security and privacy standards throughout data pipelines. EDUCATION
• Bachelor’s in computer science
• Acharya Nagarjuna University, India.