Post Job Free

Resume

Sign in

Data Analyst Analysis

Location:
Athens, GA
Posted:
April 01, 2024

Contact this candidate

Resume:

GANESH KURRA

Data Analyst

404-***-****

ad4pkj@r.postjobfree.com

PROFESSIONAL SUMMARY:

Around 5 Years of working experience as a Data analyst in product domain Industry with ETL Development and Data Modeling.

Performed Data Cleaning, Data Profiling, Data Analysis and Data Mapping operations to bring more insights onto the raw finance data and creating Data Flow Diagrams, Use Cases, Use Case Diagrams, Activity diagrams, Entity Relationship Diagrams and Data Integration.

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, Data Export and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers, and packages.

Played a pivotal role in the development of EEG data analysis pipelines, ensuring data quality, accuracy, and reproducibility for critical experiments and studies.

Proficiently managed and documented code repositories on GitHub, enabling efficient code sharing and collaboration among team members.

Conducted EEG data-driven research, helping to uncover insights into brain activity and cognitive processes, and presenting findings in reports and publications.

Proficient in Data Analysis, Cleansing, Transformation, Data Migration, Data Integration, Data Import, and Data Export through use of ETL tools such as Informatica.

Experienced in working with Spark SQL for data processing and querying, contributing to effective data processing and extraction.

Led the implementation of Microsoft Purview for effective data governance within the organization.

Established data governance policies, standards, and procedures using Purview to ensure data quality and compliance.

Basic understanding of Airflow for simple job orchestration, contributing to task automation and efficient workflow management.

Demonstrated expertise in utilizing Guidewire tools for policy administration, claims management, and billing functionalities.

Worked closely with cross-functional teams, including Data Engineers and Business Analysts, to address data-related challenges and support data-driven initiatives.

These points can be used on your resume to highlight your qualifications and experiences relevant to the job description.

Extensively experienced at convening Joint Requirement Planning (JRP) sessions with business User Groups, conducting Joint Application Development (JAD) sessions with IT Groups, and Conflict Management.

Experience in Software QA and Testing Methodologies, verification, and validations in all phases of the SDLC.

Expertise in creating Unified Modeling Language (UML) diagrams including Use Case diagrams, Class diagrams, Sequence diagrams and Activity diagrams.

TECHNICAL SKILLS:

Big records Technology: Hadoop, MapReduce, Hive, Presto, Apache Spark (PySpark), Sqoop, Apache Airflow, Autosys, Snowflake, Teradata, Oracle, RDMS,Glue, Python, Scala, ETL and Data Visualization, Kafka.

Database/Datawarehouse: Oracle, SQL Server, Snowflake, Teradata, Redshift

Programming languages: Python, Scala, Java and UNIX shell scripting

CI/CD Tools: Jenkins, GitHub, Jira, Confluence, Tableau

IDE &Tools: Eclipse, PyCharm, MS access, MS Excel

Cloud Platform: GCP,AWS,, Cloudera, Microsoft Azure, Databricks.

Certifications:Google Data Analytics, AWS Cloud Solution Architect, Microsoft Power BI Data Analyst, Tableau Intelligence Analyst Professional Certificate

PROFESSIONAL EXPERIENCE:

Client: Apple, Texas Jan 2023 to till date

Role: Data Analyst

Responsibilities:

Proficient in Python for data analysis, scripting, and automation, enabling efficient data processing and manipulation.

Demonstrated proficiency in data protection principles and cryptography applications, including expertise in handling digital certificates, private keys, cryptographic algorithms, and symmetric keys to bolster organizational data protection strategies.

Extensive expertise in Microsoft Office Suite, particularly Excel, showcasing skills in macro creation, Pivot-table utilization, VLOOKUP implementation, chart development, and crafting complex formulas to enhance data analysis and reporting.

Proficient in utilizing Guide wire Insurance Now and Guide wire Policy Center for both Personal Lines and Commercial Lines in the Property and Casualty Insurance domain.

Proficient knowledge and hands-on experience with Microsoft Power BI and other data analytics tools, illustrating the ability to leverage these platforms for effective data analysis and presentation.

Applied Databricks for efficient data processing and analysis within the Property and Casualty Insurance domain.

Applied Splunk queries to investigate and resolve incidents, ensuring rapid response and system stability.

Utilized Databricks for complex data transformations and mapping from source to target systems.

Strong SQL skills, with the ability to write complex queries to extract, transform, and analyze data, ensuring data accuracy and reliability.

Spearheaded ETL processes, streamlining data flow from source systems to the data warehouse.

Crafted end-to-end Business Intelligence (BI) architectures to support analytics and reporting requirements.

Experienced in AWS, specializing in migrating tech stack from Oracle to AWS services for enhanced scalability and cost-efficiency.

Expertise in data reporting metrics and dash-boarding, utilizing tools like Amazon Quick-sight to create interactive and visually appealing reports for data-driven decision-making.

Expertise in implementing and managing data governance frameworks within organizations, ensuring the development and enforcement of policies, standards, and procedures for effective data management.

Implemented data classification and sensitivity labels within Microsoft Purview to ensure proper handling of sensitive information.

Developed and maintained data pipelines to automate data extraction, transformation, and loading (ETL) processes, reducing manual work and enhancing data quality.

Played a pivotal role in the successful transition from Oracle to AWS, ensuring data integrity and accessibility while achieving cost savings and scalability.

Leveraged Python libraries, including Pandas and NumPy, for data manipulation and analysis.

Designed and executed PySpark and Airflow jobs to handle complex data transformations and job orchestration, improving data accuracy and workflow efficiency.

Involved in Translating business concepts into XML vocabularies by designing XML Schemas with UML.

Worked on following applications Business Objects, Enterprise Architect, Toad, Plainview (project management), Microsoft Suite (Word, Excel, PowerPoint, Visio, Access, Project).

Environment : AWS, EC2, S3,AWS Glue,Power BI, Airflow,SQL, MS access, MS excel,Python, Teradata, GIT, Apache Spark, Athena, Snowflake, SQL server, Tableau,Microsoft Purview, Oracle.

Client: Ryder System, Georgia Jan 2021 to Dec 2022

Role : Data Analyst

Responsibilities:

Worked with the fundraising teams to gather data migration and ETL requirements and preparing requirement documents.

Worked on creating Ad-hoc reports, subscription reports by using Report Builder in SSRS.

Performed Data Cleaning, Data Profiling, and Data Analysis and Data Mapping operations to bring more insights onto the raw finance data.

Worked on exporting data using Teradata and involved in the development of Reporting Data Warehousing System.

Collaborated with business units to understand data governance requirements and incorporated them into Microsoft Purview.

Generated Reports and queries using Crystal Reports for product information report and products due for analyzing.

Implemented and optimized SQL queries within Data-bricks for large-scale data sets, ensuring high performance.

Implementation of data governance tools and technologies to automate and streamline data management processes, enhancing efficiency and reducing the risk of errors.

Collaborated with data engineers to optimize data warehouse performance, resulting in faster data retrieval for Power BI reports.

Implemented row-level security in Power BI to ensure sensitive data access control, meeting strict compliance requirements.

Involved in understanding the customer needs with regards to data, documenting requirements, developing complex SQL statements to extract the data and packaging/encrypting data for delivery to customers.

Proficient in Python, using it for data manipulation, analysis, and automation to streamline data processes.

Experienced in PySpark, harnessing its capabilities for distributed data processing and advanced analytics & utilized Airflow for job orchestration and task automation, enhancing workflow efficiency.

Demonstrated expertise in writing Python scripts to pre-process, clean, and transform data for analysis and reporting.

Employed PySpark for data transformation, ETL processes, and querying, ensuring data readiness for analysis.

Developed and maintained PySpark data pipelines, enabling seamless data integration and processing.

Conducted data analysis using Python, PySpark, and Airflow, identifying trends, patterns, and anomalies.

Collaborated with cross-functional teams to provide data-driven recommendations and support business objectives.

Proficient in utilizing Tableau for data visualization and analysis, demonstrating the ability to create interactive and insightful dashboards to support informed decision-making.

Experience in leveraging Tableau's advanced features for complex data modeling, including handling large datasets, creating calculated fields, and implementing effective data blending techniques.

Strong understanding of Tableau's integration capabilities with various data sources, ensuring seamless connectivity and extraction of valuable insights from diverse datasets.

Designed and executed PySpark and Airflow jobs to handle complex data transformations and job orchestration, improving data accuracy and workflow efficiency.

Participated in Writing SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.

Involved in Tested whether the reports developed in Business Objects and Crystal Report are as per company standards.

Involved in Testing Ad-hoc reports, drill down and drill through reports using SSRS and different detail, summary reports and on demand reports.

Environment : SQL, MS access, MS excel,Python, Airflow, Microsoft Purview, Oracle, Teradata, Power BI, GIT, AWS, Redshift, Mysql, Snowflake, SQL server, Tableau.

Client: Cloud Big Data Technologies Group, India March 2018 to Dec 2020

Role: Data Analyst

Responsibilities:

Created Source to Target Mapping documents, documented business and transformation rules and participated in working sessions, and ensured full business participation throughout the process.

Created DFD Data Functional Design artifacts that incorporated the process flow Visio, S-T Mapping document, and all the specifications for proper ETL implementation

Demonstrated expertise in various RDBMS platforms, including SQL Server and DB2.

Generated Data Definition Language (DDL) statements to define and manage database structures.

Maintained comprehensive documentation for data warehouse structures, ETL processes, and BI architectures.

Tested the FSDs and ETL mapping which were developed to load from different source systems in the Teradata Staging /Target areas.

Validated the data between the Source to Staging and Staging to Target using the Source to target mapping document as a reference.

Performing own Data profiling on the source to compare with a list of valid values while mapping source to target and writing exception cases if necessary.

Helped the Infrastructure team in loading the XMLs generated by the Informatica into the ALGO Limit management system.

maintenance review, and support Data analysis, Data Quality, and ETL design that feeds the logical data models of ER Studio tools.

Conducted internal and final DFD reviews with the ETL team EIS, Data Quality and Metadata Management team, data architects, and business and application data stewards.

Involved in identifying the defects and developed a Defect Tracking report using Mercury Quality Centre this involved the Bug life cycle

Environment: ER Studio, Teradata, Oracle, Informatica Power Center 8.6, Business Objects XI R3.1, Teradata SQL Assistant, Windows XP, Share Point, Altova XML SPY, XML, MS Access, MS Excel, SharePoint Portal.

Education: Master’s in Computer science



Contact this candidate