Post Job Free
Sign in

Data Analyst Business

Location:
Brooklyn, NY
Salary:
100000
Posted:
March 16, 2023

Contact this candidate

Resume:

Kiran Chavadi

Jersey City, New Jersey, *****

(*****.*.*******@*****.***), +1-515-***-****

linkedin.com/in/kiran-chavadi-b90002104

Results-oriented Data Analyst Engineer with 5 years of experience in data analytics, data extraction, data profiling, cleansing, and data visualization; overall 7 years of IT experience in Insurance and Telecommunications. Proficient in SQL, Excel, Power BI, and Tableau, with mid-level experience in Python for data wrangling, data modeling, data pipelines and warehouse. Skilled in extracting insights and presenting in an understandable, interactive dashboards that effectively communicate complex data insights to stakeholders. Adept at developing and implementing ETL data pipelines, familiar with AWS technologies, such as S3, Redshift, and Lambda. Excels at collaborating with cross-functional teams to identify business requirements and leverage skills and experience to drive business outcomes through data-driven solutions. Passionate about continuously learning and exploring new technologies.

TECHNICAL SKILLS

Programming/Scripting: Python, JavaScript, HTML, JSON, XML

Databases: MySQL, MongoDB, Oracle, Redshift

Cloud Technologies: AWS (Amazon Web Services)

Data Visualization: Power BI, Tableau, Google Data Studio, Klipfolio

Analytics: Adobe Analytics (AEM, DAT), Piwik (Matomo)

Container Platforms: Docker, Kubernetes

Version Control: GIT

Software Development Tools: PyCharm, Eclipse, Sublime text, Notepad ++

Testing: Selenium, JMeter, Blazemeter, Taurus (bzt)

Microsoft Office Suite: Excel, SharePoint, Outlook, PowerPoint, Graph Making, Pivot Tables

PROFESSIONAL EXPERIENCE

ExterNetworks Inc Data Analyst Engineer - Piscataway, NJ December 2021 - Current

The project is to enhance the security layer of application to detect future vulnerabilities against fraudulent activities, including fraud detection & model development to predict fraud patterns by analyzing historic data.

Platform & Applications: AWS, JavaScript, Mongo DB, MySQL, Matomo (Piwik-Similar to Google Analytics), Retool, Excel, Postman.

•With the In-N-Out knowledge of the application, architecture, and business needs; analyzing the web analytics data gathered using Matomo (Piwik)-tracking too via Open API and other user application data from Mongo DB. Analyzed user’s actions, visits using various custom tags established into the application.

•Before analyzing, did initial data cleansing using Excel for missing, duplicate values, messed up format for exploratory analysis, data validation using pivot tables, conditional formatting etc.

•Created Dashboard using Retool which consisted of the charts, metrices, reports of analyzed user activities.

•Analyzing event categories (custom tags setup accordingly) for each visit, looking for differences in regular pattern of actions. And gathering such IP address, geolocation data that could provide critical information.

•For any missing gaps, working with the front-end development team to create effective tags to capture detailed information.

•Alongside gaining insights on our website’s visitors to know the application usability, performance based on the visitor’s actions/behavior and optimize or if more campaigning/marketing, content management is needed.

Solarwinds Data Analyst Engineer - Piscataway, NJ December 2020 – November 2021

NOC Dashboard was developed to get at-a-glance informational view of metrics, KPIs, progress reports and key data points of all the Network devices being monitored by Externetworks for its customers. The data being captured from different data sources like Solarwinds, ServiceNow, Ruckus, In-Control tools through the custom developed ETL process to get real-time data.

Platform & Applications: Python, Reach JS, AWS, Node Js, Mongo DB, MySQL, JSONata, Postman.

•Gathered requirements from the SMEs and stakeholders with initial and regular touch-bases and created respective SRS document, workflows.

•Planned the tech-architecture and designed Data Model based on the business model and requirements collaboratively working with the dev-team and SME.

•Planned initial data migration and assured the data is accurate and have no gaps by analyzing using Excel pivot tables for exploratory data analysis.

•Development phases: The first phase included data modelling of Customer, ActiveIssues, CronStats, Customer Lookup & Tickets collections in Mongo DB where live data was to be stored using the ETL. Live data was gathered from SolarWinds by reading the SolarWind’s exposed SOAP XML and converting into JSON. Crons (setup at the Ubuntu server) were setup to read both customer information and activeIssues into Mongo DB.

•The second phase included, getting live Tickets/Incidents data from ServiceNow using Webhook and storing into Mongo DB.

•Third phase included setting up monthly SolarWinds data dump from its SQL Server into MySQL.

•Final phase: Once the ETL was ready, tested using Postman for accuracy, then analyzed data to come up with the meaningful visualization widgets for the NOC Analytics Dashboard as per client’s needs.

•Enhancement 1: One major enhancement among others was to provide User Management functionality for users to manage roles and access to this NOC Dashboard application. Came up with this feature so that the client could self-manage the application without our dependency.

•Enhancement 2: Spoke with the clients who will be using this application and gathered usage patterns and what representation of data they are interested in and updated the application with more meaningful visualization, functionalities.

ExterNetworks Inc Data Analyst Engineer - Piscataway, NJ October 2018 – November 2020

Field Engineer is a freelance job marketplace for telecommunications companies, where it provisions the businesses to post their jobs and the skilled telecom engineers to apply and complete the work requested by the business on demand. The project scope was to develop a Data Warehouse, an ETL to collect data from various data sources and provide ready to use data to display in various interactive real time widgets of transactions, workorder status, statistics of jobs posted etc. in UI of the application. As well as creating various day to day ad-hoc reports and visualizations showing the Business Metrices for the stakeholders.

Platform & Applications: AWS, Java, Angular, React Native, Node JS, Android, iOS, Mongo DB, MySQL, Matomo (Piwik), BLOCKSCORE, Tableau, Postman, GitLab, Zoho – the project management tool.

•Gathered the requirements from client and stakeholders on the organization's business needs and created Dashboards using Data Visualization Software like Tableau for reporting and Power BI for smart metrics to portray the company's overall performance and statistics.

•Acquired data from numerous sources: some being AWS RDS i.e., MySQL, Mongo DB, REST sources like: SendGrid, Matomo (Piwik), Twilio, BLOCKSCORE, Moodle, pay zee, Stripe etc.

•Performed Data cleansing using Excel to handle missing, anomaly data and Imputation. Performed descriptive data analysis using Excel pivot tables. Cleansed data was loaded into data lake i.e., AWS S3.

•Modified the data as required which was done in a Digital Ocean droplet and the code was written in Python.

•Cleansed and prepared the data and loaded in AWS Aurora RDS where the Datawarehouse is hosted.

•Read the binary logs of MySQL replica and dumped into Aurora using the Reader and Writer Endpoints to capture and reflect data to as almost real-time.

•Reading the binary log events and constructing the AWS SNS to fetch specific data to be utilized directly at application frontend to avoid data loss.

•Wrote Python APIs to fetch the users chat data in the REST format from Twilio, to archive in Data Warehouse.

•Add Indexes, apply query optimization techniques to improve performance of the views created to be used by the application.

•Interpreted, analyzed trends/patterns using statistical techniques to provide ongoing ad-hoc reports and intelligent matrices.

•Monitor the Data Warehouse, logs and performance indicators to locate code problems and fix them.

•Explore various statistical models for strategies that optimize statistical efficiency and quality. Apply the same in developing algorithms best suited for business use cases.

•Explore and decide latest software or frameworks on Data Analytics for new requirements to check whether it is compatible with existing application's architecture and help development team implement.

AT&T Technical Requirement Engineer - Piscataway, NJ March 2017 – October 2018

CPE-Advisor is a reporting application of AT&T used to keep the track of any issues in its Customer Premise Equipment usage. The application makes use of below-mentioned technologies and Machine Learning to provide the statistics of these equipment’s usage which is used by AT&T to analyze and take the effective measures to increase their productivity.

Platform & Applications: Java, Angular 4, Hive, Oracle 11g, Swagger, Zoho

•Responsibilities included, working closely with client to explore and review the systems requirements of the application: CPE-Advisor (a reporting application of AT&T to track issues in Customer Premise Equipment usage & provide the usage statistics)

•Identify and understand the requirement and close if any requirement gaps exist.

•Analyze client business requirements and process through the document analysis and workflow analysis.

•Responsible for documenting the technical requirements, explain, distribute the tasks to the development team.

•Responsible acquiring timely wireframes-baseline for web & mobile devices, involved in application end to end architecture plan for development and Azure cloud.

•Create iteration tasks, assign, follow up, keep track of tasks, issues and timelines with team members daily to ensure no slips.

AIG AEM Business Analyst – New York city, NY April 2016 – March 2017

Platform & Applications: Adobe Experience Manager, Google Analytics, Adobe DTM, Salesforce, JIRA, SharePoint, Scene 7 and Adobe Site Catalyst for analytics tracking.

•Involved in Daily stand-up meetings, Spring planning meeting, Sprint review meeting & retrospective, Sprint Release planning.

•Gathered business requirements from stakeholders and created the SRS document and maintained the artifacts in SharePoint, JIRA & Shared folder.

•Worked with Adobe Digital Tag Management (DTM) Team to understand the business need and accordingly help improve web tagging to gather meaningful insights.

•Worked on Adobe Analytics to generate campaigning reports on individual sites according to the business user requirement.

•Assisted in Global Websites Migration from Tridion CMS to AEM for countries under APAC and LAC regions by preparing the new landing pages and components, uploading Digital Assets and Tag Management.

•Worked on Microsoft Access DB to update the business user information using SQL. Also used JavaScript, HTML & CSS to discuss with the developers on customization of the UI elements in AEM.

•As a Global Admin worked on user management, user permissions, content authoring, workflow administration, tag management configuration.

•Worked under Digital Management Support (DMS) team to provide services: update the site content by authoring to maintain every individual business user’s sites also support during the deployment of new sites under production environment along with UAT.

EDUCATION

Master’s in Computer Science

Texas A&M University, Kingsville, Texas August 2013 – December 2015

Bachelors in Electronics & Communications

SDMCET, Dharwad, Karnataka, India June 2007 – September 2011



Contact this candidate