Vamshi Krishna
*********@*****.*** 469-***-****
Objective:
Highly skilled and detail-oriented Data/Analytics Engineering Analyst with 7+ years of experience in collecting, analyzing, and interpreting complex data sets. Seeking a challenging position to leverage my expertise in data analysis, statistical modelling, and data visualization to drive business growth and provide valuable insights.
Professional Summary
Data analyst with 7+ years of work experience in Data Science and Analytics and IT business analysis, worked with self-organized, cross functional teams in Government Administration and Healthcare industry with strong understanding in the areas of Portfolio management, Trading life cycle, market analysis and Securities lending.
Experience in working with different Software Development Life Cycle (SDLC) and Agile methodologies such as Waterfall, XP, Scrum, Waterfall-Scrum Hybrid and SAFe (Scaled agile framework).
Experience in implementing data analysis with various analytic tools, such as Anaconda 4.0JupiterNotebook 4.X, R 3.0 (ggplot2, Caret) and Excel.
Solid ability to write and optimize diverse SQL queries, working knowledge of RDBMS like SQLServer2008, NoSQL databases like MongoDB3.2.
Collaborated with cross-functional stakeholders to identify automation opportunities and deliver scalable Python solutions that reduced manual workload.
Automated business-critical processes including report generation, API testing, and data validation using Python and open-source libraries.
Experience in visualization tools like, Tableau9.X, 10.X for creating dashboards.
Strong background in Oracle Database (11g, 12c, 19c), proficient in SQL and various database objects
Proficient working experience with database objects like Stored Procedures, Functions, Packages, Triggers and using the latest features to optimize performance like Bulk Binds, Inline views and Global Temporary Tables.
Database objects including Tables, Views, Primary keys, Indexes, Constraints, Packages, Sequences, Grants and Synonyms.
Very good experience in Oracle tools like SQL Plus, SQL Loader, SQL Developer and TOAD in secure databases.
Excellent understanding Agile and Scrum development methodology .
Used the version control tools like Git 2.X
Passionate about gleaning insightful information from massive data assets and developing a culture of sound, data-driven decision making.
Ability to maintain a fun, casual, professional and productive team atmosphere.
Involved in Test Environment setup in both Manual and Automation performed Functionality Testing, Regression testing and Smoke Testing for process of a given software application.
Specialized in Automation and Manual Testing using different tools like Selenium WebDriver, QTP/UFT, and Quality Center/ALM.
Proficient in System testing, Smoke testing, Regression testing, Integration testing, Functional testing, UAT testing, and API testing.
Experienced in database patching and updates for both MSSQL Server and EDB PostgreSQL Enterprise, ensuring system integrity and security.
Strong troubleshooting skills, able to identify and resolve database issues promptly to minimize disruptions and downtime.
Expert in creating Public Synonyms, DB Links.
Experience in database design using SQL to write Stored Procedures, Functions, Triggers and strong experience in writing complex queries, using Oracle, SQL Server and MySQL.
Proficient in creating and managing workflows using relevant Informatica tools.
Supported and maintained ETL processes, emphasizing reusability and efficiency.
Strong analytical, problem-solving skills.
Confident to work independently and proactive team member.
Developed SSRS reports translating complex data into actionable insights. Integrated diverse sources for accuracy and dynamism. Created parameterized reports, optimizing performance.
Designed interactive Power BI dashboards for decisions. Integrated data sources, parameterized reports, optimized performance. Collaborated for aligned visual solutions.
Crafted Tableau visualizations for informed decisions. Integrated varied data, parameterized reports, optimized performance. Collaborated for troubleshooting and insights.
Tools:
Requirement
MS Office Suite, JIRA, Confluence
Modeling
MS Visio, Erwin Data Modeler, Mockup Screens, Smart Sheet
ETL/Reporting Tools
Informatica, Tableau, Pentaho, MicroStrategy, QlikView
Databases
SQL, Hive, T-SQL, Impala, Pig, Spark SQL, Databases SQL-Server, My SQL, MS Access, HDFS, HBase, Teradata, Netezza, Mongo DB, Cassandra.
Languages
Python, SQL, PLSQL, HTML, JavaScript
Big Data Tools
Hadoop, Hive, Pig, Big Query, AWS Redshift, Sqoop, Flume, Storm, Kafka, HBase
Testing Tools
Selenium, Postman and SOAP UI
Data Warehousing
Data Marts, Data Modeling, OLAP, OLTP, Data Mining, ETL, Slicing/Dicing, Drill Down/ Roll up, Pivot, Star and Snowflake Schema
Web Services API
SOAP, REST, XML, JSON
BI Tools
Tableau, Tableau server, Tableau Reader, SAP Business Objects, OBIEE, QlikView, SAP Business Intelligence, Amazon Redshift, or Azure Data Warehouse
Education Qualification:
Bachelor of Technology in Computer Science, ACE, Hyderabad, Telangana, IND - May 2017
Master of Science in Computer Science, University of Illinois Springfield, IL, USA - Dec 2020
Professional Experience:
Client: Illinois Department of Healthcare & Family Services, Springfield, IL Aug 2023 - Present
Role: Data/ Analytics Engineering Analyst
Responsibilities:
Functioned closely with the corporate business team, control teams and technology teams to understand the end-to-end business process, data security, regulatory and compliance regarding the project.
Elicited requirements using interviews, document analysis, business process descriptions, use cases, scenarios, event lists, business analysis, task, and workflow analysis.
Assisted the Data Analytics team with executing, validating, and assembling organized analytics to meet compliance and to create value out of financial data that drives business.
Performed swot analysis and gap analysis between the current system application and targeted enhanced system.
Worked with Change Management Team and Application Development team to deploy codes in QA, UAT, TRN and PROD environments.
Experience in UI development using PHP, JavaScript and Wamp.
Provided end-to-end solutions from exploratory data analysis, model development and ETL using Python to finished dashboard/reports using MicroStrategy
Developed logic and key performance metrics and managed improvement projects to enhance business intelligence reporting tool, QlikView
Designed and implemented dashboards and ad-hoc Excel reports for billing, accounting, accounts receivable, inventory, purchasing and sales departments
Used Python programs automated the process of combining the large SAS datasets and Data files and then converting as Teradata tables for Data Analysis.
Developed Python programs for manipulating the data read from Teradata data sources and convert them as CSV files.
Developed and deployed Python-based end-to-end automation pipelines to streamline repetitive tasks and reduce manual effort.
Integrated Python automation with APIs, databases, and third-party services to enable seamless data flow and operational efficiency.
Automated REST API testing and validation using Python and the requests library, improving API reliability.
Developed Python scripts to simulate user workflows and validate end-to-end functionality of web platforms.
Designed dynamic and user-friendly Tableau dashboards to track key performance metrics.
Created interactive reports with advanced charts, filters, and calculations to enhance decision-making.
Extracted, transformed, and optimized datasets for Tableau using SQL and other ETL tools.
Implemented efficient queries and indexing strategies for faster dashboard loading.
End to end implementation and deployement of rules using Corticon Server.
Worked on numerous ad-hoc data pulls for business analysis and monitoring.
Designed and developed various monthly and quarterly business monitoring excel reports by writing Teradata SQL and using in MS Excel pivot tables.
Consistently reprioritize the Product backlog for maximizing business value or return on investment (ROI) on the Application developmental effort.
Helped PO and PM in ROI Analysis, Cost-Benefit Analysis, Risk analysis and SWOT analysis in order to be prepared for potential business and technical risks to the system.
Having extensively worked in developing ETL program for supporting Data extraction, transformations and loading using Informatica Power center.
Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
Acquired the technical knowledge, deep understandings of the system and requirements at a detailed level by interacting with Design teams, tech leads and product owner and defined Key performance indicators and their attributes.
Experienced in developing web applications using latest JavaScript ES6 features and framework/Libs such as React and Redux to store the application state.
Developed UI pages using HTML5, React, Bootstrap, CSS3, JavaScript, jQuery, AJAX and NodeJS.
Partnered with product owner in writing user stories and conducted user story writing sessions to decompose EPICS into user stories and determined Acceptance criteria for user stories.
Assisted product owner in Sprint Review meetings by facilitating product demos and backtracking them to the requirements to determine when it is DONE and shippable.
Effectively worked in Informatica version based environment and used deployement groups to migrate the objects.
Worked with architects in understanding design documents and created UML diagrams such as sequence diagrams, sequence using MS VISIO.
Worked with Smart Sheet software’s collaboration and time management tools.
Used various SQL queries (DDL, DML, TCL, Aggregate functions like Group by, Distinct, Having, Max, Min and Joins such as inner join, left, right and outer joins) to perform data profiling and data validation.
Environment: Teradata, SQL Assistant, Python, MS office, NoSQL, MS Excel, Agile, Windows, UNIX, SAS EG, SQLPutty,Scrum-agile methodology, Balsamiq Mockup, Tableau, Cucumber, Jira, HPALM, Confluence, Oracle PL/SQL.
Client: State of Texas-Health & Human Services Dept., Austin, TX Jan 2021 – July 2023
Role: Senior Data Analyst
Responsibilities:
Gathered requirements from various departments through interviews, brainstorming, requirements workshop and JAD sessions.
As part of this team, I was involved in Retrieve data from Database and developing reports based on the business requirements.
Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), and SQL Server Integration Services (SSIS)
Generating the reports for Teradata and Oracle Database to analyze the customer behavior and plan strategies to improve the response rates of marketing Campaigns.
Built scalable test automation frameworks using PyTest and Selenium for web applications, achieving 95% test coverage.
Implemented OAuth2 and token-based authentication flows in Python automation for secure API access.
Automated regression, smoke, and sanity test suites with Python, reducing QA cycle time by 60%.
Implemented CI/CD integration for automated test execution using Jenkins and GitHub Actions.
Created Numerous Volatile, Set, Multiset, Derived, and Global temp tables.
Extensively used joins while extracting data from multiple tables.
Developed Teradata SQL scripts using various characters, numeric and date functions.
Developed responsive web Interfaces using Angular JS, JavaScript, Ajax and CSS3, HTML 5, jQuery.
Develop / Auto deploy content using AWS (Amazon Web Services), GIT/Bitbucket, Maven, Jenkins.
Create new EC2 instance in AWS, allocate volumes and giving Provisionals using IAM
Improved performance while using performance tuning techniques like primary Index and collection statistics on index columns.
Interacted with different business groups and performed Gap Analysis to understand the current system and the goals of the desired system to be built.
Interacting with clients to analyze the organization business process and map it to the Oracle E-Business suite
Analyzing and comprehending the business requirements of the client organization.
Designing and implementing the integration process to connect traditional information systems with Enterprise Resource Planning (ERP) systems.
Demonstrating proficiency in Oracle Cost Management, Production, Process Execution, and Testing.
Developed interactive dashboards using Tableau for the Supply Chain reporting & analytics team to monitor operational KPI's
Communicated effectively with client/stakeholders and documented understandings and actionable items (KPI’s) from meetings.
Worked with data modelers and architects in creating UML diagrams (use case diagrams, data flow diagrams, sequence diagrams) by using MS Visio.
Interacted with systems architects and developers in designing data conceptual model and data logical model.
Developed UI part of the application applications using JSP’s, JavaScript, Ajax, CSS and HTML.
Developed Informatica mappings and Reusable transformations to facilitate timely Loading of Data of a star schema.
Developing SQL queries to extract data and generate comprehensive reports.
Facilitating Conference Room Pilot (CRP) and User Assessment Training (UAT) sessions for business clients.
Assisted in designing, planning, and managing the data migration process from on on-premises Microsoft SQL Server based data warehouse to AWS Redshift.
Executed SQL queries to retrieve data from different databases, for data validation & analysis, researching underlying data errors and generating reports.
Interacted with subject matter experts (SME’S) to create data mapping specification document and data dictionary.
Created landing pages, dashboard and web applications using Smart Sheet, Salesforce, and Html programming languages.
Used HQL (hive query language- used joins, unions, ISNUll, date functions) for data validation to ensure data accuracy.
Environment: Python, R, SQL, Tableau, Spark, SAS, Stata Machine Learning Software Package, recommendation systems, Scrum, HDFS, SQOOP, FLUME, Smart Sheet, MS- Visio, MS office, MS power point, AWS (Amazon web services), Amazon S3, AWS EC2, Kafka, Aws Dynamo DB, Linux.
Client: Soft Pro Systems Ltd, Hyderabad, India June 2017 – Dec 2019
Role: Business analyst
Responsibilities:
Interacted with Business Users for gathering, analyzing, and documenting business requirements and data specifications.
Worked on numerous ad-hoc data pulls for business analysis and monitoring.
Developed spark applications using Scala for easy Hadoop transitions.
Designed and developed various monthly and quarterly business monitoring excel reports by writing Teradata SQL and using in MS Excel pivot tables.
Performed in depth analysis on data and prepared ad-hoc reports in MS Excel and SQL scripts.
Performed verification and validation for accuracy of data in the monthly/quarterly reports.
Developed Teradata SQL scripts using various characters, numeric and date functions.
Created multi-set tables and volatile tables using existing tables and collected statistics on table to improve the performance.
Analyze data using statistical software (Stata) and maintain database files.
Developed Teradata SQL scripts using OLAP functions like rank and rank Over to improve the query performance while pulling the data from large tables.
Utilized ODBC connectivity to connect Teradata from MS Excel to automate the data pull and refresh graphs for weekly and monthly reports.
Experience in performing Dual Data Validation on various Businesses critical reports working with another Analyst.
Responsible for Rest API testing using postman tool and acted point of contact to developers and assisted them in understanding API documentation.
Logged and tracked the defects using HP Quality Center and created an error repository to support QA team in testing, tracking defects and systematically manage defects and bugs.
Assisted end users to perform User Acceptance Testing. Assisted QA team in designing Test Cases, Test Plans and Test Scenarios to perform Functional Testing, Regression, System and Load Testing.
Environment: R, R Studio, PyCharm, MS SQL Server, Oracle SQL Developer, Teradata, Tableau Server, Tableau Desktop, Hadoop, HDFS, HiveQL, MS Excel, MS PowerPoint, MS Word, Windows Server 2008, Linux.