Post Job Free

Resume

Sign in

Senior Data Engineer

Location:
Cypress, CA
Posted:
February 15, 2023

Contact this candidate

Resume:

Padma Pesaru

Email: advcix@r.postjobfree.com

Phone # 424-***-****

SUMMARY

• Around 9.5 years of IT experience which includes around 6 years of experience in creating data pipelines using Snowflake, Spark, Python, Scala, Hive, Kafka, Redshift, MySQL, PostgreSQL and AWS technologies and around 6 years of experience in writing stored procedures, triggers, views, user defined functions, cursors, performance optimization and indexing. Around 6 years of experience in creating various Dashboards visualizations and reports using Tableau, Sigma and DOMO.

• Experienced in performing Extraction, Transforming and Loading (ETL) data and creating mappings/workflows to extract data from different sources and load into various Business Entities.

• Good experience in developing spark jobs using Python and Scala for faster data processing.

• Good knowledge in major components in Hadoop Ecosystem like Hadoop Map Reduce, HDFS, HIVE, PIG, HBase, Zookeeper, Sqoop, Oozie and Flume.

• Extensive experience in Designing, Developing, and Maintaining Relational Databases (MySQL Server)

• Good experience in AWS environment for cloud computing and using various AWS services such as EC2, S3, RDS,EMR, Dynamo DB, Kinesis, Step Functions, Lambda, Glue, Athena, IAM,SNS and SQS.

• Experience in creating stored procedures, functions and triggers using MySQL.

• Expertise in creating Parameterized reports, Drill down, Drill through, and visual graph reports.

• Extensive experience in creating python scripts to automate different tasks such as scheduling and monitoring various jobs.

• Expertise in using different Business Intelligence Solutions such as Tableau, Sigma and DOMO to analyze the data and create interactive dashboards, visualizations and reports.

• Good knowledge in Machine Learning models.

• Sound knowledge in Agile Methodology using Jira tool.

• Familiar with all aspects of technology projects including Business Requirements, Technical Architecture, Design Specification, Development and Deployment. Well versed in various methodologies such as Waterfall, iterative, incremental and agile.

• Experience in core java, EJB, Servlets, JSP’s, Web Services.

• Expertise development in Linux, UNIX, Windows8/7/Vista/ XP/2000/ NT/ 98 and Linux.

• Experience in Configuration Management Systems such as Git, CVS, PVCS.

• Good communication and interpersonal skills.

• Excellent analytical and problem solving skills and ability to quickly learn new technologies.

• A very good team player with the ability to work independently. EDUCATION

• B.Tech Computer Science from JNT University, Hyderabad, India. CERTIFICATIONS

• AWS Certified Solutions Architect - Associate (2019)

• PCAP – Certified Associate in Python Programming (2019)

• Machine Learning by Stanford University on Coursera (2018)

• Oracle Certified MySQL developer (2015)

• Cloudera Certified Developer for Hadoop (CCDH-2014)

• Sun Certified Developer for Java Web Services (2009)

• Sun Certified Web Component Developer for Java 2, Enterprise Edition 1.4. (2008)

• Sun Certified Programmer for Java 2 Platform, Standard Edition 5.0. (2007)

• Oracle Certified Associate for Oracle 9i (2003)

SKILLS

Databases: Snowflake, Redshift, Mongo DB, MySQL 5.x, Document DB, Oracle 10g/9.x/8.x, PostgreSQL, MS SQL.

BI tools: Tableau 2022.2, Sigma, Domo5.x, Kibana4.3, Big Data Technologies: Spark, Hive, Kafka, Hadoop, Pig, HBase, Sqoop, Flume, Cassandra, Zookeeper, Databricks, AWS S3, Dynamo DB, Kinesis, Athena, Glue, Lambda, Step Functions, EMR, EC2, RDS, SNS, SQS, GCP, Octave, MATLAB.

Languages & Tools: Python, PySpark, Scala, Java, J2EE, R, Node JS, TypeScript, Airflow, PyCharm, EJB 2.0, JMS, MVC, JSON, Hibernate4.x, JDBC, Web Services, PHP, Ajax, jQuery, Junit, JSP, Servlets, Ant1.x, Putty, GitHub, JIRA, UML.

Web Technologies: JavaScript, XML, XSL, XSLT, SOAP, RESTFUL, WSDL, HTML5. IDE: IntelliJ, Aginity, SQLWorkbench, Toad, Eclipse Framework: Struts, Spring4.x.

Application Servers: BEA WebLogic 10.x, IBM WebSphere, JBoss. Web Servers: IBM Http Server, Apache Tomcat.

Operating Systems: UNIX, Linux, Windows 10/8/7/vista/XP/NT/98. PROFESSIONAL EXPERIENCE

• Worked for Super League Gaming Company as a Senior Data Engineer from Feb 2020 to Dec 2022.

• Worked for The Honest Company as a Data Engineer from Jan 2015 to Oct 2017.

• Worked for Virtusa India Private Limited as a Software Engineer from August 2003 to Nov 2006.

Client: Super League Gaming Inc.

Role: Senior Data Engineer Feb’20 – Dec ‘22

Super League Gaming provides gaming platform and monetizing tools in the Metaverse. This System is designed using Mongo DB, Document DB as backend and React JS and Node JS as front end. It keeps track of Game servers and players activities. It also provides services for 3rd parties to advertise and monetize in the Metaverse. Responsibilities:

• Involved in gathering the Business requirements from the PMO/Business and End Users for various projects.

• Extensively worked on developing data pipelines using Python, Snowflake, Fivetran, AWS S3 and Tableau to ingest, transform, analyze and create Dashboards for Game sessions, Servers and Players data.

• Designed and implemented data pipelines to retrieve, transform and load into Snowflake data lake from various 3rd party sources such as Stripe, SendGrid and Xsolla using API calls.

• Built in house data framework for replacing 3rd party analytics tools to fetch the insights data from various social media accounts using API calls and web scraping and ingest the data and create the user interactive dash boards in Tableau.

• Created interactive dashboards with near real time data about the server and player information to serve the Product, Marketing and Customer service teams.

• Helped the management understand the user base and deploy targeting campaigns effectively by creating Rolling and Classic retention Dash boards for returning and new players with Geo specific data.

• Built monitoring system to check the data integrity and alert the data team.

• Created complex summary tables in Snowflake and generated various data sources, Dashboards and Stories in Tableau.

• Created various cohorts, weekly and monthly reports showing detailed information for Finance, Product, Marketing and Sales teams.

• Created automatic jobs for many manually handled requests and automated them for more dynamic approach.

• Orchestrated data pipelines and workflows using Airflow.

• Participated in Sprint meetings and updating status on reports in Agile Environment.

• Worked with product managers and technical leads for resolution of technical issues. Environment: Snowflake, Python, React JS, Node JS, TypeScript, Mongo DB, Document DB, Tableau, Sigma, Fivetran, JSON, AWS S3, Athena, Kinesis, GCP, WooCommerce, Airflow, GitHub, JIRA.

Client: Honest Company

Role: Data Engineer Jan’15 – Oct’17

The Honest Company is a consumer goods company that emphasizes non-toxic household products to supply the marketplace for ethical consumerism. This System is designed using MySQL Server as backend and ruby on rails as front end. It keeps track of ecommerce transaction details. It also holds customer information, their orders and preferences detail information and Inventory Information. Responsibilities:

• Involved in gathering the Business requirements from the PMO/Business and End Users for various projects.

• Developed a data pipeline using Spark, Redshift and Domo to ingest, transform, analyze and create reports for wholesale and online sales data.

• Worked on a data pipeline to process MySQL binary logs and load into Hive and Redshift Data Lake using Spark structured streaming, Maxwell and Kafka and created near real time reports for Customer Service and Operations teams.

• Created interactive dashboards with near real time data to serve the Operations and Customer service teams.

• Extensively worked on stored procedures in creating different types of jobs to calculate and generate reports.

• Developed various ADHOC queries and Reports to generate the EXCEL reports based on different criteria and requests from End Users.

• Created complex summary tables in MySQL and RedShift databases and generated various data sources, data cards, data flows and visual graph reports in DOMO.

• Created various cohorts, weekly and monthly reports showing detailed information for Finance, Accounting, Marketing and Sales teams.

• Created automatic jobs for many manually handled requests and atomized them for more dynamic approach.

• Created Scripts, parameterized reports, Stored Procedures, User Defined Functions, Triggers, Indexes and Views according to the requirement.

• Accomplished the task of replacing the manual import process through Job Scheduling and redesign the existing code for better performance.

• Implemented job scheduling and monitoring system using Airflow.

• Performed daily system check and maintenance job.

• Analyzed database performance and optimized indexes to significantly improve queries.

• Participated in Sprint meetings and updating status on reports in Agile Environment.

• Worked with project manager and technical leads for resolution of technical issues. Environment: Apache Spark, Scala, Python, Hive, Redshift, Kafka, Maxwell, MySQL 5.x, PostgreSQL, Mongo DB, JSON, Databricks, AWS S3, EMR, IntelliJ, Airflow, GitHub, Domo 5.3, JDE, PHP, Ruby on Rails, Putty, SQL workbench, JIRA, Pentaho, Lucid chart, Kibana 4.3.

Client: Zilliant

Project: Deal Management Feb’06 - Nov’06

Role: Software Engineer

Zilliant, a leader in strategic pricing applications, provides software and services that aid companies in defining and managing their pricing for improved financial results. Deal management is another feature been added to Zilliant suite, which enables to make the deal process fully automated and to increase the revenues. Responsibilities:

• Implemented the server side design using the Industry Standard J2EE design patterns like Business Delegate, Service Locator and Data Access Object.

• Developed JSP’s, which interact with the EJB’s.

• Developed DAO's for persisting the backend data with oracle database.

• Developed presentation layer, using MVC framework, JSP, Action Form.

• Implemented Form classes, Action classes for the Deal Management module using Struts framework.

• Deployed EJB’s on WebLogic 8.1 application server. Environment: JSP, Servlets, EJB, J2EE, JAAS, Struts, Spring, Hibernate, Shark, JRules, Windows 2000.Weblogic 8.1, Eclipse, Oracle

Client: CircuitCity, VA

Project: Vendor Management Administration System Nov’04-Jan’06 Role: Software Engineer

The main objective of Vendor Management Administration System was to re-engineer the existing AS/400 and PB based system and develop a new J2EE and WebLogic based system, which could overcome various short comings of the current System. Responsibilities:

• Used design patterns like Business Delegator, Service Locator, Factory, and Singleton for the development of business logic.

• Developed Struts validations and implemented internationalization using resource bundles and wrote various Struts specific Action and Action Form classes for handling different requests.

• Deployed EJBs on Web logic 8.1 application server and developed front-end screens using JSP, DHTML and JavaScript

• Involved in creating Use cases, Class and Sequence diagrams using rational rose and unit tested the application uniting JUnit test suit. Environment: Struts 1.1, EJB 2.0, WebLogic 8.1, Oracle, JSP, LDAP, PVCS. Client: Sprint (LightBridge), Burlington, MA

Project: RMS Aug’03 – Oct’04

Role: Software Engineer

RMS is a software Application for managing retail environment in the Telecommunication industry. Involved in implementation of the framework using MVC architecture and following design patterns like Data Objects, and Front Controller. Deployed EJBs on WebLogic 6.1 application server. Developed Enterprise Java Beans and Data Access components for Inventory and Reporting modules using the following technologies: J2EE, Java 2.0, Servlets, LDAP, JSP’s, JDBC, EJB, JMS, Struts 1.1, SOAP, swing, XML, WebLogic 6.1, JavaScript, ANT, Oracle 9i.



Contact this candidate