Post Job Free

Resume

Sign in

SQL Developer

Location:
Manhattan, MT
Salary:
80 $ per hour
Posted:
November 02, 2023

Contact this candidate

Resume:

ad0tc5@r.postjobfree.com

Phone no: 469-***-****

SUMMARY

• 8+ years of experience in building data - oriented systems and supporting projects in ETL development, Data Integration, Data Analysis, Data Modeling, Data Warehousing, Data Management, Business Intelligence and Machine Learning.

TECHNICAL SKILLS

Programming: Python, SQL, PL/SQL, C/C++, Cplex, XML, Json, UNIX Shell Scripting, HTML, CSS, Pig, Hive, Java, Julia

Databases: Oracle 11g/10i/9i/8i, MongoDB 2.6.1, MySQL, PostgreSQL, MS Access 2000, Netezza

ETL Tools: Ab initio 3.1/3.2, Clover ETL 3.2/3.4/3.5, Informatica 7/9, Alteryx Data Modeling: Erwin 7.x, ER studio, ArchiMate

BI and data mining: Datameer, SAS, Mathematica, Excel, Tableau, Cognos, MINITAB, SPSS Other tools: Airflow, Collibra, MS Azure, AWS, Toad, SQL Developer, Remedy, Bugzilla, Rational Team Concert 4.0, Putty, SVN, Docker

PROFESSIONAL EXPERIENCE

Client: Rocket Systems, Sacramento, CA Jan2021-Oct2023 Designation: Data Engineer

Responsibilities:

• Designed and developed datalake and data pipelines to ingest data from various data sources and supporting analytics platform and data warehouse

• Designed and developed a data encryption/decryption framework to secure sensitive data throughout the analytics platform

• Implemented a metadata and data governance platform to provide data security and visibility throughout the analytics platform Tech: Python, Meltano, Singer, Airflow, Snowflake, AWS, Great Expectations, Collibra, Docker, dbt

Client: Cubet, Atlanta, ATL Apr2019-Dec2020

Designation: Data Engineer/Architect

Responsibilities:

• Architected and designed data and analytics platforms and processes as part of enterprise digital transformation.

• Migrated data from legacy systems to newly architected platforms; Netezza data warehouse, EDH and Oracle to Azure Synapse and Kudu Cloudera

• Designed and developed ML proof of concept including Confidential Could Pak platform and customer segmentation use case

• Designed and developed Client360 proof of concept including architecture, back-end and front-end

Tech: Django, Python, PySpark, Shell, Hive, HDFS, Oracle, MS Azure, Alteryx, Netezza, Mysql, SQL/PLSQL, Confidential Cloud Pak for Data, Cloudera Data Platform, Hue Client: Solid Brain, Austin, TX Mar2017-Jan2019

Designation: Database Developer

Responsibilities:

• Market Analytics and Classification Machine Learning support processes

• BI Dashboards and reporting and ETL jobs and processes

• Data Lake in Hadoop environment and Hive injection processes

• Datameer big data analytics processes and promotion scripts

• Python scripts for data transformation and as utility functions

• Shell script utility function

Tech: Python, Shell, Scala, Oracle, Hive, HDFS, Ab Initio, Datameer, SQL/PLSQL, Sybase, R, SVN

Client: Digi Trends, Wilmington, TX Sep2015-Mar2017 Designation: Data Engineer

Responsibilities:

• Data Engineer for a multi domain custom master data management solution.

• Built data pipelines that ingested and migrated data from various sources in various formats and was consumed by 6000 different federal and commercial organizations

• Developed Java code for data transformation and shell scripts for file management and operation

• Implemented data governance by creating 4000 data quality rules

• Performed extensive performance tuning and improvement of ETL jobs and processes

Tech: Shell, Java, Python, Oracle, SQL/PLSQL, Clover ETL Client: XB Software, Reston, VA Dec2013-Sep2015

Designation: Report Developer

Responsibilities:

• Areas of research and study: Statistics and Time Series, Metaheuristics and Simulation, Network Optimization and Graph Theory, Mathematical Programming, Stochastic Programming

• Areas of application: Financial Engineering and Portfolio Optimization, Healthcare, Supply Chain and Contract Management.

Tech: C++, SAS, Cplex



Contact this candidate