Role: Oracle Developer, Hadoop Migration
Environment: Oracle 11g, Hadoop, Hive, SQL Loader, Shell, Informatica, Linux, Cronjob
ORM is a Risk system which logs compliance sensitive information for Wells Fargo
Oracle: Lead Developer
Designing and developing database components to create ORM compliance engine to accommodate upstream data and downstream ETL requirements
Develop stored procedures, triggers, SQL Loader Scripts, Bash Scripts to load data
Developing reports’ scripts using SQL scripts
Oracle-Hadoop: Migration analytics
Data Modeling - Identify “CDE” (Critical data elements) from source tables. Create mapping diagrams between source oracle table/fields to target Hive tables.
Data Quality - Perform data validation exercises, to determine/evaluate/rectify discrepancies between source values/types (oracle) and hive tables
SQOOP queries - Create sqoop queries to extract data from Oracle
Python: Converting PLSQL logic to Python scripts to extract data from Hive
Next Era Energy-FPL, Juno Beach FL Dec2015 – May2016
Role: Oracle Developer
Environment: Oracle 11g, PL/SQL Developer, TOAD, Informatica, Linux, Agile, Control-M
Next Era energy (FPL) is a major power vendor which gets power from major producing partners and delivers the same to consumers. The power source is wind, solar and hydro etc. Power consumption records are sourced from Meters and data provided to billing systems for billing department. Data is procured primarily from the meters as flat files which is processed and inserted to Oracle database via cron jobs. Once data is received to the master tables, it is processed via triggers and procedures to evaluate customer level reports.
Roles & Responsibilities:
Attend business meetings and analyze data from new sources as well as understand business requirements. Evaluate/Write technical specification requirements to specify change required in the system
Designed and created Database tables, Keys, Indexes, Views, Materialized view to store data as per requirement. Developed Shell scripts for Extracting and Parsing data from the files.
Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle database. Develop Packages, Cursors, Procedure, function & triggers etc.
Production Support: Batch Monitoring and dealing with upstream process upon failure
Eximius Technology, Mumbai Sept2007 – July2009
Environment: Linux, Oracle 11g, Oracle, Oracle Designer and TOAD.
Eximius Data master, a fully managed data service solution and utility for reference and market data including asset descriptive data, pricing, corporate actions, terms and conditions, counterparty/legal entity, exchange traded funds and indices. FIS is an agent for over 100 different vendor/supplier feeds including Bloomberg, Euronext, Thomson Reuters, Telekurs, Standard &Poor’s, Interactive Data Pricing and Reference Data, etc., as a Single Source Of Data SSOD™ that is accessible on-line or can be replicated locally on site using FIS Container for full local control.
Extensively worked in Oracle SQL, PL/SQL, SQL*Plus, Schema Management, SQL*Loader, Query performance tuning, Created DDL scripts, Created database objects like Tables, Views Indexes, Synonyms and Sequences.
Prepared the Data Reconciliation report by using oracle PL/SQL and UNIX
Developed Complex SQL queries using various joins and developed various dynamic SQLs throughout the projects.
Tuning of SQL Queries, Procedures, Functions and Packages using EXPLAIN PLAN and TKPROF.
Analyzed the requirements gathering, Mortgage-Backed Securities, Equity Derivatives, Debt, FX, Swaps, Interest Rate Derivatives, Emerging markets, Securities Master Data and Pricing product provided by different data Vendors
Technical troubleshooting and consultation to development. Performed UAT and SIT.
Nu Horizon Services, India Jan2007 – July2007
Programmer – Intern, Environment: C, Solaris, Shell Scripting
Project name: Mailing Server
This was an in-house mailing system project for VSNL, (Govt of India Telecom). The front end was designed in C graphics and it was designed to handle SMTP, POP3 protocol level request. The protocols were coded in C, security and anti-hacking methods were implemented by tracking the SALT of Solaris and password encoding was based on the same.
Coding with C and requirement gathering
Analysis of user request for change
Chatham, NJ 07928
Since Jan 2018 while preparing for my Cloudera certification am working on Use-cases like Activity Review and Reg-W. Ingestion of Data into Hadoop via Sqoop/ SFT-Files, replication into various clusters and PYSPARK scripts to extract data. Designing and architecting all components of Hadoop infrastructure under Cloudera VM.
Database developer turned data engineer with powerful insights of data quality and modelling techniques.
With more than 5 years of experience, developing complex modules or RDBMS, structuring the flow in ETL and migration to the distributed world of HADOOP/SPARK, in data world I have earned the art of simplifying complex requirements into reliable and latency sensitive solutions.
My Confidence is tailored with strong architectural knowledge and development skills around all the components of a data-driven infrastructure starting from Ingestion to Extraction.
Cloudera: Spark and Hadoop Developer (CCA175)
Oracle: OCA - Oracle Certified Associate
Informatica – Power Center Data Integration 9.x: Developer Specialist
HADOOP: SPARK, Yarn, Map-Reduce, HDFS, Sqoop, Hive, Pig, HBase, Flume, oozie
LANGUAGE & SCRIPTING: Scala, Python, SQL, PL/SQL, Bash, PERL, PYTHON (Pandas, Numpy, Scipy), Java
DATABASE: Oracle, SQL Server, MySQL, Sybase
OS: UNIX, Windows XP, Windows 2000, Linux
REPOSITORIES: CVS, SVN, Github
TOOLS: Anaconda, Pycharm, Sublime, Eclipse, TOAD, SQL* Loader, SQL* Plus, SQL Developer, DB Artisan, MS-Office
NETWORK: Filezilla, Winscp, FTP, SFTP, Putty
SCHEDULERS: Control-M, CronTab, Scheduler (UNIX & Windows)
Masters in Computer Science – 2007
Masters in Arts – Child Nutrition – 2006
University Topper in BA - 2003